Dec 01 08:17:01 crc systemd[1]: Starting Kubernetes Kubelet... Dec 01 08:17:01 crc restorecon[4686]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:01 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 08:17:02 crc restorecon[4686]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 08:17:02 crc restorecon[4686]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Dec 01 08:17:02 crc kubenswrapper[5004]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 01 08:17:02 crc kubenswrapper[5004]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 01 08:17:02 crc kubenswrapper[5004]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 01 08:17:02 crc kubenswrapper[5004]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 01 08:17:02 crc kubenswrapper[5004]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 01 08:17:02 crc kubenswrapper[5004]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.575019 5004 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.577794 5004 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.577812 5004 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.577823 5004 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.577827 5004 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.577831 5004 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.577835 5004 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.577838 5004 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.577842 5004 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.577846 5004 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.577862 5004 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.577867 5004 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.577872 5004 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.577876 5004 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.577880 5004 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.577884 5004 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.577888 5004 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.577892 5004 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.577896 5004 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.577900 5004 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.577905 5004 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.577908 5004 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.577913 5004 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.577916 5004 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.577920 5004 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.577924 5004 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.577928 5004 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.577931 5004 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.577935 5004 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.577939 5004 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.577942 5004 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.577946 5004 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.577949 5004 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.577953 5004 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.577956 5004 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.577960 5004 feature_gate.go:330] unrecognized feature gate: Example Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.577963 5004 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.577967 5004 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.577971 5004 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.577979 5004 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.577982 5004 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.577986 5004 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.577989 5004 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.577993 5004 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.577997 5004 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.578001 5004 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.578004 5004 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.578008 5004 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.578011 5004 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.578014 5004 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.578018 5004 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.578023 5004 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.578034 5004 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.578038 5004 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.578042 5004 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.578047 5004 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.578051 5004 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.578056 5004 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.578059 5004 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.578064 5004 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.578069 5004 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.578073 5004 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.578077 5004 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.578081 5004 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.578084 5004 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.578088 5004 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.578093 5004 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.578097 5004 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.578102 5004 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.578106 5004 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.578124 5004 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.578128 5004 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578197 5004 flags.go:64] FLAG: --address="0.0.0.0" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578205 5004 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578218 5004 flags.go:64] FLAG: --anonymous-auth="true" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578228 5004 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578234 5004 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578238 5004 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578243 5004 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578248 5004 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578253 5004 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578257 5004 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578261 5004 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578265 5004 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578269 5004 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578274 5004 flags.go:64] FLAG: --cgroup-root="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578278 5004 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578282 5004 flags.go:64] FLAG: --client-ca-file="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578286 5004 flags.go:64] FLAG: --cloud-config="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578290 5004 flags.go:64] FLAG: --cloud-provider="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578294 5004 flags.go:64] FLAG: --cluster-dns="[]" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578314 5004 flags.go:64] FLAG: --cluster-domain="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578322 5004 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578326 5004 flags.go:64] FLAG: --config-dir="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578331 5004 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578336 5004 flags.go:64] FLAG: --container-log-max-files="5" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578342 5004 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578346 5004 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578351 5004 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578355 5004 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578360 5004 flags.go:64] FLAG: --contention-profiling="false" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578364 5004 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578368 5004 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578373 5004 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578376 5004 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578382 5004 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578386 5004 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578391 5004 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578395 5004 flags.go:64] FLAG: --enable-load-reader="false" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578399 5004 flags.go:64] FLAG: --enable-server="true" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578403 5004 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578424 5004 flags.go:64] FLAG: --event-burst="100" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578429 5004 flags.go:64] FLAG: --event-qps="50" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578433 5004 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578437 5004 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578441 5004 flags.go:64] FLAG: --eviction-hard="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578446 5004 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578451 5004 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578455 5004 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578459 5004 flags.go:64] FLAG: --eviction-soft="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578463 5004 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578467 5004 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578471 5004 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578476 5004 flags.go:64] FLAG: --experimental-mounter-path="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578480 5004 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578484 5004 flags.go:64] FLAG: --fail-swap-on="true" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578487 5004 flags.go:64] FLAG: --feature-gates="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578493 5004 flags.go:64] FLAG: --file-check-frequency="20s" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578497 5004 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578501 5004 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578505 5004 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578510 5004 flags.go:64] FLAG: --healthz-port="10248" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578514 5004 flags.go:64] FLAG: --help="false" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578518 5004 flags.go:64] FLAG: --hostname-override="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578522 5004 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578526 5004 flags.go:64] FLAG: --http-check-frequency="20s" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578531 5004 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578535 5004 flags.go:64] FLAG: --image-credential-provider-config="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578539 5004 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578546 5004 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578550 5004 flags.go:64] FLAG: --image-service-endpoint="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578567 5004 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578572 5004 flags.go:64] FLAG: --kube-api-burst="100" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578576 5004 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578580 5004 flags.go:64] FLAG: --kube-api-qps="50" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578584 5004 flags.go:64] FLAG: --kube-reserved="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578588 5004 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578600 5004 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578604 5004 flags.go:64] FLAG: --kubelet-cgroups="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578608 5004 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578612 5004 flags.go:64] FLAG: --lock-file="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578616 5004 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578621 5004 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578625 5004 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578631 5004 flags.go:64] FLAG: --log-json-split-stream="false" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578635 5004 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578639 5004 flags.go:64] FLAG: --log-text-split-stream="false" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578643 5004 flags.go:64] FLAG: --logging-format="text" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578647 5004 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578651 5004 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578660 5004 flags.go:64] FLAG: --manifest-url="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578664 5004 flags.go:64] FLAG: --manifest-url-header="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578669 5004 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578674 5004 flags.go:64] FLAG: --max-open-files="1000000" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578679 5004 flags.go:64] FLAG: --max-pods="110" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578683 5004 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578687 5004 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578691 5004 flags.go:64] FLAG: --memory-manager-policy="None" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578695 5004 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578699 5004 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578703 5004 flags.go:64] FLAG: --node-ip="192.168.126.11" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578709 5004 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578718 5004 flags.go:64] FLAG: --node-status-max-images="50" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578722 5004 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578726 5004 flags.go:64] FLAG: --oom-score-adj="-999" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578730 5004 flags.go:64] FLAG: --pod-cidr="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578734 5004 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578739 5004 flags.go:64] FLAG: --pod-manifest-path="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578744 5004 flags.go:64] FLAG: --pod-max-pids="-1" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578747 5004 flags.go:64] FLAG: --pods-per-core="0" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578751 5004 flags.go:64] FLAG: --port="10250" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578755 5004 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578760 5004 flags.go:64] FLAG: --provider-id="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578768 5004 flags.go:64] FLAG: --qos-reserved="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578773 5004 flags.go:64] FLAG: --read-only-port="10255" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578777 5004 flags.go:64] FLAG: --register-node="true" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578781 5004 flags.go:64] FLAG: --register-schedulable="true" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578785 5004 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578791 5004 flags.go:64] FLAG: --registry-burst="10" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578795 5004 flags.go:64] FLAG: --registry-qps="5" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578800 5004 flags.go:64] FLAG: --reserved-cpus="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578804 5004 flags.go:64] FLAG: --reserved-memory="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578809 5004 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578813 5004 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578817 5004 flags.go:64] FLAG: --rotate-certificates="false" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578821 5004 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578825 5004 flags.go:64] FLAG: --runonce="false" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578829 5004 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578833 5004 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578838 5004 flags.go:64] FLAG: --seccomp-default="false" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578842 5004 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578846 5004 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578850 5004 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578854 5004 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578860 5004 flags.go:64] FLAG: --storage-driver-password="root" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578864 5004 flags.go:64] FLAG: --storage-driver-secure="false" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578868 5004 flags.go:64] FLAG: --storage-driver-table="stats" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578872 5004 flags.go:64] FLAG: --storage-driver-user="root" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578876 5004 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578880 5004 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578884 5004 flags.go:64] FLAG: --system-cgroups="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578888 5004 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578894 5004 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578898 5004 flags.go:64] FLAG: --tls-cert-file="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578902 5004 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578909 5004 flags.go:64] FLAG: --tls-min-version="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578913 5004 flags.go:64] FLAG: --tls-private-key-file="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578917 5004 flags.go:64] FLAG: --topology-manager-policy="none" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578921 5004 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578930 5004 flags.go:64] FLAG: --topology-manager-scope="container" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578935 5004 flags.go:64] FLAG: --v="2" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578940 5004 flags.go:64] FLAG: --version="false" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578945 5004 flags.go:64] FLAG: --vmodule="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578950 5004 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.578957 5004 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.579083 5004 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.579088 5004 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.579092 5004 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.579096 5004 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.579099 5004 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.579103 5004 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.579108 5004 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.579112 5004 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.579116 5004 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.579119 5004 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.579123 5004 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.579128 5004 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.579131 5004 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.579134 5004 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.579138 5004 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.579141 5004 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.579145 5004 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.579148 5004 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.579151 5004 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.579156 5004 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.579161 5004 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.579165 5004 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.579169 5004 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.579173 5004 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.579176 5004 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.579180 5004 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.579183 5004 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.579187 5004 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.579190 5004 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.579194 5004 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.579205 5004 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.579211 5004 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.579214 5004 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.579218 5004 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.579221 5004 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.579225 5004 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.579228 5004 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.579232 5004 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.579235 5004 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.579239 5004 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.579242 5004 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.579245 5004 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.579249 5004 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.579254 5004 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.579258 5004 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.579261 5004 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.579264 5004 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.579268 5004 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.579272 5004 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.579276 5004 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.579279 5004 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.579283 5004 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.579287 5004 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.579290 5004 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.579295 5004 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.579300 5004 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.579305 5004 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.579309 5004 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.579313 5004 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.579317 5004 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.579321 5004 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.579324 5004 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.579328 5004 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.579333 5004 feature_gate.go:330] unrecognized feature gate: Example Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.579336 5004 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.579339 5004 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.579348 5004 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.579351 5004 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.579355 5004 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.579358 5004 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.579362 5004 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.579367 5004 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.593775 5004 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.593836 5004 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.593992 5004 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.594007 5004 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.594017 5004 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.594026 5004 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.594035 5004 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.594044 5004 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.594052 5004 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.594060 5004 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.594072 5004 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.594085 5004 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.594129 5004 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.594137 5004 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.594147 5004 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.594156 5004 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.594165 5004 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.594173 5004 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.594182 5004 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.594190 5004 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.594198 5004 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.594206 5004 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.594214 5004 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.594222 5004 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.594230 5004 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.594238 5004 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.594247 5004 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.594255 5004 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.594263 5004 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.594270 5004 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.594280 5004 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.594288 5004 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.594296 5004 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.594307 5004 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.594318 5004 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.594326 5004 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.594335 5004 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.594343 5004 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.594352 5004 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.594359 5004 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.594367 5004 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.594375 5004 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.594383 5004 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.594391 5004 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.594398 5004 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.594406 5004 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.594415 5004 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.594422 5004 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.594431 5004 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.594439 5004 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.594447 5004 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.594454 5004 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.594464 5004 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.594475 5004 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.594484 5004 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.594492 5004 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.594499 5004 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.594507 5004 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.594516 5004 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.594524 5004 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.594532 5004 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.594540 5004 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.594548 5004 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.594555 5004 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.594610 5004 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.594618 5004 feature_gate.go:330] unrecognized feature gate: Example Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.594628 5004 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.594638 5004 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.594647 5004 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.594655 5004 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.594663 5004 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.594672 5004 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.594682 5004 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.594699 5004 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.594975 5004 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.594993 5004 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.595002 5004 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.595010 5004 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.595018 5004 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.595027 5004 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.595034 5004 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.595042 5004 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.595052 5004 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.595060 5004 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.595068 5004 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.595075 5004 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.595083 5004 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.595090 5004 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.595098 5004 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.595106 5004 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.595114 5004 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.595122 5004 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.595130 5004 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.595138 5004 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.595146 5004 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.595154 5004 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.595162 5004 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.595170 5004 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.595178 5004 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.595186 5004 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.595194 5004 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.595204 5004 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.595214 5004 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.595223 5004 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.595231 5004 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.595240 5004 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.595248 5004 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.595256 5004 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.595263 5004 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.595271 5004 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.595279 5004 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.595288 5004 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.595297 5004 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.595305 5004 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.595313 5004 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.595321 5004 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.595329 5004 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.595339 5004 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.595348 5004 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.595355 5004 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.595363 5004 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.595371 5004 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.595379 5004 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.595386 5004 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.595394 5004 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.595402 5004 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.595412 5004 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.595423 5004 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.595434 5004 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.595442 5004 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.595450 5004 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.595458 5004 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.595466 5004 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.595473 5004 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.595481 5004 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.595489 5004 feature_gate.go:330] unrecognized feature gate: Example Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.595497 5004 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.595505 5004 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.595513 5004 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.595521 5004 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.595528 5004 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.595536 5004 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.595544 5004 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.595551 5004 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.595588 5004 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.595601 5004 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.596184 5004 server.go:940] "Client rotation is on, will bootstrap in background" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.600772 5004 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.600911 5004 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.601687 5004 server.go:997] "Starting client certificate rotation" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.601723 5004 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.601938 5004 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-18 04:18:16.815863779 +0000 UTC Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.602061 5004 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.609100 5004 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.610874 5004 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 01 08:17:02 crc kubenswrapper[5004]: E1201 08:17:02.611085 5004 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.619308 5004 log.go:25] "Validated CRI v1 runtime API" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.642329 5004 log.go:25] "Validated CRI v1 image API" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.644661 5004 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.647639 5004 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-12-01-08-12-05-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.647692 5004 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:41 fsType:tmpfs blockSize:0}] Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.675982 5004 manager.go:217] Machine: {Timestamp:2025-12-01 08:17:02.673840267 +0000 UTC m=+0.238832329 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654120448 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:77547a65-c048-47ee-89b1-8422fc81b7aa BootID:c8341267-df98-484f-a5e7-cf024fab437c Filesystems:[{Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:41 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827060224 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:9e:68:63 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:9e:68:63 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:f7:29:55 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:97:c7:de Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:c8:0b:1a Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:1e:92:4b Speed:-1 Mtu:1496} {Name:eth10 MacAddress:a2:e0:1c:b4:b1:b9 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:9a:67:e9:ab:80:e2 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654120448 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.676382 5004 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.676620 5004 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.677112 5004 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.677408 5004 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.677452 5004 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.678223 5004 topology_manager.go:138] "Creating topology manager with none policy" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.678256 5004 container_manager_linux.go:303] "Creating device plugin manager" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.678415 5004 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.678457 5004 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.678981 5004 state_mem.go:36] "Initialized new in-memory state store" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.679115 5004 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.680008 5004 kubelet.go:418] "Attempting to sync node with API server" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.680039 5004 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.680063 5004 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.680084 5004 kubelet.go:324] "Adding apiserver pod source" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.680102 5004 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.682811 5004 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.683196 5004 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Dec 01 08:17:02 crc kubenswrapper[5004]: E1201 08:17:02.683307 5004 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.683375 5004 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Dec 01 08:17:02 crc kubenswrapper[5004]: E1201 08:17:02.683473 5004 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.683690 5004 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.684917 5004 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.685797 5004 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.685845 5004 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.685864 5004 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.685881 5004 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.685909 5004 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.685930 5004 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.685949 5004 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.686009 5004 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.686029 5004 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.686051 5004 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.686073 5004 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.686090 5004 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.686390 5004 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.687215 5004 server.go:1280] "Started kubelet" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.687450 5004 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.687557 5004 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.687475 5004 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.688132 5004 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 01 08:17:02 crc systemd[1]: Started Kubernetes Kubelet. Dec 01 08:17:02 crc kubenswrapper[5004]: E1201 08:17:02.690455 5004 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.75:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187d0973f815db63 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-01 08:17:02.687173475 +0000 UTC m=+0.252165497,LastTimestamp:2025-12-01 08:17:02.687173475 +0000 UTC m=+0.252165497,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.692224 5004 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.692272 5004 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.692314 5004 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 03:06:20.383143538 +0000 UTC Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.692760 5004 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 1122h49m17.690389764s for next certificate rotation Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.692525 5004 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.693074 5004 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 01 08:17:02 crc kubenswrapper[5004]: E1201 08:17:02.692462 5004 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 01 08:17:02 crc kubenswrapper[5004]: E1201 08:17:02.692934 5004 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="200ms" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.692546 5004 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.693544 5004 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Dec 01 08:17:02 crc kubenswrapper[5004]: E1201 08:17:02.693744 5004 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.694403 5004 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.694435 5004 factory.go:55] Registering systemd factory Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.694449 5004 factory.go:221] Registration of the systemd container factory successfully Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.695239 5004 factory.go:153] Registering CRI-O factory Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.695286 5004 factory.go:221] Registration of the crio container factory successfully Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.695340 5004 factory.go:103] Registering Raw factory Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.695374 5004 manager.go:1196] Started watching for new ooms in manager Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.695715 5004 server.go:460] "Adding debug handlers to kubelet server" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.697626 5004 manager.go:319] Starting recovery of all containers Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.720804 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.720909 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.720932 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.720953 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.720971 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.720989 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.721006 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.721023 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.721043 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.721061 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.721078 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.721096 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.721115 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.721135 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.721151 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.721167 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.721187 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.721204 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.721223 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.721240 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.721293 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.721357 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.721378 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.721406 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.721432 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.721458 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.721483 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.721503 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.721527 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.721551 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.721607 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.721629 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.721658 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.721682 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.721705 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.721728 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.721750 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.721769 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.721787 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.721806 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.721823 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.721842 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.721859 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.721875 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.721894 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.721911 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.722018 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.722042 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.722060 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.722077 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.722094 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.722112 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.722134 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.722153 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.722173 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.722214 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.722233 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.722252 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.722269 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.722295 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.722312 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.722328 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.722345 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.722363 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.722390 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.722412 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.722437 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.722459 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.722476 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.722510 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.722527 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.722543 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.722589 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.722607 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.722624 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.722650 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.722669 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.722686 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.722703 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.722722 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.722740 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.722756 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.722775 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.722791 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.722809 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.722826 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.722844 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.722861 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.722880 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.722898 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.722914 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.722933 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.722950 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.722968 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.722987 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.723004 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.723022 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.723040 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.723057 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.723073 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.723090 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.723107 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.723124 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.723140 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.723164 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.723182 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.723201 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.723219 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.723236 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.723255 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.723273 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.723315 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.723335 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.723353 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.723372 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.723389 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.723418 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.723437 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.723464 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.723482 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.723500 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.723517 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.723534 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.723553 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.723767 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.723786 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.723805 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.723824 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.723844 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.723863 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.723882 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.723900 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.723918 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.723936 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.723953 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.723970 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.723987 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.724004 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.724020 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.724038 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.724056 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.724074 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.724091 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.724108 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.724127 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.724144 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.724161 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.724179 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.724198 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.724215 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.724234 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.724251 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.724268 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.724285 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.724303 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.724320 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.724338 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.724355 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.724374 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.724392 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.724410 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.724427 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.724443 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.724461 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.724480 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.724496 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.724516 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.724535 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.724552 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.724593 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.724611 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.724628 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.724644 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.724661 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.724679 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.724698 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.724716 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.724733 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.724750 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.724767 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.724837 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.724856 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.724874 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.724893 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.724921 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.724940 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.724957 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.724973 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.724990 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.725008 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.726825 5004 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.726887 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.726913 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.726934 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.726953 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.726972 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.726990 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.727010 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.727027 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.727047 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.727066 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.727084 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.727155 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.727196 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.727220 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.727244 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.727267 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.727291 5004 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.727316 5004 reconstruct.go:97] "Volume reconstruction finished" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.727333 5004 reconciler.go:26] "Reconciler: start to sync state" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.730919 5004 manager.go:324] Recovery completed Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.746847 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.748928 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.749032 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.749057 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.750157 5004 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.750351 5004 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.750525 5004 state_mem.go:36] "Initialized new in-memory state store" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.754924 5004 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.757551 5004 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.757602 5004 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.757630 5004 kubelet.go:2335] "Starting kubelet main sync loop" Dec 01 08:17:02 crc kubenswrapper[5004]: E1201 08:17:02.757678 5004 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 01 08:17:02 crc kubenswrapper[5004]: W1201 08:17:02.761266 5004 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Dec 01 08:17:02 crc kubenswrapper[5004]: E1201 08:17:02.761334 5004 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.763779 5004 policy_none.go:49] "None policy: Start" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.765989 5004 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.766037 5004 state_mem.go:35] "Initializing new in-memory state store" Dec 01 08:17:02 crc kubenswrapper[5004]: E1201 08:17:02.793701 5004 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.849101 5004 manager.go:334] "Starting Device Plugin manager" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.849176 5004 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.849190 5004 server.go:79] "Starting device plugin registration server" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.849739 5004 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.849758 5004 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.849969 5004 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.850161 5004 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.850192 5004 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.858973 5004 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc"] Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.859273 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.861922 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.861983 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.862001 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.862201 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.862434 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 08:17:02 crc kubenswrapper[5004]: E1201 08:17:02.862470 5004 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.862510 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.863626 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.863713 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.863732 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.864011 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.864147 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.864227 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.864161 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.864317 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.864338 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.865980 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.866057 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.866106 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.866012 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.866209 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.866237 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.866634 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.866733 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.866962 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.868187 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.868219 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.868242 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.868218 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.868286 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.868306 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.868534 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.868859 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.868922 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.869543 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.869628 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.869654 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.870096 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.870156 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.870170 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.870199 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.870216 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.871375 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.871425 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.871451 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:02 crc kubenswrapper[5004]: E1201 08:17:02.893800 5004 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="400ms" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.929739 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.929811 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.929846 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.929878 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.929908 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.929943 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.929979 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.930006 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.930034 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.930093 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.930120 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.930151 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.930182 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.930210 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.930236 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.950978 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.952487 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.952523 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.952533 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:02 crc kubenswrapper[5004]: I1201 08:17:02.952555 5004 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 01 08:17:02 crc kubenswrapper[5004]: E1201 08:17:02.953052 5004 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.75:6443: connect: connection refused" node="crc" Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.031551 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.031661 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.031694 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.031720 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.031752 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.031781 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.031821 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.031849 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.031894 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.031900 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.031924 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.031953 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.031955 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.032001 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.031902 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.032054 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.032059 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.032062 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.032021 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.032113 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.032116 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.032139 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.032148 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.031980 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.032204 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.032235 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.032263 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.032286 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.032340 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.032447 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.153891 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.155827 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.155892 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.155910 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.155946 5004 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 01 08:17:03 crc kubenswrapper[5004]: E1201 08:17:03.156527 5004 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.75:6443: connect: connection refused" node="crc" Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.188235 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.195743 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.218034 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 08:17:03 crc kubenswrapper[5004]: W1201 08:17:03.228432 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-25554a28a2b3fe7fc1630d587fed35622db256c5eb372011f66b6242c43c3c7c WatchSource:0}: Error finding container 25554a28a2b3fe7fc1630d587fed35622db256c5eb372011f66b6242c43c3c7c: Status 404 returned error can't find the container with id 25554a28a2b3fe7fc1630d587fed35622db256c5eb372011f66b6242c43c3c7c Dec 01 08:17:03 crc kubenswrapper[5004]: W1201 08:17:03.230084 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-23502cda4071ecc360c745f61d01782c359b5f8e152cfb66b3ad4c33e04b1261 WatchSource:0}: Error finding container 23502cda4071ecc360c745f61d01782c359b5f8e152cfb66b3ad4c33e04b1261: Status 404 returned error can't find the container with id 23502cda4071ecc360c745f61d01782c359b5f8e152cfb66b3ad4c33e04b1261 Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.231314 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.235996 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 08:17:03 crc kubenswrapper[5004]: W1201 08:17:03.251121 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-291d18dabba46dea3de5e2142b38a607a3467851178dba565a8d5f79966e7bc7 WatchSource:0}: Error finding container 291d18dabba46dea3de5e2142b38a607a3467851178dba565a8d5f79966e7bc7: Status 404 returned error can't find the container with id 291d18dabba46dea3de5e2142b38a607a3467851178dba565a8d5f79966e7bc7 Dec 01 08:17:03 crc kubenswrapper[5004]: W1201 08:17:03.258177 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-fd24d4cde675646121ffdd0d94ca713188f3139f16bc1fe8ae22c18ef24f3f67 WatchSource:0}: Error finding container fd24d4cde675646121ffdd0d94ca713188f3139f16bc1fe8ae22c18ef24f3f67: Status 404 returned error can't find the container with id fd24d4cde675646121ffdd0d94ca713188f3139f16bc1fe8ae22c18ef24f3f67 Dec 01 08:17:03 crc kubenswrapper[5004]: E1201 08:17:03.294911 5004 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="800ms" Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.557685 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.559248 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.559293 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.559309 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.559345 5004 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 01 08:17:03 crc kubenswrapper[5004]: E1201 08:17:03.559837 5004 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.75:6443: connect: connection refused" node="crc" Dec 01 08:17:03 crc kubenswrapper[5004]: W1201 08:17:03.583434 5004 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Dec 01 08:17:03 crc kubenswrapper[5004]: E1201 08:17:03.583606 5004 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.688690 5004 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.765129 5004 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0" exitCode=0 Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.765250 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0"} Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.765446 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"291d18dabba46dea3de5e2142b38a607a3467851178dba565a8d5f79966e7bc7"} Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.765665 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.767352 5004 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="488ed5075af94d953143c09a2bac601fe1af57a34197f91965f22a1028ebd2b4" exitCode=0 Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.767384 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"488ed5075af94d953143c09a2bac601fe1af57a34197f91965f22a1028ebd2b4"} Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.767436 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"d64e2d6d517220ad121c130cf7506ada0283a9c795760a379a3d1e81add8ee93"} Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.767547 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.767889 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.767930 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.767946 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.769002 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.769061 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.769085 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.769209 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9b65b9bc444101900c62fd90c7d45789d186a89148ac0fd9c7f0c6048c3f55ad"} Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.769226 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"23502cda4071ecc360c745f61d01782c359b5f8e152cfb66b3ad4c33e04b1261"} Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.770169 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.770890 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.770943 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.770966 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.771511 5004 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="f3930a1c88c6014beb768ed20a37359ade08e77496a86b913f68c6196134f13a" exitCode=0 Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.771554 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"f3930a1c88c6014beb768ed20a37359ade08e77496a86b913f68c6196134f13a"} Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.771587 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"25554a28a2b3fe7fc1630d587fed35622db256c5eb372011f66b6242c43c3c7c"} Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.771641 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.772255 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.772307 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.772331 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.774011 5004 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6" exitCode=0 Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.774059 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6"} Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.774094 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"fd24d4cde675646121ffdd0d94ca713188f3139f16bc1fe8ae22c18ef24f3f67"} Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.774262 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.775630 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.775691 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:03 crc kubenswrapper[5004]: I1201 08:17:03.775715 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:03 crc kubenswrapper[5004]: W1201 08:17:03.798779 5004 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Dec 01 08:17:03 crc kubenswrapper[5004]: E1201 08:17:03.798915 5004 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Dec 01 08:17:04 crc kubenswrapper[5004]: E1201 08:17:04.097458 5004 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="1.6s" Dec 01 08:17:04 crc kubenswrapper[5004]: W1201 08:17:04.173726 5004 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Dec 01 08:17:04 crc kubenswrapper[5004]: E1201 08:17:04.173797 5004 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Dec 01 08:17:04 crc kubenswrapper[5004]: W1201 08:17:04.239706 5004 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Dec 01 08:17:04 crc kubenswrapper[5004]: E1201 08:17:04.239787 5004 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Dec 01 08:17:04 crc kubenswrapper[5004]: I1201 08:17:04.362003 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 08:17:04 crc kubenswrapper[5004]: I1201 08:17:04.364401 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:04 crc kubenswrapper[5004]: I1201 08:17:04.364431 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:04 crc kubenswrapper[5004]: I1201 08:17:04.364441 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:04 crc kubenswrapper[5004]: I1201 08:17:04.364463 5004 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 01 08:17:04 crc kubenswrapper[5004]: I1201 08:17:04.673337 5004 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 01 08:17:04 crc kubenswrapper[5004]: I1201 08:17:04.779068 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"1ca757be16a55ba8df0e9629f7cc2653e2804a5ee5a2151ee3b07d2c30fe5b07"} Dec 01 08:17:04 crc kubenswrapper[5004]: I1201 08:17:04.779114 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"0f873da9c594c885720113b0d0fc01552050d030e802074043e9ede174fd9b16"} Dec 01 08:17:04 crc kubenswrapper[5004]: I1201 08:17:04.779129 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"37438df65f4dab6700f193e84f81d8ed41b3c208c3f6150bd5836c0642190572"} Dec 01 08:17:04 crc kubenswrapper[5004]: I1201 08:17:04.779259 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 08:17:04 crc kubenswrapper[5004]: I1201 08:17:04.780260 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:04 crc kubenswrapper[5004]: I1201 08:17:04.780317 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:04 crc kubenswrapper[5004]: I1201 08:17:04.780335 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:04 crc kubenswrapper[5004]: I1201 08:17:04.784282 5004 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803" exitCode=0 Dec 01 08:17:04 crc kubenswrapper[5004]: I1201 08:17:04.784378 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803"} Dec 01 08:17:04 crc kubenswrapper[5004]: I1201 08:17:04.784581 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 08:17:04 crc kubenswrapper[5004]: I1201 08:17:04.786422 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:04 crc kubenswrapper[5004]: I1201 08:17:04.786454 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:04 crc kubenswrapper[5004]: I1201 08:17:04.786466 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:04 crc kubenswrapper[5004]: I1201 08:17:04.790358 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2caeb2ebaf68110cb8728e60aae6db1b1fc48efae6018f66070766a4f42caa69"} Dec 01 08:17:04 crc kubenswrapper[5004]: I1201 08:17:04.790392 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d69a370af9f8f6c575f99da6bb01cee0f660dccb7e85f16f5d8874d0e263e1fd"} Dec 01 08:17:04 crc kubenswrapper[5004]: I1201 08:17:04.790402 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c82baee0fd931fd7b41cd5a73ca7e653ef13dfa3d573991cbe3238d6b95f6fac"} Dec 01 08:17:04 crc kubenswrapper[5004]: I1201 08:17:04.790412 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"45a0d7a12e7c114de1740a151e04f7f39844b709740cab958f156ac918af3228"} Dec 01 08:17:04 crc kubenswrapper[5004]: I1201 08:17:04.792813 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"8561c15ad57f27642507b1cf97c865989aea032f1d31998d18efb066baa9c283"} Dec 01 08:17:04 crc kubenswrapper[5004]: I1201 08:17:04.792932 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 08:17:04 crc kubenswrapper[5004]: I1201 08:17:04.794007 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:04 crc kubenswrapper[5004]: I1201 08:17:04.794050 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:04 crc kubenswrapper[5004]: I1201 08:17:04.794067 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:04 crc kubenswrapper[5004]: I1201 08:17:04.796099 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4563974ed1467a87aab104756828e63853c282a61c4fcab1e757eb4da9afaa40"} Dec 01 08:17:04 crc kubenswrapper[5004]: I1201 08:17:04.796138 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"edbd3e682d207b8c18d6bcbc26aa2adae2a35645502d31b327ffddd51991e842"} Dec 01 08:17:04 crc kubenswrapper[5004]: I1201 08:17:04.796158 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b520771cf1ad6f9360064c5f304243c5878a2f1025c87d1001c97b2d5f95cb9a"} Dec 01 08:17:04 crc kubenswrapper[5004]: I1201 08:17:04.796167 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 08:17:04 crc kubenswrapper[5004]: I1201 08:17:04.803664 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:04 crc kubenswrapper[5004]: I1201 08:17:04.803693 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:04 crc kubenswrapper[5004]: I1201 08:17:04.803700 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:05 crc kubenswrapper[5004]: I1201 08:17:05.802940 5004 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3" exitCode=0 Dec 01 08:17:05 crc kubenswrapper[5004]: I1201 08:17:05.803031 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3"} Dec 01 08:17:05 crc kubenswrapper[5004]: I1201 08:17:05.803742 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 08:17:05 crc kubenswrapper[5004]: I1201 08:17:05.804816 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:05 crc kubenswrapper[5004]: I1201 08:17:05.804860 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:05 crc kubenswrapper[5004]: I1201 08:17:05.804877 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:05 crc kubenswrapper[5004]: I1201 08:17:05.810463 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ad581564fc519328ae429bce757797f6d87d8648e862b008637a245866ec4cbe"} Dec 01 08:17:05 crc kubenswrapper[5004]: I1201 08:17:05.810486 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 08:17:05 crc kubenswrapper[5004]: I1201 08:17:05.810530 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 08:17:05 crc kubenswrapper[5004]: I1201 08:17:05.812218 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:05 crc kubenswrapper[5004]: I1201 08:17:05.812258 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:05 crc kubenswrapper[5004]: I1201 08:17:05.812274 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:05 crc kubenswrapper[5004]: I1201 08:17:05.813695 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:05 crc kubenswrapper[5004]: I1201 08:17:05.813739 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:05 crc kubenswrapper[5004]: I1201 08:17:05.813755 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:06 crc kubenswrapper[5004]: I1201 08:17:06.177594 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 08:17:06 crc kubenswrapper[5004]: I1201 08:17:06.817540 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d610c3a799d4a7df9ddfe2a28f2ced2e5ac4e96c67c93f8e6d3a0eee60d9ac48"} Dec 01 08:17:06 crc kubenswrapper[5004]: I1201 08:17:06.817648 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"2b358f641ff09bc8f4a452b0d4a8cf5f9790182f76779be31426d48d0278b0fe"} Dec 01 08:17:06 crc kubenswrapper[5004]: I1201 08:17:06.817674 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3c24fbeedbc7c40b6079b7ad89d8564a4f2cdaad2cf996e66682b082382da190"} Dec 01 08:17:06 crc kubenswrapper[5004]: I1201 08:17:06.817702 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 08:17:06 crc kubenswrapper[5004]: I1201 08:17:06.817589 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 08:17:06 crc kubenswrapper[5004]: I1201 08:17:06.818832 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:06 crc kubenswrapper[5004]: I1201 08:17:06.818877 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:06 crc kubenswrapper[5004]: I1201 08:17:06.818892 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:07 crc kubenswrapper[5004]: I1201 08:17:07.010307 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 08:17:07 crc kubenswrapper[5004]: I1201 08:17:07.010545 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 08:17:07 crc kubenswrapper[5004]: I1201 08:17:07.013344 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:07 crc kubenswrapper[5004]: I1201 08:17:07.013424 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:07 crc kubenswrapper[5004]: I1201 08:17:07.013449 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:07 crc kubenswrapper[5004]: I1201 08:17:07.826946 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"38e3cc2eca559417c71beed561569c4dc5b45ffb8407976e0695fc1b93219d37"} Dec 01 08:17:07 crc kubenswrapper[5004]: I1201 08:17:07.827015 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"43f9f72af3273da055eeb1e13ad6258a1b3b6b205402861beee9f87f02230297"} Dec 01 08:17:07 crc kubenswrapper[5004]: I1201 08:17:07.827030 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 08:17:07 crc kubenswrapper[5004]: I1201 08:17:07.827106 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 08:17:07 crc kubenswrapper[5004]: I1201 08:17:07.828277 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:07 crc kubenswrapper[5004]: I1201 08:17:07.828315 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:07 crc kubenswrapper[5004]: I1201 08:17:07.828332 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:07 crc kubenswrapper[5004]: I1201 08:17:07.828408 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:07 crc kubenswrapper[5004]: I1201 08:17:07.828456 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:07 crc kubenswrapper[5004]: I1201 08:17:07.828472 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:08 crc kubenswrapper[5004]: I1201 08:17:08.526227 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 08:17:08 crc kubenswrapper[5004]: I1201 08:17:08.526439 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 08:17:08 crc kubenswrapper[5004]: I1201 08:17:08.527993 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:08 crc kubenswrapper[5004]: I1201 08:17:08.528052 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:08 crc kubenswrapper[5004]: I1201 08:17:08.528070 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:08 crc kubenswrapper[5004]: I1201 08:17:08.829463 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 08:17:08 crc kubenswrapper[5004]: I1201 08:17:08.830771 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:08 crc kubenswrapper[5004]: I1201 08:17:08.830832 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:08 crc kubenswrapper[5004]: I1201 08:17:08.830852 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:10 crc kubenswrapper[5004]: I1201 08:17:10.525171 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 08:17:10 crc kubenswrapper[5004]: I1201 08:17:10.525542 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 08:17:10 crc kubenswrapper[5004]: I1201 08:17:10.526903 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:10 crc kubenswrapper[5004]: I1201 08:17:10.526956 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:10 crc kubenswrapper[5004]: I1201 08:17:10.526974 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:10 crc kubenswrapper[5004]: I1201 08:17:10.963506 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 08:17:10 crc kubenswrapper[5004]: I1201 08:17:10.963825 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 08:17:10 crc kubenswrapper[5004]: I1201 08:17:10.965830 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:10 crc kubenswrapper[5004]: I1201 08:17:10.965891 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:10 crc kubenswrapper[5004]: I1201 08:17:10.965911 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:10 crc kubenswrapper[5004]: I1201 08:17:10.970904 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 08:17:11 crc kubenswrapper[5004]: I1201 08:17:11.352984 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 08:17:11 crc kubenswrapper[5004]: I1201 08:17:11.734636 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Dec 01 08:17:11 crc kubenswrapper[5004]: I1201 08:17:11.734873 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 08:17:11 crc kubenswrapper[5004]: I1201 08:17:11.736194 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:11 crc kubenswrapper[5004]: I1201 08:17:11.736231 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:11 crc kubenswrapper[5004]: I1201 08:17:11.736241 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:11 crc kubenswrapper[5004]: I1201 08:17:11.766119 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Dec 01 08:17:11 crc kubenswrapper[5004]: I1201 08:17:11.837035 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 08:17:11 crc kubenswrapper[5004]: I1201 08:17:11.837387 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 08:17:11 crc kubenswrapper[5004]: I1201 08:17:11.838698 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:11 crc kubenswrapper[5004]: I1201 08:17:11.838752 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:11 crc kubenswrapper[5004]: I1201 08:17:11.838770 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:11 crc kubenswrapper[5004]: I1201 08:17:11.839046 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:11 crc kubenswrapper[5004]: I1201 08:17:11.839495 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:11 crc kubenswrapper[5004]: I1201 08:17:11.839699 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:12 crc kubenswrapper[5004]: I1201 08:17:12.839474 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 08:17:12 crc kubenswrapper[5004]: I1201 08:17:12.841257 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:12 crc kubenswrapper[5004]: I1201 08:17:12.841305 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:12 crc kubenswrapper[5004]: I1201 08:17:12.841319 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:12 crc kubenswrapper[5004]: I1201 08:17:12.842756 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 08:17:12 crc kubenswrapper[5004]: E1201 08:17:12.863544 5004 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 01 08:17:13 crc kubenswrapper[5004]: I1201 08:17:13.840984 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 08:17:13 crc kubenswrapper[5004]: I1201 08:17:13.844810 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:13 crc kubenswrapper[5004]: I1201 08:17:13.844850 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:13 crc kubenswrapper[5004]: I1201 08:17:13.844865 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:13 crc kubenswrapper[5004]: I1201 08:17:13.846113 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 08:17:14 crc kubenswrapper[5004]: E1201 08:17:14.365339 5004 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": net/http: TLS handshake timeout" node="crc" Dec 01 08:17:14 crc kubenswrapper[5004]: E1201 08:17:14.674745 5004 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 01 08:17:14 crc kubenswrapper[5004]: I1201 08:17:14.689148 5004 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Dec 01 08:17:14 crc kubenswrapper[5004]: I1201 08:17:14.843723 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 08:17:14 crc kubenswrapper[5004]: I1201 08:17:14.845456 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:14 crc kubenswrapper[5004]: I1201 08:17:14.845514 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:14 crc kubenswrapper[5004]: I1201 08:17:14.845527 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:15 crc kubenswrapper[5004]: I1201 08:17:15.152999 5004 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 01 08:17:15 crc kubenswrapper[5004]: I1201 08:17:15.153103 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 01 08:17:15 crc kubenswrapper[5004]: E1201 08:17:15.698682 5004 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" interval="3.2s" Dec 01 08:17:15 crc kubenswrapper[5004]: I1201 08:17:15.788452 5004 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 01 08:17:15 crc kubenswrapper[5004]: I1201 08:17:15.790135 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 01 08:17:15 crc kubenswrapper[5004]: I1201 08:17:15.803209 5004 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 01 08:17:15 crc kubenswrapper[5004]: I1201 08:17:15.803317 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 01 08:17:15 crc kubenswrapper[5004]: I1201 08:17:15.843233 5004 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 01 08:17:15 crc kubenswrapper[5004]: I1201 08:17:15.843367 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 08:17:15 crc kubenswrapper[5004]: I1201 08:17:15.966165 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 08:17:15 crc kubenswrapper[5004]: I1201 08:17:15.967638 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:15 crc kubenswrapper[5004]: I1201 08:17:15.967676 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:15 crc kubenswrapper[5004]: I1201 08:17:15.967689 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:15 crc kubenswrapper[5004]: I1201 08:17:15.967715 5004 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 01 08:17:18 crc kubenswrapper[5004]: I1201 08:17:18.709794 5004 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 01 08:17:18 crc kubenswrapper[5004]: I1201 08:17:18.721805 5004 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Dec 01 08:17:19 crc kubenswrapper[5004]: I1201 08:17:19.438682 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 08:17:19 crc kubenswrapper[5004]: I1201 08:17:19.438820 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 08:17:19 crc kubenswrapper[5004]: I1201 08:17:19.443595 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:19 crc kubenswrapper[5004]: I1201 08:17:19.443625 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:19 crc kubenswrapper[5004]: I1201 08:17:19.443634 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:19 crc kubenswrapper[5004]: I1201 08:17:19.445455 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 08:17:19 crc kubenswrapper[5004]: I1201 08:17:19.858785 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 08:17:19 crc kubenswrapper[5004]: I1201 08:17:19.860081 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:19 crc kubenswrapper[5004]: I1201 08:17:19.860290 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:19 crc kubenswrapper[5004]: I1201 08:17:19.860325 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:20 crc kubenswrapper[5004]: I1201 08:17:20.704456 5004 csr.go:261] certificate signing request csr-l2zk9 is approved, waiting to be issued Dec 01 08:17:20 crc kubenswrapper[5004]: I1201 08:17:20.725274 5004 csr.go:257] certificate signing request csr-l2zk9 is issued Dec 01 08:17:20 crc kubenswrapper[5004]: I1201 08:17:20.796517 5004 trace.go:236] Trace[935136012]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Dec-2025 08:17:07.190) (total time: 13606ms): Dec 01 08:17:20 crc kubenswrapper[5004]: Trace[935136012]: ---"Objects listed" error: 13606ms (08:17:20.796) Dec 01 08:17:20 crc kubenswrapper[5004]: Trace[935136012]: [13.606349186s] [13.606349186s] END Dec 01 08:17:20 crc kubenswrapper[5004]: I1201 08:17:20.796547 5004 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 01 08:17:20 crc kubenswrapper[5004]: I1201 08:17:20.796746 5004 trace.go:236] Trace[1835603793]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Dec-2025 08:17:07.144) (total time: 13651ms): Dec 01 08:17:20 crc kubenswrapper[5004]: Trace[1835603793]: ---"Objects listed" error: 13651ms (08:17:20.796) Dec 01 08:17:20 crc kubenswrapper[5004]: Trace[1835603793]: [13.651905916s] [13.651905916s] END Dec 01 08:17:20 crc kubenswrapper[5004]: I1201 08:17:20.796757 5004 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 01 08:17:20 crc kubenswrapper[5004]: I1201 08:17:20.797052 5004 trace.go:236] Trace[18248839]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Dec-2025 08:17:06.248) (total time: 14548ms): Dec 01 08:17:20 crc kubenswrapper[5004]: Trace[18248839]: ---"Objects listed" error: 14548ms (08:17:20.797) Dec 01 08:17:20 crc kubenswrapper[5004]: Trace[18248839]: [14.548539903s] [14.548539903s] END Dec 01 08:17:20 crc kubenswrapper[5004]: I1201 08:17:20.797064 5004 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 01 08:17:20 crc kubenswrapper[5004]: I1201 08:17:20.803693 5004 trace.go:236] Trace[668175155]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Dec-2025 08:17:05.997) (total time: 14805ms): Dec 01 08:17:20 crc kubenswrapper[5004]: Trace[668175155]: ---"Objects listed" error: 14805ms (08:17:20.803) Dec 01 08:17:20 crc kubenswrapper[5004]: Trace[668175155]: [14.805768601s] [14.805768601s] END Dec 01 08:17:20 crc kubenswrapper[5004]: I1201 08:17:20.803740 5004 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 01 08:17:20 crc kubenswrapper[5004]: I1201 08:17:20.804275 5004 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 01 08:17:20 crc kubenswrapper[5004]: I1201 08:17:20.876190 5004 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:59050->192.168.126.11:17697: read: connection reset by peer" start-of-body= Dec 01 08:17:20 crc kubenswrapper[5004]: I1201 08:17:20.876247 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:59050->192.168.126.11:17697: read: connection reset by peer" Dec 01 08:17:20 crc kubenswrapper[5004]: I1201 08:17:20.876209 5004 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:59054->192.168.126.11:17697: read: connection reset by peer" start-of-body= Dec 01 08:17:20 crc kubenswrapper[5004]: I1201 08:17:20.876304 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:59054->192.168.126.11:17697: read: connection reset by peer" Dec 01 08:17:20 crc kubenswrapper[5004]: I1201 08:17:20.876548 5004 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 01 08:17:20 crc kubenswrapper[5004]: I1201 08:17:20.876590 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 01 08:17:20 crc kubenswrapper[5004]: E1201 08:17:20.972483 5004 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.692703 5004 apiserver.go:52] "Watching apiserver" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.695440 5004 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.695725 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.696123 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.696182 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.696127 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.696313 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 08:17:21 crc kubenswrapper[5004]: E1201 08:17:21.696364 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.696303 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.696327 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 08:17:21 crc kubenswrapper[5004]: E1201 08:17:21.696495 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 08:17:21 crc kubenswrapper[5004]: E1201 08:17:21.696612 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.698219 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.698391 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.699307 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.699347 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.699439 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.700127 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.700714 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.701683 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.702346 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.726500 5004 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-12-01 08:12:20 +0000 UTC, rotation deadline is 2026-09-01 19:30:18.822029586 +0000 UTC Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.726529 5004 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6587h12m57.095502545s for next certificate rotation Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.742309 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.767496 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.775964 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.794208 5004 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.797356 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.802796 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.811274 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.811321 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.811346 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.811368 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.811390 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.811414 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.811435 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.811459 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.811480 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.811500 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.811524 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.811546 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.811587 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.811609 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.811635 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.811660 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.811683 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.811704 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.811725 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.811731 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.811748 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.811771 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.811793 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.811817 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.811838 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.811859 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.811911 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.811933 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.811957 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.811985 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.812009 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.812054 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.812076 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.812102 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.812125 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.812146 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.812174 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.812197 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.812224 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.812246 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.812269 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.812293 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.812315 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.812340 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.812361 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.812387 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.812412 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.812435 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.812461 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.812511 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.812538 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.812580 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.812607 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.812675 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.812696 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.812717 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.812738 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.812760 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.812780 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.812801 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.812821 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.812847 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.812869 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.812889 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.812910 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.812934 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.812959 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.813011 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.813036 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.813060 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.813086 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.813108 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.813130 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.813151 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.813172 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.813193 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.813214 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.813247 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.813269 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.813290 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.813340 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.813361 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.813382 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.813403 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.813427 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.813447 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.813469 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.813496 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.813521 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.813544 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.813585 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.813609 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.813628 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.813650 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.813670 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.813712 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.813733 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.813757 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.813780 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.813803 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.813827 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.813848 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.813870 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.813891 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.813911 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.813933 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.813956 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.813976 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.813997 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.814018 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.814040 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.814062 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.814082 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.814104 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.814125 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.814148 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.814178 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.814201 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.814223 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.814245 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.814270 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.814291 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.814314 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.814335 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.814357 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.814381 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.814402 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.814422 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.814444 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.814468 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.814492 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.814513 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.814535 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.814556 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.814598 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.814619 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.814638 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.814658 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.814685 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.814704 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.814724 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.814750 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.814769 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.814790 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.814811 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.814831 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.814850 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.814870 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.814949 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.814974 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.814996 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.815019 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.815041 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.815063 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.815084 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.815106 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.815128 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.815151 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.815172 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.815195 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.815217 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.815239 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.815266 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.815287 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.815309 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.815330 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.815352 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.815373 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.815396 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.815421 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.815442 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.815463 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.815485 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.815510 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.815533 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.815555 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.815596 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.815619 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.815640 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.815662 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.815685 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.815721 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.815742 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.815763 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.815784 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.815804 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.815826 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.815848 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.815870 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.815895 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.815915 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.815937 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.815961 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.815984 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.816007 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.816057 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.816108 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.816138 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.816164 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.816209 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.816231 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.816255 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.816277 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.816300 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.816328 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.816373 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.816396 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.816426 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.816451 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.816477 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.816536 5004 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.811956 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.812140 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.812380 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.812397 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.812437 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.812595 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.812651 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.812822 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.812873 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.812938 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.813022 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.813074 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.813171 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.813207 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.813264 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.813881 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.814102 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.814726 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.816009 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.816756 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.817105 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.817458 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.817735 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.818134 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.818186 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.818381 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.818773 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.819311 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.819732 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.820156 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.825661 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.826046 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.826322 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.826685 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.827220 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.827257 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.827687 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.828246 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.827885 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.828352 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.828298 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.828684 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.828710 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.829098 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.829554 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.829544 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.830202 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.832248 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.832705 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.832723 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.833135 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.833495 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.833919 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.835659 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.835896 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.836135 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.836348 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.836494 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.836625 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.836858 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.836903 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.837179 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.837429 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.837901 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.838282 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.839020 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.844232 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.844814 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.845173 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.845498 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.846954 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.847145 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.847151 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.847329 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.847504 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.852046 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.852393 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.853269 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.855502 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.857306 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.857377 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.857354 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.857691 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.857734 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.857798 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.845620 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.858007 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.858012 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.858044 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.858077 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.858284 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.858682 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.858850 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.858990 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.859243 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.859385 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.861285 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.861626 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.861993 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.862752 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: E1201 08:17:21.862870 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 08:17:22.362850825 +0000 UTC m=+19.927842807 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.863044 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.863222 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.863612 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.853288 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.864267 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.864398 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.864637 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.864883 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.864981 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.865057 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.865062 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.865543 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.865975 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.865991 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.873304 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.875302 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.875638 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.875929 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.876253 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.876725 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.877059 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.877369 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.877437 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.877971 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.878295 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.878348 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.878484 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.878526 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.878744 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.878834 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.879037 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.879139 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.879159 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.879336 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.879658 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.879844 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.875025 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.879957 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.879978 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.879977 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.879998 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.878249 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.880123 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.880272 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.880400 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.880518 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.881002 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.881730 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.882095 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.882222 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.882232 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: E1201 08:17:21.882338 5004 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 08:17:21 crc kubenswrapper[5004]: E1201 08:17:21.882356 5004 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 08:17:21 crc kubenswrapper[5004]: E1201 08:17:21.882370 5004 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.882463 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.882788 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.883671 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.882467 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.884403 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: E1201 08:17:21.884773 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 08:17:22.382414159 +0000 UTC m=+19.947406141 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.880366 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.884916 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.884928 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.882448 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.885060 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.885275 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.885473 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.885742 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.885751 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.885932 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.886098 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.886384 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.886528 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.887458 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.887597 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.888112 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.888272 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.888387 5004 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.888552 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.889246 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.889483 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.889601 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.890180 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.890529 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.890972 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.891011 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: E1201 08:17:21.891188 5004 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 08:17:21 crc kubenswrapper[5004]: E1201 08:17:21.891239 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 08:17:22.391226802 +0000 UTC m=+19.956218784 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 08:17:21 crc kubenswrapper[5004]: E1201 08:17:21.891929 5004 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 08:17:21 crc kubenswrapper[5004]: E1201 08:17:21.891969 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 08:17:22.391959241 +0000 UTC m=+19.956951223 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.892142 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.882975 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.892403 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.892422 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.892428 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.894849 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.895301 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.895378 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.895988 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.901568 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: E1201 08:17:21.901953 5004 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 08:17:21 crc kubenswrapper[5004]: E1201 08:17:21.901977 5004 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 08:17:21 crc kubenswrapper[5004]: E1201 08:17:21.901989 5004 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 08:17:21 crc kubenswrapper[5004]: E1201 08:17:21.902038 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 08:17:22.402024175 +0000 UTC m=+19.967016157 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.904719 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.905683 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.905701 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.905769 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.906104 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.905317 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.906884 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.908277 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.910138 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.910964 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.911725 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.912473 5004 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ad581564fc519328ae429bce757797f6d87d8648e862b008637a245866ec4cbe" exitCode=255 Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.912940 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"ad581564fc519328ae429bce757797f6d87d8648e862b008637a245866ec4cbe"} Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.913185 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.916942 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.916988 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917036 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917046 5004 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917056 5004 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917066 5004 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917078 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917089 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917119 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917155 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917163 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917175 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917249 5004 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917263 5004 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917267 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917273 5004 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917312 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917327 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917339 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917355 5004 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917363 5004 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917371 5004 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917380 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917388 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917396 5004 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917404 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917413 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917421 5004 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917430 5004 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917439 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917447 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917457 5004 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917466 5004 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917475 5004 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917483 5004 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917492 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917502 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917510 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917518 5004 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917526 5004 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917535 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917543 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917551 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917579 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917611 5004 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917620 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917631 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917642 5004 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917652 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917660 5004 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917668 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917676 5004 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917685 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917692 5004 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917700 5004 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917710 5004 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917719 5004 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917727 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917736 5004 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917744 5004 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917753 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917761 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917770 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917779 5004 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917787 5004 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917795 5004 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917803 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917814 5004 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917823 5004 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917831 5004 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917838 5004 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917846 5004 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917874 5004 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917893 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917902 5004 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917910 5004 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917918 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917926 5004 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917934 5004 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917941 5004 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917949 5004 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917958 5004 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917966 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917975 5004 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917983 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917991 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.917999 5004 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918017 5004 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918025 5004 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918033 5004 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918041 5004 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918066 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918074 5004 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918081 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918089 5004 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918099 5004 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918106 5004 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918115 5004 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918123 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918130 5004 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918138 5004 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918146 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918154 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918161 5004 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918169 5004 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918177 5004 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918184 5004 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918197 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918205 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918212 5004 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918220 5004 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918228 5004 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918236 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918244 5004 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918252 5004 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918260 5004 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918276 5004 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918291 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918298 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918307 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918314 5004 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918322 5004 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918330 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918338 5004 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918353 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918361 5004 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918368 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918376 5004 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918384 5004 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918392 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918400 5004 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918408 5004 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918420 5004 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918427 5004 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918435 5004 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918443 5004 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918451 5004 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918458 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918466 5004 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918474 5004 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918485 5004 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918493 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918502 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918510 5004 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918521 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918530 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918538 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918546 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918555 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918747 5004 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918756 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918765 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918774 5004 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918781 5004 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918789 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918798 5004 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918806 5004 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918814 5004 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918822 5004 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918831 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918839 5004 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918863 5004 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918871 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918888 5004 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918896 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918904 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918912 5004 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918919 5004 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918927 5004 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918936 5004 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918943 5004 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918951 5004 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918959 5004 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918985 5004 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.918992 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.919009 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.919016 5004 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.919033 5004 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.919040 5004 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.919048 5004 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.919058 5004 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.919068 5004 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.919079 5004 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.919090 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.919101 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.919111 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.919121 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.919132 5004 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.919142 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.919153 5004 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.919162 5004 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.919169 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.919180 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.919189 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.919199 5004 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.919210 5004 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.919221 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.919232 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.919062 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.925090 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.925388 5004 scope.go:117] "RemoveContainer" containerID="ad581564fc519328ae429bce757797f6d87d8648e862b008637a245866ec4cbe" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.926614 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.942238 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.949496 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.962592 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.973664 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.985784 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 08:17:21 crc kubenswrapper[5004]: I1201 08:17:21.998792 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.015143 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.016184 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43c9f762-d403-49b9-8294-d66976aca4dd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b358f641ff09bc8f4a452b0d4a8cf5f9790182f76779be31426d48d0278b0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d610c3a799d4a7df9ddfe2a28f2ced2e5ac4e96c67c93f8e6d3a0eee60d9ac48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43f9f72af3273da055eeb1e13ad6258a1b3b6b205402861beee9f87f02230297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e3cc2eca559417c71beed561569c4dc5b45ffb8407976e0695fc1b93219d37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c24fbeedbc7c40b6079b7ad89d8564a4f2cdaad2cf996e66682b082382da190\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.020362 5004 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 01 08:17:22 crc kubenswrapper[5004]: W1201 08:17:22.028482 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-747671fe19b1b5f27d25667844b7f01cb87f1bedf8d181fca54f82470dd22fa3 WatchSource:0}: Error finding container 747671fe19b1b5f27d25667844b7f01cb87f1bedf8d181fca54f82470dd22fa3: Status 404 returned error can't find the container with id 747671fe19b1b5f27d25667844b7f01cb87f1bedf8d181fca54f82470dd22fa3 Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.030238 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0ed2b6d-0c61-4639-bc3b-1c8effc4815d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45a0d7a12e7c114de1740a151e04f7f39844b709740cab958f156ac918af3228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d69a370af9f8f6c575f99da6bb01cee0f660dccb7e85f16f5d8874d0e263e1fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82baee0fd931fd7b41cd5a73ca7e653ef13dfa3d573991cbe3238d6b95f6fac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad581564fc519328ae429bce757797f6d87d8648e862b008637a245866ec4cbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad581564fc519328ae429bce757797f6d87d8648e862b008637a245866ec4cbe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T08:17:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 08:17:15.185125 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 08:17:15.186793 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071633747/tls.crt::/tmp/serving-cert-4071633747/tls.key\\\\\\\"\\\\nI1201 08:17:20.827614 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 08:17:20.834528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 08:17:20.834549 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 08:17:20.837611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 08:17:20.837630 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 08:17:20.848940 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 08:17:20.848957 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848965 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 08:17:20.848968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 08:17:20.848970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 08:17:20.848973 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 08:17:20.849115 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 08:17:20.854990 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2caeb2ebaf68110cb8728e60aae6db1b1fc48efae6018f66070766a4f42caa69\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.034529 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.040670 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.053083 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 08:17:22 crc kubenswrapper[5004]: W1201 08:17:22.077737 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-55e695ffcda1d11c647076925626669664cbcee522f7b5468ef9ca2255f82b70 WatchSource:0}: Error finding container 55e695ffcda1d11c647076925626669664cbcee522f7b5468ef9ca2255f82b70: Status 404 returned error can't find the container with id 55e695ffcda1d11c647076925626669664cbcee522f7b5468ef9ca2255f82b70 Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.423536 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 08:17:22 crc kubenswrapper[5004]: E1201 08:17:22.423658 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 08:17:23.423638724 +0000 UTC m=+20.988630726 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.423758 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.423792 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.423836 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.423863 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 08:17:22 crc kubenswrapper[5004]: E1201 08:17:22.423971 5004 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 08:17:22 crc kubenswrapper[5004]: E1201 08:17:22.423988 5004 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 08:17:22 crc kubenswrapper[5004]: E1201 08:17:22.424001 5004 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 08:17:22 crc kubenswrapper[5004]: E1201 08:17:22.424036 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 08:17:23.424026095 +0000 UTC m=+20.989018077 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 08:17:22 crc kubenswrapper[5004]: E1201 08:17:22.424079 5004 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 08:17:22 crc kubenswrapper[5004]: E1201 08:17:22.424109 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 08:17:23.424100097 +0000 UTC m=+20.989092079 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 08:17:22 crc kubenswrapper[5004]: E1201 08:17:22.424161 5004 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 08:17:22 crc kubenswrapper[5004]: E1201 08:17:22.424171 5004 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 08:17:22 crc kubenswrapper[5004]: E1201 08:17:22.424179 5004 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 08:17:22 crc kubenswrapper[5004]: E1201 08:17:22.424202 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 08:17:23.424195249 +0000 UTC m=+20.989187231 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 08:17:22 crc kubenswrapper[5004]: E1201 08:17:22.424250 5004 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 08:17:22 crc kubenswrapper[5004]: E1201 08:17:22.424273 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 08:17:23.424266351 +0000 UTC m=+20.989258333 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.563977 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-jjms6"] Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.564607 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-jjms6" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.566922 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.567271 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.567437 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.591812 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:22Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.602421 5004 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Dec 01 08:17:22 crc kubenswrapper[5004]: W1201 08:17:22.602985 5004 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.Service ended with: an error on the server ("unable to decode an event from the watch stream: http2: client connection force closed via ClientConn.Close") has prevented the request from succeeding Dec 01 08:17:22 crc kubenswrapper[5004]: W1201 08:17:22.603010 5004 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.CSIDriver ended with: an error on the server ("unable to decode an event from the watch stream: http2: client connection force closed via ClientConn.Close") has prevented the request from succeeding Dec 01 08:17:22 crc kubenswrapper[5004]: W1201 08:17:22.603026 5004 reflector.go:484] object-"openshift-network-node-identity"/"env-overrides": watch of *v1.ConfigMap ended with: an error on the server ("unable to decode an event from the watch stream: http2: client connection force closed via ClientConn.Close") has prevented the request from succeeding Dec 01 08:17:22 crc kubenswrapper[5004]: W1201 08:17:22.603030 5004 reflector.go:484] object-"openshift-network-operator"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: an error on the server ("unable to decode an event from the watch stream: http2: client connection force closed via ClientConn.Close") has prevented the request from succeeding Dec 01 08:17:22 crc kubenswrapper[5004]: W1201 08:17:22.603051 5004 reflector.go:484] object-"openshift-network-node-identity"/"network-node-identity-cert": watch of *v1.Secret ended with: an error on the server ("unable to decode an event from the watch stream: http2: client connection force closed via ClientConn.Close") has prevented the request from succeeding Dec 01 08:17:22 crc kubenswrapper[5004]: W1201 08:17:22.603076 5004 reflector.go:484] object-"openshift-network-node-identity"/"ovnkube-identity-cm": watch of *v1.ConfigMap ended with: an error on the server ("unable to decode an event from the watch stream: http2: client connection force closed via ClientConn.Close") has prevented the request from succeeding Dec 01 08:17:22 crc kubenswrapper[5004]: W1201 08:17:22.603078 5004 reflector.go:484] object-"openshift-network-operator"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: an error on the server ("unable to decode an event from the watch stream: http2: client connection force closed via ClientConn.Close") has prevented the request from succeeding Dec 01 08:17:22 crc kubenswrapper[5004]: E1201 08:17:22.603041 5004 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": write tcp 38.102.83.75:47240->38.102.83.75:6443: use of closed network connection" event="&Event{ObjectMeta:{kube-apiserver-crc.187d09742d755427 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-01 08:17:03.582622759 +0000 UTC m=+1.147614751,LastTimestamp:2025-12-01 08:17:03.582622759 +0000 UTC m=+1.147614751,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 01 08:17:22 crc kubenswrapper[5004]: W1201 08:17:22.603124 5004 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.Node ended with: an error on the server ("unable to decode an event from the watch stream: http2: client connection force closed via ClientConn.Close") has prevented the request from succeeding Dec 01 08:17:22 crc kubenswrapper[5004]: W1201 08:17:22.603140 5004 reflector.go:484] object-"openshift-network-operator"/"metrics-tls": watch of *v1.Secret ended with: an error on the server ("unable to decode an event from the watch stream: http2: client connection force closed via ClientConn.Close") has prevented the request from succeeding Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.603155 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Patch \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c/status\": http2: client connection force closed via ClientConn.Close" Dec 01 08:17:22 crc kubenswrapper[5004]: W1201 08:17:22.603176 5004 reflector.go:484] object-"openshift-dns"/"node-resolver-dockercfg-kz9s7": watch of *v1.Secret ended with: an error on the server ("unable to decode an event from the watch stream: http2: client connection force closed via ClientConn.Close") has prevented the request from succeeding Dec 01 08:17:22 crc kubenswrapper[5004]: W1201 08:17:22.603186 5004 reflector.go:484] pkg/kubelet/config/apiserver.go:66: watch of *v1.Pod ended with: an error on the server ("unable to decode an event from the watch stream: http2: client connection force closed via ClientConn.Close") has prevented the request from succeeding Dec 01 08:17:22 crc kubenswrapper[5004]: W1201 08:17:22.603199 5004 reflector.go:484] object-"openshift-dns"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: an error on the server ("unable to decode an event from the watch stream: http2: client connection force closed via ClientConn.Close") has prevented the request from succeeding Dec 01 08:17:22 crc kubenswrapper[5004]: W1201 08:17:22.603213 5004 reflector.go:484] object-"openshift-network-node-identity"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: an error on the server ("unable to decode an event from the watch stream: http2: client connection force closed via ClientConn.Close") has prevented the request from succeeding Dec 01 08:17:22 crc kubenswrapper[5004]: W1201 08:17:22.603101 5004 reflector.go:484] object-"openshift-dns"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: an error on the server ("unable to decode an event from the watch stream: http2: client connection force closed via ClientConn.Close") has prevented the request from succeeding Dec 01 08:17:22 crc kubenswrapper[5004]: W1201 08:17:22.603152 5004 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.RuntimeClass ended with: an error on the server ("unable to decode an event from the watch stream: http2: client connection force closed via ClientConn.Close") has prevented the request from succeeding Dec 01 08:17:22 crc kubenswrapper[5004]: W1201 08:17:22.602993 5004 reflector.go:484] object-"openshift-network-operator"/"iptables-alerter-script": watch of *v1.ConfigMap ended with: an error on the server ("unable to decode an event from the watch stream: http2: client connection force closed via ClientConn.Close") has prevented the request from succeeding Dec 01 08:17:22 crc kubenswrapper[5004]: W1201 08:17:22.603238 5004 reflector.go:484] object-"openshift-network-node-identity"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: an error on the server ("unable to decode an event from the watch stream: http2: client connection force closed via ClientConn.Close") has prevented the request from succeeding Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.625997 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8828af41-beeb-47dd-96cf-3dbcb5175893-hosts-file\") pod \"node-resolver-jjms6\" (UID: \"8828af41-beeb-47dd-96cf-3dbcb5175893\") " pod="openshift-dns/node-resolver-jjms6" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.626420 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvsch\" (UniqueName: \"kubernetes.io/projected/8828af41-beeb-47dd-96cf-3dbcb5175893-kube-api-access-mvsch\") pod \"node-resolver-jjms6\" (UID: \"8828af41-beeb-47dd-96cf-3dbcb5175893\") " pod="openshift-dns/node-resolver-jjms6" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.630092 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:22Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.648362 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:22Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.668014 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:22Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.693237 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43c9f762-d403-49b9-8294-d66976aca4dd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b358f641ff09bc8f4a452b0d4a8cf5f9790182f76779be31426d48d0278b0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d610c3a799d4a7df9ddfe2a28f2ced2e5ac4e96c67c93f8e6d3a0eee60d9ac48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43f9f72af3273da055eeb1e13ad6258a1b3b6b205402861beee9f87f02230297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e3cc2eca559417c71beed561569c4dc5b45ffb8407976e0695fc1b93219d37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c24fbeedbc7c40b6079b7ad89d8564a4f2cdaad2cf996e66682b082382da190\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:22Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.704611 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0ed2b6d-0c61-4639-bc3b-1c8effc4815d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45a0d7a12e7c114de1740a151e04f7f39844b709740cab958f156ac918af3228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d69a370af9f8f6c575f99da6bb01cee0f660dccb7e85f16f5d8874d0e263e1fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82baee0fd931fd7b41cd5a73ca7e653ef13dfa3d573991cbe3238d6b95f6fac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad581564fc519328ae429bce757797f6d87d8648e862b008637a245866ec4cbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad581564fc519328ae429bce757797f6d87d8648e862b008637a245866ec4cbe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T08:17:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 08:17:15.185125 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 08:17:15.186793 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071633747/tls.crt::/tmp/serving-cert-4071633747/tls.key\\\\\\\"\\\\nI1201 08:17:20.827614 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 08:17:20.834528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 08:17:20.834549 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 08:17:20.837611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 08:17:20.837630 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 08:17:20.848940 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 08:17:20.848957 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848965 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 08:17:20.848968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 08:17:20.848970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 08:17:20.848973 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 08:17:20.849115 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 08:17:20.854990 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2caeb2ebaf68110cb8728e60aae6db1b1fc48efae6018f66070766a4f42caa69\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:22Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.715543 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:22Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.724501 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jjms6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8828af41-beeb-47dd-96cf-3dbcb5175893\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvsch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jjms6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:22Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.727718 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvsch\" (UniqueName: \"kubernetes.io/projected/8828af41-beeb-47dd-96cf-3dbcb5175893-kube-api-access-mvsch\") pod \"node-resolver-jjms6\" (UID: \"8828af41-beeb-47dd-96cf-3dbcb5175893\") " pod="openshift-dns/node-resolver-jjms6" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.727834 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8828af41-beeb-47dd-96cf-3dbcb5175893-hosts-file\") pod \"node-resolver-jjms6\" (UID: \"8828af41-beeb-47dd-96cf-3dbcb5175893\") " pod="openshift-dns/node-resolver-jjms6" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.728000 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8828af41-beeb-47dd-96cf-3dbcb5175893-hosts-file\") pod \"node-resolver-jjms6\" (UID: \"8828af41-beeb-47dd-96cf-3dbcb5175893\") " pod="openshift-dns/node-resolver-jjms6" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.745683 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvsch\" (UniqueName: \"kubernetes.io/projected/8828af41-beeb-47dd-96cf-3dbcb5175893-kube-api-access-mvsch\") pod \"node-resolver-jjms6\" (UID: \"8828af41-beeb-47dd-96cf-3dbcb5175893\") " pod="openshift-dns/node-resolver-jjms6" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.762286 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.762778 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.763539 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.764129 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.764734 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.765203 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.765765 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.766257 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.768196 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.768732 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.769583 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.770245 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.771052 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.771524 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.772025 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.772874 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.773374 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.774126 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.774659 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.775225 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.777866 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.778432 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.778890 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.779861 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.780233 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.781230 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.781871 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.782673 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.783217 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.784010 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.784474 5004 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.784597 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.786716 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.787176 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.787610 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.788835 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:22Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.789158 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.790117 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.791089 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.792147 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.792799 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.793781 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.794355 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.795280 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.796198 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.796695 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.797391 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.798663 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.799352 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.800236 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:22Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.800382 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.800918 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.802419 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.802934 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.803471 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.804279 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.811957 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:22Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.820673 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:22Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.831750 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:22Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.847385 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43c9f762-d403-49b9-8294-d66976aca4dd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b358f641ff09bc8f4a452b0d4a8cf5f9790182f76779be31426d48d0278b0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d610c3a799d4a7df9ddfe2a28f2ced2e5ac4e96c67c93f8e6d3a0eee60d9ac48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43f9f72af3273da055eeb1e13ad6258a1b3b6b205402861beee9f87f02230297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e3cc2eca559417c71beed561569c4dc5b45ffb8407976e0695fc1b93219d37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c24fbeedbc7c40b6079b7ad89d8564a4f2cdaad2cf996e66682b082382da190\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:22Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.849099 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.858215 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.874762 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0ed2b6d-0c61-4639-bc3b-1c8effc4815d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45a0d7a12e7c114de1740a151e04f7f39844b709740cab958f156ac918af3228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d69a370af9f8f6c575f99da6bb01cee0f660dccb7e85f16f5d8874d0e263e1fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82baee0fd931fd7b41cd5a73ca7e653ef13dfa3d573991cbe3238d6b95f6fac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad581564fc519328ae429bce757797f6d87d8648e862b008637a245866ec4cbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad581564fc519328ae429bce757797f6d87d8648e862b008637a245866ec4cbe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T08:17:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 08:17:15.185125 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 08:17:15.186793 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071633747/tls.crt::/tmp/serving-cert-4071633747/tls.key\\\\\\\"\\\\nI1201 08:17:20.827614 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 08:17:20.834528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 08:17:20.834549 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 08:17:20.837611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 08:17:20.837630 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 08:17:20.848940 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 08:17:20.848957 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848965 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 08:17:20.848968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 08:17:20.848970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 08:17:20.848973 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 08:17:20.849115 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 08:17:20.854990 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2caeb2ebaf68110cb8728e60aae6db1b1fc48efae6018f66070766a4f42caa69\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:22Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.875013 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-jjms6" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.915964 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:22Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.938275 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jjms6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8828af41-beeb-47dd-96cf-3dbcb5175893\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvsch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jjms6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:22Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.941373 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"225cff2ed9ff2311087a58ccdc2401a213c895083e9d1b79d7c8c932f4bdbb70"} Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.956919 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:22Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.957417 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"327477e7db10129d47ddf9078f32d0df2ffd4b8e59fc7bb77c17c60c63e9a936"} Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.957459 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"747671fe19b1b5f27d25667844b7f01cb87f1bedf8d181fca54f82470dd22fa3"} Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.962253 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-jjms6" event={"ID":"8828af41-beeb-47dd-96cf-3dbcb5175893","Type":"ContainerStarted","Data":"acd2b017a7557bab6d6a89257f010e18e4053bd4dd38e3711d51b8637481494b"} Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.964081 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"3fd2f2b5d7dc3d7b8e1bb06fdf0eec3addfec3c230e044b6395d6cf535630ebd"} Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.964105 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"a0b2250161e400b201f0528e33231c9007342002a37cd037a1b1e011f2aaaa22"} Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.964115 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"55e695ffcda1d11c647076925626669664cbcee522f7b5468ef9ca2255f82b70"} Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.967264 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.968259 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"aa5f341e99100e985f65c24647d976a24782dfae7f52e752b774f41f69af27fd"} Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.968806 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.974326 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:22Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:22 crc kubenswrapper[5004]: I1201 08:17:22.991814 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:22Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.008506 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:23Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.018984 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:23Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.043746 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43c9f762-d403-49b9-8294-d66976aca4dd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b358f641ff09bc8f4a452b0d4a8cf5f9790182f76779be31426d48d0278b0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d610c3a799d4a7df9ddfe2a28f2ced2e5ac4e96c67c93f8e6d3a0eee60d9ac48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43f9f72af3273da055eeb1e13ad6258a1b3b6b205402861beee9f87f02230297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e3cc2eca559417c71beed561569c4dc5b45ffb8407976e0695fc1b93219d37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c24fbeedbc7c40b6079b7ad89d8564a4f2cdaad2cf996e66682b082382da190\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:23Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.062734 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0ed2b6d-0c61-4639-bc3b-1c8effc4815d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45a0d7a12e7c114de1740a151e04f7f39844b709740cab958f156ac918af3228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d69a370af9f8f6c575f99da6bb01cee0f660dccb7e85f16f5d8874d0e263e1fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82baee0fd931fd7b41cd5a73ca7e653ef13dfa3d573991cbe3238d6b95f6fac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad581564fc519328ae429bce757797f6d87d8648e862b008637a245866ec4cbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad581564fc519328ae429bce757797f6d87d8648e862b008637a245866ec4cbe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T08:17:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 08:17:15.185125 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 08:17:15.186793 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071633747/tls.crt::/tmp/serving-cert-4071633747/tls.key\\\\\\\"\\\\nI1201 08:17:20.827614 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 08:17:20.834528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 08:17:20.834549 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 08:17:20.837611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 08:17:20.837630 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 08:17:20.848940 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 08:17:20.848957 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848965 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 08:17:20.848968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 08:17:20.848970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 08:17:20.848973 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 08:17:20.849115 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 08:17:20.854990 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2caeb2ebaf68110cb8728e60aae6db1b1fc48efae6018f66070766a4f42caa69\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:23Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.073740 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:23Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.082370 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jjms6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8828af41-beeb-47dd-96cf-3dbcb5175893\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvsch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jjms6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:23Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.092457 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:23Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.105795 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fd2f2b5d7dc3d7b8e1bb06fdf0eec3addfec3c230e044b6395d6cf535630ebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2250161e400b201f0528e33231c9007342002a37cd037a1b1e011f2aaaa22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:23Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.119336 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327477e7db10129d47ddf9078f32d0df2ffd4b8e59fc7bb77c17c60c63e9a936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:23Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.131187 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:23Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.142834 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:23Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.166678 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43c9f762-d403-49b9-8294-d66976aca4dd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b358f641ff09bc8f4a452b0d4a8cf5f9790182f76779be31426d48d0278b0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d610c3a799d4a7df9ddfe2a28f2ced2e5ac4e96c67c93f8e6d3a0eee60d9ac48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43f9f72af3273da055eeb1e13ad6258a1b3b6b205402861beee9f87f02230297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e3cc2eca559417c71beed561569c4dc5b45ffb8407976e0695fc1b93219d37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c24fbeedbc7c40b6079b7ad89d8564a4f2cdaad2cf996e66682b082382da190\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:23Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.186258 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0ed2b6d-0c61-4639-bc3b-1c8effc4815d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45a0d7a12e7c114de1740a151e04f7f39844b709740cab958f156ac918af3228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d69a370af9f8f6c575f99da6bb01cee0f660dccb7e85f16f5d8874d0e263e1fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82baee0fd931fd7b41cd5a73ca7e653ef13dfa3d573991cbe3238d6b95f6fac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f341e99100e985f65c24647d976a24782dfae7f52e752b774f41f69af27fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad581564fc519328ae429bce757797f6d87d8648e862b008637a245866ec4cbe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T08:17:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 08:17:15.185125 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 08:17:15.186793 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071633747/tls.crt::/tmp/serving-cert-4071633747/tls.key\\\\\\\"\\\\nI1201 08:17:20.827614 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 08:17:20.834528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 08:17:20.834549 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 08:17:20.837611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 08:17:20.837630 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 08:17:20.848940 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 08:17:20.848957 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848965 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 08:17:20.848968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 08:17:20.848970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 08:17:20.848973 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 08:17:20.849115 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 08:17:20.854990 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2caeb2ebaf68110cb8728e60aae6db1b1fc48efae6018f66070766a4f42caa69\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:23Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.197446 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:23Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.215977 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jjms6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8828af41-beeb-47dd-96cf-3dbcb5175893\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvsch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jjms6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:23Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.431231 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.433621 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.433685 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.433710 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.433730 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.433749 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:17:23 crc kubenswrapper[5004]: E1201 08:17:23.433788 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 08:17:25.433761638 +0000 UTC m=+22.998753620 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:17:23 crc kubenswrapper[5004]: E1201 08:17:23.433829 5004 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 08:17:23 crc kubenswrapper[5004]: E1201 08:17:23.433862 5004 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 08:17:23 crc kubenswrapper[5004]: E1201 08:17:23.433873 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 08:17:25.43385971 +0000 UTC m=+22.998851692 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 08:17:23 crc kubenswrapper[5004]: E1201 08:17:23.433897 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 08:17:25.433890531 +0000 UTC m=+22.998882513 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 08:17:23 crc kubenswrapper[5004]: E1201 08:17:23.433929 5004 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 08:17:23 crc kubenswrapper[5004]: E1201 08:17:23.433941 5004 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 08:17:23 crc kubenswrapper[5004]: E1201 08:17:23.433950 5004 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 08:17:23 crc kubenswrapper[5004]: E1201 08:17:23.433971 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 08:17:25.433965343 +0000 UTC m=+22.998957325 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 08:17:23 crc kubenswrapper[5004]: E1201 08:17:23.433974 5004 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 08:17:23 crc kubenswrapper[5004]: E1201 08:17:23.433987 5004 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 08:17:23 crc kubenswrapper[5004]: E1201 08:17:23.434000 5004 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 08:17:23 crc kubenswrapper[5004]: E1201 08:17:23.434023 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 08:17:25.434015864 +0000 UTC m=+22.999007846 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.503302 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.517312 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.609639 5004 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.610509 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-machine-config-operator/machine-config-daemon-fvdgt","openshift-multus/multus-additional-cni-plugins-dpkxw","openshift-multus/multus-zjksw"] Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.611523 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-zjksw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.611887 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.612399 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-dpkxw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.614267 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.614476 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.614843 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.614987 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.615121 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.615708 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.615788 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.616708 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.616967 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.617117 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.617614 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.617639 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.634735 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:23Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.635855 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/70e79009-93be-49c4-a6b3-e8a06bcea7f4-multus-socket-dir-parent\") pod \"multus-zjksw\" (UID: \"70e79009-93be-49c4-a6b3-e8a06bcea7f4\") " pod="openshift-multus/multus-zjksw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.635876 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/70e79009-93be-49c4-a6b3-e8a06bcea7f4-etc-kubernetes\") pod \"multus-zjksw\" (UID: \"70e79009-93be-49c4-a6b3-e8a06bcea7f4\") " pod="openshift-multus/multus-zjksw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.635898 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/70e79009-93be-49c4-a6b3-e8a06bcea7f4-host-var-lib-kubelet\") pod \"multus-zjksw\" (UID: \"70e79009-93be-49c4-a6b3-e8a06bcea7f4\") " pod="openshift-multus/multus-zjksw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.635923 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/70e79009-93be-49c4-a6b3-e8a06bcea7f4-host-run-netns\") pod \"multus-zjksw\" (UID: \"70e79009-93be-49c4-a6b3-e8a06bcea7f4\") " pod="openshift-multus/multus-zjksw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.635943 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/70e79009-93be-49c4-a6b3-e8a06bcea7f4-host-var-lib-cni-bin\") pod \"multus-zjksw\" (UID: \"70e79009-93be-49c4-a6b3-e8a06bcea7f4\") " pod="openshift-multus/multus-zjksw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.636043 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/70e79009-93be-49c4-a6b3-e8a06bcea7f4-hostroot\") pod \"multus-zjksw\" (UID: \"70e79009-93be-49c4-a6b3-e8a06bcea7f4\") " pod="openshift-multus/multus-zjksw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.636122 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ddcaf111-708a-45b3-a342-effd3061ab17-system-cni-dir\") pod \"multus-additional-cni-plugins-dpkxw\" (UID: \"ddcaf111-708a-45b3-a342-effd3061ab17\") " pod="openshift-multus/multus-additional-cni-plugins-dpkxw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.636156 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/70e79009-93be-49c4-a6b3-e8a06bcea7f4-multus-daemon-config\") pod \"multus-zjksw\" (UID: \"70e79009-93be-49c4-a6b3-e8a06bcea7f4\") " pod="openshift-multus/multus-zjksw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.636271 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/70e79009-93be-49c4-a6b3-e8a06bcea7f4-cnibin\") pod \"multus-zjksw\" (UID: \"70e79009-93be-49c4-a6b3-e8a06bcea7f4\") " pod="openshift-multus/multus-zjksw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.636340 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/70e79009-93be-49c4-a6b3-e8a06bcea7f4-multus-cni-dir\") pod \"multus-zjksw\" (UID: \"70e79009-93be-49c4-a6b3-e8a06bcea7f4\") " pod="openshift-multus/multus-zjksw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.636415 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bd8m6\" (UniqueName: \"kubernetes.io/projected/70e79009-93be-49c4-a6b3-e8a06bcea7f4-kube-api-access-bd8m6\") pod \"multus-zjksw\" (UID: \"70e79009-93be-49c4-a6b3-e8a06bcea7f4\") " pod="openshift-multus/multus-zjksw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.636454 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ddcaf111-708a-45b3-a342-effd3061ab17-tuning-conf-dir\") pod \"multus-additional-cni-plugins-dpkxw\" (UID: \"ddcaf111-708a-45b3-a342-effd3061ab17\") " pod="openshift-multus/multus-additional-cni-plugins-dpkxw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.636499 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ddcaf111-708a-45b3-a342-effd3061ab17-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-dpkxw\" (UID: \"ddcaf111-708a-45b3-a342-effd3061ab17\") " pod="openshift-multus/multus-additional-cni-plugins-dpkxw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.636570 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f9977ebb-82de-4e96-8763-0b5a84f8d4ce-mcd-auth-proxy-config\") pod \"machine-config-daemon-fvdgt\" (UID: \"f9977ebb-82de-4e96-8763-0b5a84f8d4ce\") " pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.636610 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/70e79009-93be-49c4-a6b3-e8a06bcea7f4-system-cni-dir\") pod \"multus-zjksw\" (UID: \"70e79009-93be-49c4-a6b3-e8a06bcea7f4\") " pod="openshift-multus/multus-zjksw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.636632 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ddcaf111-708a-45b3-a342-effd3061ab17-os-release\") pod \"multus-additional-cni-plugins-dpkxw\" (UID: \"ddcaf111-708a-45b3-a342-effd3061ab17\") " pod="openshift-multus/multus-additional-cni-plugins-dpkxw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.636670 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/70e79009-93be-49c4-a6b3-e8a06bcea7f4-host-var-lib-cni-multus\") pod \"multus-zjksw\" (UID: \"70e79009-93be-49c4-a6b3-e8a06bcea7f4\") " pod="openshift-multus/multus-zjksw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.636691 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/70e79009-93be-49c4-a6b3-e8a06bcea7f4-cni-binary-copy\") pod \"multus-zjksw\" (UID: \"70e79009-93be-49c4-a6b3-e8a06bcea7f4\") " pod="openshift-multus/multus-zjksw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.636706 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/70e79009-93be-49c4-a6b3-e8a06bcea7f4-host-run-k8s-cni-cncf-io\") pod \"multus-zjksw\" (UID: \"70e79009-93be-49c4-a6b3-e8a06bcea7f4\") " pod="openshift-multus/multus-zjksw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.636723 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/f9977ebb-82de-4e96-8763-0b5a84f8d4ce-rootfs\") pod \"machine-config-daemon-fvdgt\" (UID: \"f9977ebb-82de-4e96-8763-0b5a84f8d4ce\") " pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.636739 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwgxq\" (UniqueName: \"kubernetes.io/projected/f9977ebb-82de-4e96-8763-0b5a84f8d4ce-kube-api-access-pwgxq\") pod \"machine-config-daemon-fvdgt\" (UID: \"f9977ebb-82de-4e96-8763-0b5a84f8d4ce\") " pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.636807 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/70e79009-93be-49c4-a6b3-e8a06bcea7f4-multus-conf-dir\") pod \"multus-zjksw\" (UID: \"70e79009-93be-49c4-a6b3-e8a06bcea7f4\") " pod="openshift-multus/multus-zjksw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.636856 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xb29s\" (UniqueName: \"kubernetes.io/projected/ddcaf111-708a-45b3-a342-effd3061ab17-kube-api-access-xb29s\") pod \"multus-additional-cni-plugins-dpkxw\" (UID: \"ddcaf111-708a-45b3-a342-effd3061ab17\") " pod="openshift-multus/multus-additional-cni-plugins-dpkxw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.636894 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f9977ebb-82de-4e96-8763-0b5a84f8d4ce-proxy-tls\") pod \"machine-config-daemon-fvdgt\" (UID: \"f9977ebb-82de-4e96-8763-0b5a84f8d4ce\") " pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.636926 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/70e79009-93be-49c4-a6b3-e8a06bcea7f4-os-release\") pod \"multus-zjksw\" (UID: \"70e79009-93be-49c4-a6b3-e8a06bcea7f4\") " pod="openshift-multus/multus-zjksw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.636991 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ddcaf111-708a-45b3-a342-effd3061ab17-cni-binary-copy\") pod \"multus-additional-cni-plugins-dpkxw\" (UID: \"ddcaf111-708a-45b3-a342-effd3061ab17\") " pod="openshift-multus/multus-additional-cni-plugins-dpkxw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.637016 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ddcaf111-708a-45b3-a342-effd3061ab17-cnibin\") pod \"multus-additional-cni-plugins-dpkxw\" (UID: \"ddcaf111-708a-45b3-a342-effd3061ab17\") " pod="openshift-multus/multus-additional-cni-plugins-dpkxw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.637033 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/70e79009-93be-49c4-a6b3-e8a06bcea7f4-host-run-multus-certs\") pod \"multus-zjksw\" (UID: \"70e79009-93be-49c4-a6b3-e8a06bcea7f4\") " pod="openshift-multus/multus-zjksw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.653766 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fd2f2b5d7dc3d7b8e1bb06fdf0eec3addfec3c230e044b6395d6cf535630ebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2250161e400b201f0528e33231c9007342002a37cd037a1b1e011f2aaaa22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:23Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.669507 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zjksw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70e79009-93be-49c4-a6b3-e8a06bcea7f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd8m6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zjksw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:23Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.679643 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.688791 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327477e7db10129d47ddf9078f32d0df2ffd4b8e59fc7bb77c17c60c63e9a936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:23Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.705472 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:23Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.722493 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:23Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.738400 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ddcaf111-708a-45b3-a342-effd3061ab17-system-cni-dir\") pod \"multus-additional-cni-plugins-dpkxw\" (UID: \"ddcaf111-708a-45b3-a342-effd3061ab17\") " pod="openshift-multus/multus-additional-cni-plugins-dpkxw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.738437 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/70e79009-93be-49c4-a6b3-e8a06bcea7f4-multus-daemon-config\") pod \"multus-zjksw\" (UID: \"70e79009-93be-49c4-a6b3-e8a06bcea7f4\") " pod="openshift-multus/multus-zjksw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.738454 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/70e79009-93be-49c4-a6b3-e8a06bcea7f4-multus-cni-dir\") pod \"multus-zjksw\" (UID: \"70e79009-93be-49c4-a6b3-e8a06bcea7f4\") " pod="openshift-multus/multus-zjksw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.738469 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/70e79009-93be-49c4-a6b3-e8a06bcea7f4-cnibin\") pod \"multus-zjksw\" (UID: \"70e79009-93be-49c4-a6b3-e8a06bcea7f4\") " pod="openshift-multus/multus-zjksw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.738488 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bd8m6\" (UniqueName: \"kubernetes.io/projected/70e79009-93be-49c4-a6b3-e8a06bcea7f4-kube-api-access-bd8m6\") pod \"multus-zjksw\" (UID: \"70e79009-93be-49c4-a6b3-e8a06bcea7f4\") " pod="openshift-multus/multus-zjksw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.738514 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ddcaf111-708a-45b3-a342-effd3061ab17-tuning-conf-dir\") pod \"multus-additional-cni-plugins-dpkxw\" (UID: \"ddcaf111-708a-45b3-a342-effd3061ab17\") " pod="openshift-multus/multus-additional-cni-plugins-dpkxw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.738530 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ddcaf111-708a-45b3-a342-effd3061ab17-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-dpkxw\" (UID: \"ddcaf111-708a-45b3-a342-effd3061ab17\") " pod="openshift-multus/multus-additional-cni-plugins-dpkxw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.738549 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f9977ebb-82de-4e96-8763-0b5a84f8d4ce-mcd-auth-proxy-config\") pod \"machine-config-daemon-fvdgt\" (UID: \"f9977ebb-82de-4e96-8763-0b5a84f8d4ce\") " pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.738541 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ddcaf111-708a-45b3-a342-effd3061ab17-system-cni-dir\") pod \"multus-additional-cni-plugins-dpkxw\" (UID: \"ddcaf111-708a-45b3-a342-effd3061ab17\") " pod="openshift-multus/multus-additional-cni-plugins-dpkxw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.738586 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/70e79009-93be-49c4-a6b3-e8a06bcea7f4-system-cni-dir\") pod \"multus-zjksw\" (UID: \"70e79009-93be-49c4-a6b3-e8a06bcea7f4\") " pod="openshift-multus/multus-zjksw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.738673 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ddcaf111-708a-45b3-a342-effd3061ab17-os-release\") pod \"multus-additional-cni-plugins-dpkxw\" (UID: \"ddcaf111-708a-45b3-a342-effd3061ab17\") " pod="openshift-multus/multus-additional-cni-plugins-dpkxw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.738705 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/70e79009-93be-49c4-a6b3-e8a06bcea7f4-system-cni-dir\") pod \"multus-zjksw\" (UID: \"70e79009-93be-49c4-a6b3-e8a06bcea7f4\") " pod="openshift-multus/multus-zjksw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.738681 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/70e79009-93be-49c4-a6b3-e8a06bcea7f4-cnibin\") pod \"multus-zjksw\" (UID: \"70e79009-93be-49c4-a6b3-e8a06bcea7f4\") " pod="openshift-multus/multus-zjksw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.738737 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/70e79009-93be-49c4-a6b3-e8a06bcea7f4-host-var-lib-cni-multus\") pod \"multus-zjksw\" (UID: \"70e79009-93be-49c4-a6b3-e8a06bcea7f4\") " pod="openshift-multus/multus-zjksw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.738775 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/f9977ebb-82de-4e96-8763-0b5a84f8d4ce-rootfs\") pod \"machine-config-daemon-fvdgt\" (UID: \"f9977ebb-82de-4e96-8763-0b5a84f8d4ce\") " pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.738805 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/70e79009-93be-49c4-a6b3-e8a06bcea7f4-cni-binary-copy\") pod \"multus-zjksw\" (UID: \"70e79009-93be-49c4-a6b3-e8a06bcea7f4\") " pod="openshift-multus/multus-zjksw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.738827 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/70e79009-93be-49c4-a6b3-e8a06bcea7f4-host-var-lib-cni-multus\") pod \"multus-zjksw\" (UID: \"70e79009-93be-49c4-a6b3-e8a06bcea7f4\") " pod="openshift-multus/multus-zjksw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.738837 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/70e79009-93be-49c4-a6b3-e8a06bcea7f4-host-run-k8s-cni-cncf-io\") pod \"multus-zjksw\" (UID: \"70e79009-93be-49c4-a6b3-e8a06bcea7f4\") " pod="openshift-multus/multus-zjksw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.738854 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/f9977ebb-82de-4e96-8763-0b5a84f8d4ce-rootfs\") pod \"machine-config-daemon-fvdgt\" (UID: \"f9977ebb-82de-4e96-8763-0b5a84f8d4ce\") " pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.738873 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwgxq\" (UniqueName: \"kubernetes.io/projected/f9977ebb-82de-4e96-8763-0b5a84f8d4ce-kube-api-access-pwgxq\") pod \"machine-config-daemon-fvdgt\" (UID: \"f9977ebb-82de-4e96-8763-0b5a84f8d4ce\") " pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.738889 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/70e79009-93be-49c4-a6b3-e8a06bcea7f4-host-run-k8s-cni-cncf-io\") pod \"multus-zjksw\" (UID: \"70e79009-93be-49c4-a6b3-e8a06bcea7f4\") " pod="openshift-multus/multus-zjksw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.738907 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f9977ebb-82de-4e96-8763-0b5a84f8d4ce-proxy-tls\") pod \"machine-config-daemon-fvdgt\" (UID: \"f9977ebb-82de-4e96-8763-0b5a84f8d4ce\") " pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.738937 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/70e79009-93be-49c4-a6b3-e8a06bcea7f4-os-release\") pod \"multus-zjksw\" (UID: \"70e79009-93be-49c4-a6b3-e8a06bcea7f4\") " pod="openshift-multus/multus-zjksw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.738966 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/70e79009-93be-49c4-a6b3-e8a06bcea7f4-multus-conf-dir\") pod \"multus-zjksw\" (UID: \"70e79009-93be-49c4-a6b3-e8a06bcea7f4\") " pod="openshift-multus/multus-zjksw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.738997 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xb29s\" (UniqueName: \"kubernetes.io/projected/ddcaf111-708a-45b3-a342-effd3061ab17-kube-api-access-xb29s\") pod \"multus-additional-cni-plugins-dpkxw\" (UID: \"ddcaf111-708a-45b3-a342-effd3061ab17\") " pod="openshift-multus/multus-additional-cni-plugins-dpkxw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.739031 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ddcaf111-708a-45b3-a342-effd3061ab17-cni-binary-copy\") pod \"multus-additional-cni-plugins-dpkxw\" (UID: \"ddcaf111-708a-45b3-a342-effd3061ab17\") " pod="openshift-multus/multus-additional-cni-plugins-dpkxw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.739051 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/70e79009-93be-49c4-a6b3-e8a06bcea7f4-multus-conf-dir\") pod \"multus-zjksw\" (UID: \"70e79009-93be-49c4-a6b3-e8a06bcea7f4\") " pod="openshift-multus/multus-zjksw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.739061 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ddcaf111-708a-45b3-a342-effd3061ab17-os-release\") pod \"multus-additional-cni-plugins-dpkxw\" (UID: \"ddcaf111-708a-45b3-a342-effd3061ab17\") " pod="openshift-multus/multus-additional-cni-plugins-dpkxw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.739100 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ddcaf111-708a-45b3-a342-effd3061ab17-cnibin\") pod \"multus-additional-cni-plugins-dpkxw\" (UID: \"ddcaf111-708a-45b3-a342-effd3061ab17\") " pod="openshift-multus/multus-additional-cni-plugins-dpkxw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.739096 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/70e79009-93be-49c4-a6b3-e8a06bcea7f4-os-release\") pod \"multus-zjksw\" (UID: \"70e79009-93be-49c4-a6b3-e8a06bcea7f4\") " pod="openshift-multus/multus-zjksw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.739065 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ddcaf111-708a-45b3-a342-effd3061ab17-cnibin\") pod \"multus-additional-cni-plugins-dpkxw\" (UID: \"ddcaf111-708a-45b3-a342-effd3061ab17\") " pod="openshift-multus/multus-additional-cni-plugins-dpkxw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.739136 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ddcaf111-708a-45b3-a342-effd3061ab17-tuning-conf-dir\") pod \"multus-additional-cni-plugins-dpkxw\" (UID: \"ddcaf111-708a-45b3-a342-effd3061ab17\") " pod="openshift-multus/multus-additional-cni-plugins-dpkxw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.739209 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/70e79009-93be-49c4-a6b3-e8a06bcea7f4-host-run-multus-certs\") pod \"multus-zjksw\" (UID: \"70e79009-93be-49c4-a6b3-e8a06bcea7f4\") " pod="openshift-multus/multus-zjksw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.739263 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/70e79009-93be-49c4-a6b3-e8a06bcea7f4-host-run-multus-certs\") pod \"multus-zjksw\" (UID: \"70e79009-93be-49c4-a6b3-e8a06bcea7f4\") " pod="openshift-multus/multus-zjksw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.739316 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/70e79009-93be-49c4-a6b3-e8a06bcea7f4-etc-kubernetes\") pod \"multus-zjksw\" (UID: \"70e79009-93be-49c4-a6b3-e8a06bcea7f4\") " pod="openshift-multus/multus-zjksw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.739389 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/70e79009-93be-49c4-a6b3-e8a06bcea7f4-multus-socket-dir-parent\") pod \"multus-zjksw\" (UID: \"70e79009-93be-49c4-a6b3-e8a06bcea7f4\") " pod="openshift-multus/multus-zjksw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.739426 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/70e79009-93be-49c4-a6b3-e8a06bcea7f4-etc-kubernetes\") pod \"multus-zjksw\" (UID: \"70e79009-93be-49c4-a6b3-e8a06bcea7f4\") " pod="openshift-multus/multus-zjksw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.739428 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/70e79009-93be-49c4-a6b3-e8a06bcea7f4-host-var-lib-kubelet\") pod \"multus-zjksw\" (UID: \"70e79009-93be-49c4-a6b3-e8a06bcea7f4\") " pod="openshift-multus/multus-zjksw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.739477 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/70e79009-93be-49c4-a6b3-e8a06bcea7f4-host-var-lib-kubelet\") pod \"multus-zjksw\" (UID: \"70e79009-93be-49c4-a6b3-e8a06bcea7f4\") " pod="openshift-multus/multus-zjksw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.739507 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/70e79009-93be-49c4-a6b3-e8a06bcea7f4-host-run-netns\") pod \"multus-zjksw\" (UID: \"70e79009-93be-49c4-a6b3-e8a06bcea7f4\") " pod="openshift-multus/multus-zjksw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.739541 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/70e79009-93be-49c4-a6b3-e8a06bcea7f4-host-var-lib-cni-bin\") pod \"multus-zjksw\" (UID: \"70e79009-93be-49c4-a6b3-e8a06bcea7f4\") " pod="openshift-multus/multus-zjksw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.739475 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/70e79009-93be-49c4-a6b3-e8a06bcea7f4-multus-socket-dir-parent\") pod \"multus-zjksw\" (UID: \"70e79009-93be-49c4-a6b3-e8a06bcea7f4\") " pod="openshift-multus/multus-zjksw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.739601 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/70e79009-93be-49c4-a6b3-e8a06bcea7f4-hostroot\") pod \"multus-zjksw\" (UID: \"70e79009-93be-49c4-a6b3-e8a06bcea7f4\") " pod="openshift-multus/multus-zjksw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.739602 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f9977ebb-82de-4e96-8763-0b5a84f8d4ce-mcd-auth-proxy-config\") pod \"machine-config-daemon-fvdgt\" (UID: \"f9977ebb-82de-4e96-8763-0b5a84f8d4ce\") " pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.739630 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/70e79009-93be-49c4-a6b3-e8a06bcea7f4-host-var-lib-cni-bin\") pod \"multus-zjksw\" (UID: \"70e79009-93be-49c4-a6b3-e8a06bcea7f4\") " pod="openshift-multus/multus-zjksw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.739553 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/70e79009-93be-49c4-a6b3-e8a06bcea7f4-host-run-netns\") pod \"multus-zjksw\" (UID: \"70e79009-93be-49c4-a6b3-e8a06bcea7f4\") " pod="openshift-multus/multus-zjksw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.739667 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/70e79009-93be-49c4-a6b3-e8a06bcea7f4-hostroot\") pod \"multus-zjksw\" (UID: \"70e79009-93be-49c4-a6b3-e8a06bcea7f4\") " pod="openshift-multus/multus-zjksw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.739754 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ddcaf111-708a-45b3-a342-effd3061ab17-cni-binary-copy\") pod \"multus-additional-cni-plugins-dpkxw\" (UID: \"ddcaf111-708a-45b3-a342-effd3061ab17\") " pod="openshift-multus/multus-additional-cni-plugins-dpkxw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.740164 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/70e79009-93be-49c4-a6b3-e8a06bcea7f4-cni-binary-copy\") pod \"multus-zjksw\" (UID: \"70e79009-93be-49c4-a6b3-e8a06bcea7f4\") " pod="openshift-multus/multus-zjksw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.740237 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ddcaf111-708a-45b3-a342-effd3061ab17-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-dpkxw\" (UID: \"ddcaf111-708a-45b3-a342-effd3061ab17\") " pod="openshift-multus/multus-additional-cni-plugins-dpkxw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.740659 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/70e79009-93be-49c4-a6b3-e8a06bcea7f4-multus-cni-dir\") pod \"multus-zjksw\" (UID: \"70e79009-93be-49c4-a6b3-e8a06bcea7f4\") " pod="openshift-multus/multus-zjksw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.741324 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/70e79009-93be-49c4-a6b3-e8a06bcea7f4-multus-daemon-config\") pod \"multus-zjksw\" (UID: \"70e79009-93be-49c4-a6b3-e8a06bcea7f4\") " pod="openshift-multus/multus-zjksw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.743406 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f9977ebb-82de-4e96-8763-0b5a84f8d4ce-proxy-tls\") pod \"machine-config-daemon-fvdgt\" (UID: \"f9977ebb-82de-4e96-8763-0b5a84f8d4ce\") " pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.755896 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43c9f762-d403-49b9-8294-d66976aca4dd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b358f641ff09bc8f4a452b0d4a8cf5f9790182f76779be31426d48d0278b0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d610c3a799d4a7df9ddfe2a28f2ced2e5ac4e96c67c93f8e6d3a0eee60d9ac48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43f9f72af3273da055eeb1e13ad6258a1b3b6b205402861beee9f87f02230297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e3cc2eca559417c71beed561569c4dc5b45ffb8407976e0695fc1b93219d37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c24fbeedbc7c40b6079b7ad89d8564a4f2cdaad2cf996e66682b082382da190\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:23Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.759647 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.759816 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xb29s\" (UniqueName: \"kubernetes.io/projected/ddcaf111-708a-45b3-a342-effd3061ab17-kube-api-access-xb29s\") pod \"multus-additional-cni-plugins-dpkxw\" (UID: \"ddcaf111-708a-45b3-a342-effd3061ab17\") " pod="openshift-multus/multus-additional-cni-plugins-dpkxw" Dec 01 08:17:23 crc kubenswrapper[5004]: E1201 08:17:23.759834 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.760240 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:17:23 crc kubenswrapper[5004]: E1201 08:17:23.760397 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.760527 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 08:17:23 crc kubenswrapper[5004]: E1201 08:17:23.760712 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.762158 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.762315 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.762156 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bd8m6\" (UniqueName: \"kubernetes.io/projected/70e79009-93be-49c4-a6b3-e8a06bcea7f4-kube-api-access-bd8m6\") pod \"multus-zjksw\" (UID: \"70e79009-93be-49c4-a6b3-e8a06bcea7f4\") " pod="openshift-multus/multus-zjksw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.765418 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwgxq\" (UniqueName: \"kubernetes.io/projected/f9977ebb-82de-4e96-8763-0b5a84f8d4ce-kube-api-access-pwgxq\") pod \"machine-config-daemon-fvdgt\" (UID: \"f9977ebb-82de-4e96-8763-0b5a84f8d4ce\") " pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.766529 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-knmdv"] Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.770408 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.774127 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.774150 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.774270 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.774398 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.777496 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.777501 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.777769 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.779664 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:23Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.781717 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.793280 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jjms6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8828af41-beeb-47dd-96cf-3dbcb5175893\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvsch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jjms6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:23Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.806763 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9977ebb-82de-4e96-8763-0b5a84f8d4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwgxq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwgxq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fvdgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:23Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.812374 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.824532 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2fa377e-a3a6-46ce-af23-e677799f0115\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520771cf1ad6f9360064c5f304243c5878a2f1025c87d1001c97b2d5f95cb9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b65b9bc444101900c62fd90c7d45789d186a89148ac0fd9c7f0c6048c3f55ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbd3e682d207b8c18d6bcbc26aa2adae2a35645502d31b327ffddd51991e842\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4563974ed1467a87aab104756828e63853c282a61c4fcab1e757eb4da9afaa40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:23Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.839554 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0ed2b6d-0c61-4639-bc3b-1c8effc4815d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45a0d7a12e7c114de1740a151e04f7f39844b709740cab958f156ac918af3228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d69a370af9f8f6c575f99da6bb01cee0f660dccb7e85f16f5d8874d0e263e1fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82baee0fd931fd7b41cd5a73ca7e653ef13dfa3d573991cbe3238d6b95f6fac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f341e99100e985f65c24647d976a24782dfae7f52e752b774f41f69af27fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad581564fc519328ae429bce757797f6d87d8648e862b008637a245866ec4cbe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T08:17:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 08:17:15.185125 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 08:17:15.186793 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071633747/tls.crt::/tmp/serving-cert-4071633747/tls.key\\\\\\\"\\\\nI1201 08:17:20.827614 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 08:17:20.834528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 08:17:20.834549 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 08:17:20.837611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 08:17:20.837630 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 08:17:20.848940 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 08:17:20.848957 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848965 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 08:17:20.848968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 08:17:20.848970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 08:17:20.848973 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 08:17:20.849115 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 08:17:20.854990 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2caeb2ebaf68110cb8728e60aae6db1b1fc48efae6018f66070766a4f42caa69\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:23Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.840630 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-systemd-units\") pod \"ovnkube-node-knmdv\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.840672 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-host-slash\") pod \"ovnkube-node-knmdv\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.840698 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-host-kubelet\") pod \"ovnkube-node-knmdv\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.840749 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-run-systemd\") pod \"ovnkube-node-knmdv\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.840771 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-run-ovn\") pod \"ovnkube-node-knmdv\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.840791 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-node-log\") pod \"ovnkube-node-knmdv\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.840986 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/15cdec0a-5925-4966-a30b-f60c503f633e-ovnkube-config\") pod \"ovnkube-node-knmdv\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.841052 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/15cdec0a-5925-4966-a30b-f60c503f633e-ovn-node-metrics-cert\") pod \"ovnkube-node-knmdv\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.841087 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-host-cni-bin\") pod \"ovnkube-node-knmdv\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.841148 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-log-socket\") pod \"ovnkube-node-knmdv\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.841180 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-host-run-ovn-kubernetes\") pod \"ovnkube-node-knmdv\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.841318 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/15cdec0a-5925-4966-a30b-f60c503f633e-ovnkube-script-lib\") pod \"ovnkube-node-knmdv\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.841404 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-etc-openvswitch\") pod \"ovnkube-node-knmdv\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.841439 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f472x\" (UniqueName: \"kubernetes.io/projected/15cdec0a-5925-4966-a30b-f60c503f633e-kube-api-access-f472x\") pod \"ovnkube-node-knmdv\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.841528 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-host-run-netns\") pod \"ovnkube-node-knmdv\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.841655 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-var-lib-openvswitch\") pod \"ovnkube-node-knmdv\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.841686 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-host-cni-netd\") pod \"ovnkube-node-knmdv\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.841823 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-knmdv\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.841857 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/15cdec0a-5925-4966-a30b-f60c503f633e-env-overrides\") pod \"ovnkube-node-knmdv\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.841923 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-run-openvswitch\") pod \"ovnkube-node-knmdv\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.849785 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.859130 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0ed2b6d-0c61-4639-bc3b-1c8effc4815d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45a0d7a12e7c114de1740a151e04f7f39844b709740cab958f156ac918af3228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d69a370af9f8f6c575f99da6bb01cee0f660dccb7e85f16f5d8874d0e263e1fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82baee0fd931fd7b41cd5a73ca7e653ef13dfa3d573991cbe3238d6b95f6fac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f341e99100e985f65c24647d976a24782dfae7f52e752b774f41f69af27fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad581564fc519328ae429bce757797f6d87d8648e862b008637a245866ec4cbe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T08:17:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 08:17:15.185125 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 08:17:15.186793 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071633747/tls.crt::/tmp/serving-cert-4071633747/tls.key\\\\\\\"\\\\nI1201 08:17:20.827614 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 08:17:20.834528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 08:17:20.834549 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 08:17:20.837611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 08:17:20.837630 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 08:17:20.848940 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 08:17:20.848957 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848965 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 08:17:20.848968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 08:17:20.848970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 08:17:20.848973 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 08:17:20.849115 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 08:17:20.854990 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2caeb2ebaf68110cb8728e60aae6db1b1fc48efae6018f66070766a4f42caa69\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:23Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.874150 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zjksw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70e79009-93be-49c4-a6b3-e8a06bcea7f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd8m6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zjksw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:23Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.903843 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15cdec0a-5925-4966-a30b-f60c503f633e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-knmdv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:23Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.922003 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327477e7db10129d47ddf9078f32d0df2ffd4b8e59fc7bb77c17c60c63e9a936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:23Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.925246 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-zjksw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.936221 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" Dec 01 08:17:23 crc kubenswrapper[5004]: W1201 08:17:23.939434 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70e79009_93be_49c4_a6b3_e8a06bcea7f4.slice/crio-8480696114bb43f410cdf6960f9bcce144182a0a738e7dfcd77ba7aa44e1566a WatchSource:0}: Error finding container 8480696114bb43f410cdf6960f9bcce144182a0a738e7dfcd77ba7aa44e1566a: Status 404 returned error can't find the container with id 8480696114bb43f410cdf6960f9bcce144182a0a738e7dfcd77ba7aa44e1566a Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.942370 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-systemd-units\") pod \"ovnkube-node-knmdv\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.942421 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-host-slash\") pod \"ovnkube-node-knmdv\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.942517 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-host-kubelet\") pod \"ovnkube-node-knmdv\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.942633 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-systemd-units\") pod \"ovnkube-node-knmdv\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.942696 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-host-slash\") pod \"ovnkube-node-knmdv\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.942806 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-dpkxw" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.942907 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-host-kubelet\") pod \"ovnkube-node-knmdv\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.945409 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-node-log\") pod \"ovnkube-node-knmdv\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.945506 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-node-log\") pod \"ovnkube-node-knmdv\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.945693 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-run-systemd\") pod \"ovnkube-node-knmdv\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.945781 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-run-systemd\") pod \"ovnkube-node-knmdv\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.945827 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-run-ovn\") pod \"ovnkube-node-knmdv\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.945898 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/15cdec0a-5925-4966-a30b-f60c503f633e-ovnkube-config\") pod \"ovnkube-node-knmdv\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.945965 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-run-ovn\") pod \"ovnkube-node-knmdv\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.947634 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/15cdec0a-5925-4966-a30b-f60c503f633e-ovnkube-config\") pod \"ovnkube-node-knmdv\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.945929 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/15cdec0a-5925-4966-a30b-f60c503f633e-ovn-node-metrics-cert\") pod \"ovnkube-node-knmdv\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.948805 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-host-cni-bin\") pod \"ovnkube-node-knmdv\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.948861 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-log-socket\") pod \"ovnkube-node-knmdv\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.948921 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-host-cni-bin\") pod \"ovnkube-node-knmdv\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.949042 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-log-socket\") pod \"ovnkube-node-knmdv\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.949099 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-host-run-ovn-kubernetes\") pod \"ovnkube-node-knmdv\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.949153 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/15cdec0a-5925-4966-a30b-f60c503f633e-ovnkube-script-lib\") pod \"ovnkube-node-knmdv\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.949200 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-etc-openvswitch\") pod \"ovnkube-node-knmdv\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.949234 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f472x\" (UniqueName: \"kubernetes.io/projected/15cdec0a-5925-4966-a30b-f60c503f633e-kube-api-access-f472x\") pod \"ovnkube-node-knmdv\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.949266 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-host-cni-netd\") pod \"ovnkube-node-knmdv\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.949316 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-host-run-netns\") pod \"ovnkube-node-knmdv\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.949350 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-var-lib-openvswitch\") pod \"ovnkube-node-knmdv\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.949382 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-knmdv\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.949414 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/15cdec0a-5925-4966-a30b-f60c503f633e-env-overrides\") pod \"ovnkube-node-knmdv\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.949447 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-run-openvswitch\") pod \"ovnkube-node-knmdv\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.949525 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-run-openvswitch\") pod \"ovnkube-node-knmdv\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.949605 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-host-run-ovn-kubernetes\") pod \"ovnkube-node-knmdv\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.950513 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-host-run-netns\") pod \"ovnkube-node-knmdv\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.950666 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-knmdv\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:17:23 crc kubenswrapper[5004]: W1201 08:17:23.950697 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9977ebb_82de_4e96_8763_0b5a84f8d4ce.slice/crio-14031c61f455c3d3ae01848bda8619cb5cae105edd3cdfb788825646c6ac16f9 WatchSource:0}: Error finding container 14031c61f455c3d3ae01848bda8619cb5cae105edd3cdfb788825646c6ac16f9: Status 404 returned error can't find the container with id 14031c61f455c3d3ae01848bda8619cb5cae105edd3cdfb788825646c6ac16f9 Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.950741 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-host-cni-netd\") pod \"ovnkube-node-knmdv\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.950879 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-var-lib-openvswitch\") pod \"ovnkube-node-knmdv\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.950921 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:23Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.951773 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/15cdec0a-5925-4966-a30b-f60c503f633e-env-overrides\") pod \"ovnkube-node-knmdv\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.952365 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-etc-openvswitch\") pod \"ovnkube-node-knmdv\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.953916 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/15cdec0a-5925-4966-a30b-f60c503f633e-ovn-node-metrics-cert\") pod \"ovnkube-node-knmdv\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.954460 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/15cdec0a-5925-4966-a30b-f60c503f633e-ovnkube-script-lib\") pod \"ovnkube-node-knmdv\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.972633 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-jjms6" event={"ID":"8828af41-beeb-47dd-96cf-3dbcb5175893","Type":"ContainerStarted","Data":"7b22220849cdb91592a44a67f9f6a09586146009483f46a0505a2dea3b02bccc"} Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.974424 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" event={"ID":"f9977ebb-82de-4e96-8763-0b5a84f8d4ce","Type":"ContainerStarted","Data":"14031c61f455c3d3ae01848bda8619cb5cae105edd3cdfb788825646c6ac16f9"} Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.977804 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dpkxw" event={"ID":"ddcaf111-708a-45b3-a342-effd3061ab17","Type":"ContainerStarted","Data":"124fa83a43595b80d90735b72973082ca1070eb536f4d31e3e5f3ed407f25c59"} Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.979706 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:23Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.980800 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zjksw" event={"ID":"70e79009-93be-49c4-a6b3-e8a06bcea7f4","Type":"ContainerStarted","Data":"8480696114bb43f410cdf6960f9bcce144182a0a738e7dfcd77ba7aa44e1566a"} Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.981042 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f472x\" (UniqueName: \"kubernetes.io/projected/15cdec0a-5925-4966-a30b-f60c503f633e-kube-api-access-f472x\") pod \"ovnkube-node-knmdv\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:17:23 crc kubenswrapper[5004]: E1201 08:17:23.991331 5004 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 08:17:23 crc kubenswrapper[5004]: I1201 08:17:23.999484 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:23Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.020856 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fd2f2b5d7dc3d7b8e1bb06fdf0eec3addfec3c230e044b6395d6cf535630ebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2250161e400b201f0528e33231c9007342002a37cd037a1b1e011f2aaaa22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:24Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.046970 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43c9f762-d403-49b9-8294-d66976aca4dd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b358f641ff09bc8f4a452b0d4a8cf5f9790182f76779be31426d48d0278b0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d610c3a799d4a7df9ddfe2a28f2ced2e5ac4e96c67c93f8e6d3a0eee60d9ac48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43f9f72af3273da055eeb1e13ad6258a1b3b6b205402861beee9f87f02230297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e3cc2eca559417c71beed561569c4dc5b45ffb8407976e0695fc1b93219d37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c24fbeedbc7c40b6079b7ad89d8564a4f2cdaad2cf996e66682b082382da190\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:24Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.061341 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:24Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.072609 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jjms6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8828af41-beeb-47dd-96cf-3dbcb5175893\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvsch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jjms6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:24Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.089329 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.090148 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9977ebb-82de-4e96-8763-0b5a84f8d4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwgxq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwgxq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fvdgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:24Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.109149 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2fa377e-a3a6-46ce-af23-e677799f0115\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520771cf1ad6f9360064c5f304243c5878a2f1025c87d1001c97b2d5f95cb9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b65b9bc444101900c62fd90c7d45789d186a89148ac0fd9c7f0c6048c3f55ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbd3e682d207b8c18d6bcbc26aa2adae2a35645502d31b327ffddd51991e842\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4563974ed1467a87aab104756828e63853c282a61c4fcab1e757eb4da9afaa40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:24Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:24 crc kubenswrapper[5004]: W1201 08:17:24.122968 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15cdec0a_5925_4966_a30b_f60c503f633e.slice/crio-d5212583abfb5ca3c1010406306c1284a177ea12c285e6a737cc7318e7f3ffb8 WatchSource:0}: Error finding container d5212583abfb5ca3c1010406306c1284a177ea12c285e6a737cc7318e7f3ffb8: Status 404 returned error can't find the container with id d5212583abfb5ca3c1010406306c1284a177ea12c285e6a737cc7318e7f3ffb8 Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.169536 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.169777 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.170404 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dpkxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddcaf111-708a-45b3-a342-effd3061ab17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dpkxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:24Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.172721 5004 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.176323 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.176358 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.176366 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.176456 5004 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.183789 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jjms6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8828af41-beeb-47dd-96cf-3dbcb5175893\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b22220849cdb91592a44a67f9f6a09586146009483f46a0505a2dea3b02bccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvsch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jjms6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:24Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.183900 5004 kubelet_node_status.go:115] "Node was previously registered" node="crc" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.184175 5004 kubelet_node_status.go:79] "Successfully registered node" node="crc" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.185262 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.185296 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.185304 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.185320 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.185388 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:24Z","lastTransitionTime":"2025-12-01T08:17:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.194975 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.204038 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9977ebb-82de-4e96-8763-0b5a84f8d4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwgxq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwgxq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fvdgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:24Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:24 crc kubenswrapper[5004]: E1201 08:17:24.204486 5004 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c8341267-df98-484f-a5e7-cf024fab437c\\\",\\\"systemUUID\\\":\\\"77547a65-c048-47ee-89b1-8422fc81b7aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:24Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.208680 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.208719 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.208731 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.208749 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.208764 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:24Z","lastTransitionTime":"2025-12-01T08:17:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:24 crc kubenswrapper[5004]: E1201 08:17:24.226397 5004 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c8341267-df98-484f-a5e7-cf024fab437c\\\",\\\"systemUUID\\\":\\\"77547a65-c048-47ee-89b1-8422fc81b7aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:24Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.229200 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.229228 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.229239 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.229256 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.229268 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:24Z","lastTransitionTime":"2025-12-01T08:17:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.231107 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43c9f762-d403-49b9-8294-d66976aca4dd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b358f641ff09bc8f4a452b0d4a8cf5f9790182f76779be31426d48d0278b0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d610c3a799d4a7df9ddfe2a28f2ced2e5ac4e96c67c93f8e6d3a0eee60d9ac48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43f9f72af3273da055eeb1e13ad6258a1b3b6b205402861beee9f87f02230297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e3cc2eca559417c71beed561569c4dc5b45ffb8407976e0695fc1b93219d37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c24fbeedbc7c40b6079b7ad89d8564a4f2cdaad2cf996e66682b082382da190\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:24Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:24 crc kubenswrapper[5004]: E1201 08:17:24.241126 5004 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c8341267-df98-484f-a5e7-cf024fab437c\\\",\\\"systemUUID\\\":\\\"77547a65-c048-47ee-89b1-8422fc81b7aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:24Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.244073 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.244109 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.244131 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.244148 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.244160 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:24Z","lastTransitionTime":"2025-12-01T08:17:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.250029 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:24Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:24 crc kubenswrapper[5004]: E1201 08:17:24.256832 5004 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c8341267-df98-484f-a5e7-cf024fab437c\\\",\\\"systemUUID\\\":\\\"77547a65-c048-47ee-89b1-8422fc81b7aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:24Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.260227 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.260289 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.260305 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.260327 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.260342 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:24Z","lastTransitionTime":"2025-12-01T08:17:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.264681 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2fa377e-a3a6-46ce-af23-e677799f0115\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520771cf1ad6f9360064c5f304243c5878a2f1025c87d1001c97b2d5f95cb9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b65b9bc444101900c62fd90c7d45789d186a89148ac0fd9c7f0c6048c3f55ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbd3e682d207b8c18d6bcbc26aa2adae2a35645502d31b327ffddd51991e842\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4563974ed1467a87aab104756828e63853c282a61c4fcab1e757eb4da9afaa40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:24Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:24 crc kubenswrapper[5004]: E1201 08:17:24.273616 5004 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c8341267-df98-484f-a5e7-cf024fab437c\\\",\\\"systemUUID\\\":\\\"77547a65-c048-47ee-89b1-8422fc81b7aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:24Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:24 crc kubenswrapper[5004]: E1201 08:17:24.273736 5004 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.276657 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.276934 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.277058 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.277196 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.277291 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:24Z","lastTransitionTime":"2025-12-01T08:17:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.286500 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dpkxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddcaf111-708a-45b3-a342-effd3061ab17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dpkxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:24Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.333733 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0ed2b6d-0c61-4639-bc3b-1c8effc4815d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45a0d7a12e7c114de1740a151e04f7f39844b709740cab958f156ac918af3228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d69a370af9f8f6c575f99da6bb01cee0f660dccb7e85f16f5d8874d0e263e1fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82baee0fd931fd7b41cd5a73ca7e653ef13dfa3d573991cbe3238d6b95f6fac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f341e99100e985f65c24647d976a24782dfae7f52e752b774f41f69af27fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad581564fc519328ae429bce757797f6d87d8648e862b008637a245866ec4cbe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T08:17:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 08:17:15.185125 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 08:17:15.186793 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071633747/tls.crt::/tmp/serving-cert-4071633747/tls.key\\\\\\\"\\\\nI1201 08:17:20.827614 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 08:17:20.834528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 08:17:20.834549 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 08:17:20.837611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 08:17:20.837630 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 08:17:20.848940 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 08:17:20.848957 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848965 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 08:17:20.848968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 08:17:20.848970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 08:17:20.848973 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 08:17:20.849115 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 08:17:20.854990 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2caeb2ebaf68110cb8728e60aae6db1b1fc48efae6018f66070766a4f42caa69\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:24Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.367879 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:24Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.379646 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.379699 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.379714 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.379734 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.379746 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:24Z","lastTransitionTime":"2025-12-01T08:17:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.405584 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:24Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.443622 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:24Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.487658 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fd2f2b5d7dc3d7b8e1bb06fdf0eec3addfec3c230e044b6395d6cf535630ebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2250161e400b201f0528e33231c9007342002a37cd037a1b1e011f2aaaa22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:24Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.488643 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.488705 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.488716 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.488736 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.488748 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:24Z","lastTransitionTime":"2025-12-01T08:17:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.524664 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zjksw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70e79009-93be-49c4-a6b3-e8a06bcea7f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd8m6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zjksw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:24Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.567708 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15cdec0a-5925-4966-a30b-f60c503f633e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-knmdv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:24Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.592903 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.592961 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.592976 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.592997 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.593011 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:24Z","lastTransitionTime":"2025-12-01T08:17:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.602728 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327477e7db10129d47ddf9078f32d0df2ffd4b8e59fc7bb77c17c60c63e9a936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:24Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.695665 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.695723 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.695739 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.695763 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.695780 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:24Z","lastTransitionTime":"2025-12-01T08:17:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.798729 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.799171 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.799243 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.799325 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.799399 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:24Z","lastTransitionTime":"2025-12-01T08:17:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.903709 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.903761 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.903779 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.903801 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.903818 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:24Z","lastTransitionTime":"2025-12-01T08:17:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.986720 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zjksw" event={"ID":"70e79009-93be-49c4-a6b3-e8a06bcea7f4","Type":"ContainerStarted","Data":"ad3ea204ad731500a340e639f0e36abe28ba50464f1b2ff9b009e39c234cf708"} Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.988492 5004 generic.go:334] "Generic (PLEG): container finished" podID="15cdec0a-5925-4966-a30b-f60c503f633e" containerID="97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c" exitCode=0 Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.988528 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" event={"ID":"15cdec0a-5925-4966-a30b-f60c503f633e","Type":"ContainerDied","Data":"97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c"} Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.988629 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" event={"ID":"15cdec0a-5925-4966-a30b-f60c503f633e","Type":"ContainerStarted","Data":"d5212583abfb5ca3c1010406306c1284a177ea12c285e6a737cc7318e7f3ffb8"} Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.993079 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" event={"ID":"f9977ebb-82de-4e96-8763-0b5a84f8d4ce","Type":"ContainerStarted","Data":"baf84c2eea7167eed95a4195d37bf784ffef310bc2079d8d06e1f47cb45a7864"} Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.993130 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" event={"ID":"f9977ebb-82de-4e96-8763-0b5a84f8d4ce","Type":"ContainerStarted","Data":"6c1e51fa92aec80e93d0fce11c743b8c4fde23fc23d8626e8d5b3e25ab42350d"} Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.995194 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"03c470edd984db3b756e031e33f64a9bc5ca6b9c156ab479e5cd647745655a5e"} Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.997667 5004 generic.go:334] "Generic (PLEG): container finished" podID="ddcaf111-708a-45b3-a342-effd3061ab17" containerID="cd2f03a8020818d1cc98b6eeaf64a728e627a3401400c3af53654f8771ef02eb" exitCode=0 Dec 01 08:17:24 crc kubenswrapper[5004]: I1201 08:17:24.997749 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dpkxw" event={"ID":"ddcaf111-708a-45b3-a342-effd3061ab17","Type":"ContainerDied","Data":"cd2f03a8020818d1cc98b6eeaf64a728e627a3401400c3af53654f8771ef02eb"} Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.009753 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:25Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.010513 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.010666 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.010683 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.017038 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.017059 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:25Z","lastTransitionTime":"2025-12-01T08:17:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.042763 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fd2f2b5d7dc3d7b8e1bb06fdf0eec3addfec3c230e044b6395d6cf535630ebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2250161e400b201f0528e33231c9007342002a37cd037a1b1e011f2aaaa22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:25Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.065960 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zjksw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70e79009-93be-49c4-a6b3-e8a06bcea7f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad3ea204ad731500a340e639f0e36abe28ba50464f1b2ff9b009e39c234cf708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd8m6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zjksw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:25Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.097088 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15cdec0a-5925-4966-a30b-f60c503f633e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-knmdv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:25Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.120120 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.120286 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.120340 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.120431 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.120485 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:25Z","lastTransitionTime":"2025-12-01T08:17:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.135624 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327477e7db10129d47ddf9078f32d0df2ffd4b8e59fc7bb77c17c60c63e9a936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:25Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.173962 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:25Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.190861 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:25Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.223124 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.223153 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.223161 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.223174 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.223183 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:25Z","lastTransitionTime":"2025-12-01T08:17:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.223630 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43c9f762-d403-49b9-8294-d66976aca4dd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b358f641ff09bc8f4a452b0d4a8cf5f9790182f76779be31426d48d0278b0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d610c3a799d4a7df9ddfe2a28f2ced2e5ac4e96c67c93f8e6d3a0eee60d9ac48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43f9f72af3273da055eeb1e13ad6258a1b3b6b205402861beee9f87f02230297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e3cc2eca559417c71beed561569c4dc5b45ffb8407976e0695fc1b93219d37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c24fbeedbc7c40b6079b7ad89d8564a4f2cdaad2cf996e66682b082382da190\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:25Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.233775 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:25Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.242061 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jjms6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8828af41-beeb-47dd-96cf-3dbcb5175893\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b22220849cdb91592a44a67f9f6a09586146009483f46a0505a2dea3b02bccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvsch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jjms6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:25Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.250616 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9977ebb-82de-4e96-8763-0b5a84f8d4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwgxq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwgxq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fvdgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:25Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.260992 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2fa377e-a3a6-46ce-af23-e677799f0115\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520771cf1ad6f9360064c5f304243c5878a2f1025c87d1001c97b2d5f95cb9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b65b9bc444101900c62fd90c7d45789d186a89148ac0fd9c7f0c6048c3f55ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbd3e682d207b8c18d6bcbc26aa2adae2a35645502d31b327ffddd51991e842\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4563974ed1467a87aab104756828e63853c282a61c4fcab1e757eb4da9afaa40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:25Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.271760 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dpkxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddcaf111-708a-45b3-a342-effd3061ab17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dpkxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:25Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.290312 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0ed2b6d-0c61-4639-bc3b-1c8effc4815d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45a0d7a12e7c114de1740a151e04f7f39844b709740cab958f156ac918af3228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d69a370af9f8f6c575f99da6bb01cee0f660dccb7e85f16f5d8874d0e263e1fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82baee0fd931fd7b41cd5a73ca7e653ef13dfa3d573991cbe3238d6b95f6fac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f341e99100e985f65c24647d976a24782dfae7f52e752b774f41f69af27fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad581564fc519328ae429bce757797f6d87d8648e862b008637a245866ec4cbe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T08:17:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 08:17:15.185125 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 08:17:15.186793 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071633747/tls.crt::/tmp/serving-cert-4071633747/tls.key\\\\\\\"\\\\nI1201 08:17:20.827614 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 08:17:20.834528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 08:17:20.834549 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 08:17:20.837611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 08:17:20.837630 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 08:17:20.848940 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 08:17:20.848957 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848965 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 08:17:20.848968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 08:17:20.848970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 08:17:20.848973 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 08:17:20.849115 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 08:17:20.854990 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2caeb2ebaf68110cb8728e60aae6db1b1fc48efae6018f66070766a4f42caa69\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:25Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.309135 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0ed2b6d-0c61-4639-bc3b-1c8effc4815d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45a0d7a12e7c114de1740a151e04f7f39844b709740cab958f156ac918af3228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d69a370af9f8f6c575f99da6bb01cee0f660dccb7e85f16f5d8874d0e263e1fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82baee0fd931fd7b41cd5a73ca7e653ef13dfa3d573991cbe3238d6b95f6fac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f341e99100e985f65c24647d976a24782dfae7f52e752b774f41f69af27fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad581564fc519328ae429bce757797f6d87d8648e862b008637a245866ec4cbe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T08:17:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 08:17:15.185125 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 08:17:15.186793 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071633747/tls.crt::/tmp/serving-cert-4071633747/tls.key\\\\\\\"\\\\nI1201 08:17:20.827614 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 08:17:20.834528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 08:17:20.834549 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 08:17:20.837611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 08:17:20.837630 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 08:17:20.848940 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 08:17:20.848957 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848965 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 08:17:20.848968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 08:17:20.848970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 08:17:20.848973 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 08:17:20.849115 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 08:17:20.854990 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2caeb2ebaf68110cb8728e60aae6db1b1fc48efae6018f66070766a4f42caa69\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:25Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.322484 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fd2f2b5d7dc3d7b8e1bb06fdf0eec3addfec3c230e044b6395d6cf535630ebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2250161e400b201f0528e33231c9007342002a37cd037a1b1e011f2aaaa22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:25Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.325161 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.325463 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.325477 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.325495 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.325507 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:25Z","lastTransitionTime":"2025-12-01T08:17:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.334866 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zjksw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70e79009-93be-49c4-a6b3-e8a06bcea7f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad3ea204ad731500a340e639f0e36abe28ba50464f1b2ff9b009e39c234cf708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd8m6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zjksw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:25Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.359036 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15cdec0a-5925-4966-a30b-f60c503f633e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-knmdv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:25Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.372546 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327477e7db10129d47ddf9078f32d0df2ffd4b8e59fc7bb77c17c60c63e9a936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:25Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.406398 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:25Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.428048 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.428087 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.428099 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.428116 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.428129 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:25Z","lastTransitionTime":"2025-12-01T08:17:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.447388 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:25Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.468254 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.468397 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.468450 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.468492 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.468551 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:17:25 crc kubenswrapper[5004]: E1201 08:17:25.468715 5004 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 08:17:25 crc kubenswrapper[5004]: E1201 08:17:25.468785 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 08:17:29.468764356 +0000 UTC m=+27.033756368 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 08:17:25 crc kubenswrapper[5004]: E1201 08:17:25.468871 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 08:17:29.468859359 +0000 UTC m=+27.033851371 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:17:25 crc kubenswrapper[5004]: E1201 08:17:25.468964 5004 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 08:17:25 crc kubenswrapper[5004]: E1201 08:17:25.468990 5004 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 08:17:25 crc kubenswrapper[5004]: E1201 08:17:25.469008 5004 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 08:17:25 crc kubenswrapper[5004]: E1201 08:17:25.469046 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 08:17:29.469034433 +0000 UTC m=+27.034026455 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 08:17:25 crc kubenswrapper[5004]: E1201 08:17:25.469097 5004 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 08:17:25 crc kubenswrapper[5004]: E1201 08:17:25.469133 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 08:17:29.469122306 +0000 UTC m=+27.034114328 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 08:17:25 crc kubenswrapper[5004]: E1201 08:17:25.469202 5004 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 08:17:25 crc kubenswrapper[5004]: E1201 08:17:25.469227 5004 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 08:17:25 crc kubenswrapper[5004]: E1201 08:17:25.469241 5004 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 08:17:25 crc kubenswrapper[5004]: E1201 08:17:25.469277 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 08:17:29.46926522 +0000 UTC m=+27.034257242 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.485752 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c470edd984db3b756e031e33f64a9bc5ca6b9c156ab479e5cd647745655a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:25Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.529605 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.529642 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.529653 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.529667 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.529678 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:25Z","lastTransitionTime":"2025-12-01T08:17:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.537336 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43c9f762-d403-49b9-8294-d66976aca4dd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b358f641ff09bc8f4a452b0d4a8cf5f9790182f76779be31426d48d0278b0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d610c3a799d4a7df9ddfe2a28f2ced2e5ac4e96c67c93f8e6d3a0eee60d9ac48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43f9f72af3273da055eeb1e13ad6258a1b3b6b205402861beee9f87f02230297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e3cc2eca559417c71beed561569c4dc5b45ffb8407976e0695fc1b93219d37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c24fbeedbc7c40b6079b7ad89d8564a4f2cdaad2cf996e66682b082382da190\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:25Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.566255 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:25Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.604331 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jjms6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8828af41-beeb-47dd-96cf-3dbcb5175893\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b22220849cdb91592a44a67f9f6a09586146009483f46a0505a2dea3b02bccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvsch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jjms6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:25Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.631922 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.631961 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.631972 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.631987 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.631999 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:25Z","lastTransitionTime":"2025-12-01T08:17:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.643438 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9977ebb-82de-4e96-8763-0b5a84f8d4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf84c2eea7167eed95a4195d37bf784ffef310bc2079d8d06e1f47cb45a7864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwgxq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c1e51fa92aec80e93d0fce11c743b8c4fde23fc23d8626e8d5b3e25ab42350d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwgxq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fvdgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:25Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.666706 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-ww6lq"] Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.667051 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-ww6lq" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.692053 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2fa377e-a3a6-46ce-af23-e677799f0115\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520771cf1ad6f9360064c5f304243c5878a2f1025c87d1001c97b2d5f95cb9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b65b9bc444101900c62fd90c7d45789d186a89148ac0fd9c7f0c6048c3f55ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbd3e682d207b8c18d6bcbc26aa2adae2a35645502d31b327ffddd51991e842\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4563974ed1467a87aab104756828e63853c282a61c4fcab1e757eb4da9afaa40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:25Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.694625 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.714519 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.735907 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.735968 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.735985 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.736009 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.736026 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:25Z","lastTransitionTime":"2025-12-01T08:17:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.736144 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.755065 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.758394 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.758395 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 08:17:25 crc kubenswrapper[5004]: E1201 08:17:25.758552 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.758410 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:17:25 crc kubenswrapper[5004]: E1201 08:17:25.758741 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 08:17:25 crc kubenswrapper[5004]: E1201 08:17:25.758820 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.771884 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/72500ffd-4ca3-4614-a3a2-bbdc5a7506c5-host\") pod \"node-ca-ww6lq\" (UID: \"72500ffd-4ca3-4614-a3a2-bbdc5a7506c5\") " pod="openshift-image-registry/node-ca-ww6lq" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.771941 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmsvt\" (UniqueName: \"kubernetes.io/projected/72500ffd-4ca3-4614-a3a2-bbdc5a7506c5-kube-api-access-gmsvt\") pod \"node-ca-ww6lq\" (UID: \"72500ffd-4ca3-4614-a3a2-bbdc5a7506c5\") " pod="openshift-image-registry/node-ca-ww6lq" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.772022 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/72500ffd-4ca3-4614-a3a2-bbdc5a7506c5-serviceca\") pod \"node-ca-ww6lq\" (UID: \"72500ffd-4ca3-4614-a3a2-bbdc5a7506c5\") " pod="openshift-image-registry/node-ca-ww6lq" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.808369 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dpkxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddcaf111-708a-45b3-a342-effd3061ab17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd2f03a8020818d1cc98b6eeaf64a728e627a3401400c3af53654f8771ef02eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd2f03a8020818d1cc98b6eeaf64a728e627a3401400c3af53654f8771ef02eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dpkxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:25Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.838339 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.838381 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.838394 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.838414 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.838426 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:25Z","lastTransitionTime":"2025-12-01T08:17:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.847080 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zjksw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70e79009-93be-49c4-a6b3-e8a06bcea7f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad3ea204ad731500a340e639f0e36abe28ba50464f1b2ff9b009e39c234cf708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd8m6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zjksw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:25Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.872617 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/72500ffd-4ca3-4614-a3a2-bbdc5a7506c5-host\") pod \"node-ca-ww6lq\" (UID: \"72500ffd-4ca3-4614-a3a2-bbdc5a7506c5\") " pod="openshift-image-registry/node-ca-ww6lq" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.872671 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmsvt\" (UniqueName: \"kubernetes.io/projected/72500ffd-4ca3-4614-a3a2-bbdc5a7506c5-kube-api-access-gmsvt\") pod \"node-ca-ww6lq\" (UID: \"72500ffd-4ca3-4614-a3a2-bbdc5a7506c5\") " pod="openshift-image-registry/node-ca-ww6lq" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.872693 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/72500ffd-4ca3-4614-a3a2-bbdc5a7506c5-serviceca\") pod \"node-ca-ww6lq\" (UID: \"72500ffd-4ca3-4614-a3a2-bbdc5a7506c5\") " pod="openshift-image-registry/node-ca-ww6lq" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.872891 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/72500ffd-4ca3-4614-a3a2-bbdc5a7506c5-host\") pod \"node-ca-ww6lq\" (UID: \"72500ffd-4ca3-4614-a3a2-bbdc5a7506c5\") " pod="openshift-image-registry/node-ca-ww6lq" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.873528 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/72500ffd-4ca3-4614-a3a2-bbdc5a7506c5-serviceca\") pod \"node-ca-ww6lq\" (UID: \"72500ffd-4ca3-4614-a3a2-bbdc5a7506c5\") " pod="openshift-image-registry/node-ca-ww6lq" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.899766 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15cdec0a-5925-4966-a30b-f60c503f633e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-knmdv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:25Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.917880 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmsvt\" (UniqueName: \"kubernetes.io/projected/72500ffd-4ca3-4614-a3a2-bbdc5a7506c5-kube-api-access-gmsvt\") pod \"node-ca-ww6lq\" (UID: \"72500ffd-4ca3-4614-a3a2-bbdc5a7506c5\") " pod="openshift-image-registry/node-ca-ww6lq" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.940397 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.940430 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.940443 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.940460 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.940473 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:25Z","lastTransitionTime":"2025-12-01T08:17:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.945661 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327477e7db10129d47ddf9078f32d0df2ffd4b8e59fc7bb77c17c60c63e9a936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:25Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.980550 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-ww6lq" Dec 01 08:17:25 crc kubenswrapper[5004]: I1201 08:17:25.985801 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:25Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:25 crc kubenswrapper[5004]: W1201 08:17:25.993557 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72500ffd_4ca3_4614_a3a2_bbdc5a7506c5.slice/crio-964cb03f40ae6be63437d5187c9f14083ed378b2740bb01a9419a59725cd46e3 WatchSource:0}: Error finding container 964cb03f40ae6be63437d5187c9f14083ed378b2740bb01a9419a59725cd46e3: Status 404 returned error can't find the container with id 964cb03f40ae6be63437d5187c9f14083ed378b2740bb01a9419a59725cd46e3 Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.009963 5004 generic.go:334] "Generic (PLEG): container finished" podID="ddcaf111-708a-45b3-a342-effd3061ab17" containerID="b8dd31c10eaf6bc673dec377a884040524418dd1891a96b2eb6e6e983979512a" exitCode=0 Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.010076 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dpkxw" event={"ID":"ddcaf111-708a-45b3-a342-effd3061ab17","Type":"ContainerDied","Data":"b8dd31c10eaf6bc673dec377a884040524418dd1891a96b2eb6e6e983979512a"} Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.016423 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" event={"ID":"15cdec0a-5925-4966-a30b-f60c503f633e","Type":"ContainerStarted","Data":"e7c5c043438b515e37a6d6d5c47b92479bbb7322fe35c3a1bc57ee7b3a746544"} Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.016472 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" event={"ID":"15cdec0a-5925-4966-a30b-f60c503f633e","Type":"ContainerStarted","Data":"23df33e45cc27e4d343ae6ec3fb88f2dfc125ba7da424515d4bee5ec19281832"} Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.016493 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" event={"ID":"15cdec0a-5925-4966-a30b-f60c503f633e","Type":"ContainerStarted","Data":"f001b70a83da28b1ea78a94f1cd70dfc63b3de5a21b041932d33c2ac970c986b"} Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.016507 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" event={"ID":"15cdec0a-5925-4966-a30b-f60c503f633e","Type":"ContainerStarted","Data":"f528b94ff12c7a804930dca4b93c23334a9ae3a53317750cd34b5695f08b4582"} Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.016519 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" event={"ID":"15cdec0a-5925-4966-a30b-f60c503f633e","Type":"ContainerStarted","Data":"d4a7cab8e1594814a687b1433a92642a19eff7c2eb12f96fa06f08a2e270733d"} Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.016532 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" event={"ID":"15cdec0a-5925-4966-a30b-f60c503f633e","Type":"ContainerStarted","Data":"1844a18a6109c25a25b9c8cb3ff65e69cc657f66be03dd97e883d24a774b7638"} Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.018028 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-ww6lq" event={"ID":"72500ffd-4ca3-4614-a3a2-bbdc5a7506c5","Type":"ContainerStarted","Data":"964cb03f40ae6be63437d5187c9f14083ed378b2740bb01a9419a59725cd46e3"} Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.024320 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:26Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.043900 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.043975 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.043999 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.044028 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.044053 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:26Z","lastTransitionTime":"2025-12-01T08:17:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.063372 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c470edd984db3b756e031e33f64a9bc5ca6b9c156ab479e5cd647745655a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:26Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.107366 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fd2f2b5d7dc3d7b8e1bb06fdf0eec3addfec3c230e044b6395d6cf535630ebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2250161e400b201f0528e33231c9007342002a37cd037a1b1e011f2aaaa22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:26Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.146246 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.146296 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.146312 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.146335 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.146352 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:26Z","lastTransitionTime":"2025-12-01T08:17:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.159143 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43c9f762-d403-49b9-8294-d66976aca4dd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b358f641ff09bc8f4a452b0d4a8cf5f9790182f76779be31426d48d0278b0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d610c3a799d4a7df9ddfe2a28f2ced2e5ac4e96c67c93f8e6d3a0eee60d9ac48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43f9f72af3273da055eeb1e13ad6258a1b3b6b205402861beee9f87f02230297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e3cc2eca559417c71beed561569c4dc5b45ffb8407976e0695fc1b93219d37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c24fbeedbc7c40b6079b7ad89d8564a4f2cdaad2cf996e66682b082382da190\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:26Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.182893 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:26Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.222352 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jjms6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8828af41-beeb-47dd-96cf-3dbcb5175893\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b22220849cdb91592a44a67f9f6a09586146009483f46a0505a2dea3b02bccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvsch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jjms6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:26Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.248120 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.248149 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.248158 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.248172 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.248181 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:26Z","lastTransitionTime":"2025-12-01T08:17:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.263682 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9977ebb-82de-4e96-8763-0b5a84f8d4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf84c2eea7167eed95a4195d37bf784ffef310bc2079d8d06e1f47cb45a7864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwgxq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c1e51fa92aec80e93d0fce11c743b8c4fde23fc23d8626e8d5b3e25ab42350d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwgxq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fvdgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:26Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.303720 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2fa377e-a3a6-46ce-af23-e677799f0115\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520771cf1ad6f9360064c5f304243c5878a2f1025c87d1001c97b2d5f95cb9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b65b9bc444101900c62fd90c7d45789d186a89148ac0fd9c7f0c6048c3f55ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbd3e682d207b8c18d6bcbc26aa2adae2a35645502d31b327ffddd51991e842\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4563974ed1467a87aab104756828e63853c282a61c4fcab1e757eb4da9afaa40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:26Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.346391 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dpkxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddcaf111-708a-45b3-a342-effd3061ab17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd2f03a8020818d1cc98b6eeaf64a728e627a3401400c3af53654f8771ef02eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd2f03a8020818d1cc98b6eeaf64a728e627a3401400c3af53654f8771ef02eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dpkxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:26Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.351657 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.351700 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.351712 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.351732 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.351747 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:26Z","lastTransitionTime":"2025-12-01T08:17:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.383173 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ww6lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72500ffd-4ca3-4614-a3a2-bbdc5a7506c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:25Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:25Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmsvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ww6lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:26Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.430355 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0ed2b6d-0c61-4639-bc3b-1c8effc4815d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45a0d7a12e7c114de1740a151e04f7f39844b709740cab958f156ac918af3228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d69a370af9f8f6c575f99da6bb01cee0f660dccb7e85f16f5d8874d0e263e1fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82baee0fd931fd7b41cd5a73ca7e653ef13dfa3d573991cbe3238d6b95f6fac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f341e99100e985f65c24647d976a24782dfae7f52e752b774f41f69af27fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad581564fc519328ae429bce757797f6d87d8648e862b008637a245866ec4cbe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T08:17:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 08:17:15.185125 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 08:17:15.186793 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071633747/tls.crt::/tmp/serving-cert-4071633747/tls.key\\\\\\\"\\\\nI1201 08:17:20.827614 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 08:17:20.834528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 08:17:20.834549 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 08:17:20.837611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 08:17:20.837630 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 08:17:20.848940 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 08:17:20.848957 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848965 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 08:17:20.848968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 08:17:20.848970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 08:17:20.848973 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 08:17:20.849115 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 08:17:20.854990 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2caeb2ebaf68110cb8728e60aae6db1b1fc48efae6018f66070766a4f42caa69\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:26Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.455180 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.455231 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.455244 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.455265 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.455289 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:26Z","lastTransitionTime":"2025-12-01T08:17:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.465037 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2fa377e-a3a6-46ce-af23-e677799f0115\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520771cf1ad6f9360064c5f304243c5878a2f1025c87d1001c97b2d5f95cb9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b65b9bc444101900c62fd90c7d45789d186a89148ac0fd9c7f0c6048c3f55ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbd3e682d207b8c18d6bcbc26aa2adae2a35645502d31b327ffddd51991e842\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4563974ed1467a87aab104756828e63853c282a61c4fcab1e757eb4da9afaa40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:26Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.507435 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dpkxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddcaf111-708a-45b3-a342-effd3061ab17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd2f03a8020818d1cc98b6eeaf64a728e627a3401400c3af53654f8771ef02eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd2f03a8020818d1cc98b6eeaf64a728e627a3401400c3af53654f8771ef02eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8dd31c10eaf6bc673dec377a884040524418dd1891a96b2eb6e6e983979512a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8dd31c10eaf6bc673dec377a884040524418dd1891a96b2eb6e6e983979512a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dpkxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:26Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.545891 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ww6lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72500ffd-4ca3-4614-a3a2-bbdc5a7506c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:25Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:25Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmsvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ww6lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:26Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.558053 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.558124 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.558144 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.558172 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.558206 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:26Z","lastTransitionTime":"2025-12-01T08:17:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.595845 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0ed2b6d-0c61-4639-bc3b-1c8effc4815d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45a0d7a12e7c114de1740a151e04f7f39844b709740cab958f156ac918af3228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d69a370af9f8f6c575f99da6bb01cee0f660dccb7e85f16f5d8874d0e263e1fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82baee0fd931fd7b41cd5a73ca7e653ef13dfa3d573991cbe3238d6b95f6fac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f341e99100e985f65c24647d976a24782dfae7f52e752b774f41f69af27fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad581564fc519328ae429bce757797f6d87d8648e862b008637a245866ec4cbe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T08:17:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 08:17:15.185125 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 08:17:15.186793 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071633747/tls.crt::/tmp/serving-cert-4071633747/tls.key\\\\\\\"\\\\nI1201 08:17:20.827614 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 08:17:20.834528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 08:17:20.834549 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 08:17:20.837611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 08:17:20.837630 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 08:17:20.848940 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 08:17:20.848957 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848965 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 08:17:20.848968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 08:17:20.848970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 08:17:20.848973 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 08:17:20.849115 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 08:17:20.854990 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2caeb2ebaf68110cb8728e60aae6db1b1fc48efae6018f66070766a4f42caa69\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:26Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.633117 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zjksw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70e79009-93be-49c4-a6b3-e8a06bcea7f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad3ea204ad731500a340e639f0e36abe28ba50464f1b2ff9b009e39c234cf708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd8m6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zjksw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:26Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.661107 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.661187 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.661210 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.661239 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.661263 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:26Z","lastTransitionTime":"2025-12-01T08:17:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.681751 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15cdec0a-5925-4966-a30b-f60c503f633e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-knmdv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:26Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.713986 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327477e7db10129d47ddf9078f32d0df2ffd4b8e59fc7bb77c17c60c63e9a936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:26Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.749611 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:26Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.764629 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.764687 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.764700 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.764725 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.764737 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:26Z","lastTransitionTime":"2025-12-01T08:17:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.789423 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:26Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.823391 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c470edd984db3b756e031e33f64a9bc5ca6b9c156ab479e5cd647745655a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:26Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.867787 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.867821 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.867831 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.867848 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.867859 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:26Z","lastTransitionTime":"2025-12-01T08:17:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.871318 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fd2f2b5d7dc3d7b8e1bb06fdf0eec3addfec3c230e044b6395d6cf535630ebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2250161e400b201f0528e33231c9007342002a37cd037a1b1e011f2aaaa22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:26Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.924769 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43c9f762-d403-49b9-8294-d66976aca4dd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b358f641ff09bc8f4a452b0d4a8cf5f9790182f76779be31426d48d0278b0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d610c3a799d4a7df9ddfe2a28f2ced2e5ac4e96c67c93f8e6d3a0eee60d9ac48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43f9f72af3273da055eeb1e13ad6258a1b3b6b205402861beee9f87f02230297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e3cc2eca559417c71beed561569c4dc5b45ffb8407976e0695fc1b93219d37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c24fbeedbc7c40b6079b7ad89d8564a4f2cdaad2cf996e66682b082382da190\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:26Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.949320 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:26Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.971815 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.971876 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.971894 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.971921 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.971942 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:26Z","lastTransitionTime":"2025-12-01T08:17:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:26 crc kubenswrapper[5004]: I1201 08:17:26.984702 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jjms6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8828af41-beeb-47dd-96cf-3dbcb5175893\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b22220849cdb91592a44a67f9f6a09586146009483f46a0505a2dea3b02bccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvsch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jjms6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:26Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:27 crc kubenswrapper[5004]: I1201 08:17:27.025377 5004 generic.go:334] "Generic (PLEG): container finished" podID="ddcaf111-708a-45b3-a342-effd3061ab17" containerID="9a7a8ef66a9486ca3d13826df7aa39db2083cff2c386b4d0d858e0526e0f094d" exitCode=0 Dec 01 08:17:27 crc kubenswrapper[5004]: I1201 08:17:27.025483 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dpkxw" event={"ID":"ddcaf111-708a-45b3-a342-effd3061ab17","Type":"ContainerDied","Data":"9a7a8ef66a9486ca3d13826df7aa39db2083cff2c386b4d0d858e0526e0f094d"} Dec 01 08:17:27 crc kubenswrapper[5004]: I1201 08:17:27.030392 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-ww6lq" event={"ID":"72500ffd-4ca3-4614-a3a2-bbdc5a7506c5","Type":"ContainerStarted","Data":"668cfaa2af20e2bb082fc47e1702cfd9f704c4fdf56a4d27cf25d6915e7cd18a"} Dec 01 08:17:27 crc kubenswrapper[5004]: I1201 08:17:27.039064 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9977ebb-82de-4e96-8763-0b5a84f8d4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf84c2eea7167eed95a4195d37bf784ffef310bc2079d8d06e1f47cb45a7864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwgxq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c1e51fa92aec80e93d0fce11c743b8c4fde23fc23d8626e8d5b3e25ab42350d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwgxq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fvdgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:27Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:27 crc kubenswrapper[5004]: I1201 08:17:27.067223 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:27Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:27 crc kubenswrapper[5004]: I1201 08:17:27.074802 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:27 crc kubenswrapper[5004]: I1201 08:17:27.074861 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:27 crc kubenswrapper[5004]: I1201 08:17:27.074883 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:27 crc kubenswrapper[5004]: I1201 08:17:27.074909 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:27 crc kubenswrapper[5004]: I1201 08:17:27.074926 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:27Z","lastTransitionTime":"2025-12-01T08:17:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:27 crc kubenswrapper[5004]: I1201 08:17:27.110952 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:27Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:27 crc kubenswrapper[5004]: I1201 08:17:27.144611 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c470edd984db3b756e031e33f64a9bc5ca6b9c156ab479e5cd647745655a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:27Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:27 crc kubenswrapper[5004]: I1201 08:17:27.177786 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:27 crc kubenswrapper[5004]: I1201 08:17:27.177819 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:27 crc kubenswrapper[5004]: I1201 08:17:27.177831 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:27 crc kubenswrapper[5004]: I1201 08:17:27.177847 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:27 crc kubenswrapper[5004]: I1201 08:17:27.177857 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:27Z","lastTransitionTime":"2025-12-01T08:17:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:27 crc kubenswrapper[5004]: I1201 08:17:27.187149 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fd2f2b5d7dc3d7b8e1bb06fdf0eec3addfec3c230e044b6395d6cf535630ebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2250161e400b201f0528e33231c9007342002a37cd037a1b1e011f2aaaa22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:27Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:27 crc kubenswrapper[5004]: I1201 08:17:27.228460 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zjksw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70e79009-93be-49c4-a6b3-e8a06bcea7f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad3ea204ad731500a340e639f0e36abe28ba50464f1b2ff9b009e39c234cf708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd8m6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zjksw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:27Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:27 crc kubenswrapper[5004]: I1201 08:17:27.272905 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15cdec0a-5925-4966-a30b-f60c503f633e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-knmdv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:27Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:27 crc kubenswrapper[5004]: I1201 08:17:27.280311 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:27 crc kubenswrapper[5004]: I1201 08:17:27.280365 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:27 crc kubenswrapper[5004]: I1201 08:17:27.280384 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:27 crc kubenswrapper[5004]: I1201 08:17:27.280409 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:27 crc kubenswrapper[5004]: I1201 08:17:27.280428 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:27Z","lastTransitionTime":"2025-12-01T08:17:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:27 crc kubenswrapper[5004]: I1201 08:17:27.309957 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327477e7db10129d47ddf9078f32d0df2ffd4b8e59fc7bb77c17c60c63e9a936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:27Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:27 crc kubenswrapper[5004]: I1201 08:17:27.343984 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jjms6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8828af41-beeb-47dd-96cf-3dbcb5175893\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b22220849cdb91592a44a67f9f6a09586146009483f46a0505a2dea3b02bccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvsch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jjms6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:27Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:27 crc kubenswrapper[5004]: I1201 08:17:27.383895 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:27 crc kubenswrapper[5004]: I1201 08:17:27.383944 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:27 crc kubenswrapper[5004]: I1201 08:17:27.383962 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:27 crc kubenswrapper[5004]: I1201 08:17:27.383985 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:27 crc kubenswrapper[5004]: I1201 08:17:27.384002 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:27Z","lastTransitionTime":"2025-12-01T08:17:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:27 crc kubenswrapper[5004]: I1201 08:17:27.385345 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9977ebb-82de-4e96-8763-0b5a84f8d4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf84c2eea7167eed95a4195d37bf784ffef310bc2079d8d06e1f47cb45a7864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwgxq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c1e51fa92aec80e93d0fce11c743b8c4fde23fc23d8626e8d5b3e25ab42350d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwgxq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fvdgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:27Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:27 crc kubenswrapper[5004]: I1201 08:17:27.440235 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43c9f762-d403-49b9-8294-d66976aca4dd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b358f641ff09bc8f4a452b0d4a8cf5f9790182f76779be31426d48d0278b0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d610c3a799d4a7df9ddfe2a28f2ced2e5ac4e96c67c93f8e6d3a0eee60d9ac48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43f9f72af3273da055eeb1e13ad6258a1b3b6b205402861beee9f87f02230297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e3cc2eca559417c71beed561569c4dc5b45ffb8407976e0695fc1b93219d37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c24fbeedbc7c40b6079b7ad89d8564a4f2cdaad2cf996e66682b082382da190\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:27Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:27 crc kubenswrapper[5004]: I1201 08:17:27.469758 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:27Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:27 crc kubenswrapper[5004]: I1201 08:17:27.487350 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:27 crc kubenswrapper[5004]: I1201 08:17:27.487884 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:27 crc kubenswrapper[5004]: I1201 08:17:27.488035 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:27 crc kubenswrapper[5004]: I1201 08:17:27.488188 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:27 crc kubenswrapper[5004]: I1201 08:17:27.488323 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:27Z","lastTransitionTime":"2025-12-01T08:17:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:27 crc kubenswrapper[5004]: I1201 08:17:27.506536 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ww6lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72500ffd-4ca3-4614-a3a2-bbdc5a7506c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://668cfaa2af20e2bb082fc47e1702cfd9f704c4fdf56a4d27cf25d6915e7cd18a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmsvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ww6lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:27Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:27 crc kubenswrapper[5004]: I1201 08:17:27.566641 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2fa377e-a3a6-46ce-af23-e677799f0115\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520771cf1ad6f9360064c5f304243c5878a2f1025c87d1001c97b2d5f95cb9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b65b9bc444101900c62fd90c7d45789d186a89148ac0fd9c7f0c6048c3f55ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbd3e682d207b8c18d6bcbc26aa2adae2a35645502d31b327ffddd51991e842\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4563974ed1467a87aab104756828e63853c282a61c4fcab1e757eb4da9afaa40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:27Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:27 crc kubenswrapper[5004]: I1201 08:17:27.591161 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:27 crc kubenswrapper[5004]: I1201 08:17:27.591206 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:27 crc kubenswrapper[5004]: I1201 08:17:27.591220 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:27 crc kubenswrapper[5004]: I1201 08:17:27.591239 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:27 crc kubenswrapper[5004]: I1201 08:17:27.591253 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:27Z","lastTransitionTime":"2025-12-01T08:17:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:27 crc kubenswrapper[5004]: I1201 08:17:27.596496 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dpkxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddcaf111-708a-45b3-a342-effd3061ab17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd2f03a8020818d1cc98b6eeaf64a728e627a3401400c3af53654f8771ef02eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd2f03a8020818d1cc98b6eeaf64a728e627a3401400c3af53654f8771ef02eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8dd31c10eaf6bc673dec377a884040524418dd1891a96b2eb6e6e983979512a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8dd31c10eaf6bc673dec377a884040524418dd1891a96b2eb6e6e983979512a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a7a8ef66a9486ca3d13826df7aa39db2083cff2c386b4d0d858e0526e0f094d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a7a8ef66a9486ca3d13826df7aa39db2083cff2c386b4d0d858e0526e0f094d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dpkxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:27Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:27 crc kubenswrapper[5004]: I1201 08:17:27.627742 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0ed2b6d-0c61-4639-bc3b-1c8effc4815d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45a0d7a12e7c114de1740a151e04f7f39844b709740cab958f156ac918af3228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d69a370af9f8f6c575f99da6bb01cee0f660dccb7e85f16f5d8874d0e263e1fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82baee0fd931fd7b41cd5a73ca7e653ef13dfa3d573991cbe3238d6b95f6fac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f341e99100e985f65c24647d976a24782dfae7f52e752b774f41f69af27fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad581564fc519328ae429bce757797f6d87d8648e862b008637a245866ec4cbe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T08:17:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 08:17:15.185125 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 08:17:15.186793 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071633747/tls.crt::/tmp/serving-cert-4071633747/tls.key\\\\\\\"\\\\nI1201 08:17:20.827614 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 08:17:20.834528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 08:17:20.834549 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 08:17:20.837611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 08:17:20.837630 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 08:17:20.848940 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 08:17:20.848957 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848965 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 08:17:20.848968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 08:17:20.848970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 08:17:20.848973 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 08:17:20.849115 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 08:17:20.854990 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2caeb2ebaf68110cb8728e60aae6db1b1fc48efae6018f66070766a4f42caa69\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:27Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:27 crc kubenswrapper[5004]: I1201 08:17:27.647509 5004 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 01 08:17:27 crc kubenswrapper[5004]: I1201 08:17:27.694150 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:27 crc kubenswrapper[5004]: I1201 08:17:27.694205 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:27 crc kubenswrapper[5004]: I1201 08:17:27.694223 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:27 crc kubenswrapper[5004]: I1201 08:17:27.694245 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:27 crc kubenswrapper[5004]: I1201 08:17:27.694269 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:27Z","lastTransitionTime":"2025-12-01T08:17:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:27 crc kubenswrapper[5004]: I1201 08:17:27.749685 5004 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 01 08:17:27 crc kubenswrapper[5004]: I1201 08:17:27.758757 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 08:17:27 crc kubenswrapper[5004]: I1201 08:17:27.758757 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 08:17:27 crc kubenswrapper[5004]: E1201 08:17:27.758912 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 08:17:27 crc kubenswrapper[5004]: I1201 08:17:27.758992 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:17:27 crc kubenswrapper[5004]: E1201 08:17:27.759087 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 08:17:27 crc kubenswrapper[5004]: E1201 08:17:27.759233 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 08:17:27 crc kubenswrapper[5004]: I1201 08:17:27.797470 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:27 crc kubenswrapper[5004]: I1201 08:17:27.797524 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:27 crc kubenswrapper[5004]: I1201 08:17:27.797540 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:27 crc kubenswrapper[5004]: I1201 08:17:27.797593 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:27 crc kubenswrapper[5004]: I1201 08:17:27.797612 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:27Z","lastTransitionTime":"2025-12-01T08:17:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:27 crc kubenswrapper[5004]: I1201 08:17:27.872910 5004 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 01 08:17:27 crc kubenswrapper[5004]: I1201 08:17:27.899530 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:27 crc kubenswrapper[5004]: I1201 08:17:27.899592 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:27 crc kubenswrapper[5004]: I1201 08:17:27.899607 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:27 crc kubenswrapper[5004]: I1201 08:17:27.899624 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:27 crc kubenswrapper[5004]: I1201 08:17:27.899636 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:27Z","lastTransitionTime":"2025-12-01T08:17:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:27 crc kubenswrapper[5004]: I1201 08:17:27.963194 5004 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 01 08:17:28 crc kubenswrapper[5004]: I1201 08:17:28.002727 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:28 crc kubenswrapper[5004]: I1201 08:17:28.002774 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:28 crc kubenswrapper[5004]: I1201 08:17:28.002787 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:28 crc kubenswrapper[5004]: I1201 08:17:28.002807 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:28 crc kubenswrapper[5004]: I1201 08:17:28.002820 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:28Z","lastTransitionTime":"2025-12-01T08:17:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:28 crc kubenswrapper[5004]: I1201 08:17:28.036836 5004 generic.go:334] "Generic (PLEG): container finished" podID="ddcaf111-708a-45b3-a342-effd3061ab17" containerID="cd1c47c976c17bf17e4d660288ce30472372b6a7f33f6ffa6d96c1a4436275d0" exitCode=0 Dec 01 08:17:28 crc kubenswrapper[5004]: I1201 08:17:28.036921 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dpkxw" event={"ID":"ddcaf111-708a-45b3-a342-effd3061ab17","Type":"ContainerDied","Data":"cd1c47c976c17bf17e4d660288ce30472372b6a7f33f6ffa6d96c1a4436275d0"} Dec 01 08:17:28 crc kubenswrapper[5004]: I1201 08:17:28.043854 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" event={"ID":"15cdec0a-5925-4966-a30b-f60c503f633e","Type":"ContainerStarted","Data":"1f231e0092e1a85cc5d6e00fb8d3bd982223b670191ac3dc21d81ff1fab462a0"} Dec 01 08:17:28 crc kubenswrapper[5004]: I1201 08:17:28.071840 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0ed2b6d-0c61-4639-bc3b-1c8effc4815d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45a0d7a12e7c114de1740a151e04f7f39844b709740cab958f156ac918af3228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d69a370af9f8f6c575f99da6bb01cee0f660dccb7e85f16f5d8874d0e263e1fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82baee0fd931fd7b41cd5a73ca7e653ef13dfa3d573991cbe3238d6b95f6fac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f341e99100e985f65c24647d976a24782dfae7f52e752b774f41f69af27fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad581564fc519328ae429bce757797f6d87d8648e862b008637a245866ec4cbe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T08:17:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 08:17:15.185125 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 08:17:15.186793 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071633747/tls.crt::/tmp/serving-cert-4071633747/tls.key\\\\\\\"\\\\nI1201 08:17:20.827614 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 08:17:20.834528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 08:17:20.834549 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 08:17:20.837611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 08:17:20.837630 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 08:17:20.848940 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 08:17:20.848957 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848965 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 08:17:20.848968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 08:17:20.848970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 08:17:20.848973 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 08:17:20.849115 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 08:17:20.854990 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2caeb2ebaf68110cb8728e60aae6db1b1fc48efae6018f66070766a4f42caa69\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:28Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:28 crc kubenswrapper[5004]: I1201 08:17:28.094942 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:28Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:28 crc kubenswrapper[5004]: I1201 08:17:28.107096 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:28 crc kubenswrapper[5004]: I1201 08:17:28.107163 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:28 crc kubenswrapper[5004]: I1201 08:17:28.107184 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:28 crc kubenswrapper[5004]: I1201 08:17:28.107216 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:28 crc kubenswrapper[5004]: I1201 08:17:28.107237 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:28Z","lastTransitionTime":"2025-12-01T08:17:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:28 crc kubenswrapper[5004]: I1201 08:17:28.115334 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:28Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:28 crc kubenswrapper[5004]: I1201 08:17:28.137450 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c470edd984db3b756e031e33f64a9bc5ca6b9c156ab479e5cd647745655a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:28Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:28 crc kubenswrapper[5004]: I1201 08:17:28.157628 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fd2f2b5d7dc3d7b8e1bb06fdf0eec3addfec3c230e044b6395d6cf535630ebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2250161e400b201f0528e33231c9007342002a37cd037a1b1e011f2aaaa22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:28Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:28 crc kubenswrapper[5004]: I1201 08:17:28.177393 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zjksw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70e79009-93be-49c4-a6b3-e8a06bcea7f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad3ea204ad731500a340e639f0e36abe28ba50464f1b2ff9b009e39c234cf708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd8m6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zjksw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:28Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:28 crc kubenswrapper[5004]: I1201 08:17:28.200461 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15cdec0a-5925-4966-a30b-f60c503f633e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-knmdv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:28Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:28 crc kubenswrapper[5004]: I1201 08:17:28.209905 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:28 crc kubenswrapper[5004]: I1201 08:17:28.209957 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:28 crc kubenswrapper[5004]: I1201 08:17:28.209970 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:28 crc kubenswrapper[5004]: I1201 08:17:28.209989 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:28 crc kubenswrapper[5004]: I1201 08:17:28.210001 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:28Z","lastTransitionTime":"2025-12-01T08:17:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:28 crc kubenswrapper[5004]: I1201 08:17:28.217082 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327477e7db10129d47ddf9078f32d0df2ffd4b8e59fc7bb77c17c60c63e9a936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:28Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:28 crc kubenswrapper[5004]: I1201 08:17:28.231055 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jjms6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8828af41-beeb-47dd-96cf-3dbcb5175893\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b22220849cdb91592a44a67f9f6a09586146009483f46a0505a2dea3b02bccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvsch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jjms6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:28Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:28 crc kubenswrapper[5004]: I1201 08:17:28.245069 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9977ebb-82de-4e96-8763-0b5a84f8d4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf84c2eea7167eed95a4195d37bf784ffef310bc2079d8d06e1f47cb45a7864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwgxq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c1e51fa92aec80e93d0fce11c743b8c4fde23fc23d8626e8d5b3e25ab42350d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwgxq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fvdgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:28Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:28 crc kubenswrapper[5004]: I1201 08:17:28.279541 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43c9f762-d403-49b9-8294-d66976aca4dd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b358f641ff09bc8f4a452b0d4a8cf5f9790182f76779be31426d48d0278b0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d610c3a799d4a7df9ddfe2a28f2ced2e5ac4e96c67c93f8e6d3a0eee60d9ac48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43f9f72af3273da055eeb1e13ad6258a1b3b6b205402861beee9f87f02230297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e3cc2eca559417c71beed561569c4dc5b45ffb8407976e0695fc1b93219d37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c24fbeedbc7c40b6079b7ad89d8564a4f2cdaad2cf996e66682b082382da190\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:28Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:28 crc kubenswrapper[5004]: I1201 08:17:28.296296 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:28Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:28 crc kubenswrapper[5004]: I1201 08:17:28.307366 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ww6lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72500ffd-4ca3-4614-a3a2-bbdc5a7506c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://668cfaa2af20e2bb082fc47e1702cfd9f704c4fdf56a4d27cf25d6915e7cd18a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmsvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ww6lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:28Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:28 crc kubenswrapper[5004]: I1201 08:17:28.311844 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:28 crc kubenswrapper[5004]: I1201 08:17:28.311891 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:28 crc kubenswrapper[5004]: I1201 08:17:28.311910 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:28 crc kubenswrapper[5004]: I1201 08:17:28.311928 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:28 crc kubenswrapper[5004]: I1201 08:17:28.311939 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:28Z","lastTransitionTime":"2025-12-01T08:17:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:28 crc kubenswrapper[5004]: I1201 08:17:28.318119 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2fa377e-a3a6-46ce-af23-e677799f0115\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520771cf1ad6f9360064c5f304243c5878a2f1025c87d1001c97b2d5f95cb9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b65b9bc444101900c62fd90c7d45789d186a89148ac0fd9c7f0c6048c3f55ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbd3e682d207b8c18d6bcbc26aa2adae2a35645502d31b327ffddd51991e842\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4563974ed1467a87aab104756828e63853c282a61c4fcab1e757eb4da9afaa40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:28Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:28 crc kubenswrapper[5004]: I1201 08:17:28.333421 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dpkxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddcaf111-708a-45b3-a342-effd3061ab17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd2f03a8020818d1cc98b6eeaf64a728e627a3401400c3af53654f8771ef02eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd2f03a8020818d1cc98b6eeaf64a728e627a3401400c3af53654f8771ef02eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8dd31c10eaf6bc673dec377a884040524418dd1891a96b2eb6e6e983979512a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8dd31c10eaf6bc673dec377a884040524418dd1891a96b2eb6e6e983979512a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a7a8ef66a9486ca3d13826df7aa39db2083cff2c386b4d0d858e0526e0f094d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a7a8ef66a9486ca3d13826df7aa39db2083cff2c386b4d0d858e0526e0f094d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd1c47c976c17bf17e4d660288ce30472372b6a7f33f6ffa6d96c1a4436275d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd1c47c976c17bf17e4d660288ce30472372b6a7f33f6ffa6d96c1a4436275d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dpkxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:28Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:28 crc kubenswrapper[5004]: I1201 08:17:28.414874 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:28 crc kubenswrapper[5004]: I1201 08:17:28.415080 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:28 crc kubenswrapper[5004]: I1201 08:17:28.415092 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:28 crc kubenswrapper[5004]: I1201 08:17:28.415112 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:28 crc kubenswrapper[5004]: I1201 08:17:28.415125 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:28Z","lastTransitionTime":"2025-12-01T08:17:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:28 crc kubenswrapper[5004]: I1201 08:17:28.517662 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:28 crc kubenswrapper[5004]: I1201 08:17:28.517714 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:28 crc kubenswrapper[5004]: I1201 08:17:28.517733 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:28 crc kubenswrapper[5004]: I1201 08:17:28.517760 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:28 crc kubenswrapper[5004]: I1201 08:17:28.517780 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:28Z","lastTransitionTime":"2025-12-01T08:17:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:28 crc kubenswrapper[5004]: I1201 08:17:28.620156 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:28 crc kubenswrapper[5004]: I1201 08:17:28.620212 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:28 crc kubenswrapper[5004]: I1201 08:17:28.620231 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:28 crc kubenswrapper[5004]: I1201 08:17:28.620255 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:28 crc kubenswrapper[5004]: I1201 08:17:28.620273 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:28Z","lastTransitionTime":"2025-12-01T08:17:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:28 crc kubenswrapper[5004]: I1201 08:17:28.723092 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:28 crc kubenswrapper[5004]: I1201 08:17:28.723135 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:28 crc kubenswrapper[5004]: I1201 08:17:28.723151 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:28 crc kubenswrapper[5004]: I1201 08:17:28.723173 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:28 crc kubenswrapper[5004]: I1201 08:17:28.723190 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:28Z","lastTransitionTime":"2025-12-01T08:17:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:28 crc kubenswrapper[5004]: I1201 08:17:28.825070 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:28 crc kubenswrapper[5004]: I1201 08:17:28.825104 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:28 crc kubenswrapper[5004]: I1201 08:17:28.825113 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:28 crc kubenswrapper[5004]: I1201 08:17:28.825127 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:28 crc kubenswrapper[5004]: I1201 08:17:28.825136 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:28Z","lastTransitionTime":"2025-12-01T08:17:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:28 crc kubenswrapper[5004]: I1201 08:17:28.928167 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:28 crc kubenswrapper[5004]: I1201 08:17:28.928192 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:28 crc kubenswrapper[5004]: I1201 08:17:28.928200 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:28 crc kubenswrapper[5004]: I1201 08:17:28.928211 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:28 crc kubenswrapper[5004]: I1201 08:17:28.928219 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:28Z","lastTransitionTime":"2025-12-01T08:17:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.031419 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.031482 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.031536 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.031592 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.031610 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:29Z","lastTransitionTime":"2025-12-01T08:17:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.050856 5004 generic.go:334] "Generic (PLEG): container finished" podID="ddcaf111-708a-45b3-a342-effd3061ab17" containerID="9b7c8e9648e5514ab1fcb63dc72822dd276a2d74987e2fb4f4800b26ec176be7" exitCode=0 Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.050912 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dpkxw" event={"ID":"ddcaf111-708a-45b3-a342-effd3061ab17","Type":"ContainerDied","Data":"9b7c8e9648e5514ab1fcb63dc72822dd276a2d74987e2fb4f4800b26ec176be7"} Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.067793 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ww6lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72500ffd-4ca3-4614-a3a2-bbdc5a7506c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://668cfaa2af20e2bb082fc47e1702cfd9f704c4fdf56a4d27cf25d6915e7cd18a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmsvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ww6lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:29Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.091594 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2fa377e-a3a6-46ce-af23-e677799f0115\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520771cf1ad6f9360064c5f304243c5878a2f1025c87d1001c97b2d5f95cb9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b65b9bc444101900c62fd90c7d45789d186a89148ac0fd9c7f0c6048c3f55ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbd3e682d207b8c18d6bcbc26aa2adae2a35645502d31b327ffddd51991e842\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4563974ed1467a87aab104756828e63853c282a61c4fcab1e757eb4da9afaa40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:29Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.129519 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dpkxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddcaf111-708a-45b3-a342-effd3061ab17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd2f03a8020818d1cc98b6eeaf64a728e627a3401400c3af53654f8771ef02eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd2f03a8020818d1cc98b6eeaf64a728e627a3401400c3af53654f8771ef02eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8dd31c10eaf6bc673dec377a884040524418dd1891a96b2eb6e6e983979512a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8dd31c10eaf6bc673dec377a884040524418dd1891a96b2eb6e6e983979512a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a7a8ef66a9486ca3d13826df7aa39db2083cff2c386b4d0d858e0526e0f094d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a7a8ef66a9486ca3d13826df7aa39db2083cff2c386b4d0d858e0526e0f094d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd1c47c976c17bf17e4d660288ce30472372b6a7f33f6ffa6d96c1a4436275d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd1c47c976c17bf17e4d660288ce30472372b6a7f33f6ffa6d96c1a4436275d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b7c8e9648e5514ab1fcb63dc72822dd276a2d74987e2fb4f4800b26ec176be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b7c8e9648e5514ab1fcb63dc72822dd276a2d74987e2fb4f4800b26ec176be7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dpkxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:29Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.135191 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.135477 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.135641 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.135769 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.135885 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:29Z","lastTransitionTime":"2025-12-01T08:17:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.156956 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0ed2b6d-0c61-4639-bc3b-1c8effc4815d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45a0d7a12e7c114de1740a151e04f7f39844b709740cab958f156ac918af3228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d69a370af9f8f6c575f99da6bb01cee0f660dccb7e85f16f5d8874d0e263e1fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82baee0fd931fd7b41cd5a73ca7e653ef13dfa3d573991cbe3238d6b95f6fac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f341e99100e985f65c24647d976a24782dfae7f52e752b774f41f69af27fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad581564fc519328ae429bce757797f6d87d8648e862b008637a245866ec4cbe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T08:17:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 08:17:15.185125 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 08:17:15.186793 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071633747/tls.crt::/tmp/serving-cert-4071633747/tls.key\\\\\\\"\\\\nI1201 08:17:20.827614 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 08:17:20.834528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 08:17:20.834549 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 08:17:20.837611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 08:17:20.837630 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 08:17:20.848940 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 08:17:20.848957 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848965 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 08:17:20.848968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 08:17:20.848970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 08:17:20.848973 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 08:17:20.849115 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 08:17:20.854990 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2caeb2ebaf68110cb8728e60aae6db1b1fc48efae6018f66070766a4f42caa69\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:29Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.172658 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:29Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.190449 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:29Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.202797 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c470edd984db3b756e031e33f64a9bc5ca6b9c156ab479e5cd647745655a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:29Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.221613 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fd2f2b5d7dc3d7b8e1bb06fdf0eec3addfec3c230e044b6395d6cf535630ebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2250161e400b201f0528e33231c9007342002a37cd037a1b1e011f2aaaa22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:29Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.237846 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.237902 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.237920 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.237945 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.237967 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:29Z","lastTransitionTime":"2025-12-01T08:17:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.241182 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zjksw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70e79009-93be-49c4-a6b3-e8a06bcea7f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad3ea204ad731500a340e639f0e36abe28ba50464f1b2ff9b009e39c234cf708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd8m6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zjksw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:29Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.263045 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15cdec0a-5925-4966-a30b-f60c503f633e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-knmdv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:29Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.278719 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327477e7db10129d47ddf9078f32d0df2ffd4b8e59fc7bb77c17c60c63e9a936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:29Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.290460 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jjms6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8828af41-beeb-47dd-96cf-3dbcb5175893\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b22220849cdb91592a44a67f9f6a09586146009483f46a0505a2dea3b02bccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvsch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jjms6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:29Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.304196 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9977ebb-82de-4e96-8763-0b5a84f8d4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf84c2eea7167eed95a4195d37bf784ffef310bc2079d8d06e1f47cb45a7864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwgxq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c1e51fa92aec80e93d0fce11c743b8c4fde23fc23d8626e8d5b3e25ab42350d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwgxq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fvdgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:29Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.324324 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43c9f762-d403-49b9-8294-d66976aca4dd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b358f641ff09bc8f4a452b0d4a8cf5f9790182f76779be31426d48d0278b0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d610c3a799d4a7df9ddfe2a28f2ced2e5ac4e96c67c93f8e6d3a0eee60d9ac48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43f9f72af3273da055eeb1e13ad6258a1b3b6b205402861beee9f87f02230297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e3cc2eca559417c71beed561569c4dc5b45ffb8407976e0695fc1b93219d37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c24fbeedbc7c40b6079b7ad89d8564a4f2cdaad2cf996e66682b082382da190\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:29Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.337312 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:29Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.339963 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.340004 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.340015 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.340033 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.340046 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:29Z","lastTransitionTime":"2025-12-01T08:17:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.443058 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.443100 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.443110 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.443124 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.443135 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:29Z","lastTransitionTime":"2025-12-01T08:17:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.512847 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.512993 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.513031 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.513065 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.513264 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 08:17:29 crc kubenswrapper[5004]: E1201 08:17:29.513334 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 08:17:37.513299401 +0000 UTC m=+35.078291393 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:17:29 crc kubenswrapper[5004]: E1201 08:17:29.513413 5004 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 08:17:29 crc kubenswrapper[5004]: E1201 08:17:29.513436 5004 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 08:17:29 crc kubenswrapper[5004]: E1201 08:17:29.513452 5004 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 08:17:29 crc kubenswrapper[5004]: E1201 08:17:29.513462 5004 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 08:17:29 crc kubenswrapper[5004]: E1201 08:17:29.513502 5004 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 08:17:29 crc kubenswrapper[5004]: E1201 08:17:29.513510 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 08:17:37.513489656 +0000 UTC m=+35.078481728 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 08:17:29 crc kubenswrapper[5004]: E1201 08:17:29.513524 5004 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 08:17:29 crc kubenswrapper[5004]: E1201 08:17:29.513541 5004 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 08:17:29 crc kubenswrapper[5004]: E1201 08:17:29.513470 5004 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 08:17:29 crc kubenswrapper[5004]: E1201 08:17:29.513621 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 08:17:37.513597789 +0000 UTC m=+35.078589861 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 08:17:29 crc kubenswrapper[5004]: E1201 08:17:29.513644 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 08:17:37.51363273 +0000 UTC m=+35.078624872 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 08:17:29 crc kubenswrapper[5004]: E1201 08:17:29.513664 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 08:17:37.51365446 +0000 UTC m=+35.078646602 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.546242 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.546281 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.546295 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.546312 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.546325 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:29Z","lastTransitionTime":"2025-12-01T08:17:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.649456 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.649502 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.649514 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.649531 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.649544 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:29Z","lastTransitionTime":"2025-12-01T08:17:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.753546 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.753640 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.753657 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.753683 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.753701 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:29Z","lastTransitionTime":"2025-12-01T08:17:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.758201 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.758240 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.758283 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 08:17:29 crc kubenswrapper[5004]: E1201 08:17:29.758359 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 08:17:29 crc kubenswrapper[5004]: E1201 08:17:29.758476 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 08:17:29 crc kubenswrapper[5004]: E1201 08:17:29.758656 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.856878 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.856928 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.856941 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.856963 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.856978 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:29Z","lastTransitionTime":"2025-12-01T08:17:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.960208 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.960266 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.960284 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.960308 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:29 crc kubenswrapper[5004]: I1201 08:17:29.960326 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:29Z","lastTransitionTime":"2025-12-01T08:17:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:30 crc kubenswrapper[5004]: I1201 08:17:30.059137 5004 generic.go:334] "Generic (PLEG): container finished" podID="ddcaf111-708a-45b3-a342-effd3061ab17" containerID="1d7115db7fe248437f4b8918f06c9aa48b141fcd3f31add80faee81b4347e47b" exitCode=0 Dec 01 08:17:30 crc kubenswrapper[5004]: I1201 08:17:30.059197 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dpkxw" event={"ID":"ddcaf111-708a-45b3-a342-effd3061ab17","Type":"ContainerDied","Data":"1d7115db7fe248437f4b8918f06c9aa48b141fcd3f31add80faee81b4347e47b"} Dec 01 08:17:30 crc kubenswrapper[5004]: I1201 08:17:30.063211 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:30 crc kubenswrapper[5004]: I1201 08:17:30.063246 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:30 crc kubenswrapper[5004]: I1201 08:17:30.063263 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:30 crc kubenswrapper[5004]: I1201 08:17:30.063283 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:30 crc kubenswrapper[5004]: I1201 08:17:30.063300 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:30Z","lastTransitionTime":"2025-12-01T08:17:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:30 crc kubenswrapper[5004]: I1201 08:17:30.084284 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43c9f762-d403-49b9-8294-d66976aca4dd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b358f641ff09bc8f4a452b0d4a8cf5f9790182f76779be31426d48d0278b0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d610c3a799d4a7df9ddfe2a28f2ced2e5ac4e96c67c93f8e6d3a0eee60d9ac48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43f9f72af3273da055eeb1e13ad6258a1b3b6b205402861beee9f87f02230297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e3cc2eca559417c71beed561569c4dc5b45ffb8407976e0695fc1b93219d37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c24fbeedbc7c40b6079b7ad89d8564a4f2cdaad2cf996e66682b082382da190\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:30Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:30 crc kubenswrapper[5004]: I1201 08:17:30.102859 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:30Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:30 crc kubenswrapper[5004]: I1201 08:17:30.119807 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jjms6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8828af41-beeb-47dd-96cf-3dbcb5175893\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b22220849cdb91592a44a67f9f6a09586146009483f46a0505a2dea3b02bccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvsch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jjms6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:30Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:30 crc kubenswrapper[5004]: I1201 08:17:30.137504 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9977ebb-82de-4e96-8763-0b5a84f8d4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf84c2eea7167eed95a4195d37bf784ffef310bc2079d8d06e1f47cb45a7864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwgxq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c1e51fa92aec80e93d0fce11c743b8c4fde23fc23d8626e8d5b3e25ab42350d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwgxq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fvdgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:30Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:30 crc kubenswrapper[5004]: I1201 08:17:30.158019 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2fa377e-a3a6-46ce-af23-e677799f0115\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520771cf1ad6f9360064c5f304243c5878a2f1025c87d1001c97b2d5f95cb9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b65b9bc444101900c62fd90c7d45789d186a89148ac0fd9c7f0c6048c3f55ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbd3e682d207b8c18d6bcbc26aa2adae2a35645502d31b327ffddd51991e842\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4563974ed1467a87aab104756828e63853c282a61c4fcab1e757eb4da9afaa40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:30Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:30 crc kubenswrapper[5004]: I1201 08:17:30.166511 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:30 crc kubenswrapper[5004]: I1201 08:17:30.167018 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:30 crc kubenswrapper[5004]: I1201 08:17:30.167246 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:30 crc kubenswrapper[5004]: I1201 08:17:30.167478 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:30 crc kubenswrapper[5004]: I1201 08:17:30.167718 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:30Z","lastTransitionTime":"2025-12-01T08:17:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:30 crc kubenswrapper[5004]: I1201 08:17:30.177131 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dpkxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddcaf111-708a-45b3-a342-effd3061ab17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd2f03a8020818d1cc98b6eeaf64a728e627a3401400c3af53654f8771ef02eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd2f03a8020818d1cc98b6eeaf64a728e627a3401400c3af53654f8771ef02eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8dd31c10eaf6bc673dec377a884040524418dd1891a96b2eb6e6e983979512a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8dd31c10eaf6bc673dec377a884040524418dd1891a96b2eb6e6e983979512a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a7a8ef66a9486ca3d13826df7aa39db2083cff2c386b4d0d858e0526e0f094d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a7a8ef66a9486ca3d13826df7aa39db2083cff2c386b4d0d858e0526e0f094d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd1c47c976c17bf17e4d660288ce30472372b6a7f33f6ffa6d96c1a4436275d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd1c47c976c17bf17e4d660288ce30472372b6a7f33f6ffa6d96c1a4436275d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b7c8e9648e5514ab1fcb63dc72822dd276a2d74987e2fb4f4800b26ec176be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b7c8e9648e5514ab1fcb63dc72822dd276a2d74987e2fb4f4800b26ec176be7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d7115db7fe248437f4b8918f06c9aa48b141fcd3f31add80faee81b4347e47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d7115db7fe248437f4b8918f06c9aa48b141fcd3f31add80faee81b4347e47b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dpkxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:30Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:30 crc kubenswrapper[5004]: I1201 08:17:30.191608 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ww6lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72500ffd-4ca3-4614-a3a2-bbdc5a7506c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://668cfaa2af20e2bb082fc47e1702cfd9f704c4fdf56a4d27cf25d6915e7cd18a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmsvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ww6lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:30Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:30 crc kubenswrapper[5004]: I1201 08:17:30.210653 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0ed2b6d-0c61-4639-bc3b-1c8effc4815d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45a0d7a12e7c114de1740a151e04f7f39844b709740cab958f156ac918af3228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d69a370af9f8f6c575f99da6bb01cee0f660dccb7e85f16f5d8874d0e263e1fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82baee0fd931fd7b41cd5a73ca7e653ef13dfa3d573991cbe3238d6b95f6fac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f341e99100e985f65c24647d976a24782dfae7f52e752b774f41f69af27fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad581564fc519328ae429bce757797f6d87d8648e862b008637a245866ec4cbe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T08:17:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 08:17:15.185125 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 08:17:15.186793 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071633747/tls.crt::/tmp/serving-cert-4071633747/tls.key\\\\\\\"\\\\nI1201 08:17:20.827614 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 08:17:20.834528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 08:17:20.834549 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 08:17:20.837611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 08:17:20.837630 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 08:17:20.848940 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 08:17:20.848957 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848965 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 08:17:20.848968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 08:17:20.848970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 08:17:20.848973 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 08:17:20.849115 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 08:17:20.854990 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2caeb2ebaf68110cb8728e60aae6db1b1fc48efae6018f66070766a4f42caa69\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:30Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:30 crc kubenswrapper[5004]: I1201 08:17:30.228536 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zjksw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70e79009-93be-49c4-a6b3-e8a06bcea7f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad3ea204ad731500a340e639f0e36abe28ba50464f1b2ff9b009e39c234cf708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd8m6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zjksw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:30Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:30 crc kubenswrapper[5004]: I1201 08:17:30.258912 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15cdec0a-5925-4966-a30b-f60c503f633e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-knmdv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:30Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:30 crc kubenswrapper[5004]: I1201 08:17:30.271937 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:30 crc kubenswrapper[5004]: I1201 08:17:30.271998 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:30 crc kubenswrapper[5004]: I1201 08:17:30.272021 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:30 crc kubenswrapper[5004]: I1201 08:17:30.272051 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:30 crc kubenswrapper[5004]: I1201 08:17:30.272074 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:30Z","lastTransitionTime":"2025-12-01T08:17:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:30 crc kubenswrapper[5004]: I1201 08:17:30.279782 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327477e7db10129d47ddf9078f32d0df2ffd4b8e59fc7bb77c17c60c63e9a936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:30Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:30 crc kubenswrapper[5004]: I1201 08:17:30.296932 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:30Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:30 crc kubenswrapper[5004]: I1201 08:17:30.313832 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:30Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:30 crc kubenswrapper[5004]: I1201 08:17:30.330859 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c470edd984db3b756e031e33f64a9bc5ca6b9c156ab479e5cd647745655a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:30Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:30 crc kubenswrapper[5004]: I1201 08:17:30.347320 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fd2f2b5d7dc3d7b8e1bb06fdf0eec3addfec3c230e044b6395d6cf535630ebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2250161e400b201f0528e33231c9007342002a37cd037a1b1e011f2aaaa22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:30Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:30 crc kubenswrapper[5004]: I1201 08:17:30.374606 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:30 crc kubenswrapper[5004]: I1201 08:17:30.374670 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:30 crc kubenswrapper[5004]: I1201 08:17:30.374690 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:30 crc kubenswrapper[5004]: I1201 08:17:30.374715 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:30 crc kubenswrapper[5004]: I1201 08:17:30.374735 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:30Z","lastTransitionTime":"2025-12-01T08:17:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:30 crc kubenswrapper[5004]: I1201 08:17:30.477224 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:30 crc kubenswrapper[5004]: I1201 08:17:30.477489 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:30 crc kubenswrapper[5004]: I1201 08:17:30.477598 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:30 crc kubenswrapper[5004]: I1201 08:17:30.477691 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:30 crc kubenswrapper[5004]: I1201 08:17:30.477775 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:30Z","lastTransitionTime":"2025-12-01T08:17:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:30 crc kubenswrapper[5004]: I1201 08:17:30.580951 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:30 crc kubenswrapper[5004]: I1201 08:17:30.581008 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:30 crc kubenswrapper[5004]: I1201 08:17:30.581026 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:30 crc kubenswrapper[5004]: I1201 08:17:30.581049 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:30 crc kubenswrapper[5004]: I1201 08:17:30.581066 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:30Z","lastTransitionTime":"2025-12-01T08:17:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:30 crc kubenswrapper[5004]: I1201 08:17:30.684477 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:30 crc kubenswrapper[5004]: I1201 08:17:30.684707 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:30 crc kubenswrapper[5004]: I1201 08:17:30.684796 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:30 crc kubenswrapper[5004]: I1201 08:17:30.684863 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:30 crc kubenswrapper[5004]: I1201 08:17:30.684917 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:30Z","lastTransitionTime":"2025-12-01T08:17:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:30 crc kubenswrapper[5004]: I1201 08:17:30.787552 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:30 crc kubenswrapper[5004]: I1201 08:17:30.787816 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:30 crc kubenswrapper[5004]: I1201 08:17:30.787875 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:30 crc kubenswrapper[5004]: I1201 08:17:30.787942 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:30 crc kubenswrapper[5004]: I1201 08:17:30.788000 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:30Z","lastTransitionTime":"2025-12-01T08:17:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:30 crc kubenswrapper[5004]: I1201 08:17:30.891316 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:30 crc kubenswrapper[5004]: I1201 08:17:30.891384 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:30 crc kubenswrapper[5004]: I1201 08:17:30.891401 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:30 crc kubenswrapper[5004]: I1201 08:17:30.891431 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:30 crc kubenswrapper[5004]: I1201 08:17:30.891487 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:30Z","lastTransitionTime":"2025-12-01T08:17:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:30 crc kubenswrapper[5004]: I1201 08:17:30.994496 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:30 crc kubenswrapper[5004]: I1201 08:17:30.994587 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:30 crc kubenswrapper[5004]: I1201 08:17:30.994610 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:30 crc kubenswrapper[5004]: I1201 08:17:30.994638 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:30 crc kubenswrapper[5004]: I1201 08:17:30.994659 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:30Z","lastTransitionTime":"2025-12-01T08:17:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.073798 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" event={"ID":"15cdec0a-5925-4966-a30b-f60c503f633e","Type":"ContainerStarted","Data":"d0292bc1ff208619ec4cb469a8fbcddf2915fccba3a6ebc07d1ecc8bf1092007"} Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.073933 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.073962 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.073984 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.080782 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dpkxw" event={"ID":"ddcaf111-708a-45b3-a342-effd3061ab17","Type":"ContainerStarted","Data":"68ecba515f05ca83fdd0cdda10e3e5925a146aadb70ae17859586c12daf55dac"} Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.092125 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2fa377e-a3a6-46ce-af23-e677799f0115\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520771cf1ad6f9360064c5f304243c5878a2f1025c87d1001c97b2d5f95cb9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b65b9bc444101900c62fd90c7d45789d186a89148ac0fd9c7f0c6048c3f55ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbd3e682d207b8c18d6bcbc26aa2adae2a35645502d31b327ffddd51991e842\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4563974ed1467a87aab104756828e63853c282a61c4fcab1e757eb4da9afaa40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:31Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.097538 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.097588 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.097600 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.097619 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.097632 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:31Z","lastTransitionTime":"2025-12-01T08:17:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.113501 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.115996 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.117433 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dpkxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddcaf111-708a-45b3-a342-effd3061ab17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd2f03a8020818d1cc98b6eeaf64a728e627a3401400c3af53654f8771ef02eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd2f03a8020818d1cc98b6eeaf64a728e627a3401400c3af53654f8771ef02eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8dd31c10eaf6bc673dec377a884040524418dd1891a96b2eb6e6e983979512a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8dd31c10eaf6bc673dec377a884040524418dd1891a96b2eb6e6e983979512a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a7a8ef66a9486ca3d13826df7aa39db2083cff2c386b4d0d858e0526e0f094d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a7a8ef66a9486ca3d13826df7aa39db2083cff2c386b4d0d858e0526e0f094d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd1c47c976c17bf17e4d660288ce30472372b6a7f33f6ffa6d96c1a4436275d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd1c47c976c17bf17e4d660288ce30472372b6a7f33f6ffa6d96c1a4436275d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b7c8e9648e5514ab1fcb63dc72822dd276a2d74987e2fb4f4800b26ec176be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b7c8e9648e5514ab1fcb63dc72822dd276a2d74987e2fb4f4800b26ec176be7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d7115db7fe248437f4b8918f06c9aa48b141fcd3f31add80faee81b4347e47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d7115db7fe248437f4b8918f06c9aa48b141fcd3f31add80faee81b4347e47b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dpkxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:31Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.130926 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ww6lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72500ffd-4ca3-4614-a3a2-bbdc5a7506c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://668cfaa2af20e2bb082fc47e1702cfd9f704c4fdf56a4d27cf25d6915e7cd18a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmsvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ww6lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:31Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.145759 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0ed2b6d-0c61-4639-bc3b-1c8effc4815d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45a0d7a12e7c114de1740a151e04f7f39844b709740cab958f156ac918af3228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d69a370af9f8f6c575f99da6bb01cee0f660dccb7e85f16f5d8874d0e263e1fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82baee0fd931fd7b41cd5a73ca7e653ef13dfa3d573991cbe3238d6b95f6fac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f341e99100e985f65c24647d976a24782dfae7f52e752b774f41f69af27fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad581564fc519328ae429bce757797f6d87d8648e862b008637a245866ec4cbe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T08:17:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 08:17:15.185125 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 08:17:15.186793 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071633747/tls.crt::/tmp/serving-cert-4071633747/tls.key\\\\\\\"\\\\nI1201 08:17:20.827614 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 08:17:20.834528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 08:17:20.834549 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 08:17:20.837611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 08:17:20.837630 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 08:17:20.848940 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 08:17:20.848957 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848965 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 08:17:20.848968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 08:17:20.848970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 08:17:20.848973 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 08:17:20.849115 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 08:17:20.854990 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2caeb2ebaf68110cb8728e60aae6db1b1fc48efae6018f66070766a4f42caa69\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:31Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.160156 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:31Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.173520 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c470edd984db3b756e031e33f64a9bc5ca6b9c156ab479e5cd647745655a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:31Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.190479 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fd2f2b5d7dc3d7b8e1bb06fdf0eec3addfec3c230e044b6395d6cf535630ebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2250161e400b201f0528e33231c9007342002a37cd037a1b1e011f2aaaa22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:31Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.199877 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.199922 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.199930 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.199942 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.199951 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:31Z","lastTransitionTime":"2025-12-01T08:17:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.208471 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zjksw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70e79009-93be-49c4-a6b3-e8a06bcea7f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad3ea204ad731500a340e639f0e36abe28ba50464f1b2ff9b009e39c234cf708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd8m6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zjksw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:31Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.234729 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15cdec0a-5925-4966-a30b-f60c503f633e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f528b94ff12c7a804930dca4b93c23334a9ae3a53317750cd34b5695f08b4582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f001b70a83da28b1ea78a94f1cd70dfc63b3de5a21b041932d33c2ac970c986b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c5c043438b515e37a6d6d5c47b92479bbb7322fe35c3a1bc57ee7b3a746544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23df33e45cc27e4d343ae6ec3fb88f2dfc125ba7da424515d4bee5ec19281832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4a7cab8e1594814a687b1433a92642a19eff7c2eb12f96fa06f08a2e270733d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1844a18a6109c25a25b9c8cb3ff65e69cc657f66be03dd97e883d24a774b7638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0292bc1ff208619ec4cb469a8fbcddf2915fccba3a6ebc07d1ecc8bf1092007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f231e0092e1a85cc5d6e00fb8d3bd982223b670191ac3dc21d81ff1fab462a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-knmdv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:31Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.261782 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327477e7db10129d47ddf9078f32d0df2ffd4b8e59fc7bb77c17c60c63e9a936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:31Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.278488 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:31Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.295635 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9977ebb-82de-4e96-8763-0b5a84f8d4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf84c2eea7167eed95a4195d37bf784ffef310bc2079d8d06e1f47cb45a7864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwgxq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c1e51fa92aec80e93d0fce11c743b8c4fde23fc23d8626e8d5b3e25ab42350d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwgxq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fvdgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:31Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.303508 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.303582 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.303601 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.303625 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.303645 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:31Z","lastTransitionTime":"2025-12-01T08:17:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.331014 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43c9f762-d403-49b9-8294-d66976aca4dd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b358f641ff09bc8f4a452b0d4a8cf5f9790182f76779be31426d48d0278b0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d610c3a799d4a7df9ddfe2a28f2ced2e5ac4e96c67c93f8e6d3a0eee60d9ac48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43f9f72af3273da055eeb1e13ad6258a1b3b6b205402861beee9f87f02230297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e3cc2eca559417c71beed561569c4dc5b45ffb8407976e0695fc1b93219d37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c24fbeedbc7c40b6079b7ad89d8564a4f2cdaad2cf996e66682b082382da190\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:31Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.352516 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:31Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.366539 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jjms6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8828af41-beeb-47dd-96cf-3dbcb5175893\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b22220849cdb91592a44a67f9f6a09586146009483f46a0505a2dea3b02bccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvsch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jjms6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:31Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.383609 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0ed2b6d-0c61-4639-bc3b-1c8effc4815d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45a0d7a12e7c114de1740a151e04f7f39844b709740cab958f156ac918af3228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d69a370af9f8f6c575f99da6bb01cee0f660dccb7e85f16f5d8874d0e263e1fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82baee0fd931fd7b41cd5a73ca7e653ef13dfa3d573991cbe3238d6b95f6fac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f341e99100e985f65c24647d976a24782dfae7f52e752b774f41f69af27fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad581564fc519328ae429bce757797f6d87d8648e862b008637a245866ec4cbe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T08:17:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 08:17:15.185125 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 08:17:15.186793 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071633747/tls.crt::/tmp/serving-cert-4071633747/tls.key\\\\\\\"\\\\nI1201 08:17:20.827614 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 08:17:20.834528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 08:17:20.834549 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 08:17:20.837611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 08:17:20.837630 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 08:17:20.848940 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 08:17:20.848957 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848965 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 08:17:20.848968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 08:17:20.848970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 08:17:20.848973 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 08:17:20.849115 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 08:17:20.854990 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2caeb2ebaf68110cb8728e60aae6db1b1fc48efae6018f66070766a4f42caa69\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:31Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.403965 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327477e7db10129d47ddf9078f32d0df2ffd4b8e59fc7bb77c17c60c63e9a936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:31Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.405875 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.405941 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.405959 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.405984 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.406002 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:31Z","lastTransitionTime":"2025-12-01T08:17:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.423655 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:31Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.443074 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:31Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.458308 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c470edd984db3b756e031e33f64a9bc5ca6b9c156ab479e5cd647745655a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:31Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.478174 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fd2f2b5d7dc3d7b8e1bb06fdf0eec3addfec3c230e044b6395d6cf535630ebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2250161e400b201f0528e33231c9007342002a37cd037a1b1e011f2aaaa22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:31Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.495929 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zjksw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70e79009-93be-49c4-a6b3-e8a06bcea7f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad3ea204ad731500a340e639f0e36abe28ba50464f1b2ff9b009e39c234cf708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd8m6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zjksw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:31Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.508847 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.508929 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.508952 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.508986 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.509010 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:31Z","lastTransitionTime":"2025-12-01T08:17:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.525953 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15cdec0a-5925-4966-a30b-f60c503f633e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f528b94ff12c7a804930dca4b93c23334a9ae3a53317750cd34b5695f08b4582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f001b70a83da28b1ea78a94f1cd70dfc63b3de5a21b041932d33c2ac970c986b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c5c043438b515e37a6d6d5c47b92479bbb7322fe35c3a1bc57ee7b3a746544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23df33e45cc27e4d343ae6ec3fb88f2dfc125ba7da424515d4bee5ec19281832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4a7cab8e1594814a687b1433a92642a19eff7c2eb12f96fa06f08a2e270733d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1844a18a6109c25a25b9c8cb3ff65e69cc657f66be03dd97e883d24a774b7638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0292bc1ff208619ec4cb469a8fbcddf2915fccba3a6ebc07d1ecc8bf1092007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f231e0092e1a85cc5d6e00fb8d3bd982223b670191ac3dc21d81ff1fab462a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-knmdv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:31Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.546992 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:31Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.565340 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jjms6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8828af41-beeb-47dd-96cf-3dbcb5175893\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b22220849cdb91592a44a67f9f6a09586146009483f46a0505a2dea3b02bccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvsch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jjms6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:31Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.583315 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9977ebb-82de-4e96-8763-0b5a84f8d4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf84c2eea7167eed95a4195d37bf784ffef310bc2079d8d06e1f47cb45a7864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwgxq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c1e51fa92aec80e93d0fce11c743b8c4fde23fc23d8626e8d5b3e25ab42350d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwgxq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fvdgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:31Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.611971 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.612012 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.612029 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.612052 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.612067 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:31Z","lastTransitionTime":"2025-12-01T08:17:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.615904 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43c9f762-d403-49b9-8294-d66976aca4dd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b358f641ff09bc8f4a452b0d4a8cf5f9790182f76779be31426d48d0278b0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d610c3a799d4a7df9ddfe2a28f2ced2e5ac4e96c67c93f8e6d3a0eee60d9ac48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43f9f72af3273da055eeb1e13ad6258a1b3b6b205402861beee9f87f02230297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e3cc2eca559417c71beed561569c4dc5b45ffb8407976e0695fc1b93219d37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c24fbeedbc7c40b6079b7ad89d8564a4f2cdaad2cf996e66682b082382da190\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:31Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.676290 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dpkxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddcaf111-708a-45b3-a342-effd3061ab17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68ecba515f05ca83fdd0cdda10e3e5925a146aadb70ae17859586c12daf55dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd2f03a8020818d1cc98b6eeaf64a728e627a3401400c3af53654f8771ef02eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd2f03a8020818d1cc98b6eeaf64a728e627a3401400c3af53654f8771ef02eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8dd31c10eaf6bc673dec377a884040524418dd1891a96b2eb6e6e983979512a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8dd31c10eaf6bc673dec377a884040524418dd1891a96b2eb6e6e983979512a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a7a8ef66a9486ca3d13826df7aa39db2083cff2c386b4d0d858e0526e0f094d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a7a8ef66a9486ca3d13826df7aa39db2083cff2c386b4d0d858e0526e0f094d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd1c47c976c17bf17e4d660288ce30472372b6a7f33f6ffa6d96c1a4436275d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd1c47c976c17bf17e4d660288ce30472372b6a7f33f6ffa6d96c1a4436275d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b7c8e9648e5514ab1fcb63dc72822dd276a2d74987e2fb4f4800b26ec176be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b7c8e9648e5514ab1fcb63dc72822dd276a2d74987e2fb4f4800b26ec176be7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d7115db7fe248437f4b8918f06c9aa48b141fcd3f31add80faee81b4347e47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d7115db7fe248437f4b8918f06c9aa48b141fcd3f31add80faee81b4347e47b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dpkxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:31Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.687786 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ww6lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72500ffd-4ca3-4614-a3a2-bbdc5a7506c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://668cfaa2af20e2bb082fc47e1702cfd9f704c4fdf56a4d27cf25d6915e7cd18a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmsvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ww6lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:31Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.707020 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2fa377e-a3a6-46ce-af23-e677799f0115\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520771cf1ad6f9360064c5f304243c5878a2f1025c87d1001c97b2d5f95cb9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b65b9bc444101900c62fd90c7d45789d186a89148ac0fd9c7f0c6048c3f55ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbd3e682d207b8c18d6bcbc26aa2adae2a35645502d31b327ffddd51991e842\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4563974ed1467a87aab104756828e63853c282a61c4fcab1e757eb4da9afaa40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:31Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.714239 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.714270 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.714282 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.714298 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.714311 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:31Z","lastTransitionTime":"2025-12-01T08:17:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.757925 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 08:17:31 crc kubenswrapper[5004]: E1201 08:17:31.758101 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.758201 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:17:31 crc kubenswrapper[5004]: E1201 08:17:31.758290 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.758363 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 08:17:31 crc kubenswrapper[5004]: E1201 08:17:31.758437 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.817138 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.817197 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.817217 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.817243 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.817261 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:31Z","lastTransitionTime":"2025-12-01T08:17:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.920631 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.920692 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.920710 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.920736 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:31 crc kubenswrapper[5004]: I1201 08:17:31.920752 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:31Z","lastTransitionTime":"2025-12-01T08:17:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:32 crc kubenswrapper[5004]: I1201 08:17:32.023509 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:32 crc kubenswrapper[5004]: I1201 08:17:32.023605 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:32 crc kubenswrapper[5004]: I1201 08:17:32.023630 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:32 crc kubenswrapper[5004]: I1201 08:17:32.023658 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:32 crc kubenswrapper[5004]: I1201 08:17:32.023681 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:32Z","lastTransitionTime":"2025-12-01T08:17:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:32 crc kubenswrapper[5004]: I1201 08:17:32.126124 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:32 crc kubenswrapper[5004]: I1201 08:17:32.126180 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:32 crc kubenswrapper[5004]: I1201 08:17:32.126197 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:32 crc kubenswrapper[5004]: I1201 08:17:32.126221 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:32 crc kubenswrapper[5004]: I1201 08:17:32.126271 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:32Z","lastTransitionTime":"2025-12-01T08:17:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:32 crc kubenswrapper[5004]: I1201 08:17:32.229878 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:32 crc kubenswrapper[5004]: I1201 08:17:32.229964 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:32 crc kubenswrapper[5004]: I1201 08:17:32.229984 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:32 crc kubenswrapper[5004]: I1201 08:17:32.230009 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:32 crc kubenswrapper[5004]: I1201 08:17:32.230028 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:32Z","lastTransitionTime":"2025-12-01T08:17:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:32 crc kubenswrapper[5004]: I1201 08:17:32.333552 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:32 crc kubenswrapper[5004]: I1201 08:17:32.333632 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:32 crc kubenswrapper[5004]: I1201 08:17:32.333649 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:32 crc kubenswrapper[5004]: I1201 08:17:32.333674 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:32 crc kubenswrapper[5004]: I1201 08:17:32.333695 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:32Z","lastTransitionTime":"2025-12-01T08:17:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:32 crc kubenswrapper[5004]: I1201 08:17:32.435743 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:32 crc kubenswrapper[5004]: I1201 08:17:32.435800 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:32 crc kubenswrapper[5004]: I1201 08:17:32.435823 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:32 crc kubenswrapper[5004]: I1201 08:17:32.435857 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:32 crc kubenswrapper[5004]: I1201 08:17:32.435875 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:32Z","lastTransitionTime":"2025-12-01T08:17:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:32 crc kubenswrapper[5004]: I1201 08:17:32.539137 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:32 crc kubenswrapper[5004]: I1201 08:17:32.539196 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:32 crc kubenswrapper[5004]: I1201 08:17:32.539213 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:32 crc kubenswrapper[5004]: I1201 08:17:32.539236 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:32 crc kubenswrapper[5004]: I1201 08:17:32.539252 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:32Z","lastTransitionTime":"2025-12-01T08:17:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:32 crc kubenswrapper[5004]: I1201 08:17:32.642047 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:32 crc kubenswrapper[5004]: I1201 08:17:32.642088 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:32 crc kubenswrapper[5004]: I1201 08:17:32.642100 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:32 crc kubenswrapper[5004]: I1201 08:17:32.642117 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:32 crc kubenswrapper[5004]: I1201 08:17:32.642129 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:32Z","lastTransitionTime":"2025-12-01T08:17:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:32 crc kubenswrapper[5004]: I1201 08:17:32.744394 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:32 crc kubenswrapper[5004]: I1201 08:17:32.744434 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:32 crc kubenswrapper[5004]: I1201 08:17:32.744446 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:32 crc kubenswrapper[5004]: I1201 08:17:32.744463 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:32 crc kubenswrapper[5004]: I1201 08:17:32.744475 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:32Z","lastTransitionTime":"2025-12-01T08:17:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:32 crc kubenswrapper[5004]: I1201 08:17:32.773599 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0ed2b6d-0c61-4639-bc3b-1c8effc4815d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45a0d7a12e7c114de1740a151e04f7f39844b709740cab958f156ac918af3228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d69a370af9f8f6c575f99da6bb01cee0f660dccb7e85f16f5d8874d0e263e1fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82baee0fd931fd7b41cd5a73ca7e653ef13dfa3d573991cbe3238d6b95f6fac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f341e99100e985f65c24647d976a24782dfae7f52e752b774f41f69af27fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad581564fc519328ae429bce757797f6d87d8648e862b008637a245866ec4cbe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T08:17:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 08:17:15.185125 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 08:17:15.186793 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071633747/tls.crt::/tmp/serving-cert-4071633747/tls.key\\\\\\\"\\\\nI1201 08:17:20.827614 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 08:17:20.834528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 08:17:20.834549 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 08:17:20.837611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 08:17:20.837630 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 08:17:20.848940 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 08:17:20.848957 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848965 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 08:17:20.848968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 08:17:20.848970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 08:17:20.848973 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 08:17:20.849115 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 08:17:20.854990 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2caeb2ebaf68110cb8728e60aae6db1b1fc48efae6018f66070766a4f42caa69\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:32Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:32 crc kubenswrapper[5004]: I1201 08:17:32.785722 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c470edd984db3b756e031e33f64a9bc5ca6b9c156ab479e5cd647745655a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:32Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:32 crc kubenswrapper[5004]: I1201 08:17:32.798903 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fd2f2b5d7dc3d7b8e1bb06fdf0eec3addfec3c230e044b6395d6cf535630ebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2250161e400b201f0528e33231c9007342002a37cd037a1b1e011f2aaaa22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:32Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:32 crc kubenswrapper[5004]: I1201 08:17:32.816711 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zjksw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70e79009-93be-49c4-a6b3-e8a06bcea7f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad3ea204ad731500a340e639f0e36abe28ba50464f1b2ff9b009e39c234cf708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd8m6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zjksw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:32Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:32 crc kubenswrapper[5004]: I1201 08:17:32.837400 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15cdec0a-5925-4966-a30b-f60c503f633e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f528b94ff12c7a804930dca4b93c23334a9ae3a53317750cd34b5695f08b4582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f001b70a83da28b1ea78a94f1cd70dfc63b3de5a21b041932d33c2ac970c986b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c5c043438b515e37a6d6d5c47b92479bbb7322fe35c3a1bc57ee7b3a746544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23df33e45cc27e4d343ae6ec3fb88f2dfc125ba7da424515d4bee5ec19281832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4a7cab8e1594814a687b1433a92642a19eff7c2eb12f96fa06f08a2e270733d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1844a18a6109c25a25b9c8cb3ff65e69cc657f66be03dd97e883d24a774b7638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0292bc1ff208619ec4cb469a8fbcddf2915fccba3a6ebc07d1ecc8bf1092007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f231e0092e1a85cc5d6e00fb8d3bd982223b670191ac3dc21d81ff1fab462a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-knmdv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:32Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:32 crc kubenswrapper[5004]: I1201 08:17:32.846520 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:32 crc kubenswrapper[5004]: I1201 08:17:32.846578 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:32 crc kubenswrapper[5004]: I1201 08:17:32.846593 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:32 crc kubenswrapper[5004]: I1201 08:17:32.846609 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:32 crc kubenswrapper[5004]: I1201 08:17:32.846620 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:32Z","lastTransitionTime":"2025-12-01T08:17:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:32 crc kubenswrapper[5004]: I1201 08:17:32.851392 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327477e7db10129d47ddf9078f32d0df2ffd4b8e59fc7bb77c17c60c63e9a936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:32Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:32 crc kubenswrapper[5004]: I1201 08:17:32.867417 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:32Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:32 crc kubenswrapper[5004]: I1201 08:17:32.883516 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:32Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:32 crc kubenswrapper[5004]: I1201 08:17:32.908018 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43c9f762-d403-49b9-8294-d66976aca4dd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b358f641ff09bc8f4a452b0d4a8cf5f9790182f76779be31426d48d0278b0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d610c3a799d4a7df9ddfe2a28f2ced2e5ac4e96c67c93f8e6d3a0eee60d9ac48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43f9f72af3273da055eeb1e13ad6258a1b3b6b205402861beee9f87f02230297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e3cc2eca559417c71beed561569c4dc5b45ffb8407976e0695fc1b93219d37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c24fbeedbc7c40b6079b7ad89d8564a4f2cdaad2cf996e66682b082382da190\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:32Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:32 crc kubenswrapper[5004]: I1201 08:17:32.920174 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:32Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:32 crc kubenswrapper[5004]: I1201 08:17:32.934846 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jjms6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8828af41-beeb-47dd-96cf-3dbcb5175893\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b22220849cdb91592a44a67f9f6a09586146009483f46a0505a2dea3b02bccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvsch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jjms6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:32Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:32 crc kubenswrapper[5004]: I1201 08:17:32.948271 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:32 crc kubenswrapper[5004]: I1201 08:17:32.948317 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:32 crc kubenswrapper[5004]: I1201 08:17:32.948328 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:32 crc kubenswrapper[5004]: I1201 08:17:32.948347 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:32 crc kubenswrapper[5004]: I1201 08:17:32.948360 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:32Z","lastTransitionTime":"2025-12-01T08:17:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:32 crc kubenswrapper[5004]: I1201 08:17:32.948715 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9977ebb-82de-4e96-8763-0b5a84f8d4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf84c2eea7167eed95a4195d37bf784ffef310bc2079d8d06e1f47cb45a7864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwgxq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c1e51fa92aec80e93d0fce11c743b8c4fde23fc23d8626e8d5b3e25ab42350d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwgxq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fvdgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:32Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:32 crc kubenswrapper[5004]: I1201 08:17:32.962850 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2fa377e-a3a6-46ce-af23-e677799f0115\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520771cf1ad6f9360064c5f304243c5878a2f1025c87d1001c97b2d5f95cb9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b65b9bc444101900c62fd90c7d45789d186a89148ac0fd9c7f0c6048c3f55ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbd3e682d207b8c18d6bcbc26aa2adae2a35645502d31b327ffddd51991e842\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4563974ed1467a87aab104756828e63853c282a61c4fcab1e757eb4da9afaa40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:32Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:32 crc kubenswrapper[5004]: I1201 08:17:32.984021 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dpkxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddcaf111-708a-45b3-a342-effd3061ab17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68ecba515f05ca83fdd0cdda10e3e5925a146aadb70ae17859586c12daf55dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd2f03a8020818d1cc98b6eeaf64a728e627a3401400c3af53654f8771ef02eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd2f03a8020818d1cc98b6eeaf64a728e627a3401400c3af53654f8771ef02eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8dd31c10eaf6bc673dec377a884040524418dd1891a96b2eb6e6e983979512a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8dd31c10eaf6bc673dec377a884040524418dd1891a96b2eb6e6e983979512a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a7a8ef66a9486ca3d13826df7aa39db2083cff2c386b4d0d858e0526e0f094d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a7a8ef66a9486ca3d13826df7aa39db2083cff2c386b4d0d858e0526e0f094d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd1c47c976c17bf17e4d660288ce30472372b6a7f33f6ffa6d96c1a4436275d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd1c47c976c17bf17e4d660288ce30472372b6a7f33f6ffa6d96c1a4436275d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b7c8e9648e5514ab1fcb63dc72822dd276a2d74987e2fb4f4800b26ec176be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b7c8e9648e5514ab1fcb63dc72822dd276a2d74987e2fb4f4800b26ec176be7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d7115db7fe248437f4b8918f06c9aa48b141fcd3f31add80faee81b4347e47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d7115db7fe248437f4b8918f06c9aa48b141fcd3f31add80faee81b4347e47b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dpkxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:32Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:33 crc kubenswrapper[5004]: I1201 08:17:33.003295 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ww6lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72500ffd-4ca3-4614-a3a2-bbdc5a7506c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://668cfaa2af20e2bb082fc47e1702cfd9f704c4fdf56a4d27cf25d6915e7cd18a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmsvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ww6lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:33Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:33 crc kubenswrapper[5004]: I1201 08:17:33.050883 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:33 crc kubenswrapper[5004]: I1201 08:17:33.050942 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:33 crc kubenswrapper[5004]: I1201 08:17:33.050967 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:33 crc kubenswrapper[5004]: I1201 08:17:33.050998 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:33 crc kubenswrapper[5004]: I1201 08:17:33.051020 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:33Z","lastTransitionTime":"2025-12-01T08:17:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:33 crc kubenswrapper[5004]: I1201 08:17:33.153482 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:33 crc kubenswrapper[5004]: I1201 08:17:33.153848 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:33 crc kubenswrapper[5004]: I1201 08:17:33.153866 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:33 crc kubenswrapper[5004]: I1201 08:17:33.153888 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:33 crc kubenswrapper[5004]: I1201 08:17:33.153903 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:33Z","lastTransitionTime":"2025-12-01T08:17:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:33 crc kubenswrapper[5004]: I1201 08:17:33.256321 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:33 crc kubenswrapper[5004]: I1201 08:17:33.256370 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:33 crc kubenswrapper[5004]: I1201 08:17:33.256386 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:33 crc kubenswrapper[5004]: I1201 08:17:33.256408 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:33 crc kubenswrapper[5004]: I1201 08:17:33.256422 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:33Z","lastTransitionTime":"2025-12-01T08:17:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:33 crc kubenswrapper[5004]: I1201 08:17:33.359055 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:33 crc kubenswrapper[5004]: I1201 08:17:33.359096 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:33 crc kubenswrapper[5004]: I1201 08:17:33.359107 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:33 crc kubenswrapper[5004]: I1201 08:17:33.359123 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:33 crc kubenswrapper[5004]: I1201 08:17:33.359133 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:33Z","lastTransitionTime":"2025-12-01T08:17:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:33 crc kubenswrapper[5004]: I1201 08:17:33.461535 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:33 crc kubenswrapper[5004]: I1201 08:17:33.461613 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:33 crc kubenswrapper[5004]: I1201 08:17:33.461629 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:33 crc kubenswrapper[5004]: I1201 08:17:33.461651 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:33 crc kubenswrapper[5004]: I1201 08:17:33.461669 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:33Z","lastTransitionTime":"2025-12-01T08:17:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:33 crc kubenswrapper[5004]: I1201 08:17:33.565150 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:33 crc kubenswrapper[5004]: I1201 08:17:33.565205 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:33 crc kubenswrapper[5004]: I1201 08:17:33.565222 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:33 crc kubenswrapper[5004]: I1201 08:17:33.565247 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:33 crc kubenswrapper[5004]: I1201 08:17:33.565265 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:33Z","lastTransitionTime":"2025-12-01T08:17:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:33 crc kubenswrapper[5004]: I1201 08:17:33.667924 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:33 crc kubenswrapper[5004]: I1201 08:17:33.668290 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:33 crc kubenswrapper[5004]: I1201 08:17:33.668550 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:33 crc kubenswrapper[5004]: I1201 08:17:33.668813 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:33 crc kubenswrapper[5004]: I1201 08:17:33.668966 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:33Z","lastTransitionTime":"2025-12-01T08:17:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:33 crc kubenswrapper[5004]: I1201 08:17:33.758478 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 08:17:33 crc kubenswrapper[5004]: I1201 08:17:33.758542 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 08:17:33 crc kubenswrapper[5004]: I1201 08:17:33.758643 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:17:33 crc kubenswrapper[5004]: E1201 08:17:33.758816 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 08:17:33 crc kubenswrapper[5004]: E1201 08:17:33.759006 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 08:17:33 crc kubenswrapper[5004]: E1201 08:17:33.759156 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 08:17:33 crc kubenswrapper[5004]: I1201 08:17:33.772585 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:33 crc kubenswrapper[5004]: I1201 08:17:33.772620 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:33 crc kubenswrapper[5004]: I1201 08:17:33.772630 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:33 crc kubenswrapper[5004]: I1201 08:17:33.772674 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:33 crc kubenswrapper[5004]: I1201 08:17:33.772686 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:33Z","lastTransitionTime":"2025-12-01T08:17:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:33 crc kubenswrapper[5004]: I1201 08:17:33.875616 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:33 crc kubenswrapper[5004]: I1201 08:17:33.875662 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:33 crc kubenswrapper[5004]: I1201 08:17:33.875687 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:33 crc kubenswrapper[5004]: I1201 08:17:33.875705 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:33 crc kubenswrapper[5004]: I1201 08:17:33.875716 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:33Z","lastTransitionTime":"2025-12-01T08:17:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:33 crc kubenswrapper[5004]: I1201 08:17:33.978523 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:33 crc kubenswrapper[5004]: I1201 08:17:33.978598 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:33 crc kubenswrapper[5004]: I1201 08:17:33.978613 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:33 crc kubenswrapper[5004]: I1201 08:17:33.978633 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:33 crc kubenswrapper[5004]: I1201 08:17:33.978648 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:33Z","lastTransitionTime":"2025-12-01T08:17:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.081684 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.081745 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.081762 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.081786 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.081804 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:34Z","lastTransitionTime":"2025-12-01T08:17:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.092774 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-knmdv_15cdec0a-5925-4966-a30b-f60c503f633e/ovnkube-controller/0.log" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.096023 5004 generic.go:334] "Generic (PLEG): container finished" podID="15cdec0a-5925-4966-a30b-f60c503f633e" containerID="d0292bc1ff208619ec4cb469a8fbcddf2915fccba3a6ebc07d1ecc8bf1092007" exitCode=1 Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.096065 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" event={"ID":"15cdec0a-5925-4966-a30b-f60c503f633e","Type":"ContainerDied","Data":"d0292bc1ff208619ec4cb469a8fbcddf2915fccba3a6ebc07d1ecc8bf1092007"} Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.096787 5004 scope.go:117] "RemoveContainer" containerID="d0292bc1ff208619ec4cb469a8fbcddf2915fccba3a6ebc07d1ecc8bf1092007" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.116224 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2fa377e-a3a6-46ce-af23-e677799f0115\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520771cf1ad6f9360064c5f304243c5878a2f1025c87d1001c97b2d5f95cb9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b65b9bc444101900c62fd90c7d45789d186a89148ac0fd9c7f0c6048c3f55ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbd3e682d207b8c18d6bcbc26aa2adae2a35645502d31b327ffddd51991e842\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4563974ed1467a87aab104756828e63853c282a61c4fcab1e757eb4da9afaa40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:34Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.184802 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.184833 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.184841 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.184859 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.184868 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:34Z","lastTransitionTime":"2025-12-01T08:17:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.288238 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.288282 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.288301 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.288323 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.288345 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:34Z","lastTransitionTime":"2025-12-01T08:17:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.314629 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dpkxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddcaf111-708a-45b3-a342-effd3061ab17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68ecba515f05ca83fdd0cdda10e3e5925a146aadb70ae17859586c12daf55dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd2f03a8020818d1cc98b6eeaf64a728e627a3401400c3af53654f8771ef02eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd2f03a8020818d1cc98b6eeaf64a728e627a3401400c3af53654f8771ef02eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8dd31c10eaf6bc673dec377a884040524418dd1891a96b2eb6e6e983979512a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8dd31c10eaf6bc673dec377a884040524418dd1891a96b2eb6e6e983979512a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a7a8ef66a9486ca3d13826df7aa39db2083cff2c386b4d0d858e0526e0f094d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a7a8ef66a9486ca3d13826df7aa39db2083cff2c386b4d0d858e0526e0f094d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd1c47c976c17bf17e4d660288ce30472372b6a7f33f6ffa6d96c1a4436275d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd1c47c976c17bf17e4d660288ce30472372b6a7f33f6ffa6d96c1a4436275d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b7c8e9648e5514ab1fcb63dc72822dd276a2d74987e2fb4f4800b26ec176be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b7c8e9648e5514ab1fcb63dc72822dd276a2d74987e2fb4f4800b26ec176be7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d7115db7fe248437f4b8918f06c9aa48b141fcd3f31add80faee81b4347e47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d7115db7fe248437f4b8918f06c9aa48b141fcd3f31add80faee81b4347e47b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dpkxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:34Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.327699 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ww6lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72500ffd-4ca3-4614-a3a2-bbdc5a7506c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://668cfaa2af20e2bb082fc47e1702cfd9f704c4fdf56a4d27cf25d6915e7cd18a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmsvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ww6lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:34Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.354957 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0ed2b6d-0c61-4639-bc3b-1c8effc4815d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45a0d7a12e7c114de1740a151e04f7f39844b709740cab958f156ac918af3228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d69a370af9f8f6c575f99da6bb01cee0f660dccb7e85f16f5d8874d0e263e1fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82baee0fd931fd7b41cd5a73ca7e653ef13dfa3d573991cbe3238d6b95f6fac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f341e99100e985f65c24647d976a24782dfae7f52e752b774f41f69af27fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad581564fc519328ae429bce757797f6d87d8648e862b008637a245866ec4cbe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T08:17:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 08:17:15.185125 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 08:17:15.186793 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071633747/tls.crt::/tmp/serving-cert-4071633747/tls.key\\\\\\\"\\\\nI1201 08:17:20.827614 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 08:17:20.834528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 08:17:20.834549 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 08:17:20.837611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 08:17:20.837630 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 08:17:20.848940 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 08:17:20.848957 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848965 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 08:17:20.848968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 08:17:20.848970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 08:17:20.848973 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 08:17:20.849115 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 08:17:20.854990 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2caeb2ebaf68110cb8728e60aae6db1b1fc48efae6018f66070766a4f42caa69\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:34Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.370974 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.371006 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.371019 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.371034 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.371044 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:34Z","lastTransitionTime":"2025-12-01T08:17:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.376910 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:34Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:34 crc kubenswrapper[5004]: E1201 08:17:34.384340 5004 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c8341267-df98-484f-a5e7-cf024fab437c\\\",\\\"systemUUID\\\":\\\"77547a65-c048-47ee-89b1-8422fc81b7aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:34Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.389310 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.389341 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.389350 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.389364 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.389374 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:34Z","lastTransitionTime":"2025-12-01T08:17:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.393828 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c470edd984db3b756e031e33f64a9bc5ca6b9c156ab479e5cd647745655a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:34Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:34 crc kubenswrapper[5004]: E1201 08:17:34.402859 5004 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c8341267-df98-484f-a5e7-cf024fab437c\\\",\\\"systemUUID\\\":\\\"77547a65-c048-47ee-89b1-8422fc81b7aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:34Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.406749 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.406857 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.406877 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.406901 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.406921 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:34Z","lastTransitionTime":"2025-12-01T08:17:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.409670 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fd2f2b5d7dc3d7b8e1bb06fdf0eec3addfec3c230e044b6395d6cf535630ebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2250161e400b201f0528e33231c9007342002a37cd037a1b1e011f2aaaa22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:34Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:34 crc kubenswrapper[5004]: E1201 08:17:34.421582 5004 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c8341267-df98-484f-a5e7-cf024fab437c\\\",\\\"systemUUID\\\":\\\"77547a65-c048-47ee-89b1-8422fc81b7aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:34Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.423479 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zjksw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70e79009-93be-49c4-a6b3-e8a06bcea7f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad3ea204ad731500a340e639f0e36abe28ba50464f1b2ff9b009e39c234cf708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd8m6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zjksw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:34Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.426380 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.426411 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.426419 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.426434 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.426443 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:34Z","lastTransitionTime":"2025-12-01T08:17:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:34 crc kubenswrapper[5004]: E1201 08:17:34.447072 5004 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c8341267-df98-484f-a5e7-cf024fab437c\\\",\\\"systemUUID\\\":\\\"77547a65-c048-47ee-89b1-8422fc81b7aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:34Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.451328 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.451401 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.451423 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.451453 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.451473 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:34Z","lastTransitionTime":"2025-12-01T08:17:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.452333 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15cdec0a-5925-4966-a30b-f60c503f633e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f528b94ff12c7a804930dca4b93c23334a9ae3a53317750cd34b5695f08b4582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f001b70a83da28b1ea78a94f1cd70dfc63b3de5a21b041932d33c2ac970c986b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c5c043438b515e37a6d6d5c47b92479bbb7322fe35c3a1bc57ee7b3a746544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23df33e45cc27e4d343ae6ec3fb88f2dfc125ba7da424515d4bee5ec19281832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4a7cab8e1594814a687b1433a92642a19eff7c2eb12f96fa06f08a2e270733d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1844a18a6109c25a25b9c8cb3ff65e69cc657f66be03dd97e883d24a774b7638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0292bc1ff208619ec4cb469a8fbcddf2915fccba3a6ebc07d1ecc8bf1092007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0292bc1ff208619ec4cb469a8fbcddf2915fccba3a6ebc07d1ecc8bf1092007\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T08:17:33Z\\\",\\\"message\\\":\\\"al\\\\nI1201 08:17:33.210759 6283 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1201 08:17:33.210788 6283 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1201 08:17:33.211067 6283 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 08:17:33.211080 6283 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1201 08:17:33.211080 6283 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1201 08:17:33.211177 6283 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 08:17:33.211209 6283 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 08:17:33.211241 6283 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1201 08:17:33.211253 6283 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1201 08:17:33.211264 6283 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 08:17:33.211283 6283 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 08:17:33.211300 6283 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 08:17:33.211310 6283 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1201 08:17:33.211316 6283 factory.go:656] Stopping watch factory\\\\nI1201 08:17:33.211312 6283 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 08:17:33.211333 6283 ovnkube.go:599] Stopped ovnkube\\\\nI1201 08:17:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f231e0092e1a85cc5d6e00fb8d3bd982223b670191ac3dc21d81ff1fab462a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-knmdv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:34Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:34 crc kubenswrapper[5004]: E1201 08:17:34.465525 5004 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c8341267-df98-484f-a5e7-cf024fab437c\\\",\\\"systemUUID\\\":\\\"77547a65-c048-47ee-89b1-8422fc81b7aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:34Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:34 crc kubenswrapper[5004]: E1201 08:17:34.465915 5004 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.467772 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.467958 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.468056 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.468151 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.468239 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:34Z","lastTransitionTime":"2025-12-01T08:17:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.471022 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327477e7db10129d47ddf9078f32d0df2ffd4b8e59fc7bb77c17c60c63e9a936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:34Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.484877 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:34Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.497254 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9977ebb-82de-4e96-8763-0b5a84f8d4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf84c2eea7167eed95a4195d37bf784ffef310bc2079d8d06e1f47cb45a7864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwgxq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c1e51fa92aec80e93d0fce11c743b8c4fde23fc23d8626e8d5b3e25ab42350d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwgxq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fvdgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:34Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.523971 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43c9f762-d403-49b9-8294-d66976aca4dd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b358f641ff09bc8f4a452b0d4a8cf5f9790182f76779be31426d48d0278b0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d610c3a799d4a7df9ddfe2a28f2ced2e5ac4e96c67c93f8e6d3a0eee60d9ac48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43f9f72af3273da055eeb1e13ad6258a1b3b6b205402861beee9f87f02230297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e3cc2eca559417c71beed561569c4dc5b45ffb8407976e0695fc1b93219d37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c24fbeedbc7c40b6079b7ad89d8564a4f2cdaad2cf996e66682b082382da190\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:34Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.538214 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:34Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.548614 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jjms6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8828af41-beeb-47dd-96cf-3dbcb5175893\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b22220849cdb91592a44a67f9f6a09586146009483f46a0505a2dea3b02bccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvsch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jjms6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:34Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.570878 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.571241 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.571379 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.571499 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.571707 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:34Z","lastTransitionTime":"2025-12-01T08:17:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.674635 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.674867 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.675046 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.675303 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.675523 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:34Z","lastTransitionTime":"2025-12-01T08:17:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.778026 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.778066 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.778082 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.778103 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.778120 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:34Z","lastTransitionTime":"2025-12-01T08:17:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.879946 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.880002 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.880064 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.880091 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.880107 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:34Z","lastTransitionTime":"2025-12-01T08:17:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.982255 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.982300 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.982311 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.982327 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:34 crc kubenswrapper[5004]: I1201 08:17:34.982339 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:34Z","lastTransitionTime":"2025-12-01T08:17:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.085352 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.085389 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.085398 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.085412 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.085422 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:35Z","lastTransitionTime":"2025-12-01T08:17:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.112343 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-knmdv_15cdec0a-5925-4966-a30b-f60c503f633e/ovnkube-controller/0.log" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.115806 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" event={"ID":"15cdec0a-5925-4966-a30b-f60c503f633e","Type":"ContainerStarted","Data":"404ed4cf3ada652b914fc9fd3295d149810f22a6cc4ea044f934ba8ee2595209"} Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.116374 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.129280 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mzsvz"] Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.129713 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mzsvz" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.129874 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2fa377e-a3a6-46ce-af23-e677799f0115\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520771cf1ad6f9360064c5f304243c5878a2f1025c87d1001c97b2d5f95cb9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b65b9bc444101900c62fd90c7d45789d186a89148ac0fd9c7f0c6048c3f55ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbd3e682d207b8c18d6bcbc26aa2adae2a35645502d31b327ffddd51991e842\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4563974ed1467a87aab104756828e63853c282a61c4fcab1e757eb4da9afaa40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:35Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.131813 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.132333 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.149454 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dpkxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddcaf111-708a-45b3-a342-effd3061ab17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68ecba515f05ca83fdd0cdda10e3e5925a146aadb70ae17859586c12daf55dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd2f03a8020818d1cc98b6eeaf64a728e627a3401400c3af53654f8771ef02eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd2f03a8020818d1cc98b6eeaf64a728e627a3401400c3af53654f8771ef02eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8dd31c10eaf6bc673dec377a884040524418dd1891a96b2eb6e6e983979512a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8dd31c10eaf6bc673dec377a884040524418dd1891a96b2eb6e6e983979512a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a7a8ef66a9486ca3d13826df7aa39db2083cff2c386b4d0d858e0526e0f094d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a7a8ef66a9486ca3d13826df7aa39db2083cff2c386b4d0d858e0526e0f094d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd1c47c976c17bf17e4d660288ce30472372b6a7f33f6ffa6d96c1a4436275d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd1c47c976c17bf17e4d660288ce30472372b6a7f33f6ffa6d96c1a4436275d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b7c8e9648e5514ab1fcb63dc72822dd276a2d74987e2fb4f4800b26ec176be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b7c8e9648e5514ab1fcb63dc72822dd276a2d74987e2fb4f4800b26ec176be7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d7115db7fe248437f4b8918f06c9aa48b141fcd3f31add80faee81b4347e47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d7115db7fe248437f4b8918f06c9aa48b141fcd3f31add80faee81b4347e47b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dpkxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:35Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.157295 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.161353 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ww6lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72500ffd-4ca3-4614-a3a2-bbdc5a7506c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://668cfaa2af20e2bb082fc47e1702cfd9f704c4fdf56a4d27cf25d6915e7cd18a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmsvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ww6lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:35Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.176400 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0ed2b6d-0c61-4639-bc3b-1c8effc4815d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45a0d7a12e7c114de1740a151e04f7f39844b709740cab958f156ac918af3228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d69a370af9f8f6c575f99da6bb01cee0f660dccb7e85f16f5d8874d0e263e1fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82baee0fd931fd7b41cd5a73ca7e653ef13dfa3d573991cbe3238d6b95f6fac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f341e99100e985f65c24647d976a24782dfae7f52e752b774f41f69af27fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad581564fc519328ae429bce757797f6d87d8648e862b008637a245866ec4cbe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T08:17:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 08:17:15.185125 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 08:17:15.186793 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071633747/tls.crt::/tmp/serving-cert-4071633747/tls.key\\\\\\\"\\\\nI1201 08:17:20.827614 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 08:17:20.834528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 08:17:20.834549 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 08:17:20.837611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 08:17:20.837630 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 08:17:20.848940 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 08:17:20.848957 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848965 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 08:17:20.848968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 08:17:20.848970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 08:17:20.848973 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 08:17:20.849115 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 08:17:20.854990 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2caeb2ebaf68110cb8728e60aae6db1b1fc48efae6018f66070766a4f42caa69\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:35Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.188491 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.188541 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.188556 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.188593 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.188607 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:35Z","lastTransitionTime":"2025-12-01T08:17:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.189685 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327477e7db10129d47ddf9078f32d0df2ffd4b8e59fc7bb77c17c60c63e9a936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:35Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.200831 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:35Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.210897 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/397b51b7-934a-41d1-a593-500a64161bd9-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-mzsvz\" (UID: \"397b51b7-934a-41d1-a593-500a64161bd9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mzsvz" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.211090 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4h49g\" (UniqueName: \"kubernetes.io/projected/397b51b7-934a-41d1-a593-500a64161bd9-kube-api-access-4h49g\") pod \"ovnkube-control-plane-749d76644c-mzsvz\" (UID: \"397b51b7-934a-41d1-a593-500a64161bd9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mzsvz" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.211165 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/397b51b7-934a-41d1-a593-500a64161bd9-env-overrides\") pod \"ovnkube-control-plane-749d76644c-mzsvz\" (UID: \"397b51b7-934a-41d1-a593-500a64161bd9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mzsvz" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.211278 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/397b51b7-934a-41d1-a593-500a64161bd9-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-mzsvz\" (UID: \"397b51b7-934a-41d1-a593-500a64161bd9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mzsvz" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.213106 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:35Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.225200 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c470edd984db3b756e031e33f64a9bc5ca6b9c156ab479e5cd647745655a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:35Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.235937 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fd2f2b5d7dc3d7b8e1bb06fdf0eec3addfec3c230e044b6395d6cf535630ebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2250161e400b201f0528e33231c9007342002a37cd037a1b1e011f2aaaa22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:35Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.253887 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zjksw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70e79009-93be-49c4-a6b3-e8a06bcea7f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad3ea204ad731500a340e639f0e36abe28ba50464f1b2ff9b009e39c234cf708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd8m6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zjksw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:35Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.279441 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15cdec0a-5925-4966-a30b-f60c503f633e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f528b94ff12c7a804930dca4b93c23334a9ae3a53317750cd34b5695f08b4582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f001b70a83da28b1ea78a94f1cd70dfc63b3de5a21b041932d33c2ac970c986b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c5c043438b515e37a6d6d5c47b92479bbb7322fe35c3a1bc57ee7b3a746544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23df33e45cc27e4d343ae6ec3fb88f2dfc125ba7da424515d4bee5ec19281832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4a7cab8e1594814a687b1433a92642a19eff7c2eb12f96fa06f08a2e270733d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1844a18a6109c25a25b9c8cb3ff65e69cc657f66be03dd97e883d24a774b7638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://404ed4cf3ada652b914fc9fd3295d149810f22a6cc4ea044f934ba8ee2595209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0292bc1ff208619ec4cb469a8fbcddf2915fccba3a6ebc07d1ecc8bf1092007\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T08:17:33Z\\\",\\\"message\\\":\\\"al\\\\nI1201 08:17:33.210759 6283 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1201 08:17:33.210788 6283 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1201 08:17:33.211067 6283 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 08:17:33.211080 6283 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1201 08:17:33.211080 6283 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1201 08:17:33.211177 6283 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 08:17:33.211209 6283 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 08:17:33.211241 6283 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1201 08:17:33.211253 6283 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1201 08:17:33.211264 6283 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 08:17:33.211283 6283 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 08:17:33.211300 6283 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 08:17:33.211310 6283 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1201 08:17:33.211316 6283 factory.go:656] Stopping watch factory\\\\nI1201 08:17:33.211312 6283 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 08:17:33.211333 6283 ovnkube.go:599] Stopped ovnkube\\\\nI1201 08:17:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f231e0092e1a85cc5d6e00fb8d3bd982223b670191ac3dc21d81ff1fab462a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-knmdv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:35Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.290629 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.290686 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.290702 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.290723 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.290738 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:35Z","lastTransitionTime":"2025-12-01T08:17:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.301131 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43c9f762-d403-49b9-8294-d66976aca4dd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b358f641ff09bc8f4a452b0d4a8cf5f9790182f76779be31426d48d0278b0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d610c3a799d4a7df9ddfe2a28f2ced2e5ac4e96c67c93f8e6d3a0eee60d9ac48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43f9f72af3273da055eeb1e13ad6258a1b3b6b205402861beee9f87f02230297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e3cc2eca559417c71beed561569c4dc5b45ffb8407976e0695fc1b93219d37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c24fbeedbc7c40b6079b7ad89d8564a4f2cdaad2cf996e66682b082382da190\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:35Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.312335 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4h49g\" (UniqueName: \"kubernetes.io/projected/397b51b7-934a-41d1-a593-500a64161bd9-kube-api-access-4h49g\") pod \"ovnkube-control-plane-749d76644c-mzsvz\" (UID: \"397b51b7-934a-41d1-a593-500a64161bd9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mzsvz" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.312379 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/397b51b7-934a-41d1-a593-500a64161bd9-env-overrides\") pod \"ovnkube-control-plane-749d76644c-mzsvz\" (UID: \"397b51b7-934a-41d1-a593-500a64161bd9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mzsvz" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.312423 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/397b51b7-934a-41d1-a593-500a64161bd9-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-mzsvz\" (UID: \"397b51b7-934a-41d1-a593-500a64161bd9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mzsvz" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.312457 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/397b51b7-934a-41d1-a593-500a64161bd9-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-mzsvz\" (UID: \"397b51b7-934a-41d1-a593-500a64161bd9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mzsvz" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.313125 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/397b51b7-934a-41d1-a593-500a64161bd9-env-overrides\") pod \"ovnkube-control-plane-749d76644c-mzsvz\" (UID: \"397b51b7-934a-41d1-a593-500a64161bd9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mzsvz" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.313154 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/397b51b7-934a-41d1-a593-500a64161bd9-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-mzsvz\" (UID: \"397b51b7-934a-41d1-a593-500a64161bd9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mzsvz" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.314743 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:35Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.319496 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/397b51b7-934a-41d1-a593-500a64161bd9-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-mzsvz\" (UID: \"397b51b7-934a-41d1-a593-500a64161bd9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mzsvz" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.331124 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4h49g\" (UniqueName: \"kubernetes.io/projected/397b51b7-934a-41d1-a593-500a64161bd9-kube-api-access-4h49g\") pod \"ovnkube-control-plane-749d76644c-mzsvz\" (UID: \"397b51b7-934a-41d1-a593-500a64161bd9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mzsvz" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.336041 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jjms6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8828af41-beeb-47dd-96cf-3dbcb5175893\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b22220849cdb91592a44a67f9f6a09586146009483f46a0505a2dea3b02bccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvsch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jjms6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:35Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.350438 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9977ebb-82de-4e96-8763-0b5a84f8d4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf84c2eea7167eed95a4195d37bf784ffef310bc2079d8d06e1f47cb45a7864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwgxq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c1e51fa92aec80e93d0fce11c743b8c4fde23fc23d8626e8d5b3e25ab42350d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwgxq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fvdgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:35Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.420545 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.420627 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.420643 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.420664 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.420678 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:35Z","lastTransitionTime":"2025-12-01T08:17:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.438310 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2fa377e-a3a6-46ce-af23-e677799f0115\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520771cf1ad6f9360064c5f304243c5878a2f1025c87d1001c97b2d5f95cb9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b65b9bc444101900c62fd90c7d45789d186a89148ac0fd9c7f0c6048c3f55ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbd3e682d207b8c18d6bcbc26aa2adae2a35645502d31b327ffddd51991e842\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4563974ed1467a87aab104756828e63853c282a61c4fcab1e757eb4da9afaa40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:35Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.440454 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mzsvz" Dec 01 08:17:35 crc kubenswrapper[5004]: W1201 08:17:35.454021 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod397b51b7_934a_41d1_a593_500a64161bd9.slice/crio-39b9dc9b3afbd0113c15082df71c9033f42ab3443e2551215ace7fbbda8c1549 WatchSource:0}: Error finding container 39b9dc9b3afbd0113c15082df71c9033f42ab3443e2551215ace7fbbda8c1549: Status 404 returned error can't find the container with id 39b9dc9b3afbd0113c15082df71c9033f42ab3443e2551215ace7fbbda8c1549 Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.463373 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dpkxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddcaf111-708a-45b3-a342-effd3061ab17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68ecba515f05ca83fdd0cdda10e3e5925a146aadb70ae17859586c12daf55dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd2f03a8020818d1cc98b6eeaf64a728e627a3401400c3af53654f8771ef02eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd2f03a8020818d1cc98b6eeaf64a728e627a3401400c3af53654f8771ef02eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8dd31c10eaf6bc673dec377a884040524418dd1891a96b2eb6e6e983979512a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8dd31c10eaf6bc673dec377a884040524418dd1891a96b2eb6e6e983979512a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a7a8ef66a9486ca3d13826df7aa39db2083cff2c386b4d0d858e0526e0f094d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a7a8ef66a9486ca3d13826df7aa39db2083cff2c386b4d0d858e0526e0f094d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd1c47c976c17bf17e4d660288ce30472372b6a7f33f6ffa6d96c1a4436275d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd1c47c976c17bf17e4d660288ce30472372b6a7f33f6ffa6d96c1a4436275d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b7c8e9648e5514ab1fcb63dc72822dd276a2d74987e2fb4f4800b26ec176be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b7c8e9648e5514ab1fcb63dc72822dd276a2d74987e2fb4f4800b26ec176be7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d7115db7fe248437f4b8918f06c9aa48b141fcd3f31add80faee81b4347e47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d7115db7fe248437f4b8918f06c9aa48b141fcd3f31add80faee81b4347e47b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dpkxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:35Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.479903 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ww6lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72500ffd-4ca3-4614-a3a2-bbdc5a7506c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://668cfaa2af20e2bb082fc47e1702cfd9f704c4fdf56a4d27cf25d6915e7cd18a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmsvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ww6lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:35Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.495876 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0ed2b6d-0c61-4639-bc3b-1c8effc4815d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45a0d7a12e7c114de1740a151e04f7f39844b709740cab958f156ac918af3228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d69a370af9f8f6c575f99da6bb01cee0f660dccb7e85f16f5d8874d0e263e1fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82baee0fd931fd7b41cd5a73ca7e653ef13dfa3d573991cbe3238d6b95f6fac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f341e99100e985f65c24647d976a24782dfae7f52e752b774f41f69af27fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad581564fc519328ae429bce757797f6d87d8648e862b008637a245866ec4cbe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T08:17:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 08:17:15.185125 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 08:17:15.186793 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071633747/tls.crt::/tmp/serving-cert-4071633747/tls.key\\\\\\\"\\\\nI1201 08:17:20.827614 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 08:17:20.834528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 08:17:20.834549 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 08:17:20.837611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 08:17:20.837630 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 08:17:20.848940 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 08:17:20.848957 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848965 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 08:17:20.848968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 08:17:20.848970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 08:17:20.848973 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 08:17:20.849115 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 08:17:20.854990 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2caeb2ebaf68110cb8728e60aae6db1b1fc48efae6018f66070766a4f42caa69\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:35Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.513090 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c470edd984db3b756e031e33f64a9bc5ca6b9c156ab479e5cd647745655a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:35Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.523328 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.523373 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.523402 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.523420 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.523433 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:35Z","lastTransitionTime":"2025-12-01T08:17:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.527907 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fd2f2b5d7dc3d7b8e1bb06fdf0eec3addfec3c230e044b6395d6cf535630ebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2250161e400b201f0528e33231c9007342002a37cd037a1b1e011f2aaaa22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:35Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.548300 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zjksw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70e79009-93be-49c4-a6b3-e8a06bcea7f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad3ea204ad731500a340e639f0e36abe28ba50464f1b2ff9b009e39c234cf708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd8m6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zjksw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:35Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.568621 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15cdec0a-5925-4966-a30b-f60c503f633e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f528b94ff12c7a804930dca4b93c23334a9ae3a53317750cd34b5695f08b4582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f001b70a83da28b1ea78a94f1cd70dfc63b3de5a21b041932d33c2ac970c986b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c5c043438b515e37a6d6d5c47b92479bbb7322fe35c3a1bc57ee7b3a746544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23df33e45cc27e4d343ae6ec3fb88f2dfc125ba7da424515d4bee5ec19281832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4a7cab8e1594814a687b1433a92642a19eff7c2eb12f96fa06f08a2e270733d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1844a18a6109c25a25b9c8cb3ff65e69cc657f66be03dd97e883d24a774b7638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://404ed4cf3ada652b914fc9fd3295d149810f22a6cc4ea044f934ba8ee2595209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0292bc1ff208619ec4cb469a8fbcddf2915fccba3a6ebc07d1ecc8bf1092007\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T08:17:33Z\\\",\\\"message\\\":\\\"al\\\\nI1201 08:17:33.210759 6283 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1201 08:17:33.210788 6283 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1201 08:17:33.211067 6283 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 08:17:33.211080 6283 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1201 08:17:33.211080 6283 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1201 08:17:33.211177 6283 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 08:17:33.211209 6283 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 08:17:33.211241 6283 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1201 08:17:33.211253 6283 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1201 08:17:33.211264 6283 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 08:17:33.211283 6283 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 08:17:33.211300 6283 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 08:17:33.211310 6283 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1201 08:17:33.211316 6283 factory.go:656] Stopping watch factory\\\\nI1201 08:17:33.211312 6283 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 08:17:33.211333 6283 ovnkube.go:599] Stopped ovnkube\\\\nI1201 08:17:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f231e0092e1a85cc5d6e00fb8d3bd982223b670191ac3dc21d81ff1fab462a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-knmdv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:35Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.582913 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327477e7db10129d47ddf9078f32d0df2ffd4b8e59fc7bb77c17c60c63e9a936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:35Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.596552 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:35Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.611359 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:35Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.623024 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mzsvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"397b51b7-934a-41d1-a593-500a64161bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h49g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h49g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mzsvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:35Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.627224 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.627276 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.627303 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.627325 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.627338 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:35Z","lastTransitionTime":"2025-12-01T08:17:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.654825 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43c9f762-d403-49b9-8294-d66976aca4dd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b358f641ff09bc8f4a452b0d4a8cf5f9790182f76779be31426d48d0278b0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d610c3a799d4a7df9ddfe2a28f2ced2e5ac4e96c67c93f8e6d3a0eee60d9ac48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43f9f72af3273da055eeb1e13ad6258a1b3b6b205402861beee9f87f02230297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e3cc2eca559417c71beed561569c4dc5b45ffb8407976e0695fc1b93219d37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c24fbeedbc7c40b6079b7ad89d8564a4f2cdaad2cf996e66682b082382da190\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:35Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.668852 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:35Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.680029 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jjms6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8828af41-beeb-47dd-96cf-3dbcb5175893\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b22220849cdb91592a44a67f9f6a09586146009483f46a0505a2dea3b02bccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvsch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jjms6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:35Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.692045 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9977ebb-82de-4e96-8763-0b5a84f8d4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf84c2eea7167eed95a4195d37bf784ffef310bc2079d8d06e1f47cb45a7864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwgxq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c1e51fa92aec80e93d0fce11c743b8c4fde23fc23d8626e8d5b3e25ab42350d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwgxq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fvdgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:35Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.730190 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.730243 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.730252 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.730267 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.730320 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:35Z","lastTransitionTime":"2025-12-01T08:17:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.758085 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.758091 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 08:17:35 crc kubenswrapper[5004]: E1201 08:17:35.758311 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.758418 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 08:17:35 crc kubenswrapper[5004]: E1201 08:17:35.758542 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 08:17:35 crc kubenswrapper[5004]: E1201 08:17:35.758788 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.832977 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.833039 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.833056 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.833080 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.833098 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:35Z","lastTransitionTime":"2025-12-01T08:17:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.936173 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.936240 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.936264 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.936293 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:35 crc kubenswrapper[5004]: I1201 08:17:35.936315 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:35Z","lastTransitionTime":"2025-12-01T08:17:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:36 crc kubenswrapper[5004]: I1201 08:17:36.039397 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:36 crc kubenswrapper[5004]: I1201 08:17:36.039463 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:36 crc kubenswrapper[5004]: I1201 08:17:36.039481 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:36 crc kubenswrapper[5004]: I1201 08:17:36.039507 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:36 crc kubenswrapper[5004]: I1201 08:17:36.039533 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:36Z","lastTransitionTime":"2025-12-01T08:17:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:36 crc kubenswrapper[5004]: I1201 08:17:36.120692 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mzsvz" event={"ID":"397b51b7-934a-41d1-a593-500a64161bd9","Type":"ContainerStarted","Data":"39b9dc9b3afbd0113c15082df71c9033f42ab3443e2551215ace7fbbda8c1549"} Dec 01 08:17:36 crc kubenswrapper[5004]: I1201 08:17:36.142256 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:36 crc kubenswrapper[5004]: I1201 08:17:36.142317 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:36 crc kubenswrapper[5004]: I1201 08:17:36.142332 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:36 crc kubenswrapper[5004]: I1201 08:17:36.142356 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:36 crc kubenswrapper[5004]: I1201 08:17:36.142374 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:36Z","lastTransitionTime":"2025-12-01T08:17:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:36 crc kubenswrapper[5004]: I1201 08:17:36.245365 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:36 crc kubenswrapper[5004]: I1201 08:17:36.245414 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:36 crc kubenswrapper[5004]: I1201 08:17:36.245429 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:36 crc kubenswrapper[5004]: I1201 08:17:36.245453 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:36 crc kubenswrapper[5004]: I1201 08:17:36.245468 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:36Z","lastTransitionTime":"2025-12-01T08:17:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:36 crc kubenswrapper[5004]: I1201 08:17:36.348970 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:36 crc kubenswrapper[5004]: I1201 08:17:36.349010 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:36 crc kubenswrapper[5004]: I1201 08:17:36.349022 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:36 crc kubenswrapper[5004]: I1201 08:17:36.349039 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:36 crc kubenswrapper[5004]: I1201 08:17:36.349053 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:36Z","lastTransitionTime":"2025-12-01T08:17:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:36 crc kubenswrapper[5004]: I1201 08:17:36.451387 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:36 crc kubenswrapper[5004]: I1201 08:17:36.451468 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:36 crc kubenswrapper[5004]: I1201 08:17:36.451478 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:36 crc kubenswrapper[5004]: I1201 08:17:36.451512 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:36 crc kubenswrapper[5004]: I1201 08:17:36.451522 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:36Z","lastTransitionTime":"2025-12-01T08:17:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:36 crc kubenswrapper[5004]: I1201 08:17:36.554964 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:36 crc kubenswrapper[5004]: I1201 08:17:36.555028 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:36 crc kubenswrapper[5004]: I1201 08:17:36.555048 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:36 crc kubenswrapper[5004]: I1201 08:17:36.555070 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:36 crc kubenswrapper[5004]: I1201 08:17:36.555086 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:36Z","lastTransitionTime":"2025-12-01T08:17:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:36 crc kubenswrapper[5004]: I1201 08:17:36.657514 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:36 crc kubenswrapper[5004]: I1201 08:17:36.657624 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:36 crc kubenswrapper[5004]: I1201 08:17:36.657642 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:36 crc kubenswrapper[5004]: I1201 08:17:36.657674 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:36 crc kubenswrapper[5004]: I1201 08:17:36.657692 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:36Z","lastTransitionTime":"2025-12-01T08:17:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:36 crc kubenswrapper[5004]: I1201 08:17:36.760672 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:36 crc kubenswrapper[5004]: I1201 08:17:36.760722 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:36 crc kubenswrapper[5004]: I1201 08:17:36.760733 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:36 crc kubenswrapper[5004]: I1201 08:17:36.760756 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:36 crc kubenswrapper[5004]: I1201 08:17:36.760768 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:36Z","lastTransitionTime":"2025-12-01T08:17:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:36 crc kubenswrapper[5004]: I1201 08:17:36.863288 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:36 crc kubenswrapper[5004]: I1201 08:17:36.863369 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:36 crc kubenswrapper[5004]: I1201 08:17:36.863397 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:36 crc kubenswrapper[5004]: I1201 08:17:36.863427 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:36 crc kubenswrapper[5004]: I1201 08:17:36.863450 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:36Z","lastTransitionTime":"2025-12-01T08:17:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:36 crc kubenswrapper[5004]: I1201 08:17:36.966930 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:36 crc kubenswrapper[5004]: I1201 08:17:36.967001 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:36 crc kubenswrapper[5004]: I1201 08:17:36.967025 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:36 crc kubenswrapper[5004]: I1201 08:17:36.967054 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:36 crc kubenswrapper[5004]: I1201 08:17:36.967077 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:36Z","lastTransitionTime":"2025-12-01T08:17:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.038656 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-7cl5l"] Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.039270 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cl5l" Dec 01 08:17:37 crc kubenswrapper[5004]: E1201 08:17:37.039354 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cl5l" podUID="b488f4f3-d385-4d40-bdee-96d8fe2d42a1" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.062253 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zjksw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70e79009-93be-49c4-a6b3-e8a06bcea7f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad3ea204ad731500a340e639f0e36abe28ba50464f1b2ff9b009e39c234cf708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd8m6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zjksw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:37Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.070214 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.070268 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.070286 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.070310 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.070326 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:37Z","lastTransitionTime":"2025-12-01T08:17:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.093939 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15cdec0a-5925-4966-a30b-f60c503f633e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f528b94ff12c7a804930dca4b93c23334a9ae3a53317750cd34b5695f08b4582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f001b70a83da28b1ea78a94f1cd70dfc63b3de5a21b041932d33c2ac970c986b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c5c043438b515e37a6d6d5c47b92479bbb7322fe35c3a1bc57ee7b3a746544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23df33e45cc27e4d343ae6ec3fb88f2dfc125ba7da424515d4bee5ec19281832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4a7cab8e1594814a687b1433a92642a19eff7c2eb12f96fa06f08a2e270733d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1844a18a6109c25a25b9c8cb3ff65e69cc657f66be03dd97e883d24a774b7638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://404ed4cf3ada652b914fc9fd3295d149810f22a6cc4ea044f934ba8ee2595209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0292bc1ff208619ec4cb469a8fbcddf2915fccba3a6ebc07d1ecc8bf1092007\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T08:17:33Z\\\",\\\"message\\\":\\\"al\\\\nI1201 08:17:33.210759 6283 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1201 08:17:33.210788 6283 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1201 08:17:33.211067 6283 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 08:17:33.211080 6283 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1201 08:17:33.211080 6283 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1201 08:17:33.211177 6283 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 08:17:33.211209 6283 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 08:17:33.211241 6283 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1201 08:17:33.211253 6283 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1201 08:17:33.211264 6283 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 08:17:33.211283 6283 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 08:17:33.211300 6283 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 08:17:33.211310 6283 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1201 08:17:33.211316 6283 factory.go:656] Stopping watch factory\\\\nI1201 08:17:33.211312 6283 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 08:17:33.211333 6283 ovnkube.go:599] Stopped ovnkube\\\\nI1201 08:17:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f231e0092e1a85cc5d6e00fb8d3bd982223b670191ac3dc21d81ff1fab462a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-knmdv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:37Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.116357 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327477e7db10129d47ddf9078f32d0df2ffd4b8e59fc7bb77c17c60c63e9a936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:37Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.137952 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-knmdv_15cdec0a-5925-4966-a30b-f60c503f633e/ovnkube-controller/1.log" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.139211 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-knmdv_15cdec0a-5925-4966-a30b-f60c503f633e/ovnkube-controller/0.log" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.139722 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glbhl\" (UniqueName: \"kubernetes.io/projected/b488f4f3-d385-4d40-bdee-96d8fe2d42a1-kube-api-access-glbhl\") pod \"network-metrics-daemon-7cl5l\" (UID: \"b488f4f3-d385-4d40-bdee-96d8fe2d42a1\") " pod="openshift-multus/network-metrics-daemon-7cl5l" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.139851 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b488f4f3-d385-4d40-bdee-96d8fe2d42a1-metrics-certs\") pod \"network-metrics-daemon-7cl5l\" (UID: \"b488f4f3-d385-4d40-bdee-96d8fe2d42a1\") " pod="openshift-multus/network-metrics-daemon-7cl5l" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.143459 5004 generic.go:334] "Generic (PLEG): container finished" podID="15cdec0a-5925-4966-a30b-f60c503f633e" containerID="404ed4cf3ada652b914fc9fd3295d149810f22a6cc4ea044f934ba8ee2595209" exitCode=1 Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.143555 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" event={"ID":"15cdec0a-5925-4966-a30b-f60c503f633e","Type":"ContainerDied","Data":"404ed4cf3ada652b914fc9fd3295d149810f22a6cc4ea044f934ba8ee2595209"} Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.143635 5004 scope.go:117] "RemoveContainer" containerID="d0292bc1ff208619ec4cb469a8fbcddf2915fccba3a6ebc07d1ecc8bf1092007" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.145442 5004 scope.go:117] "RemoveContainer" containerID="404ed4cf3ada652b914fc9fd3295d149810f22a6cc4ea044f934ba8ee2595209" Dec 01 08:17:37 crc kubenswrapper[5004]: E1201 08:17:37.146139 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-knmdv_openshift-ovn-kubernetes(15cdec0a-5925-4966-a30b-f60c503f633e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" podUID="15cdec0a-5925-4966-a30b-f60c503f633e" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.149334 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mzsvz" event={"ID":"397b51b7-934a-41d1-a593-500a64161bd9","Type":"ContainerStarted","Data":"d83844478080e4829975fb6c8e0444d9fdebd44b08afcf45e7d0b04fc534a844"} Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.149398 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mzsvz" event={"ID":"397b51b7-934a-41d1-a593-500a64161bd9","Type":"ContainerStarted","Data":"4adac5633dfe89777bf019818bab9ee3a208cad9d929c96cf2cb86b18c2d4264"} Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.149849 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:37Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.170048 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:37Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.173610 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.173672 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.173690 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.173720 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.173738 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:37Z","lastTransitionTime":"2025-12-01T08:17:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.189703 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c470edd984db3b756e031e33f64a9bc5ca6b9c156ab479e5cd647745655a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:37Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.214959 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fd2f2b5d7dc3d7b8e1bb06fdf0eec3addfec3c230e044b6395d6cf535630ebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2250161e400b201f0528e33231c9007342002a37cd037a1b1e011f2aaaa22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:37Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.232691 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mzsvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"397b51b7-934a-41d1-a593-500a64161bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h49g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h49g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mzsvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:37Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.240625 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b488f4f3-d385-4d40-bdee-96d8fe2d42a1-metrics-certs\") pod \"network-metrics-daemon-7cl5l\" (UID: \"b488f4f3-d385-4d40-bdee-96d8fe2d42a1\") " pod="openshift-multus/network-metrics-daemon-7cl5l" Dec 01 08:17:37 crc kubenswrapper[5004]: E1201 08:17:37.240966 5004 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 08:17:37 crc kubenswrapper[5004]: E1201 08:17:37.241096 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b488f4f3-d385-4d40-bdee-96d8fe2d42a1-metrics-certs podName:b488f4f3-d385-4d40-bdee-96d8fe2d42a1 nodeName:}" failed. No retries permitted until 2025-12-01 08:17:37.741067242 +0000 UTC m=+35.306059254 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b488f4f3-d385-4d40-bdee-96d8fe2d42a1-metrics-certs") pod "network-metrics-daemon-7cl5l" (UID: "b488f4f3-d385-4d40-bdee-96d8fe2d42a1") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.242042 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glbhl\" (UniqueName: \"kubernetes.io/projected/b488f4f3-d385-4d40-bdee-96d8fe2d42a1-kube-api-access-glbhl\") pod \"network-metrics-daemon-7cl5l\" (UID: \"b488f4f3-d385-4d40-bdee-96d8fe2d42a1\") " pod="openshift-multus/network-metrics-daemon-7cl5l" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.265727 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43c9f762-d403-49b9-8294-d66976aca4dd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b358f641ff09bc8f4a452b0d4a8cf5f9790182f76779be31426d48d0278b0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d610c3a799d4a7df9ddfe2a28f2ced2e5ac4e96c67c93f8e6d3a0eee60d9ac48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43f9f72af3273da055eeb1e13ad6258a1b3b6b205402861beee9f87f02230297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e3cc2eca559417c71beed561569c4dc5b45ffb8407976e0695fc1b93219d37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c24fbeedbc7c40b6079b7ad89d8564a4f2cdaad2cf996e66682b082382da190\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:37Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.270464 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glbhl\" (UniqueName: \"kubernetes.io/projected/b488f4f3-d385-4d40-bdee-96d8fe2d42a1-kube-api-access-glbhl\") pod \"network-metrics-daemon-7cl5l\" (UID: \"b488f4f3-d385-4d40-bdee-96d8fe2d42a1\") " pod="openshift-multus/network-metrics-daemon-7cl5l" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.276697 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.276753 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.276770 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.277034 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.277061 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:37Z","lastTransitionTime":"2025-12-01T08:17:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.286259 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:37Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.302636 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jjms6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8828af41-beeb-47dd-96cf-3dbcb5175893\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b22220849cdb91592a44a67f9f6a09586146009483f46a0505a2dea3b02bccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvsch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jjms6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:37Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.321666 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9977ebb-82de-4e96-8763-0b5a84f8d4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf84c2eea7167eed95a4195d37bf784ffef310bc2079d8d06e1f47cb45a7864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwgxq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c1e51fa92aec80e93d0fce11c743b8c4fde23fc23d8626e8d5b3e25ab42350d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwgxq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fvdgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:37Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.339298 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7cl5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b488f4f3-d385-4d40-bdee-96d8fe2d42a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-glbhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-glbhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7cl5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:37Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.358405 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2fa377e-a3a6-46ce-af23-e677799f0115\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520771cf1ad6f9360064c5f304243c5878a2f1025c87d1001c97b2d5f95cb9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b65b9bc444101900c62fd90c7d45789d186a89148ac0fd9c7f0c6048c3f55ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbd3e682d207b8c18d6bcbc26aa2adae2a35645502d31b327ffddd51991e842\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4563974ed1467a87aab104756828e63853c282a61c4fcab1e757eb4da9afaa40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:37Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.381166 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.381246 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.381268 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.381296 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.381316 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:37Z","lastTransitionTime":"2025-12-01T08:17:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.381313 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dpkxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddcaf111-708a-45b3-a342-effd3061ab17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68ecba515f05ca83fdd0cdda10e3e5925a146aadb70ae17859586c12daf55dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd2f03a8020818d1cc98b6eeaf64a728e627a3401400c3af53654f8771ef02eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd2f03a8020818d1cc98b6eeaf64a728e627a3401400c3af53654f8771ef02eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8dd31c10eaf6bc673dec377a884040524418dd1891a96b2eb6e6e983979512a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8dd31c10eaf6bc673dec377a884040524418dd1891a96b2eb6e6e983979512a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a7a8ef66a9486ca3d13826df7aa39db2083cff2c386b4d0d858e0526e0f094d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a7a8ef66a9486ca3d13826df7aa39db2083cff2c386b4d0d858e0526e0f094d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd1c47c976c17bf17e4d660288ce30472372b6a7f33f6ffa6d96c1a4436275d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd1c47c976c17bf17e4d660288ce30472372b6a7f33f6ffa6d96c1a4436275d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b7c8e9648e5514ab1fcb63dc72822dd276a2d74987e2fb4f4800b26ec176be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b7c8e9648e5514ab1fcb63dc72822dd276a2d74987e2fb4f4800b26ec176be7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d7115db7fe248437f4b8918f06c9aa48b141fcd3f31add80faee81b4347e47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d7115db7fe248437f4b8918f06c9aa48b141fcd3f31add80faee81b4347e47b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dpkxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:37Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.398542 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ww6lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72500ffd-4ca3-4614-a3a2-bbdc5a7506c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://668cfaa2af20e2bb082fc47e1702cfd9f704c4fdf56a4d27cf25d6915e7cd18a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmsvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ww6lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:37Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.420981 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0ed2b6d-0c61-4639-bc3b-1c8effc4815d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45a0d7a12e7c114de1740a151e04f7f39844b709740cab958f156ac918af3228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d69a370af9f8f6c575f99da6bb01cee0f660dccb7e85f16f5d8874d0e263e1fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82baee0fd931fd7b41cd5a73ca7e653ef13dfa3d573991cbe3238d6b95f6fac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f341e99100e985f65c24647d976a24782dfae7f52e752b774f41f69af27fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad581564fc519328ae429bce757797f6d87d8648e862b008637a245866ec4cbe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T08:17:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 08:17:15.185125 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 08:17:15.186793 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071633747/tls.crt::/tmp/serving-cert-4071633747/tls.key\\\\\\\"\\\\nI1201 08:17:20.827614 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 08:17:20.834528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 08:17:20.834549 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 08:17:20.837611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 08:17:20.837630 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 08:17:20.848940 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 08:17:20.848957 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848965 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 08:17:20.848968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 08:17:20.848970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 08:17:20.848973 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 08:17:20.849115 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 08:17:20.854990 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2caeb2ebaf68110cb8728e60aae6db1b1fc48efae6018f66070766a4f42caa69\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:37Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.444328 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0ed2b6d-0c61-4639-bc3b-1c8effc4815d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45a0d7a12e7c114de1740a151e04f7f39844b709740cab958f156ac918af3228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d69a370af9f8f6c575f99da6bb01cee0f660dccb7e85f16f5d8874d0e263e1fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82baee0fd931fd7b41cd5a73ca7e653ef13dfa3d573991cbe3238d6b95f6fac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f341e99100e985f65c24647d976a24782dfae7f52e752b774f41f69af27fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad581564fc519328ae429bce757797f6d87d8648e862b008637a245866ec4cbe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T08:17:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 08:17:15.185125 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 08:17:15.186793 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071633747/tls.crt::/tmp/serving-cert-4071633747/tls.key\\\\\\\"\\\\nI1201 08:17:20.827614 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 08:17:20.834528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 08:17:20.834549 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 08:17:20.837611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 08:17:20.837630 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 08:17:20.848940 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 08:17:20.848957 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848965 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 08:17:20.848968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 08:17:20.848970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 08:17:20.848973 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 08:17:20.849115 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 08:17:20.854990 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2caeb2ebaf68110cb8728e60aae6db1b1fc48efae6018f66070766a4f42caa69\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:37Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.466540 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327477e7db10129d47ddf9078f32d0df2ffd4b8e59fc7bb77c17c60c63e9a936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:37Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.484291 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.484343 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.484359 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.484381 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.484398 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:37Z","lastTransitionTime":"2025-12-01T08:17:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.486677 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:37Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.506247 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:37Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.523593 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c470edd984db3b756e031e33f64a9bc5ca6b9c156ab479e5cd647745655a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:37Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.543666 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fd2f2b5d7dc3d7b8e1bb06fdf0eec3addfec3c230e044b6395d6cf535630ebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2250161e400b201f0528e33231c9007342002a37cd037a1b1e011f2aaaa22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:37Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.546102 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.546266 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:17:37 crc kubenswrapper[5004]: E1201 08:17:37.546467 5004 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 08:17:37 crc kubenswrapper[5004]: E1201 08:17:37.547266 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 08:17:53.546349405 +0000 UTC m=+51.111341417 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:17:37 crc kubenswrapper[5004]: E1201 08:17:37.547330 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 08:17:53.54731189 +0000 UTC m=+51.112303902 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.547395 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.547482 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.547548 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 08:17:37 crc kubenswrapper[5004]: E1201 08:17:37.547721 5004 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 08:17:37 crc kubenswrapper[5004]: E1201 08:17:37.547764 5004 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 08:17:37 crc kubenswrapper[5004]: E1201 08:17:37.547783 5004 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 08:17:37 crc kubenswrapper[5004]: E1201 08:17:37.547788 5004 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 08:17:37 crc kubenswrapper[5004]: E1201 08:17:37.547809 5004 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 08:17:37 crc kubenswrapper[5004]: E1201 08:17:37.547799 5004 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 08:17:37 crc kubenswrapper[5004]: E1201 08:17:37.547829 5004 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 08:17:37 crc kubenswrapper[5004]: E1201 08:17:37.547895 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 08:17:53.547866953 +0000 UTC m=+51.112858975 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 08:17:37 crc kubenswrapper[5004]: E1201 08:17:37.547958 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 08:17:53.547914545 +0000 UTC m=+51.112906617 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 08:17:37 crc kubenswrapper[5004]: E1201 08:17:37.548000 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 08:17:53.547981266 +0000 UTC m=+51.112973458 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.567865 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zjksw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70e79009-93be-49c4-a6b3-e8a06bcea7f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad3ea204ad731500a340e639f0e36abe28ba50464f1b2ff9b009e39c234cf708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd8m6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zjksw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:37Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.587890 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.587946 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.587964 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.587989 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.588006 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:37Z","lastTransitionTime":"2025-12-01T08:17:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.597205 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15cdec0a-5925-4966-a30b-f60c503f633e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f528b94ff12c7a804930dca4b93c23334a9ae3a53317750cd34b5695f08b4582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f001b70a83da28b1ea78a94f1cd70dfc63b3de5a21b041932d33c2ac970c986b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c5c043438b515e37a6d6d5c47b92479bbb7322fe35c3a1bc57ee7b3a746544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23df33e45cc27e4d343ae6ec3fb88f2dfc125ba7da424515d4bee5ec19281832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4a7cab8e1594814a687b1433a92642a19eff7c2eb12f96fa06f08a2e270733d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1844a18a6109c25a25b9c8cb3ff65e69cc657f66be03dd97e883d24a774b7638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://404ed4cf3ada652b914fc9fd3295d149810f22a6cc4ea044f934ba8ee2595209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0292bc1ff208619ec4cb469a8fbcddf2915fccba3a6ebc07d1ecc8bf1092007\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T08:17:33Z\\\",\\\"message\\\":\\\"al\\\\nI1201 08:17:33.210759 6283 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1201 08:17:33.210788 6283 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1201 08:17:33.211067 6283 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 08:17:33.211080 6283 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1201 08:17:33.211080 6283 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1201 08:17:33.211177 6283 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 08:17:33.211209 6283 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 08:17:33.211241 6283 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1201 08:17:33.211253 6283 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1201 08:17:33.211264 6283 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 08:17:33.211283 6283 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 08:17:33.211300 6283 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 08:17:33.211310 6283 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1201 08:17:33.211316 6283 factory.go:656] Stopping watch factory\\\\nI1201 08:17:33.211312 6283 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 08:17:33.211333 6283 ovnkube.go:599] Stopped ovnkube\\\\nI1201 08:17:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://404ed4cf3ada652b914fc9fd3295d149810f22a6cc4ea044f934ba8ee2595209\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T08:17:36Z\\\",\\\"message\\\":\\\"ping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 08:17:35.520098 6428 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 08:17:35.520395 6428 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 08:17:35.520708 6428 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 08:17:35.520930 6428 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 08:17:35.521362 6428 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1201 08:17:35.521431 6428 factory.go:656] Stopping watch factory\\\\nI1201 08:17:35.521454 6428 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1201 08:17:35.523766 6428 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1201 08:17:35.523786 6428 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1201 08:17:35.523846 6428 ovnkube.go:599] Stopped ovnkube\\\\nI1201 08:17:35.523873 6428 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1201 08:17:35.523949 6428 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f231e0092e1a85cc5d6e00fb8d3bd982223b670191ac3dc21d81ff1fab462a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-knmdv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:37Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.614524 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mzsvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"397b51b7-934a-41d1-a593-500a64161bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4adac5633dfe89777bf019818bab9ee3a208cad9d929c96cf2cb86b18c2d4264\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h49g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d83844478080e4829975fb6c8e0444d9fdebd44b08afcf45e7d0b04fc534a844\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h49g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mzsvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:37Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.631683 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:37Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.646409 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jjms6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8828af41-beeb-47dd-96cf-3dbcb5175893\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b22220849cdb91592a44a67f9f6a09586146009483f46a0505a2dea3b02bccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvsch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jjms6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:37Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.663875 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9977ebb-82de-4e96-8763-0b5a84f8d4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf84c2eea7167eed95a4195d37bf784ffef310bc2079d8d06e1f47cb45a7864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwgxq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c1e51fa92aec80e93d0fce11c743b8c4fde23fc23d8626e8d5b3e25ab42350d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwgxq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fvdgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:37Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.681476 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7cl5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b488f4f3-d385-4d40-bdee-96d8fe2d42a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-glbhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-glbhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7cl5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:37Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.693060 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.693142 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.693183 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.693218 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.693244 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:37Z","lastTransitionTime":"2025-12-01T08:17:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.720886 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43c9f762-d403-49b9-8294-d66976aca4dd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b358f641ff09bc8f4a452b0d4a8cf5f9790182f76779be31426d48d0278b0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d610c3a799d4a7df9ddfe2a28f2ced2e5ac4e96c67c93f8e6d3a0eee60d9ac48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43f9f72af3273da055eeb1e13ad6258a1b3b6b205402861beee9f87f02230297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e3cc2eca559417c71beed561569c4dc5b45ffb8407976e0695fc1b93219d37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c24fbeedbc7c40b6079b7ad89d8564a4f2cdaad2cf996e66682b082382da190\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:37Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.744256 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dpkxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddcaf111-708a-45b3-a342-effd3061ab17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68ecba515f05ca83fdd0cdda10e3e5925a146aadb70ae17859586c12daf55dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd2f03a8020818d1cc98b6eeaf64a728e627a3401400c3af53654f8771ef02eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd2f03a8020818d1cc98b6eeaf64a728e627a3401400c3af53654f8771ef02eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8dd31c10eaf6bc673dec377a884040524418dd1891a96b2eb6e6e983979512a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8dd31c10eaf6bc673dec377a884040524418dd1891a96b2eb6e6e983979512a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a7a8ef66a9486ca3d13826df7aa39db2083cff2c386b4d0d858e0526e0f094d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a7a8ef66a9486ca3d13826df7aa39db2083cff2c386b4d0d858e0526e0f094d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd1c47c976c17bf17e4d660288ce30472372b6a7f33f6ffa6d96c1a4436275d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd1c47c976c17bf17e4d660288ce30472372b6a7f33f6ffa6d96c1a4436275d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b7c8e9648e5514ab1fcb63dc72822dd276a2d74987e2fb4f4800b26ec176be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b7c8e9648e5514ab1fcb63dc72822dd276a2d74987e2fb4f4800b26ec176be7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d7115db7fe248437f4b8918f06c9aa48b141fcd3f31add80faee81b4347e47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d7115db7fe248437f4b8918f06c9aa48b141fcd3f31add80faee81b4347e47b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dpkxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:37Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.750198 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b488f4f3-d385-4d40-bdee-96d8fe2d42a1-metrics-certs\") pod \"network-metrics-daemon-7cl5l\" (UID: \"b488f4f3-d385-4d40-bdee-96d8fe2d42a1\") " pod="openshift-multus/network-metrics-daemon-7cl5l" Dec 01 08:17:37 crc kubenswrapper[5004]: E1201 08:17:37.750346 5004 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 08:17:37 crc kubenswrapper[5004]: E1201 08:17:37.750418 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b488f4f3-d385-4d40-bdee-96d8fe2d42a1-metrics-certs podName:b488f4f3-d385-4d40-bdee-96d8fe2d42a1 nodeName:}" failed. No retries permitted until 2025-12-01 08:17:38.750397281 +0000 UTC m=+36.315389293 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b488f4f3-d385-4d40-bdee-96d8fe2d42a1-metrics-certs") pod "network-metrics-daemon-7cl5l" (UID: "b488f4f3-d385-4d40-bdee-96d8fe2d42a1") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.758154 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.758227 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.758270 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:17:37 crc kubenswrapper[5004]: E1201 08:17:37.758304 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.758210 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ww6lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72500ffd-4ca3-4614-a3a2-bbdc5a7506c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://668cfaa2af20e2bb082fc47e1702cfd9f704c4fdf56a4d27cf25d6915e7cd18a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmsvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ww6lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:37Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:37 crc kubenswrapper[5004]: E1201 08:17:37.758458 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 08:17:37 crc kubenswrapper[5004]: E1201 08:17:37.758624 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.781480 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2fa377e-a3a6-46ce-af23-e677799f0115\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520771cf1ad6f9360064c5f304243c5878a2f1025c87d1001c97b2d5f95cb9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b65b9bc444101900c62fd90c7d45789d186a89148ac0fd9c7f0c6048c3f55ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbd3e682d207b8c18d6bcbc26aa2adae2a35645502d31b327ffddd51991e842\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4563974ed1467a87aab104756828e63853c282a61c4fcab1e757eb4da9afaa40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:37Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.797009 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.797040 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.797050 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.797065 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.797077 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:37Z","lastTransitionTime":"2025-12-01T08:17:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.899746 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.899816 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.899833 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.899856 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:37 crc kubenswrapper[5004]: I1201 08:17:37.899874 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:37Z","lastTransitionTime":"2025-12-01T08:17:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:38 crc kubenswrapper[5004]: I1201 08:17:38.002926 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:38 crc kubenswrapper[5004]: I1201 08:17:38.002982 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:38 crc kubenswrapper[5004]: I1201 08:17:38.002998 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:38 crc kubenswrapper[5004]: I1201 08:17:38.003020 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:38 crc kubenswrapper[5004]: I1201 08:17:38.003037 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:38Z","lastTransitionTime":"2025-12-01T08:17:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:38 crc kubenswrapper[5004]: I1201 08:17:38.106050 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:38 crc kubenswrapper[5004]: I1201 08:17:38.106106 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:38 crc kubenswrapper[5004]: I1201 08:17:38.106127 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:38 crc kubenswrapper[5004]: I1201 08:17:38.106149 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:38 crc kubenswrapper[5004]: I1201 08:17:38.106166 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:38Z","lastTransitionTime":"2025-12-01T08:17:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:38 crc kubenswrapper[5004]: I1201 08:17:38.155545 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-knmdv_15cdec0a-5925-4966-a30b-f60c503f633e/ovnkube-controller/1.log" Dec 01 08:17:38 crc kubenswrapper[5004]: I1201 08:17:38.209265 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:38 crc kubenswrapper[5004]: I1201 08:17:38.209343 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:38 crc kubenswrapper[5004]: I1201 08:17:38.209364 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:38 crc kubenswrapper[5004]: I1201 08:17:38.209389 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:38 crc kubenswrapper[5004]: I1201 08:17:38.209407 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:38Z","lastTransitionTime":"2025-12-01T08:17:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:38 crc kubenswrapper[5004]: I1201 08:17:38.312821 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:38 crc kubenswrapper[5004]: I1201 08:17:38.312880 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:38 crc kubenswrapper[5004]: I1201 08:17:38.312904 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:38 crc kubenswrapper[5004]: I1201 08:17:38.312929 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:38 crc kubenswrapper[5004]: I1201 08:17:38.312951 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:38Z","lastTransitionTime":"2025-12-01T08:17:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:38 crc kubenswrapper[5004]: I1201 08:17:38.415759 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:38 crc kubenswrapper[5004]: I1201 08:17:38.415821 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:38 crc kubenswrapper[5004]: I1201 08:17:38.415838 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:38 crc kubenswrapper[5004]: I1201 08:17:38.415861 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:38 crc kubenswrapper[5004]: I1201 08:17:38.415877 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:38Z","lastTransitionTime":"2025-12-01T08:17:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:38 crc kubenswrapper[5004]: I1201 08:17:38.518310 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:38 crc kubenswrapper[5004]: I1201 08:17:38.518375 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:38 crc kubenswrapper[5004]: I1201 08:17:38.518395 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:38 crc kubenswrapper[5004]: I1201 08:17:38.518419 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:38 crc kubenswrapper[5004]: I1201 08:17:38.518456 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:38Z","lastTransitionTime":"2025-12-01T08:17:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:38 crc kubenswrapper[5004]: I1201 08:17:38.620965 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:38 crc kubenswrapper[5004]: I1201 08:17:38.621026 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:38 crc kubenswrapper[5004]: I1201 08:17:38.621047 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:38 crc kubenswrapper[5004]: I1201 08:17:38.621074 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:38 crc kubenswrapper[5004]: I1201 08:17:38.621093 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:38Z","lastTransitionTime":"2025-12-01T08:17:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:38 crc kubenswrapper[5004]: I1201 08:17:38.723893 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:38 crc kubenswrapper[5004]: I1201 08:17:38.723978 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:38 crc kubenswrapper[5004]: I1201 08:17:38.723991 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:38 crc kubenswrapper[5004]: I1201 08:17:38.724009 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:38 crc kubenswrapper[5004]: I1201 08:17:38.724022 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:38Z","lastTransitionTime":"2025-12-01T08:17:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:38 crc kubenswrapper[5004]: I1201 08:17:38.758957 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cl5l" Dec 01 08:17:38 crc kubenswrapper[5004]: E1201 08:17:38.759154 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cl5l" podUID="b488f4f3-d385-4d40-bdee-96d8fe2d42a1" Dec 01 08:17:38 crc kubenswrapper[5004]: I1201 08:17:38.763315 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b488f4f3-d385-4d40-bdee-96d8fe2d42a1-metrics-certs\") pod \"network-metrics-daemon-7cl5l\" (UID: \"b488f4f3-d385-4d40-bdee-96d8fe2d42a1\") " pod="openshift-multus/network-metrics-daemon-7cl5l" Dec 01 08:17:38 crc kubenswrapper[5004]: E1201 08:17:38.763475 5004 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 08:17:38 crc kubenswrapper[5004]: E1201 08:17:38.763640 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b488f4f3-d385-4d40-bdee-96d8fe2d42a1-metrics-certs podName:b488f4f3-d385-4d40-bdee-96d8fe2d42a1 nodeName:}" failed. No retries permitted until 2025-12-01 08:17:40.763612183 +0000 UTC m=+38.328604205 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b488f4f3-d385-4d40-bdee-96d8fe2d42a1-metrics-certs") pod "network-metrics-daemon-7cl5l" (UID: "b488f4f3-d385-4d40-bdee-96d8fe2d42a1") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 08:17:38 crc kubenswrapper[5004]: I1201 08:17:38.827083 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:38 crc kubenswrapper[5004]: I1201 08:17:38.827144 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:38 crc kubenswrapper[5004]: I1201 08:17:38.827164 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:38 crc kubenswrapper[5004]: I1201 08:17:38.827190 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:38 crc kubenswrapper[5004]: I1201 08:17:38.827211 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:38Z","lastTransitionTime":"2025-12-01T08:17:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:38 crc kubenswrapper[5004]: I1201 08:17:38.930633 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:38 crc kubenswrapper[5004]: I1201 08:17:38.930684 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:38 crc kubenswrapper[5004]: I1201 08:17:38.930696 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:38 crc kubenswrapper[5004]: I1201 08:17:38.930712 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:38 crc kubenswrapper[5004]: I1201 08:17:38.930726 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:38Z","lastTransitionTime":"2025-12-01T08:17:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:39 crc kubenswrapper[5004]: I1201 08:17:39.033164 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:39 crc kubenswrapper[5004]: I1201 08:17:39.033277 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:39 crc kubenswrapper[5004]: I1201 08:17:39.033294 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:39 crc kubenswrapper[5004]: I1201 08:17:39.033318 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:39 crc kubenswrapper[5004]: I1201 08:17:39.033336 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:39Z","lastTransitionTime":"2025-12-01T08:17:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:39 crc kubenswrapper[5004]: I1201 08:17:39.136439 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:39 crc kubenswrapper[5004]: I1201 08:17:39.136507 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:39 crc kubenswrapper[5004]: I1201 08:17:39.136525 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:39 crc kubenswrapper[5004]: I1201 08:17:39.136548 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:39 crc kubenswrapper[5004]: I1201 08:17:39.136593 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:39Z","lastTransitionTime":"2025-12-01T08:17:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:39 crc kubenswrapper[5004]: I1201 08:17:39.239619 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:39 crc kubenswrapper[5004]: I1201 08:17:39.239678 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:39 crc kubenswrapper[5004]: I1201 08:17:39.239718 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:39 crc kubenswrapper[5004]: I1201 08:17:39.239746 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:39 crc kubenswrapper[5004]: I1201 08:17:39.239763 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:39Z","lastTransitionTime":"2025-12-01T08:17:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:39 crc kubenswrapper[5004]: I1201 08:17:39.342879 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:39 crc kubenswrapper[5004]: I1201 08:17:39.342931 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:39 crc kubenswrapper[5004]: I1201 08:17:39.342948 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:39 crc kubenswrapper[5004]: I1201 08:17:39.342975 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:39 crc kubenswrapper[5004]: I1201 08:17:39.342997 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:39Z","lastTransitionTime":"2025-12-01T08:17:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:39 crc kubenswrapper[5004]: I1201 08:17:39.446404 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:39 crc kubenswrapper[5004]: I1201 08:17:39.446461 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:39 crc kubenswrapper[5004]: I1201 08:17:39.446497 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:39 crc kubenswrapper[5004]: I1201 08:17:39.446525 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:39 crc kubenswrapper[5004]: I1201 08:17:39.446543 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:39Z","lastTransitionTime":"2025-12-01T08:17:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:39 crc kubenswrapper[5004]: I1201 08:17:39.550015 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:39 crc kubenswrapper[5004]: I1201 08:17:39.550082 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:39 crc kubenswrapper[5004]: I1201 08:17:39.550105 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:39 crc kubenswrapper[5004]: I1201 08:17:39.550136 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:39 crc kubenswrapper[5004]: I1201 08:17:39.550160 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:39Z","lastTransitionTime":"2025-12-01T08:17:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:39 crc kubenswrapper[5004]: I1201 08:17:39.653302 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:39 crc kubenswrapper[5004]: I1201 08:17:39.653363 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:39 crc kubenswrapper[5004]: I1201 08:17:39.653380 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:39 crc kubenswrapper[5004]: I1201 08:17:39.653403 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:39 crc kubenswrapper[5004]: I1201 08:17:39.653421 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:39Z","lastTransitionTime":"2025-12-01T08:17:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:39 crc kubenswrapper[5004]: I1201 08:17:39.757271 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:39 crc kubenswrapper[5004]: I1201 08:17:39.757316 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:39 crc kubenswrapper[5004]: I1201 08:17:39.757329 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:39 crc kubenswrapper[5004]: I1201 08:17:39.757344 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:39 crc kubenswrapper[5004]: I1201 08:17:39.757356 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:39Z","lastTransitionTime":"2025-12-01T08:17:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:39 crc kubenswrapper[5004]: I1201 08:17:39.757932 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 08:17:39 crc kubenswrapper[5004]: E1201 08:17:39.758023 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 08:17:39 crc kubenswrapper[5004]: I1201 08:17:39.758365 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 08:17:39 crc kubenswrapper[5004]: E1201 08:17:39.758440 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 08:17:39 crc kubenswrapper[5004]: I1201 08:17:39.758369 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:17:39 crc kubenswrapper[5004]: E1201 08:17:39.758850 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 08:17:39 crc kubenswrapper[5004]: I1201 08:17:39.860222 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:39 crc kubenswrapper[5004]: I1201 08:17:39.860282 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:39 crc kubenswrapper[5004]: I1201 08:17:39.860300 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:39 crc kubenswrapper[5004]: I1201 08:17:39.860324 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:39 crc kubenswrapper[5004]: I1201 08:17:39.860341 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:39Z","lastTransitionTime":"2025-12-01T08:17:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:39 crc kubenswrapper[5004]: I1201 08:17:39.963699 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:39 crc kubenswrapper[5004]: I1201 08:17:39.963768 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:39 crc kubenswrapper[5004]: I1201 08:17:39.963795 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:39 crc kubenswrapper[5004]: I1201 08:17:39.963826 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:39 crc kubenswrapper[5004]: I1201 08:17:39.963846 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:39Z","lastTransitionTime":"2025-12-01T08:17:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:40 crc kubenswrapper[5004]: I1201 08:17:40.066521 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:40 crc kubenswrapper[5004]: I1201 08:17:40.066613 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:40 crc kubenswrapper[5004]: I1201 08:17:40.066631 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:40 crc kubenswrapper[5004]: I1201 08:17:40.066655 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:40 crc kubenswrapper[5004]: I1201 08:17:40.066672 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:40Z","lastTransitionTime":"2025-12-01T08:17:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:40 crc kubenswrapper[5004]: I1201 08:17:40.169386 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:40 crc kubenswrapper[5004]: I1201 08:17:40.169494 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:40 crc kubenswrapper[5004]: I1201 08:17:40.169517 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:40 crc kubenswrapper[5004]: I1201 08:17:40.169540 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:40 crc kubenswrapper[5004]: I1201 08:17:40.169595 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:40Z","lastTransitionTime":"2025-12-01T08:17:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:40 crc kubenswrapper[5004]: I1201 08:17:40.273255 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:40 crc kubenswrapper[5004]: I1201 08:17:40.273321 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:40 crc kubenswrapper[5004]: I1201 08:17:40.273338 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:40 crc kubenswrapper[5004]: I1201 08:17:40.273364 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:40 crc kubenswrapper[5004]: I1201 08:17:40.273381 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:40Z","lastTransitionTime":"2025-12-01T08:17:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:40 crc kubenswrapper[5004]: I1201 08:17:40.375847 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:40 crc kubenswrapper[5004]: I1201 08:17:40.375907 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:40 crc kubenswrapper[5004]: I1201 08:17:40.375924 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:40 crc kubenswrapper[5004]: I1201 08:17:40.375948 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:40 crc kubenswrapper[5004]: I1201 08:17:40.375965 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:40Z","lastTransitionTime":"2025-12-01T08:17:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:40 crc kubenswrapper[5004]: I1201 08:17:40.478350 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:40 crc kubenswrapper[5004]: I1201 08:17:40.478414 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:40 crc kubenswrapper[5004]: I1201 08:17:40.478431 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:40 crc kubenswrapper[5004]: I1201 08:17:40.478453 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:40 crc kubenswrapper[5004]: I1201 08:17:40.478471 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:40Z","lastTransitionTime":"2025-12-01T08:17:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:40 crc kubenswrapper[5004]: I1201 08:17:40.581357 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:40 crc kubenswrapper[5004]: I1201 08:17:40.581393 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:40 crc kubenswrapper[5004]: I1201 08:17:40.581401 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:40 crc kubenswrapper[5004]: I1201 08:17:40.581418 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:40 crc kubenswrapper[5004]: I1201 08:17:40.581427 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:40Z","lastTransitionTime":"2025-12-01T08:17:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:40 crc kubenswrapper[5004]: I1201 08:17:40.684021 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:40 crc kubenswrapper[5004]: I1201 08:17:40.684076 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:40 crc kubenswrapper[5004]: I1201 08:17:40.684093 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:40 crc kubenswrapper[5004]: I1201 08:17:40.684116 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:40 crc kubenswrapper[5004]: I1201 08:17:40.684133 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:40Z","lastTransitionTime":"2025-12-01T08:17:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:40 crc kubenswrapper[5004]: I1201 08:17:40.758887 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cl5l" Dec 01 08:17:40 crc kubenswrapper[5004]: E1201 08:17:40.759101 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cl5l" podUID="b488f4f3-d385-4d40-bdee-96d8fe2d42a1" Dec 01 08:17:40 crc kubenswrapper[5004]: I1201 08:17:40.786408 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:40 crc kubenswrapper[5004]: I1201 08:17:40.786471 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:40 crc kubenswrapper[5004]: I1201 08:17:40.786488 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:40 crc kubenswrapper[5004]: I1201 08:17:40.786515 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:40 crc kubenswrapper[5004]: I1201 08:17:40.786532 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:40Z","lastTransitionTime":"2025-12-01T08:17:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:40 crc kubenswrapper[5004]: I1201 08:17:40.787711 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b488f4f3-d385-4d40-bdee-96d8fe2d42a1-metrics-certs\") pod \"network-metrics-daemon-7cl5l\" (UID: \"b488f4f3-d385-4d40-bdee-96d8fe2d42a1\") " pod="openshift-multus/network-metrics-daemon-7cl5l" Dec 01 08:17:40 crc kubenswrapper[5004]: E1201 08:17:40.787895 5004 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 08:17:40 crc kubenswrapper[5004]: E1201 08:17:40.787985 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b488f4f3-d385-4d40-bdee-96d8fe2d42a1-metrics-certs podName:b488f4f3-d385-4d40-bdee-96d8fe2d42a1 nodeName:}" failed. No retries permitted until 2025-12-01 08:17:44.787957832 +0000 UTC m=+42.352949854 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b488f4f3-d385-4d40-bdee-96d8fe2d42a1-metrics-certs") pod "network-metrics-daemon-7cl5l" (UID: "b488f4f3-d385-4d40-bdee-96d8fe2d42a1") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 08:17:40 crc kubenswrapper[5004]: I1201 08:17:40.888726 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:40 crc kubenswrapper[5004]: I1201 08:17:40.888789 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:40 crc kubenswrapper[5004]: I1201 08:17:40.888807 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:40 crc kubenswrapper[5004]: I1201 08:17:40.888834 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:40 crc kubenswrapper[5004]: I1201 08:17:40.888853 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:40Z","lastTransitionTime":"2025-12-01T08:17:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:40 crc kubenswrapper[5004]: I1201 08:17:40.992448 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:40 crc kubenswrapper[5004]: I1201 08:17:40.992504 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:40 crc kubenswrapper[5004]: I1201 08:17:40.992520 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:40 crc kubenswrapper[5004]: I1201 08:17:40.992542 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:40 crc kubenswrapper[5004]: I1201 08:17:40.992592 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:40Z","lastTransitionTime":"2025-12-01T08:17:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:41 crc kubenswrapper[5004]: I1201 08:17:41.095876 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:41 crc kubenswrapper[5004]: I1201 08:17:41.095931 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:41 crc kubenswrapper[5004]: I1201 08:17:41.095950 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:41 crc kubenswrapper[5004]: I1201 08:17:41.095978 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:41 crc kubenswrapper[5004]: I1201 08:17:41.095995 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:41Z","lastTransitionTime":"2025-12-01T08:17:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:41 crc kubenswrapper[5004]: I1201 08:17:41.198140 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:41 crc kubenswrapper[5004]: I1201 08:17:41.198202 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:41 crc kubenswrapper[5004]: I1201 08:17:41.198219 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:41 crc kubenswrapper[5004]: I1201 08:17:41.198243 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:41 crc kubenswrapper[5004]: I1201 08:17:41.198262 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:41Z","lastTransitionTime":"2025-12-01T08:17:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:41 crc kubenswrapper[5004]: I1201 08:17:41.301444 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:41 crc kubenswrapper[5004]: I1201 08:17:41.301505 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:41 crc kubenswrapper[5004]: I1201 08:17:41.301525 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:41 crc kubenswrapper[5004]: I1201 08:17:41.301549 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:41 crc kubenswrapper[5004]: I1201 08:17:41.301593 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:41Z","lastTransitionTime":"2025-12-01T08:17:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:41 crc kubenswrapper[5004]: I1201 08:17:41.403853 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:41 crc kubenswrapper[5004]: I1201 08:17:41.403914 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:41 crc kubenswrapper[5004]: I1201 08:17:41.403933 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:41 crc kubenswrapper[5004]: I1201 08:17:41.403957 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:41 crc kubenswrapper[5004]: I1201 08:17:41.403972 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:41Z","lastTransitionTime":"2025-12-01T08:17:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:41 crc kubenswrapper[5004]: I1201 08:17:41.506736 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:41 crc kubenswrapper[5004]: I1201 08:17:41.506794 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:41 crc kubenswrapper[5004]: I1201 08:17:41.506811 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:41 crc kubenswrapper[5004]: I1201 08:17:41.506871 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:41 crc kubenswrapper[5004]: I1201 08:17:41.506899 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:41Z","lastTransitionTime":"2025-12-01T08:17:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:41 crc kubenswrapper[5004]: I1201 08:17:41.609245 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:41 crc kubenswrapper[5004]: I1201 08:17:41.609318 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:41 crc kubenswrapper[5004]: I1201 08:17:41.609336 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:41 crc kubenswrapper[5004]: I1201 08:17:41.609361 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:41 crc kubenswrapper[5004]: I1201 08:17:41.609378 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:41Z","lastTransitionTime":"2025-12-01T08:17:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:41 crc kubenswrapper[5004]: I1201 08:17:41.713904 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:41 crc kubenswrapper[5004]: I1201 08:17:41.713972 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:41 crc kubenswrapper[5004]: I1201 08:17:41.713983 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:41 crc kubenswrapper[5004]: I1201 08:17:41.713999 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:41 crc kubenswrapper[5004]: I1201 08:17:41.714013 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:41Z","lastTransitionTime":"2025-12-01T08:17:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:41 crc kubenswrapper[5004]: I1201 08:17:41.758700 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:17:41 crc kubenswrapper[5004]: I1201 08:17:41.758749 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 08:17:41 crc kubenswrapper[5004]: I1201 08:17:41.758769 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 08:17:41 crc kubenswrapper[5004]: E1201 08:17:41.758876 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 08:17:41 crc kubenswrapper[5004]: E1201 08:17:41.758990 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 08:17:41 crc kubenswrapper[5004]: E1201 08:17:41.759111 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 08:17:41 crc kubenswrapper[5004]: I1201 08:17:41.817666 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:41 crc kubenswrapper[5004]: I1201 08:17:41.817770 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:41 crc kubenswrapper[5004]: I1201 08:17:41.817792 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:41 crc kubenswrapper[5004]: I1201 08:17:41.817815 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:41 crc kubenswrapper[5004]: I1201 08:17:41.817830 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:41Z","lastTransitionTime":"2025-12-01T08:17:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:41 crc kubenswrapper[5004]: I1201 08:17:41.920310 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:41 crc kubenswrapper[5004]: I1201 08:17:41.920361 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:41 crc kubenswrapper[5004]: I1201 08:17:41.920378 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:41 crc kubenswrapper[5004]: I1201 08:17:41.920401 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:41 crc kubenswrapper[5004]: I1201 08:17:41.920418 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:41Z","lastTransitionTime":"2025-12-01T08:17:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:42 crc kubenswrapper[5004]: I1201 08:17:42.023588 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:42 crc kubenswrapper[5004]: I1201 08:17:42.023633 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:42 crc kubenswrapper[5004]: I1201 08:17:42.023645 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:42 crc kubenswrapper[5004]: I1201 08:17:42.023661 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:42 crc kubenswrapper[5004]: I1201 08:17:42.023672 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:42Z","lastTransitionTime":"2025-12-01T08:17:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:42 crc kubenswrapper[5004]: I1201 08:17:42.126325 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:42 crc kubenswrapper[5004]: I1201 08:17:42.126385 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:42 crc kubenswrapper[5004]: I1201 08:17:42.126402 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:42 crc kubenswrapper[5004]: I1201 08:17:42.126427 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:42 crc kubenswrapper[5004]: I1201 08:17:42.126444 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:42Z","lastTransitionTime":"2025-12-01T08:17:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:42 crc kubenswrapper[5004]: I1201 08:17:42.229409 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:42 crc kubenswrapper[5004]: I1201 08:17:42.229454 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:42 crc kubenswrapper[5004]: I1201 08:17:42.229466 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:42 crc kubenswrapper[5004]: I1201 08:17:42.229483 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:42 crc kubenswrapper[5004]: I1201 08:17:42.229496 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:42Z","lastTransitionTime":"2025-12-01T08:17:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:42 crc kubenswrapper[5004]: I1201 08:17:42.332838 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:42 crc kubenswrapper[5004]: I1201 08:17:42.332889 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:42 crc kubenswrapper[5004]: I1201 08:17:42.332901 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:42 crc kubenswrapper[5004]: I1201 08:17:42.332923 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:42 crc kubenswrapper[5004]: I1201 08:17:42.332937 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:42Z","lastTransitionTime":"2025-12-01T08:17:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:42 crc kubenswrapper[5004]: I1201 08:17:42.435549 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:42 crc kubenswrapper[5004]: I1201 08:17:42.435666 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:42 crc kubenswrapper[5004]: I1201 08:17:42.435685 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:42 crc kubenswrapper[5004]: I1201 08:17:42.435713 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:42 crc kubenswrapper[5004]: I1201 08:17:42.435732 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:42Z","lastTransitionTime":"2025-12-01T08:17:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:42 crc kubenswrapper[5004]: I1201 08:17:42.538873 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:42 crc kubenswrapper[5004]: I1201 08:17:42.538936 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:42 crc kubenswrapper[5004]: I1201 08:17:42.538952 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:42 crc kubenswrapper[5004]: I1201 08:17:42.538981 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:42 crc kubenswrapper[5004]: I1201 08:17:42.538998 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:42Z","lastTransitionTime":"2025-12-01T08:17:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:42 crc kubenswrapper[5004]: I1201 08:17:42.641808 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:42 crc kubenswrapper[5004]: I1201 08:17:42.641855 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:42 crc kubenswrapper[5004]: I1201 08:17:42.641873 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:42 crc kubenswrapper[5004]: I1201 08:17:42.641896 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:42 crc kubenswrapper[5004]: I1201 08:17:42.641913 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:42Z","lastTransitionTime":"2025-12-01T08:17:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:42 crc kubenswrapper[5004]: I1201 08:17:42.744280 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:42 crc kubenswrapper[5004]: I1201 08:17:42.744382 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:42 crc kubenswrapper[5004]: I1201 08:17:42.744402 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:42 crc kubenswrapper[5004]: I1201 08:17:42.744464 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:42 crc kubenswrapper[5004]: I1201 08:17:42.744494 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:42Z","lastTransitionTime":"2025-12-01T08:17:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:42 crc kubenswrapper[5004]: I1201 08:17:42.758714 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cl5l" Dec 01 08:17:42 crc kubenswrapper[5004]: E1201 08:17:42.758902 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cl5l" podUID="b488f4f3-d385-4d40-bdee-96d8fe2d42a1" Dec 01 08:17:42 crc kubenswrapper[5004]: I1201 08:17:42.791601 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43c9f762-d403-49b9-8294-d66976aca4dd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b358f641ff09bc8f4a452b0d4a8cf5f9790182f76779be31426d48d0278b0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d610c3a799d4a7df9ddfe2a28f2ced2e5ac4e96c67c93f8e6d3a0eee60d9ac48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43f9f72af3273da055eeb1e13ad6258a1b3b6b205402861beee9f87f02230297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e3cc2eca559417c71beed561569c4dc5b45ffb8407976e0695fc1b93219d37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c24fbeedbc7c40b6079b7ad89d8564a4f2cdaad2cf996e66682b082382da190\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:42Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:42 crc kubenswrapper[5004]: I1201 08:17:42.809028 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:42Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:42 crc kubenswrapper[5004]: I1201 08:17:42.819248 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jjms6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8828af41-beeb-47dd-96cf-3dbcb5175893\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b22220849cdb91592a44a67f9f6a09586146009483f46a0505a2dea3b02bccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvsch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jjms6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:42Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:42 crc kubenswrapper[5004]: I1201 08:17:42.833193 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9977ebb-82de-4e96-8763-0b5a84f8d4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf84c2eea7167eed95a4195d37bf784ffef310bc2079d8d06e1f47cb45a7864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwgxq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c1e51fa92aec80e93d0fce11c743b8c4fde23fc23d8626e8d5b3e25ab42350d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwgxq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fvdgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:42Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:42 crc kubenswrapper[5004]: I1201 08:17:42.845335 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7cl5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b488f4f3-d385-4d40-bdee-96d8fe2d42a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-glbhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-glbhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7cl5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:42Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:42 crc kubenswrapper[5004]: I1201 08:17:42.847446 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:42 crc kubenswrapper[5004]: I1201 08:17:42.847499 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:42 crc kubenswrapper[5004]: I1201 08:17:42.847515 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:42 crc kubenswrapper[5004]: I1201 08:17:42.847540 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:42 crc kubenswrapper[5004]: I1201 08:17:42.847557 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:42Z","lastTransitionTime":"2025-12-01T08:17:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:42 crc kubenswrapper[5004]: I1201 08:17:42.858371 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2fa377e-a3a6-46ce-af23-e677799f0115\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520771cf1ad6f9360064c5f304243c5878a2f1025c87d1001c97b2d5f95cb9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b65b9bc444101900c62fd90c7d45789d186a89148ac0fd9c7f0c6048c3f55ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbd3e682d207b8c18d6bcbc26aa2adae2a35645502d31b327ffddd51991e842\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4563974ed1467a87aab104756828e63853c282a61c4fcab1e757eb4da9afaa40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:42Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:42 crc kubenswrapper[5004]: I1201 08:17:42.878794 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dpkxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddcaf111-708a-45b3-a342-effd3061ab17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68ecba515f05ca83fdd0cdda10e3e5925a146aadb70ae17859586c12daf55dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd2f03a8020818d1cc98b6eeaf64a728e627a3401400c3af53654f8771ef02eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd2f03a8020818d1cc98b6eeaf64a728e627a3401400c3af53654f8771ef02eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8dd31c10eaf6bc673dec377a884040524418dd1891a96b2eb6e6e983979512a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8dd31c10eaf6bc673dec377a884040524418dd1891a96b2eb6e6e983979512a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a7a8ef66a9486ca3d13826df7aa39db2083cff2c386b4d0d858e0526e0f094d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a7a8ef66a9486ca3d13826df7aa39db2083cff2c386b4d0d858e0526e0f094d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd1c47c976c17bf17e4d660288ce30472372b6a7f33f6ffa6d96c1a4436275d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd1c47c976c17bf17e4d660288ce30472372b6a7f33f6ffa6d96c1a4436275d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b7c8e9648e5514ab1fcb63dc72822dd276a2d74987e2fb4f4800b26ec176be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b7c8e9648e5514ab1fcb63dc72822dd276a2d74987e2fb4f4800b26ec176be7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d7115db7fe248437f4b8918f06c9aa48b141fcd3f31add80faee81b4347e47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d7115db7fe248437f4b8918f06c9aa48b141fcd3f31add80faee81b4347e47b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dpkxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:42Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:42 crc kubenswrapper[5004]: I1201 08:17:42.892805 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ww6lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72500ffd-4ca3-4614-a3a2-bbdc5a7506c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://668cfaa2af20e2bb082fc47e1702cfd9f704c4fdf56a4d27cf25d6915e7cd18a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmsvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ww6lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:42Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:42 crc kubenswrapper[5004]: I1201 08:17:42.911820 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0ed2b6d-0c61-4639-bc3b-1c8effc4815d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45a0d7a12e7c114de1740a151e04f7f39844b709740cab958f156ac918af3228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d69a370af9f8f6c575f99da6bb01cee0f660dccb7e85f16f5d8874d0e263e1fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82baee0fd931fd7b41cd5a73ca7e653ef13dfa3d573991cbe3238d6b95f6fac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f341e99100e985f65c24647d976a24782dfae7f52e752b774f41f69af27fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad581564fc519328ae429bce757797f6d87d8648e862b008637a245866ec4cbe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T08:17:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 08:17:15.185125 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 08:17:15.186793 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071633747/tls.crt::/tmp/serving-cert-4071633747/tls.key\\\\\\\"\\\\nI1201 08:17:20.827614 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 08:17:20.834528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 08:17:20.834549 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 08:17:20.837611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 08:17:20.837630 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 08:17:20.848940 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 08:17:20.848957 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848965 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 08:17:20.848968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 08:17:20.848970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 08:17:20.848973 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 08:17:20.849115 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 08:17:20.854990 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2caeb2ebaf68110cb8728e60aae6db1b1fc48efae6018f66070766a4f42caa69\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:42Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:42 crc kubenswrapper[5004]: I1201 08:17:42.933186 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zjksw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70e79009-93be-49c4-a6b3-e8a06bcea7f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad3ea204ad731500a340e639f0e36abe28ba50464f1b2ff9b009e39c234cf708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd8m6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zjksw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:42Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:42 crc kubenswrapper[5004]: I1201 08:17:42.950376 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:42 crc kubenswrapper[5004]: I1201 08:17:42.950656 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:42 crc kubenswrapper[5004]: I1201 08:17:42.950681 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:42 crc kubenswrapper[5004]: I1201 08:17:42.950712 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:42 crc kubenswrapper[5004]: I1201 08:17:42.950735 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:42Z","lastTransitionTime":"2025-12-01T08:17:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:42 crc kubenswrapper[5004]: I1201 08:17:42.964968 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15cdec0a-5925-4966-a30b-f60c503f633e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f528b94ff12c7a804930dca4b93c23334a9ae3a53317750cd34b5695f08b4582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f001b70a83da28b1ea78a94f1cd70dfc63b3de5a21b041932d33c2ac970c986b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c5c043438b515e37a6d6d5c47b92479bbb7322fe35c3a1bc57ee7b3a746544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23df33e45cc27e4d343ae6ec3fb88f2dfc125ba7da424515d4bee5ec19281832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4a7cab8e1594814a687b1433a92642a19eff7c2eb12f96fa06f08a2e270733d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1844a18a6109c25a25b9c8cb3ff65e69cc657f66be03dd97e883d24a774b7638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://404ed4cf3ada652b914fc9fd3295d149810f22a6cc4ea044f934ba8ee2595209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0292bc1ff208619ec4cb469a8fbcddf2915fccba3a6ebc07d1ecc8bf1092007\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T08:17:33Z\\\",\\\"message\\\":\\\"al\\\\nI1201 08:17:33.210759 6283 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1201 08:17:33.210788 6283 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1201 08:17:33.211067 6283 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 08:17:33.211080 6283 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1201 08:17:33.211080 6283 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1201 08:17:33.211177 6283 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 08:17:33.211209 6283 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 08:17:33.211241 6283 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1201 08:17:33.211253 6283 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1201 08:17:33.211264 6283 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 08:17:33.211283 6283 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 08:17:33.211300 6283 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 08:17:33.211310 6283 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1201 08:17:33.211316 6283 factory.go:656] Stopping watch factory\\\\nI1201 08:17:33.211312 6283 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 08:17:33.211333 6283 ovnkube.go:599] Stopped ovnkube\\\\nI1201 08:17:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://404ed4cf3ada652b914fc9fd3295d149810f22a6cc4ea044f934ba8ee2595209\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T08:17:36Z\\\",\\\"message\\\":\\\"ping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 08:17:35.520098 6428 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 08:17:35.520395 6428 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 08:17:35.520708 6428 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 08:17:35.520930 6428 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 08:17:35.521362 6428 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1201 08:17:35.521431 6428 factory.go:656] Stopping watch factory\\\\nI1201 08:17:35.521454 6428 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1201 08:17:35.523766 6428 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1201 08:17:35.523786 6428 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1201 08:17:35.523846 6428 ovnkube.go:599] Stopped ovnkube\\\\nI1201 08:17:35.523873 6428 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1201 08:17:35.523949 6428 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f231e0092e1a85cc5d6e00fb8d3bd982223b670191ac3dc21d81ff1fab462a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-knmdv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:42Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:42 crc kubenswrapper[5004]: I1201 08:17:42.987654 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327477e7db10129d47ddf9078f32d0df2ffd4b8e59fc7bb77c17c60c63e9a936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:42Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:43 crc kubenswrapper[5004]: I1201 08:17:43.006662 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:43Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:43 crc kubenswrapper[5004]: I1201 08:17:43.029506 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:43Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:43 crc kubenswrapper[5004]: I1201 08:17:43.050426 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c470edd984db3b756e031e33f64a9bc5ca6b9c156ab479e5cd647745655a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:43Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:43 crc kubenswrapper[5004]: I1201 08:17:43.053082 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:43 crc kubenswrapper[5004]: I1201 08:17:43.053118 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:43 crc kubenswrapper[5004]: I1201 08:17:43.053130 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:43 crc kubenswrapper[5004]: I1201 08:17:43.053148 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:43 crc kubenswrapper[5004]: I1201 08:17:43.053162 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:43Z","lastTransitionTime":"2025-12-01T08:17:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:43 crc kubenswrapper[5004]: I1201 08:17:43.073831 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fd2f2b5d7dc3d7b8e1bb06fdf0eec3addfec3c230e044b6395d6cf535630ebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2250161e400b201f0528e33231c9007342002a37cd037a1b1e011f2aaaa22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:43Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:43 crc kubenswrapper[5004]: I1201 08:17:43.091110 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mzsvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"397b51b7-934a-41d1-a593-500a64161bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4adac5633dfe89777bf019818bab9ee3a208cad9d929c96cf2cb86b18c2d4264\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h49g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d83844478080e4829975fb6c8e0444d9fdebd44b08afcf45e7d0b04fc534a844\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h49g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mzsvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:43Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:43 crc kubenswrapper[5004]: I1201 08:17:43.156120 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:43 crc kubenswrapper[5004]: I1201 08:17:43.156180 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:43 crc kubenswrapper[5004]: I1201 08:17:43.156198 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:43 crc kubenswrapper[5004]: I1201 08:17:43.156222 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:43 crc kubenswrapper[5004]: I1201 08:17:43.156239 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:43Z","lastTransitionTime":"2025-12-01T08:17:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:43 crc kubenswrapper[5004]: I1201 08:17:43.258901 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:43 crc kubenswrapper[5004]: I1201 08:17:43.259194 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:43 crc kubenswrapper[5004]: I1201 08:17:43.259375 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:43 crc kubenswrapper[5004]: I1201 08:17:43.259545 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:43 crc kubenswrapper[5004]: I1201 08:17:43.259745 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:43Z","lastTransitionTime":"2025-12-01T08:17:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:43 crc kubenswrapper[5004]: I1201 08:17:43.361513 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:43 crc kubenswrapper[5004]: I1201 08:17:43.361544 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:43 crc kubenswrapper[5004]: I1201 08:17:43.361553 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:43 crc kubenswrapper[5004]: I1201 08:17:43.361592 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:43 crc kubenswrapper[5004]: I1201 08:17:43.361605 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:43Z","lastTransitionTime":"2025-12-01T08:17:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:43 crc kubenswrapper[5004]: I1201 08:17:43.464270 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:43 crc kubenswrapper[5004]: I1201 08:17:43.464807 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:43 crc kubenswrapper[5004]: I1201 08:17:43.465045 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:43 crc kubenswrapper[5004]: I1201 08:17:43.465254 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:43 crc kubenswrapper[5004]: I1201 08:17:43.465423 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:43Z","lastTransitionTime":"2025-12-01T08:17:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:43 crc kubenswrapper[5004]: I1201 08:17:43.568343 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:43 crc kubenswrapper[5004]: I1201 08:17:43.568421 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:43 crc kubenswrapper[5004]: I1201 08:17:43.568442 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:43 crc kubenswrapper[5004]: I1201 08:17:43.568470 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:43 crc kubenswrapper[5004]: I1201 08:17:43.568492 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:43Z","lastTransitionTime":"2025-12-01T08:17:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:43 crc kubenswrapper[5004]: I1201 08:17:43.671288 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:43 crc kubenswrapper[5004]: I1201 08:17:43.671352 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:43 crc kubenswrapper[5004]: I1201 08:17:43.671370 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:43 crc kubenswrapper[5004]: I1201 08:17:43.671398 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:43 crc kubenswrapper[5004]: I1201 08:17:43.671415 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:43Z","lastTransitionTime":"2025-12-01T08:17:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:43 crc kubenswrapper[5004]: I1201 08:17:43.758665 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 08:17:43 crc kubenswrapper[5004]: I1201 08:17:43.758800 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:17:43 crc kubenswrapper[5004]: E1201 08:17:43.759007 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 08:17:43 crc kubenswrapper[5004]: E1201 08:17:43.759126 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 08:17:43 crc kubenswrapper[5004]: I1201 08:17:43.759407 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 08:17:43 crc kubenswrapper[5004]: E1201 08:17:43.759823 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 08:17:43 crc kubenswrapper[5004]: I1201 08:17:43.775119 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:43 crc kubenswrapper[5004]: I1201 08:17:43.775371 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:43 crc kubenswrapper[5004]: I1201 08:17:43.775542 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:43 crc kubenswrapper[5004]: I1201 08:17:43.775744 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:43 crc kubenswrapper[5004]: I1201 08:17:43.775913 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:43Z","lastTransitionTime":"2025-12-01T08:17:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:43 crc kubenswrapper[5004]: I1201 08:17:43.878873 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:43 crc kubenswrapper[5004]: I1201 08:17:43.879216 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:43 crc kubenswrapper[5004]: I1201 08:17:43.879647 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:43 crc kubenswrapper[5004]: I1201 08:17:43.879991 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:43 crc kubenswrapper[5004]: I1201 08:17:43.880297 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:43Z","lastTransitionTime":"2025-12-01T08:17:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:43 crc kubenswrapper[5004]: I1201 08:17:43.983306 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:43 crc kubenswrapper[5004]: I1201 08:17:43.983395 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:43 crc kubenswrapper[5004]: I1201 08:17:43.983413 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:43 crc kubenswrapper[5004]: I1201 08:17:43.983440 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:43 crc kubenswrapper[5004]: I1201 08:17:43.983458 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:43Z","lastTransitionTime":"2025-12-01T08:17:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:44 crc kubenswrapper[5004]: I1201 08:17:44.086241 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:44 crc kubenswrapper[5004]: I1201 08:17:44.086313 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:44 crc kubenswrapper[5004]: I1201 08:17:44.086337 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:44 crc kubenswrapper[5004]: I1201 08:17:44.086362 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:44 crc kubenswrapper[5004]: I1201 08:17:44.086378 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:44Z","lastTransitionTime":"2025-12-01T08:17:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:44 crc kubenswrapper[5004]: I1201 08:17:44.190385 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:44 crc kubenswrapper[5004]: I1201 08:17:44.190469 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:44 crc kubenswrapper[5004]: I1201 08:17:44.190509 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:44 crc kubenswrapper[5004]: I1201 08:17:44.190544 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:44 crc kubenswrapper[5004]: I1201 08:17:44.190627 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:44Z","lastTransitionTime":"2025-12-01T08:17:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:44 crc kubenswrapper[5004]: I1201 08:17:44.294048 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:44 crc kubenswrapper[5004]: I1201 08:17:44.294085 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:44 crc kubenswrapper[5004]: I1201 08:17:44.294099 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:44 crc kubenswrapper[5004]: I1201 08:17:44.294116 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:44 crc kubenswrapper[5004]: I1201 08:17:44.294128 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:44Z","lastTransitionTime":"2025-12-01T08:17:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:44 crc kubenswrapper[5004]: I1201 08:17:44.397111 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:44 crc kubenswrapper[5004]: I1201 08:17:44.397171 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:44 crc kubenswrapper[5004]: I1201 08:17:44.397190 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:44 crc kubenswrapper[5004]: I1201 08:17:44.397219 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:44 crc kubenswrapper[5004]: I1201 08:17:44.397240 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:44Z","lastTransitionTime":"2025-12-01T08:17:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:44 crc kubenswrapper[5004]: I1201 08:17:44.499747 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:44 crc kubenswrapper[5004]: I1201 08:17:44.499813 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:44 crc kubenswrapper[5004]: I1201 08:17:44.499831 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:44 crc kubenswrapper[5004]: I1201 08:17:44.499855 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:44 crc kubenswrapper[5004]: I1201 08:17:44.499872 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:44Z","lastTransitionTime":"2025-12-01T08:17:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:44 crc kubenswrapper[5004]: I1201 08:17:44.602943 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:44 crc kubenswrapper[5004]: I1201 08:17:44.603005 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:44 crc kubenswrapper[5004]: I1201 08:17:44.603021 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:44 crc kubenswrapper[5004]: I1201 08:17:44.603043 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:44 crc kubenswrapper[5004]: I1201 08:17:44.603060 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:44Z","lastTransitionTime":"2025-12-01T08:17:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:44 crc kubenswrapper[5004]: I1201 08:17:44.705942 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:44 crc kubenswrapper[5004]: I1201 08:17:44.706004 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:44 crc kubenswrapper[5004]: I1201 08:17:44.706014 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:44 crc kubenswrapper[5004]: I1201 08:17:44.706028 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:44 crc kubenswrapper[5004]: I1201 08:17:44.706039 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:44Z","lastTransitionTime":"2025-12-01T08:17:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:44 crc kubenswrapper[5004]: I1201 08:17:44.759032 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cl5l" Dec 01 08:17:44 crc kubenswrapper[5004]: E1201 08:17:44.759217 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cl5l" podUID="b488f4f3-d385-4d40-bdee-96d8fe2d42a1" Dec 01 08:17:44 crc kubenswrapper[5004]: I1201 08:17:44.808260 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:44 crc kubenswrapper[5004]: I1201 08:17:44.808317 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:44 crc kubenswrapper[5004]: I1201 08:17:44.808334 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:44 crc kubenswrapper[5004]: I1201 08:17:44.808355 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:44 crc kubenswrapper[5004]: I1201 08:17:44.808373 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:44Z","lastTransitionTime":"2025-12-01T08:17:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:44 crc kubenswrapper[5004]: I1201 08:17:44.854515 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:44 crc kubenswrapper[5004]: I1201 08:17:44.854612 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:44 crc kubenswrapper[5004]: I1201 08:17:44.854629 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:44 crc kubenswrapper[5004]: I1201 08:17:44.854653 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:44 crc kubenswrapper[5004]: I1201 08:17:44.854672 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:44Z","lastTransitionTime":"2025-12-01T08:17:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:44 crc kubenswrapper[5004]: I1201 08:17:44.860042 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b488f4f3-d385-4d40-bdee-96d8fe2d42a1-metrics-certs\") pod \"network-metrics-daemon-7cl5l\" (UID: \"b488f4f3-d385-4d40-bdee-96d8fe2d42a1\") " pod="openshift-multus/network-metrics-daemon-7cl5l" Dec 01 08:17:44 crc kubenswrapper[5004]: E1201 08:17:44.860215 5004 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 08:17:44 crc kubenswrapper[5004]: E1201 08:17:44.860332 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b488f4f3-d385-4d40-bdee-96d8fe2d42a1-metrics-certs podName:b488f4f3-d385-4d40-bdee-96d8fe2d42a1 nodeName:}" failed. No retries permitted until 2025-12-01 08:17:52.86030168 +0000 UTC m=+50.425293702 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b488f4f3-d385-4d40-bdee-96d8fe2d42a1-metrics-certs") pod "network-metrics-daemon-7cl5l" (UID: "b488f4f3-d385-4d40-bdee-96d8fe2d42a1") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 08:17:44 crc kubenswrapper[5004]: E1201 08:17:44.870437 5004 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c8341267-df98-484f-a5e7-cf024fab437c\\\",\\\"systemUUID\\\":\\\"77547a65-c048-47ee-89b1-8422fc81b7aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:44Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:44 crc kubenswrapper[5004]: I1201 08:17:44.874977 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:44 crc kubenswrapper[5004]: I1201 08:17:44.875043 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:44 crc kubenswrapper[5004]: I1201 08:17:44.875061 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:44 crc kubenswrapper[5004]: I1201 08:17:44.875085 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:44 crc kubenswrapper[5004]: I1201 08:17:44.875101 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:44Z","lastTransitionTime":"2025-12-01T08:17:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:44 crc kubenswrapper[5004]: E1201 08:17:44.895653 5004 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c8341267-df98-484f-a5e7-cf024fab437c\\\",\\\"systemUUID\\\":\\\"77547a65-c048-47ee-89b1-8422fc81b7aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:44Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:44 crc kubenswrapper[5004]: I1201 08:17:44.899617 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:44 crc kubenswrapper[5004]: I1201 08:17:44.899652 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:44 crc kubenswrapper[5004]: I1201 08:17:44.899662 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:44 crc kubenswrapper[5004]: I1201 08:17:44.899678 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:44 crc kubenswrapper[5004]: I1201 08:17:44.899690 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:44Z","lastTransitionTime":"2025-12-01T08:17:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:44 crc kubenswrapper[5004]: E1201 08:17:44.919079 5004 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c8341267-df98-484f-a5e7-cf024fab437c\\\",\\\"systemUUID\\\":\\\"77547a65-c048-47ee-89b1-8422fc81b7aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:44Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:44 crc kubenswrapper[5004]: I1201 08:17:44.923310 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:44 crc kubenswrapper[5004]: I1201 08:17:44.923378 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:44 crc kubenswrapper[5004]: I1201 08:17:44.923395 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:44 crc kubenswrapper[5004]: I1201 08:17:44.923418 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:44 crc kubenswrapper[5004]: I1201 08:17:44.923437 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:44Z","lastTransitionTime":"2025-12-01T08:17:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:44 crc kubenswrapper[5004]: E1201 08:17:44.938362 5004 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c8341267-df98-484f-a5e7-cf024fab437c\\\",\\\"systemUUID\\\":\\\"77547a65-c048-47ee-89b1-8422fc81b7aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:44Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:44 crc kubenswrapper[5004]: I1201 08:17:44.943257 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:44 crc kubenswrapper[5004]: I1201 08:17:44.943312 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:44 crc kubenswrapper[5004]: I1201 08:17:44.943329 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:44 crc kubenswrapper[5004]: I1201 08:17:44.943354 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:44 crc kubenswrapper[5004]: I1201 08:17:44.943371 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:44Z","lastTransitionTime":"2025-12-01T08:17:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:44 crc kubenswrapper[5004]: E1201 08:17:44.962164 5004 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c8341267-df98-484f-a5e7-cf024fab437c\\\",\\\"systemUUID\\\":\\\"77547a65-c048-47ee-89b1-8422fc81b7aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:44Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:44 crc kubenswrapper[5004]: E1201 08:17:44.962389 5004 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 08:17:44 crc kubenswrapper[5004]: I1201 08:17:44.964183 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:44 crc kubenswrapper[5004]: I1201 08:17:44.964217 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:44 crc kubenswrapper[5004]: I1201 08:17:44.964231 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:44 crc kubenswrapper[5004]: I1201 08:17:44.964246 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:44 crc kubenswrapper[5004]: I1201 08:17:44.964258 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:44Z","lastTransitionTime":"2025-12-01T08:17:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:45 crc kubenswrapper[5004]: I1201 08:17:45.067238 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:45 crc kubenswrapper[5004]: I1201 08:17:45.067287 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:45 crc kubenswrapper[5004]: I1201 08:17:45.067306 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:45 crc kubenswrapper[5004]: I1201 08:17:45.067327 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:45 crc kubenswrapper[5004]: I1201 08:17:45.067341 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:45Z","lastTransitionTime":"2025-12-01T08:17:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:45 crc kubenswrapper[5004]: I1201 08:17:45.169381 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:45 crc kubenswrapper[5004]: I1201 08:17:45.169450 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:45 crc kubenswrapper[5004]: I1201 08:17:45.169466 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:45 crc kubenswrapper[5004]: I1201 08:17:45.169492 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:45 crc kubenswrapper[5004]: I1201 08:17:45.169509 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:45Z","lastTransitionTime":"2025-12-01T08:17:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:45 crc kubenswrapper[5004]: I1201 08:17:45.272292 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:45 crc kubenswrapper[5004]: I1201 08:17:45.272398 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:45 crc kubenswrapper[5004]: I1201 08:17:45.272408 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:45 crc kubenswrapper[5004]: I1201 08:17:45.272421 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:45 crc kubenswrapper[5004]: I1201 08:17:45.272429 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:45Z","lastTransitionTime":"2025-12-01T08:17:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:45 crc kubenswrapper[5004]: I1201 08:17:45.374999 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:45 crc kubenswrapper[5004]: I1201 08:17:45.375048 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:45 crc kubenswrapper[5004]: I1201 08:17:45.375064 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:45 crc kubenswrapper[5004]: I1201 08:17:45.375089 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:45 crc kubenswrapper[5004]: I1201 08:17:45.375106 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:45Z","lastTransitionTime":"2025-12-01T08:17:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:45 crc kubenswrapper[5004]: I1201 08:17:45.477615 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:45 crc kubenswrapper[5004]: I1201 08:17:45.477680 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:45 crc kubenswrapper[5004]: I1201 08:17:45.477697 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:45 crc kubenswrapper[5004]: I1201 08:17:45.477720 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:45 crc kubenswrapper[5004]: I1201 08:17:45.477737 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:45Z","lastTransitionTime":"2025-12-01T08:17:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:45 crc kubenswrapper[5004]: I1201 08:17:45.580547 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:45 crc kubenswrapper[5004]: I1201 08:17:45.580644 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:45 crc kubenswrapper[5004]: I1201 08:17:45.580663 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:45 crc kubenswrapper[5004]: I1201 08:17:45.580686 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:45 crc kubenswrapper[5004]: I1201 08:17:45.580703 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:45Z","lastTransitionTime":"2025-12-01T08:17:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:45 crc kubenswrapper[5004]: I1201 08:17:45.684102 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:45 crc kubenswrapper[5004]: I1201 08:17:45.684175 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:45 crc kubenswrapper[5004]: I1201 08:17:45.684206 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:45 crc kubenswrapper[5004]: I1201 08:17:45.684242 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:45 crc kubenswrapper[5004]: I1201 08:17:45.684272 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:45Z","lastTransitionTime":"2025-12-01T08:17:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:45 crc kubenswrapper[5004]: I1201 08:17:45.758734 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 08:17:45 crc kubenswrapper[5004]: I1201 08:17:45.758779 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:17:45 crc kubenswrapper[5004]: I1201 08:17:45.758786 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 08:17:45 crc kubenswrapper[5004]: E1201 08:17:45.758903 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 08:17:45 crc kubenswrapper[5004]: E1201 08:17:45.759042 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 08:17:45 crc kubenswrapper[5004]: E1201 08:17:45.759136 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 08:17:45 crc kubenswrapper[5004]: I1201 08:17:45.787653 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:45 crc kubenswrapper[5004]: I1201 08:17:45.787725 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:45 crc kubenswrapper[5004]: I1201 08:17:45.787751 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:45 crc kubenswrapper[5004]: I1201 08:17:45.787780 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:45 crc kubenswrapper[5004]: I1201 08:17:45.787802 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:45Z","lastTransitionTime":"2025-12-01T08:17:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:45 crc kubenswrapper[5004]: I1201 08:17:45.891336 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:45 crc kubenswrapper[5004]: I1201 08:17:45.891385 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:45 crc kubenswrapper[5004]: I1201 08:17:45.891401 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:45 crc kubenswrapper[5004]: I1201 08:17:45.891420 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:45 crc kubenswrapper[5004]: I1201 08:17:45.891434 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:45Z","lastTransitionTime":"2025-12-01T08:17:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:45 crc kubenswrapper[5004]: I1201 08:17:45.994990 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:45 crc kubenswrapper[5004]: I1201 08:17:45.995077 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:45 crc kubenswrapper[5004]: I1201 08:17:45.995094 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:45 crc kubenswrapper[5004]: I1201 08:17:45.995123 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:45 crc kubenswrapper[5004]: I1201 08:17:45.995139 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:45Z","lastTransitionTime":"2025-12-01T08:17:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:46 crc kubenswrapper[5004]: I1201 08:17:46.097146 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:46 crc kubenswrapper[5004]: I1201 08:17:46.097212 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:46 crc kubenswrapper[5004]: I1201 08:17:46.097228 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:46 crc kubenswrapper[5004]: I1201 08:17:46.097255 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:46 crc kubenswrapper[5004]: I1201 08:17:46.097273 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:46Z","lastTransitionTime":"2025-12-01T08:17:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:46 crc kubenswrapper[5004]: I1201 08:17:46.200167 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:46 crc kubenswrapper[5004]: I1201 08:17:46.200207 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:46 crc kubenswrapper[5004]: I1201 08:17:46.200217 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:46 crc kubenswrapper[5004]: I1201 08:17:46.200234 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:46 crc kubenswrapper[5004]: I1201 08:17:46.200245 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:46Z","lastTransitionTime":"2025-12-01T08:17:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:46 crc kubenswrapper[5004]: I1201 08:17:46.302540 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:46 crc kubenswrapper[5004]: I1201 08:17:46.302608 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:46 crc kubenswrapper[5004]: I1201 08:17:46.302616 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:46 crc kubenswrapper[5004]: I1201 08:17:46.302630 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:46 crc kubenswrapper[5004]: I1201 08:17:46.302640 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:46Z","lastTransitionTime":"2025-12-01T08:17:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:46 crc kubenswrapper[5004]: I1201 08:17:46.405606 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:46 crc kubenswrapper[5004]: I1201 08:17:46.405649 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:46 crc kubenswrapper[5004]: I1201 08:17:46.405660 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:46 crc kubenswrapper[5004]: I1201 08:17:46.405676 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:46 crc kubenswrapper[5004]: I1201 08:17:46.405686 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:46Z","lastTransitionTime":"2025-12-01T08:17:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:46 crc kubenswrapper[5004]: I1201 08:17:46.508240 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:46 crc kubenswrapper[5004]: I1201 08:17:46.508300 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:46 crc kubenswrapper[5004]: I1201 08:17:46.508317 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:46 crc kubenswrapper[5004]: I1201 08:17:46.508341 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:46 crc kubenswrapper[5004]: I1201 08:17:46.508360 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:46Z","lastTransitionTime":"2025-12-01T08:17:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:46 crc kubenswrapper[5004]: I1201 08:17:46.611209 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:46 crc kubenswrapper[5004]: I1201 08:17:46.611421 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:46 crc kubenswrapper[5004]: I1201 08:17:46.611540 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:46 crc kubenswrapper[5004]: I1201 08:17:46.611678 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:46 crc kubenswrapper[5004]: I1201 08:17:46.611762 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:46Z","lastTransitionTime":"2025-12-01T08:17:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:46 crc kubenswrapper[5004]: I1201 08:17:46.714142 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:46 crc kubenswrapper[5004]: I1201 08:17:46.714170 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:46 crc kubenswrapper[5004]: I1201 08:17:46.714180 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:46 crc kubenswrapper[5004]: I1201 08:17:46.714195 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:46 crc kubenswrapper[5004]: I1201 08:17:46.714205 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:46Z","lastTransitionTime":"2025-12-01T08:17:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:46 crc kubenswrapper[5004]: I1201 08:17:46.768934 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cl5l" Dec 01 08:17:46 crc kubenswrapper[5004]: E1201 08:17:46.769156 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cl5l" podUID="b488f4f3-d385-4d40-bdee-96d8fe2d42a1" Dec 01 08:17:46 crc kubenswrapper[5004]: I1201 08:17:46.816513 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:46 crc kubenswrapper[5004]: I1201 08:17:46.816586 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:46 crc kubenswrapper[5004]: I1201 08:17:46.816603 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:46 crc kubenswrapper[5004]: I1201 08:17:46.816623 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:46 crc kubenswrapper[5004]: I1201 08:17:46.816638 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:46Z","lastTransitionTime":"2025-12-01T08:17:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:46 crc kubenswrapper[5004]: I1201 08:17:46.919215 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:46 crc kubenswrapper[5004]: I1201 08:17:46.919306 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:46 crc kubenswrapper[5004]: I1201 08:17:46.919325 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:46 crc kubenswrapper[5004]: I1201 08:17:46.919352 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:46 crc kubenswrapper[5004]: I1201 08:17:46.919369 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:46Z","lastTransitionTime":"2025-12-01T08:17:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:47 crc kubenswrapper[5004]: I1201 08:17:47.023013 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:47 crc kubenswrapper[5004]: I1201 08:17:47.023076 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:47 crc kubenswrapper[5004]: I1201 08:17:47.023093 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:47 crc kubenswrapper[5004]: I1201 08:17:47.023117 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:47 crc kubenswrapper[5004]: I1201 08:17:47.023137 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:47Z","lastTransitionTime":"2025-12-01T08:17:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:47 crc kubenswrapper[5004]: I1201 08:17:47.126332 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:47 crc kubenswrapper[5004]: I1201 08:17:47.126391 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:47 crc kubenswrapper[5004]: I1201 08:17:47.126413 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:47 crc kubenswrapper[5004]: I1201 08:17:47.126439 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:47 crc kubenswrapper[5004]: I1201 08:17:47.126456 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:47Z","lastTransitionTime":"2025-12-01T08:17:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:47 crc kubenswrapper[5004]: I1201 08:17:47.229373 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:47 crc kubenswrapper[5004]: I1201 08:17:47.229417 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:47 crc kubenswrapper[5004]: I1201 08:17:47.229431 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:47 crc kubenswrapper[5004]: I1201 08:17:47.229447 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:47 crc kubenswrapper[5004]: I1201 08:17:47.229461 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:47Z","lastTransitionTime":"2025-12-01T08:17:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:47 crc kubenswrapper[5004]: I1201 08:17:47.332224 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:47 crc kubenswrapper[5004]: I1201 08:17:47.332300 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:47 crc kubenswrapper[5004]: I1201 08:17:47.332325 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:47 crc kubenswrapper[5004]: I1201 08:17:47.332356 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:47 crc kubenswrapper[5004]: I1201 08:17:47.332378 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:47Z","lastTransitionTime":"2025-12-01T08:17:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:47 crc kubenswrapper[5004]: I1201 08:17:47.434838 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:47 crc kubenswrapper[5004]: I1201 08:17:47.434903 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:47 crc kubenswrapper[5004]: I1201 08:17:47.434921 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:47 crc kubenswrapper[5004]: I1201 08:17:47.434947 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:47 crc kubenswrapper[5004]: I1201 08:17:47.434964 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:47Z","lastTransitionTime":"2025-12-01T08:17:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:47 crc kubenswrapper[5004]: I1201 08:17:47.537760 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:47 crc kubenswrapper[5004]: I1201 08:17:47.537822 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:47 crc kubenswrapper[5004]: I1201 08:17:47.537840 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:47 crc kubenswrapper[5004]: I1201 08:17:47.537863 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:47 crc kubenswrapper[5004]: I1201 08:17:47.537889 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:47Z","lastTransitionTime":"2025-12-01T08:17:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:47 crc kubenswrapper[5004]: I1201 08:17:47.640637 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:47 crc kubenswrapper[5004]: I1201 08:17:47.640730 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:47 crc kubenswrapper[5004]: I1201 08:17:47.640748 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:47 crc kubenswrapper[5004]: I1201 08:17:47.640770 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:47 crc kubenswrapper[5004]: I1201 08:17:47.640787 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:47Z","lastTransitionTime":"2025-12-01T08:17:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:47 crc kubenswrapper[5004]: I1201 08:17:47.744501 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:47 crc kubenswrapper[5004]: I1201 08:17:47.744597 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:47 crc kubenswrapper[5004]: I1201 08:17:47.744615 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:47 crc kubenswrapper[5004]: I1201 08:17:47.744639 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:47 crc kubenswrapper[5004]: I1201 08:17:47.744656 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:47Z","lastTransitionTime":"2025-12-01T08:17:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:47 crc kubenswrapper[5004]: I1201 08:17:47.758738 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 08:17:47 crc kubenswrapper[5004]: I1201 08:17:47.758741 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 08:17:47 crc kubenswrapper[5004]: E1201 08:17:47.759155 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 08:17:47 crc kubenswrapper[5004]: E1201 08:17:47.759224 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 08:17:47 crc kubenswrapper[5004]: I1201 08:17:47.758781 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:17:47 crc kubenswrapper[5004]: E1201 08:17:47.759380 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 08:17:47 crc kubenswrapper[5004]: I1201 08:17:47.847957 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:47 crc kubenswrapper[5004]: I1201 08:17:47.848027 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:47 crc kubenswrapper[5004]: I1201 08:17:47.848048 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:47 crc kubenswrapper[5004]: I1201 08:17:47.848077 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:47 crc kubenswrapper[5004]: I1201 08:17:47.848096 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:47Z","lastTransitionTime":"2025-12-01T08:17:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:47 crc kubenswrapper[5004]: I1201 08:17:47.950792 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:47 crc kubenswrapper[5004]: I1201 08:17:47.950841 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:47 crc kubenswrapper[5004]: I1201 08:17:47.950854 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:47 crc kubenswrapper[5004]: I1201 08:17:47.950870 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:47 crc kubenswrapper[5004]: I1201 08:17:47.950884 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:47Z","lastTransitionTime":"2025-12-01T08:17:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:48 crc kubenswrapper[5004]: I1201 08:17:48.054003 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:48 crc kubenswrapper[5004]: I1201 08:17:48.054069 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:48 crc kubenswrapper[5004]: I1201 08:17:48.054086 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:48 crc kubenswrapper[5004]: I1201 08:17:48.054114 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:48 crc kubenswrapper[5004]: I1201 08:17:48.054131 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:48Z","lastTransitionTime":"2025-12-01T08:17:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:48 crc kubenswrapper[5004]: I1201 08:17:48.156691 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:48 crc kubenswrapper[5004]: I1201 08:17:48.156767 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:48 crc kubenswrapper[5004]: I1201 08:17:48.156789 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:48 crc kubenswrapper[5004]: I1201 08:17:48.156818 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:48 crc kubenswrapper[5004]: I1201 08:17:48.156844 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:48Z","lastTransitionTime":"2025-12-01T08:17:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:48 crc kubenswrapper[5004]: I1201 08:17:48.260145 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:48 crc kubenswrapper[5004]: I1201 08:17:48.260243 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:48 crc kubenswrapper[5004]: I1201 08:17:48.260262 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:48 crc kubenswrapper[5004]: I1201 08:17:48.260321 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:48 crc kubenswrapper[5004]: I1201 08:17:48.260344 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:48Z","lastTransitionTime":"2025-12-01T08:17:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:48 crc kubenswrapper[5004]: I1201 08:17:48.362866 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:48 crc kubenswrapper[5004]: I1201 08:17:48.362929 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:48 crc kubenswrapper[5004]: I1201 08:17:48.362947 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:48 crc kubenswrapper[5004]: I1201 08:17:48.362973 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:48 crc kubenswrapper[5004]: I1201 08:17:48.362990 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:48Z","lastTransitionTime":"2025-12-01T08:17:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:48 crc kubenswrapper[5004]: I1201 08:17:48.465901 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:48 crc kubenswrapper[5004]: I1201 08:17:48.465961 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:48 crc kubenswrapper[5004]: I1201 08:17:48.465980 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:48 crc kubenswrapper[5004]: I1201 08:17:48.466004 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:48 crc kubenswrapper[5004]: I1201 08:17:48.466022 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:48Z","lastTransitionTime":"2025-12-01T08:17:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:48 crc kubenswrapper[5004]: I1201 08:17:48.569013 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:48 crc kubenswrapper[5004]: I1201 08:17:48.569074 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:48 crc kubenswrapper[5004]: I1201 08:17:48.569090 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:48 crc kubenswrapper[5004]: I1201 08:17:48.569120 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:48 crc kubenswrapper[5004]: I1201 08:17:48.569137 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:48Z","lastTransitionTime":"2025-12-01T08:17:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:48 crc kubenswrapper[5004]: I1201 08:17:48.672602 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:48 crc kubenswrapper[5004]: I1201 08:17:48.672653 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:48 crc kubenswrapper[5004]: I1201 08:17:48.672669 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:48 crc kubenswrapper[5004]: I1201 08:17:48.672691 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:48 crc kubenswrapper[5004]: I1201 08:17:48.672708 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:48Z","lastTransitionTime":"2025-12-01T08:17:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:48 crc kubenswrapper[5004]: I1201 08:17:48.757937 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cl5l" Dec 01 08:17:48 crc kubenswrapper[5004]: E1201 08:17:48.758085 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cl5l" podUID="b488f4f3-d385-4d40-bdee-96d8fe2d42a1" Dec 01 08:17:48 crc kubenswrapper[5004]: I1201 08:17:48.774722 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:48 crc kubenswrapper[5004]: I1201 08:17:48.774782 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:48 crc kubenswrapper[5004]: I1201 08:17:48.774799 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:48 crc kubenswrapper[5004]: I1201 08:17:48.774821 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:48 crc kubenswrapper[5004]: I1201 08:17:48.774839 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:48Z","lastTransitionTime":"2025-12-01T08:17:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:48 crc kubenswrapper[5004]: I1201 08:17:48.877848 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:48 crc kubenswrapper[5004]: I1201 08:17:48.877891 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:48 crc kubenswrapper[5004]: I1201 08:17:48.877902 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:48 crc kubenswrapper[5004]: I1201 08:17:48.877917 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:48 crc kubenswrapper[5004]: I1201 08:17:48.877928 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:48Z","lastTransitionTime":"2025-12-01T08:17:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:48 crc kubenswrapper[5004]: I1201 08:17:48.981832 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:48 crc kubenswrapper[5004]: I1201 08:17:48.981882 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:48 crc kubenswrapper[5004]: I1201 08:17:48.981902 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:48 crc kubenswrapper[5004]: I1201 08:17:48.981926 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:48 crc kubenswrapper[5004]: I1201 08:17:48.981943 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:48Z","lastTransitionTime":"2025-12-01T08:17:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:49 crc kubenswrapper[5004]: I1201 08:17:49.085241 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:49 crc kubenswrapper[5004]: I1201 08:17:49.085294 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:49 crc kubenswrapper[5004]: I1201 08:17:49.085311 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:49 crc kubenswrapper[5004]: I1201 08:17:49.085336 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:49 crc kubenswrapper[5004]: I1201 08:17:49.085353 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:49Z","lastTransitionTime":"2025-12-01T08:17:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:49 crc kubenswrapper[5004]: I1201 08:17:49.188026 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:49 crc kubenswrapper[5004]: I1201 08:17:49.188062 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:49 crc kubenswrapper[5004]: I1201 08:17:49.188074 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:49 crc kubenswrapper[5004]: I1201 08:17:49.188093 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:49 crc kubenswrapper[5004]: I1201 08:17:49.188108 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:49Z","lastTransitionTime":"2025-12-01T08:17:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:49 crc kubenswrapper[5004]: I1201 08:17:49.290901 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:49 crc kubenswrapper[5004]: I1201 08:17:49.290963 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:49 crc kubenswrapper[5004]: I1201 08:17:49.290980 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:49 crc kubenswrapper[5004]: I1201 08:17:49.291005 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:49 crc kubenswrapper[5004]: I1201 08:17:49.291022 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:49Z","lastTransitionTime":"2025-12-01T08:17:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:49 crc kubenswrapper[5004]: I1201 08:17:49.394685 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:49 crc kubenswrapper[5004]: I1201 08:17:49.394750 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:49 crc kubenswrapper[5004]: I1201 08:17:49.394767 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:49 crc kubenswrapper[5004]: I1201 08:17:49.394790 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:49 crc kubenswrapper[5004]: I1201 08:17:49.394808 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:49Z","lastTransitionTime":"2025-12-01T08:17:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:49 crc kubenswrapper[5004]: I1201 08:17:49.497203 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:49 crc kubenswrapper[5004]: I1201 08:17:49.497277 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:49 crc kubenswrapper[5004]: I1201 08:17:49.497299 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:49 crc kubenswrapper[5004]: I1201 08:17:49.497328 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:49 crc kubenswrapper[5004]: I1201 08:17:49.497352 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:49Z","lastTransitionTime":"2025-12-01T08:17:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:49 crc kubenswrapper[5004]: I1201 08:17:49.599487 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:49 crc kubenswrapper[5004]: I1201 08:17:49.599538 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:49 crc kubenswrapper[5004]: I1201 08:17:49.599547 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:49 crc kubenswrapper[5004]: I1201 08:17:49.599594 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:49 crc kubenswrapper[5004]: I1201 08:17:49.599606 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:49Z","lastTransitionTime":"2025-12-01T08:17:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:49 crc kubenswrapper[5004]: I1201 08:17:49.701866 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:49 crc kubenswrapper[5004]: I1201 08:17:49.701923 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:49 crc kubenswrapper[5004]: I1201 08:17:49.701941 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:49 crc kubenswrapper[5004]: I1201 08:17:49.701964 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:49 crc kubenswrapper[5004]: I1201 08:17:49.701980 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:49Z","lastTransitionTime":"2025-12-01T08:17:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:49 crc kubenswrapper[5004]: I1201 08:17:49.758714 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:17:49 crc kubenswrapper[5004]: I1201 08:17:49.758787 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 08:17:49 crc kubenswrapper[5004]: I1201 08:17:49.758734 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 08:17:49 crc kubenswrapper[5004]: E1201 08:17:49.758938 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 08:17:49 crc kubenswrapper[5004]: E1201 08:17:49.759081 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 08:17:49 crc kubenswrapper[5004]: E1201 08:17:49.759197 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 08:17:49 crc kubenswrapper[5004]: I1201 08:17:49.804524 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:49 crc kubenswrapper[5004]: I1201 08:17:49.804889 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:49 crc kubenswrapper[5004]: I1201 08:17:49.804918 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:49 crc kubenswrapper[5004]: I1201 08:17:49.804949 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:49 crc kubenswrapper[5004]: I1201 08:17:49.804971 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:49Z","lastTransitionTime":"2025-12-01T08:17:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:49 crc kubenswrapper[5004]: I1201 08:17:49.908270 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:49 crc kubenswrapper[5004]: I1201 08:17:49.908334 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:49 crc kubenswrapper[5004]: I1201 08:17:49.908356 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:49 crc kubenswrapper[5004]: I1201 08:17:49.908384 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:49 crc kubenswrapper[5004]: I1201 08:17:49.908404 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:49Z","lastTransitionTime":"2025-12-01T08:17:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:50 crc kubenswrapper[5004]: I1201 08:17:50.010600 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:50 crc kubenswrapper[5004]: I1201 08:17:50.010654 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:50 crc kubenswrapper[5004]: I1201 08:17:50.010673 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:50 crc kubenswrapper[5004]: I1201 08:17:50.010696 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:50 crc kubenswrapper[5004]: I1201 08:17:50.010713 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:50Z","lastTransitionTime":"2025-12-01T08:17:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:50 crc kubenswrapper[5004]: I1201 08:17:50.113640 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:50 crc kubenswrapper[5004]: I1201 08:17:50.113701 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:50 crc kubenswrapper[5004]: I1201 08:17:50.113718 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:50 crc kubenswrapper[5004]: I1201 08:17:50.113746 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:50 crc kubenswrapper[5004]: I1201 08:17:50.113765 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:50Z","lastTransitionTime":"2025-12-01T08:17:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:50 crc kubenswrapper[5004]: I1201 08:17:50.215977 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:50 crc kubenswrapper[5004]: I1201 08:17:50.216050 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:50 crc kubenswrapper[5004]: I1201 08:17:50.216066 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:50 crc kubenswrapper[5004]: I1201 08:17:50.216089 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:50 crc kubenswrapper[5004]: I1201 08:17:50.216106 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:50Z","lastTransitionTime":"2025-12-01T08:17:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:50 crc kubenswrapper[5004]: I1201 08:17:50.318025 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:50 crc kubenswrapper[5004]: I1201 08:17:50.318059 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:50 crc kubenswrapper[5004]: I1201 08:17:50.318066 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:50 crc kubenswrapper[5004]: I1201 08:17:50.318079 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:50 crc kubenswrapper[5004]: I1201 08:17:50.318088 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:50Z","lastTransitionTime":"2025-12-01T08:17:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:50 crc kubenswrapper[5004]: I1201 08:17:50.421175 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:50 crc kubenswrapper[5004]: I1201 08:17:50.421220 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:50 crc kubenswrapper[5004]: I1201 08:17:50.421231 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:50 crc kubenswrapper[5004]: I1201 08:17:50.421248 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:50 crc kubenswrapper[5004]: I1201 08:17:50.421260 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:50Z","lastTransitionTime":"2025-12-01T08:17:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:50 crc kubenswrapper[5004]: I1201 08:17:50.523222 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:50 crc kubenswrapper[5004]: I1201 08:17:50.523268 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:50 crc kubenswrapper[5004]: I1201 08:17:50.523280 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:50 crc kubenswrapper[5004]: I1201 08:17:50.523298 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:50 crc kubenswrapper[5004]: I1201 08:17:50.523309 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:50Z","lastTransitionTime":"2025-12-01T08:17:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:50 crc kubenswrapper[5004]: I1201 08:17:50.625608 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:50 crc kubenswrapper[5004]: I1201 08:17:50.625649 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:50 crc kubenswrapper[5004]: I1201 08:17:50.625661 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:50 crc kubenswrapper[5004]: I1201 08:17:50.625677 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:50 crc kubenswrapper[5004]: I1201 08:17:50.625689 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:50Z","lastTransitionTime":"2025-12-01T08:17:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:50 crc kubenswrapper[5004]: I1201 08:17:50.727963 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:50 crc kubenswrapper[5004]: I1201 08:17:50.728011 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:50 crc kubenswrapper[5004]: I1201 08:17:50.728021 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:50 crc kubenswrapper[5004]: I1201 08:17:50.728037 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:50 crc kubenswrapper[5004]: I1201 08:17:50.728048 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:50Z","lastTransitionTime":"2025-12-01T08:17:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:50 crc kubenswrapper[5004]: I1201 08:17:50.759031 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cl5l" Dec 01 08:17:50 crc kubenswrapper[5004]: E1201 08:17:50.759178 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cl5l" podUID="b488f4f3-d385-4d40-bdee-96d8fe2d42a1" Dec 01 08:17:50 crc kubenswrapper[5004]: I1201 08:17:50.830914 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:50 crc kubenswrapper[5004]: I1201 08:17:50.830959 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:50 crc kubenswrapper[5004]: I1201 08:17:50.830970 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:50 crc kubenswrapper[5004]: I1201 08:17:50.830987 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:50 crc kubenswrapper[5004]: I1201 08:17:50.830997 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:50Z","lastTransitionTime":"2025-12-01T08:17:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:50 crc kubenswrapper[5004]: I1201 08:17:50.933482 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:50 crc kubenswrapper[5004]: I1201 08:17:50.933544 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:50 crc kubenswrapper[5004]: I1201 08:17:50.933592 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:50 crc kubenswrapper[5004]: I1201 08:17:50.933619 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:50 crc kubenswrapper[5004]: I1201 08:17:50.933637 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:50Z","lastTransitionTime":"2025-12-01T08:17:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:51 crc kubenswrapper[5004]: I1201 08:17:51.036545 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:51 crc kubenswrapper[5004]: I1201 08:17:51.036682 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:51 crc kubenswrapper[5004]: I1201 08:17:51.036704 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:51 crc kubenswrapper[5004]: I1201 08:17:51.036734 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:51 crc kubenswrapper[5004]: I1201 08:17:51.036757 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:51Z","lastTransitionTime":"2025-12-01T08:17:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:51 crc kubenswrapper[5004]: I1201 08:17:51.139674 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:51 crc kubenswrapper[5004]: I1201 08:17:51.139715 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:51 crc kubenswrapper[5004]: I1201 08:17:51.139725 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:51 crc kubenswrapper[5004]: I1201 08:17:51.139739 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:51 crc kubenswrapper[5004]: I1201 08:17:51.139750 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:51Z","lastTransitionTime":"2025-12-01T08:17:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:51 crc kubenswrapper[5004]: I1201 08:17:51.241824 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:51 crc kubenswrapper[5004]: I1201 08:17:51.241876 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:51 crc kubenswrapper[5004]: I1201 08:17:51.241890 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:51 crc kubenswrapper[5004]: I1201 08:17:51.241910 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:51 crc kubenswrapper[5004]: I1201 08:17:51.241923 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:51Z","lastTransitionTime":"2025-12-01T08:17:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:51 crc kubenswrapper[5004]: I1201 08:17:51.344777 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:51 crc kubenswrapper[5004]: I1201 08:17:51.344853 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:51 crc kubenswrapper[5004]: I1201 08:17:51.344875 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:51 crc kubenswrapper[5004]: I1201 08:17:51.344906 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:51 crc kubenswrapper[5004]: I1201 08:17:51.344928 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:51Z","lastTransitionTime":"2025-12-01T08:17:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:51 crc kubenswrapper[5004]: I1201 08:17:51.448188 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:51 crc kubenswrapper[5004]: I1201 08:17:51.448259 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:51 crc kubenswrapper[5004]: I1201 08:17:51.448282 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:51 crc kubenswrapper[5004]: I1201 08:17:51.448312 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:51 crc kubenswrapper[5004]: I1201 08:17:51.448333 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:51Z","lastTransitionTime":"2025-12-01T08:17:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:51 crc kubenswrapper[5004]: I1201 08:17:51.551083 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:51 crc kubenswrapper[5004]: I1201 08:17:51.551137 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:51 crc kubenswrapper[5004]: I1201 08:17:51.551153 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:51 crc kubenswrapper[5004]: I1201 08:17:51.551176 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:51 crc kubenswrapper[5004]: I1201 08:17:51.551194 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:51Z","lastTransitionTime":"2025-12-01T08:17:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:51 crc kubenswrapper[5004]: I1201 08:17:51.654007 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:51 crc kubenswrapper[5004]: I1201 08:17:51.654065 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:51 crc kubenswrapper[5004]: I1201 08:17:51.654082 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:51 crc kubenswrapper[5004]: I1201 08:17:51.654105 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:51 crc kubenswrapper[5004]: I1201 08:17:51.654124 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:51Z","lastTransitionTime":"2025-12-01T08:17:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:51 crc kubenswrapper[5004]: I1201 08:17:51.756472 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:51 crc kubenswrapper[5004]: I1201 08:17:51.756528 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:51 crc kubenswrapper[5004]: I1201 08:17:51.756545 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:51 crc kubenswrapper[5004]: I1201 08:17:51.756593 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:51 crc kubenswrapper[5004]: I1201 08:17:51.756614 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:51Z","lastTransitionTime":"2025-12-01T08:17:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:51 crc kubenswrapper[5004]: I1201 08:17:51.757877 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 08:17:51 crc kubenswrapper[5004]: I1201 08:17:51.757942 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 08:17:51 crc kubenswrapper[5004]: E1201 08:17:51.758059 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 08:17:51 crc kubenswrapper[5004]: I1201 08:17:51.758164 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:17:51 crc kubenswrapper[5004]: E1201 08:17:51.758224 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 08:17:51 crc kubenswrapper[5004]: E1201 08:17:51.758387 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 08:17:51 crc kubenswrapper[5004]: I1201 08:17:51.759324 5004 scope.go:117] "RemoveContainer" containerID="404ed4cf3ada652b914fc9fd3295d149810f22a6cc4ea044f934ba8ee2595209" Dec 01 08:17:51 crc kubenswrapper[5004]: I1201 08:17:51.829418 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0ed2b6d-0c61-4639-bc3b-1c8effc4815d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45a0d7a12e7c114de1740a151e04f7f39844b709740cab958f156ac918af3228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d69a370af9f8f6c575f99da6bb01cee0f660dccb7e85f16f5d8874d0e263e1fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82baee0fd931fd7b41cd5a73ca7e653ef13dfa3d573991cbe3238d6b95f6fac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f341e99100e985f65c24647d976a24782dfae7f52e752b774f41f69af27fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad581564fc519328ae429bce757797f6d87d8648e862b008637a245866ec4cbe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T08:17:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 08:17:15.185125 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 08:17:15.186793 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071633747/tls.crt::/tmp/serving-cert-4071633747/tls.key\\\\\\\"\\\\nI1201 08:17:20.827614 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 08:17:20.834528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 08:17:20.834549 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 08:17:20.837611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 08:17:20.837630 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 08:17:20.848940 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 08:17:20.848957 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848965 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 08:17:20.848968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 08:17:20.848970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 08:17:20.848973 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 08:17:20.849115 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 08:17:20.854990 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2caeb2ebaf68110cb8728e60aae6db1b1fc48efae6018f66070766a4f42caa69\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:51Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:51 crc kubenswrapper[5004]: I1201 08:17:51.845527 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327477e7db10129d47ddf9078f32d0df2ffd4b8e59fc7bb77c17c60c63e9a936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:51Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:51 crc kubenswrapper[5004]: I1201 08:17:51.860240 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:51Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:51 crc kubenswrapper[5004]: I1201 08:17:51.861781 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:51 crc kubenswrapper[5004]: I1201 08:17:51.861818 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:51 crc kubenswrapper[5004]: I1201 08:17:51.861828 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:51 crc kubenswrapper[5004]: I1201 08:17:51.861845 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:51 crc kubenswrapper[5004]: I1201 08:17:51.861856 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:51Z","lastTransitionTime":"2025-12-01T08:17:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:51 crc kubenswrapper[5004]: I1201 08:17:51.876381 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:51Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:51 crc kubenswrapper[5004]: I1201 08:17:51.891675 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c470edd984db3b756e031e33f64a9bc5ca6b9c156ab479e5cd647745655a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:51Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:51 crc kubenswrapper[5004]: I1201 08:17:51.912521 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fd2f2b5d7dc3d7b8e1bb06fdf0eec3addfec3c230e044b6395d6cf535630ebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2250161e400b201f0528e33231c9007342002a37cd037a1b1e011f2aaaa22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:51Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:51 crc kubenswrapper[5004]: I1201 08:17:51.932220 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zjksw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70e79009-93be-49c4-a6b3-e8a06bcea7f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad3ea204ad731500a340e639f0e36abe28ba50464f1b2ff9b009e39c234cf708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd8m6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zjksw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:51Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:51 crc kubenswrapper[5004]: I1201 08:17:51.960544 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15cdec0a-5925-4966-a30b-f60c503f633e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f528b94ff12c7a804930dca4b93c23334a9ae3a53317750cd34b5695f08b4582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f001b70a83da28b1ea78a94f1cd70dfc63b3de5a21b041932d33c2ac970c986b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c5c043438b515e37a6d6d5c47b92479bbb7322fe35c3a1bc57ee7b3a746544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23df33e45cc27e4d343ae6ec3fb88f2dfc125ba7da424515d4bee5ec19281832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4a7cab8e1594814a687b1433a92642a19eff7c2eb12f96fa06f08a2e270733d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1844a18a6109c25a25b9c8cb3ff65e69cc657f66be03dd97e883d24a774b7638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://404ed4cf3ada652b914fc9fd3295d149810f22a6cc4ea044f934ba8ee2595209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://404ed4cf3ada652b914fc9fd3295d149810f22a6cc4ea044f934ba8ee2595209\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T08:17:36Z\\\",\\\"message\\\":\\\"ping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 08:17:35.520098 6428 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 08:17:35.520395 6428 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 08:17:35.520708 6428 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 08:17:35.520930 6428 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 08:17:35.521362 6428 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1201 08:17:35.521431 6428 factory.go:656] Stopping watch factory\\\\nI1201 08:17:35.521454 6428 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1201 08:17:35.523766 6428 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1201 08:17:35.523786 6428 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1201 08:17:35.523846 6428 ovnkube.go:599] Stopped ovnkube\\\\nI1201 08:17:35.523873 6428 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1201 08:17:35.523949 6428 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-knmdv_openshift-ovn-kubernetes(15cdec0a-5925-4966-a30b-f60c503f633e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f231e0092e1a85cc5d6e00fb8d3bd982223b670191ac3dc21d81ff1fab462a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-knmdv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:51Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:51 crc kubenswrapper[5004]: I1201 08:17:51.964840 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:51 crc kubenswrapper[5004]: I1201 08:17:51.964900 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:51 crc kubenswrapper[5004]: I1201 08:17:51.964915 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:51 crc kubenswrapper[5004]: I1201 08:17:51.964942 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:51 crc kubenswrapper[5004]: I1201 08:17:51.964958 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:51Z","lastTransitionTime":"2025-12-01T08:17:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:51 crc kubenswrapper[5004]: I1201 08:17:51.976747 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mzsvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"397b51b7-934a-41d1-a593-500a64161bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4adac5633dfe89777bf019818bab9ee3a208cad9d929c96cf2cb86b18c2d4264\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h49g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d83844478080e4829975fb6c8e0444d9fdebd44b08afcf45e7d0b04fc534a844\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h49g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mzsvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:51Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:51 crc kubenswrapper[5004]: I1201 08:17:51.995364 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:51Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.008067 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jjms6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8828af41-beeb-47dd-96cf-3dbcb5175893\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b22220849cdb91592a44a67f9f6a09586146009483f46a0505a2dea3b02bccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvsch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jjms6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:52Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.021197 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9977ebb-82de-4e96-8763-0b5a84f8d4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf84c2eea7167eed95a4195d37bf784ffef310bc2079d8d06e1f47cb45a7864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwgxq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c1e51fa92aec80e93d0fce11c743b8c4fde23fc23d8626e8d5b3e25ab42350d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwgxq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fvdgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:52Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.038890 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7cl5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b488f4f3-d385-4d40-bdee-96d8fe2d42a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-glbhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-glbhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7cl5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:52Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.068040 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.068061 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.068069 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.068081 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.068093 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:52Z","lastTransitionTime":"2025-12-01T08:17:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.076651 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43c9f762-d403-49b9-8294-d66976aca4dd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b358f641ff09bc8f4a452b0d4a8cf5f9790182f76779be31426d48d0278b0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d610c3a799d4a7df9ddfe2a28f2ced2e5ac4e96c67c93f8e6d3a0eee60d9ac48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43f9f72af3273da055eeb1e13ad6258a1b3b6b205402861beee9f87f02230297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e3cc2eca559417c71beed561569c4dc5b45ffb8407976e0695fc1b93219d37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c24fbeedbc7c40b6079b7ad89d8564a4f2cdaad2cf996e66682b082382da190\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:52Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.100266 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dpkxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddcaf111-708a-45b3-a342-effd3061ab17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68ecba515f05ca83fdd0cdda10e3e5925a146aadb70ae17859586c12daf55dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd2f03a8020818d1cc98b6eeaf64a728e627a3401400c3af53654f8771ef02eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd2f03a8020818d1cc98b6eeaf64a728e627a3401400c3af53654f8771ef02eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8dd31c10eaf6bc673dec377a884040524418dd1891a96b2eb6e6e983979512a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8dd31c10eaf6bc673dec377a884040524418dd1891a96b2eb6e6e983979512a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a7a8ef66a9486ca3d13826df7aa39db2083cff2c386b4d0d858e0526e0f094d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a7a8ef66a9486ca3d13826df7aa39db2083cff2c386b4d0d858e0526e0f094d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd1c47c976c17bf17e4d660288ce30472372b6a7f33f6ffa6d96c1a4436275d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd1c47c976c17bf17e4d660288ce30472372b6a7f33f6ffa6d96c1a4436275d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b7c8e9648e5514ab1fcb63dc72822dd276a2d74987e2fb4f4800b26ec176be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b7c8e9648e5514ab1fcb63dc72822dd276a2d74987e2fb4f4800b26ec176be7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d7115db7fe248437f4b8918f06c9aa48b141fcd3f31add80faee81b4347e47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d7115db7fe248437f4b8918f06c9aa48b141fcd3f31add80faee81b4347e47b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dpkxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:52Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.120174 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ww6lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72500ffd-4ca3-4614-a3a2-bbdc5a7506c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://668cfaa2af20e2bb082fc47e1702cfd9f704c4fdf56a4d27cf25d6915e7cd18a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmsvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ww6lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:52Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.138063 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2fa377e-a3a6-46ce-af23-e677799f0115\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520771cf1ad6f9360064c5f304243c5878a2f1025c87d1001c97b2d5f95cb9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b65b9bc444101900c62fd90c7d45789d186a89148ac0fd9c7f0c6048c3f55ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbd3e682d207b8c18d6bcbc26aa2adae2a35645502d31b327ffddd51991e842\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4563974ed1467a87aab104756828e63853c282a61c4fcab1e757eb4da9afaa40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:52Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.171254 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.171302 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.171314 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.171332 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.171343 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:52Z","lastTransitionTime":"2025-12-01T08:17:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.209852 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-knmdv_15cdec0a-5925-4966-a30b-f60c503f633e/ovnkube-controller/1.log" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.213352 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" event={"ID":"15cdec0a-5925-4966-a30b-f60c503f633e","Type":"ContainerStarted","Data":"76c127e5ff47d030a709c79b7c7f2c2d3b32009052ff95ff522d90ecd86cf993"} Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.213792 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.239326 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327477e7db10129d47ddf9078f32d0df2ffd4b8e59fc7bb77c17c60c63e9a936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:52Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.260104 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:52Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.274148 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.274188 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.274211 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.274243 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.274264 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:52Z","lastTransitionTime":"2025-12-01T08:17:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.285465 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:52Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.312719 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c470edd984db3b756e031e33f64a9bc5ca6b9c156ab479e5cd647745655a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:52Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.340068 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fd2f2b5d7dc3d7b8e1bb06fdf0eec3addfec3c230e044b6395d6cf535630ebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2250161e400b201f0528e33231c9007342002a37cd037a1b1e011f2aaaa22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:52Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.357477 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zjksw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70e79009-93be-49c4-a6b3-e8a06bcea7f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad3ea204ad731500a340e639f0e36abe28ba50464f1b2ff9b009e39c234cf708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd8m6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zjksw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:52Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.377184 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.377239 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.377252 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.377272 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.377285 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:52Z","lastTransitionTime":"2025-12-01T08:17:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.392300 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15cdec0a-5925-4966-a30b-f60c503f633e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f528b94ff12c7a804930dca4b93c23334a9ae3a53317750cd34b5695f08b4582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f001b70a83da28b1ea78a94f1cd70dfc63b3de5a21b041932d33c2ac970c986b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c5c043438b515e37a6d6d5c47b92479bbb7322fe35c3a1bc57ee7b3a746544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23df33e45cc27e4d343ae6ec3fb88f2dfc125ba7da424515d4bee5ec19281832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4a7cab8e1594814a687b1433a92642a19eff7c2eb12f96fa06f08a2e270733d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1844a18a6109c25a25b9c8cb3ff65e69cc657f66be03dd97e883d24a774b7638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76c127e5ff47d030a709c79b7c7f2c2d3b32009052ff95ff522d90ecd86cf993\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://404ed4cf3ada652b914fc9fd3295d149810f22a6cc4ea044f934ba8ee2595209\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T08:17:36Z\\\",\\\"message\\\":\\\"ping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 08:17:35.520098 6428 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 08:17:35.520395 6428 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 08:17:35.520708 6428 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 08:17:35.520930 6428 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 08:17:35.521362 6428 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1201 08:17:35.521431 6428 factory.go:656] Stopping watch factory\\\\nI1201 08:17:35.521454 6428 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1201 08:17:35.523766 6428 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1201 08:17:35.523786 6428 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1201 08:17:35.523846 6428 ovnkube.go:599] Stopped ovnkube\\\\nI1201 08:17:35.523873 6428 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1201 08:17:35.523949 6428 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f231e0092e1a85cc5d6e00fb8d3bd982223b670191ac3dc21d81ff1fab462a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-knmdv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:52Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.404972 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mzsvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"397b51b7-934a-41d1-a593-500a64161bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4adac5633dfe89777bf019818bab9ee3a208cad9d929c96cf2cb86b18c2d4264\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h49g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d83844478080e4829975fb6c8e0444d9fdebd44b08afcf45e7d0b04fc534a844\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h49g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mzsvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:52Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.424829 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43c9f762-d403-49b9-8294-d66976aca4dd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b358f641ff09bc8f4a452b0d4a8cf5f9790182f76779be31426d48d0278b0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d610c3a799d4a7df9ddfe2a28f2ced2e5ac4e96c67c93f8e6d3a0eee60d9ac48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43f9f72af3273da055eeb1e13ad6258a1b3b6b205402861beee9f87f02230297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e3cc2eca559417c71beed561569c4dc5b45ffb8407976e0695fc1b93219d37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c24fbeedbc7c40b6079b7ad89d8564a4f2cdaad2cf996e66682b082382da190\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:52Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.439196 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:52Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.449974 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jjms6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8828af41-beeb-47dd-96cf-3dbcb5175893\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b22220849cdb91592a44a67f9f6a09586146009483f46a0505a2dea3b02bccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvsch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jjms6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:52Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.465335 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9977ebb-82de-4e96-8763-0b5a84f8d4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf84c2eea7167eed95a4195d37bf784ffef310bc2079d8d06e1f47cb45a7864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwgxq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c1e51fa92aec80e93d0fce11c743b8c4fde23fc23d8626e8d5b3e25ab42350d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwgxq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fvdgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:52Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.477050 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7cl5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b488f4f3-d385-4d40-bdee-96d8fe2d42a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-glbhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-glbhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7cl5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:52Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.479476 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.479534 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.479546 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.479583 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.479598 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:52Z","lastTransitionTime":"2025-12-01T08:17:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.494524 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2fa377e-a3a6-46ce-af23-e677799f0115\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520771cf1ad6f9360064c5f304243c5878a2f1025c87d1001c97b2d5f95cb9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b65b9bc444101900c62fd90c7d45789d186a89148ac0fd9c7f0c6048c3f55ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbd3e682d207b8c18d6bcbc26aa2adae2a35645502d31b327ffddd51991e842\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4563974ed1467a87aab104756828e63853c282a61c4fcab1e757eb4da9afaa40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:52Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.515940 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dpkxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddcaf111-708a-45b3-a342-effd3061ab17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68ecba515f05ca83fdd0cdda10e3e5925a146aadb70ae17859586c12daf55dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd2f03a8020818d1cc98b6eeaf64a728e627a3401400c3af53654f8771ef02eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd2f03a8020818d1cc98b6eeaf64a728e627a3401400c3af53654f8771ef02eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8dd31c10eaf6bc673dec377a884040524418dd1891a96b2eb6e6e983979512a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8dd31c10eaf6bc673dec377a884040524418dd1891a96b2eb6e6e983979512a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a7a8ef66a9486ca3d13826df7aa39db2083cff2c386b4d0d858e0526e0f094d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a7a8ef66a9486ca3d13826df7aa39db2083cff2c386b4d0d858e0526e0f094d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd1c47c976c17bf17e4d660288ce30472372b6a7f33f6ffa6d96c1a4436275d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd1c47c976c17bf17e4d660288ce30472372b6a7f33f6ffa6d96c1a4436275d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b7c8e9648e5514ab1fcb63dc72822dd276a2d74987e2fb4f4800b26ec176be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b7c8e9648e5514ab1fcb63dc72822dd276a2d74987e2fb4f4800b26ec176be7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d7115db7fe248437f4b8918f06c9aa48b141fcd3f31add80faee81b4347e47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d7115db7fe248437f4b8918f06c9aa48b141fcd3f31add80faee81b4347e47b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dpkxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:52Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.527180 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ww6lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72500ffd-4ca3-4614-a3a2-bbdc5a7506c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://668cfaa2af20e2bb082fc47e1702cfd9f704c4fdf56a4d27cf25d6915e7cd18a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmsvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ww6lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:52Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.545932 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0ed2b6d-0c61-4639-bc3b-1c8effc4815d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45a0d7a12e7c114de1740a151e04f7f39844b709740cab958f156ac918af3228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d69a370af9f8f6c575f99da6bb01cee0f660dccb7e85f16f5d8874d0e263e1fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82baee0fd931fd7b41cd5a73ca7e653ef13dfa3d573991cbe3238d6b95f6fac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f341e99100e985f65c24647d976a24782dfae7f52e752b774f41f69af27fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad581564fc519328ae429bce757797f6d87d8648e862b008637a245866ec4cbe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T08:17:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 08:17:15.185125 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 08:17:15.186793 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071633747/tls.crt::/tmp/serving-cert-4071633747/tls.key\\\\\\\"\\\\nI1201 08:17:20.827614 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 08:17:20.834528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 08:17:20.834549 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 08:17:20.837611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 08:17:20.837630 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 08:17:20.848940 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 08:17:20.848957 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848965 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 08:17:20.848968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 08:17:20.848970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 08:17:20.848973 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 08:17:20.849115 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 08:17:20.854990 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2caeb2ebaf68110cb8728e60aae6db1b1fc48efae6018f66070766a4f42caa69\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:52Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.581937 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.581972 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.581983 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.582001 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.582013 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:52Z","lastTransitionTime":"2025-12-01T08:17:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.684590 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.684657 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.684677 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.684703 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.684722 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:52Z","lastTransitionTime":"2025-12-01T08:17:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.758834 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cl5l" Dec 01 08:17:52 crc kubenswrapper[5004]: E1201 08:17:52.759026 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cl5l" podUID="b488f4f3-d385-4d40-bdee-96d8fe2d42a1" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.781776 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dpkxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddcaf111-708a-45b3-a342-effd3061ab17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68ecba515f05ca83fdd0cdda10e3e5925a146aadb70ae17859586c12daf55dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd2f03a8020818d1cc98b6eeaf64a728e627a3401400c3af53654f8771ef02eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd2f03a8020818d1cc98b6eeaf64a728e627a3401400c3af53654f8771ef02eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8dd31c10eaf6bc673dec377a884040524418dd1891a96b2eb6e6e983979512a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8dd31c10eaf6bc673dec377a884040524418dd1891a96b2eb6e6e983979512a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a7a8ef66a9486ca3d13826df7aa39db2083cff2c386b4d0d858e0526e0f094d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a7a8ef66a9486ca3d13826df7aa39db2083cff2c386b4d0d858e0526e0f094d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd1c47c976c17bf17e4d660288ce30472372b6a7f33f6ffa6d96c1a4436275d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd1c47c976c17bf17e4d660288ce30472372b6a7f33f6ffa6d96c1a4436275d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b7c8e9648e5514ab1fcb63dc72822dd276a2d74987e2fb4f4800b26ec176be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b7c8e9648e5514ab1fcb63dc72822dd276a2d74987e2fb4f4800b26ec176be7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d7115db7fe248437f4b8918f06c9aa48b141fcd3f31add80faee81b4347e47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d7115db7fe248437f4b8918f06c9aa48b141fcd3f31add80faee81b4347e47b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dpkxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:52Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.786783 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.786839 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.786857 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.786884 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.786902 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:52Z","lastTransitionTime":"2025-12-01T08:17:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.798884 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ww6lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72500ffd-4ca3-4614-a3a2-bbdc5a7506c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://668cfaa2af20e2bb082fc47e1702cfd9f704c4fdf56a4d27cf25d6915e7cd18a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmsvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ww6lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:52Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.818615 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2fa377e-a3a6-46ce-af23-e677799f0115\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520771cf1ad6f9360064c5f304243c5878a2f1025c87d1001c97b2d5f95cb9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b65b9bc444101900c62fd90c7d45789d186a89148ac0fd9c7f0c6048c3f55ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbd3e682d207b8c18d6bcbc26aa2adae2a35645502d31b327ffddd51991e842\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4563974ed1467a87aab104756828e63853c282a61c4fcab1e757eb4da9afaa40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:52Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.838983 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0ed2b6d-0c61-4639-bc3b-1c8effc4815d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45a0d7a12e7c114de1740a151e04f7f39844b709740cab958f156ac918af3228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d69a370af9f8f6c575f99da6bb01cee0f660dccb7e85f16f5d8874d0e263e1fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82baee0fd931fd7b41cd5a73ca7e653ef13dfa3d573991cbe3238d6b95f6fac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f341e99100e985f65c24647d976a24782dfae7f52e752b774f41f69af27fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad581564fc519328ae429bce757797f6d87d8648e862b008637a245866ec4cbe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T08:17:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 08:17:15.185125 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 08:17:15.186793 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071633747/tls.crt::/tmp/serving-cert-4071633747/tls.key\\\\\\\"\\\\nI1201 08:17:20.827614 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 08:17:20.834528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 08:17:20.834549 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 08:17:20.837611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 08:17:20.837630 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 08:17:20.848940 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 08:17:20.848957 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848965 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 08:17:20.848968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 08:17:20.848970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 08:17:20.848973 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 08:17:20.849115 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 08:17:20.854990 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2caeb2ebaf68110cb8728e60aae6db1b1fc48efae6018f66070766a4f42caa69\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:52Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.854719 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327477e7db10129d47ddf9078f32d0df2ffd4b8e59fc7bb77c17c60c63e9a936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:52Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.876381 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:52Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.889274 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.889330 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.889346 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.889371 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.889387 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:52Z","lastTransitionTime":"2025-12-01T08:17:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.895989 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:52Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.913339 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c470edd984db3b756e031e33f64a9bc5ca6b9c156ab479e5cd647745655a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:52Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.928782 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fd2f2b5d7dc3d7b8e1bb06fdf0eec3addfec3c230e044b6395d6cf535630ebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2250161e400b201f0528e33231c9007342002a37cd037a1b1e011f2aaaa22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:52Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.936892 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b488f4f3-d385-4d40-bdee-96d8fe2d42a1-metrics-certs\") pod \"network-metrics-daemon-7cl5l\" (UID: \"b488f4f3-d385-4d40-bdee-96d8fe2d42a1\") " pod="openshift-multus/network-metrics-daemon-7cl5l" Dec 01 08:17:52 crc kubenswrapper[5004]: E1201 08:17:52.937088 5004 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 08:17:52 crc kubenswrapper[5004]: E1201 08:17:52.937208 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b488f4f3-d385-4d40-bdee-96d8fe2d42a1-metrics-certs podName:b488f4f3-d385-4d40-bdee-96d8fe2d42a1 nodeName:}" failed. No retries permitted until 2025-12-01 08:18:08.937181631 +0000 UTC m=+66.502173643 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b488f4f3-d385-4d40-bdee-96d8fe2d42a1-metrics-certs") pod "network-metrics-daemon-7cl5l" (UID: "b488f4f3-d385-4d40-bdee-96d8fe2d42a1") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.949699 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zjksw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70e79009-93be-49c4-a6b3-e8a06bcea7f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad3ea204ad731500a340e639f0e36abe28ba50464f1b2ff9b009e39c234cf708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd8m6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zjksw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:52Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.976085 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15cdec0a-5925-4966-a30b-f60c503f633e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f528b94ff12c7a804930dca4b93c23334a9ae3a53317750cd34b5695f08b4582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f001b70a83da28b1ea78a94f1cd70dfc63b3de5a21b041932d33c2ac970c986b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c5c043438b515e37a6d6d5c47b92479bbb7322fe35c3a1bc57ee7b3a746544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23df33e45cc27e4d343ae6ec3fb88f2dfc125ba7da424515d4bee5ec19281832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4a7cab8e1594814a687b1433a92642a19eff7c2eb12f96fa06f08a2e270733d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1844a18a6109c25a25b9c8cb3ff65e69cc657f66be03dd97e883d24a774b7638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76c127e5ff47d030a709c79b7c7f2c2d3b32009052ff95ff522d90ecd86cf993\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://404ed4cf3ada652b914fc9fd3295d149810f22a6cc4ea044f934ba8ee2595209\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T08:17:36Z\\\",\\\"message\\\":\\\"ping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 08:17:35.520098 6428 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 08:17:35.520395 6428 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 08:17:35.520708 6428 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 08:17:35.520930 6428 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 08:17:35.521362 6428 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1201 08:17:35.521431 6428 factory.go:656] Stopping watch factory\\\\nI1201 08:17:35.521454 6428 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1201 08:17:35.523766 6428 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1201 08:17:35.523786 6428 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1201 08:17:35.523846 6428 ovnkube.go:599] Stopped ovnkube\\\\nI1201 08:17:35.523873 6428 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1201 08:17:35.523949 6428 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f231e0092e1a85cc5d6e00fb8d3bd982223b670191ac3dc21d81ff1fab462a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-knmdv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:52Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.993216 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.993284 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.993306 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.993334 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.993360 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:52Z","lastTransitionTime":"2025-12-01T08:17:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:52 crc kubenswrapper[5004]: I1201 08:17:52.993626 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mzsvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"397b51b7-934a-41d1-a593-500a64161bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4adac5633dfe89777bf019818bab9ee3a208cad9d929c96cf2cb86b18c2d4264\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h49g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d83844478080e4829975fb6c8e0444d9fdebd44b08afcf45e7d0b04fc534a844\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h49g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mzsvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:52Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.007218 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:53Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.017807 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jjms6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8828af41-beeb-47dd-96cf-3dbcb5175893\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b22220849cdb91592a44a67f9f6a09586146009483f46a0505a2dea3b02bccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvsch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jjms6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:53Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.034460 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9977ebb-82de-4e96-8763-0b5a84f8d4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf84c2eea7167eed95a4195d37bf784ffef310bc2079d8d06e1f47cb45a7864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwgxq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c1e51fa92aec80e93d0fce11c743b8c4fde23fc23d8626e8d5b3e25ab42350d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwgxq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fvdgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:53Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.047344 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7cl5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b488f4f3-d385-4d40-bdee-96d8fe2d42a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-glbhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-glbhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7cl5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:53Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.068302 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43c9f762-d403-49b9-8294-d66976aca4dd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b358f641ff09bc8f4a452b0d4a8cf5f9790182f76779be31426d48d0278b0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d610c3a799d4a7df9ddfe2a28f2ced2e5ac4e96c67c93f8e6d3a0eee60d9ac48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43f9f72af3273da055eeb1e13ad6258a1b3b6b205402861beee9f87f02230297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e3cc2eca559417c71beed561569c4dc5b45ffb8407976e0695fc1b93219d37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c24fbeedbc7c40b6079b7ad89d8564a4f2cdaad2cf996e66682b082382da190\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:53Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.095414 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.095625 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.095747 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.095850 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.095949 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:53Z","lastTransitionTime":"2025-12-01T08:17:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.198961 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.199014 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.199031 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.199052 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.199067 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:53Z","lastTransitionTime":"2025-12-01T08:17:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.219128 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-knmdv_15cdec0a-5925-4966-a30b-f60c503f633e/ovnkube-controller/2.log" Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.220481 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-knmdv_15cdec0a-5925-4966-a30b-f60c503f633e/ovnkube-controller/1.log" Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.223958 5004 generic.go:334] "Generic (PLEG): container finished" podID="15cdec0a-5925-4966-a30b-f60c503f633e" containerID="76c127e5ff47d030a709c79b7c7f2c2d3b32009052ff95ff522d90ecd86cf993" exitCode=1 Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.224009 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" event={"ID":"15cdec0a-5925-4966-a30b-f60c503f633e","Type":"ContainerDied","Data":"76c127e5ff47d030a709c79b7c7f2c2d3b32009052ff95ff522d90ecd86cf993"} Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.224047 5004 scope.go:117] "RemoveContainer" containerID="404ed4cf3ada652b914fc9fd3295d149810f22a6cc4ea044f934ba8ee2595209" Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.226710 5004 scope.go:117] "RemoveContainer" containerID="76c127e5ff47d030a709c79b7c7f2c2d3b32009052ff95ff522d90ecd86cf993" Dec 01 08:17:53 crc kubenswrapper[5004]: E1201 08:17:53.227323 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-knmdv_openshift-ovn-kubernetes(15cdec0a-5925-4966-a30b-f60c503f633e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" podUID="15cdec0a-5925-4966-a30b-f60c503f633e" Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.251520 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0ed2b6d-0c61-4639-bc3b-1c8effc4815d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45a0d7a12e7c114de1740a151e04f7f39844b709740cab958f156ac918af3228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d69a370af9f8f6c575f99da6bb01cee0f660dccb7e85f16f5d8874d0e263e1fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82baee0fd931fd7b41cd5a73ca7e653ef13dfa3d573991cbe3238d6b95f6fac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f341e99100e985f65c24647d976a24782dfae7f52e752b774f41f69af27fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad581564fc519328ae429bce757797f6d87d8648e862b008637a245866ec4cbe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T08:17:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 08:17:15.185125 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 08:17:15.186793 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071633747/tls.crt::/tmp/serving-cert-4071633747/tls.key\\\\\\\"\\\\nI1201 08:17:20.827614 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 08:17:20.834528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 08:17:20.834549 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 08:17:20.837611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 08:17:20.837630 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 08:17:20.848940 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 08:17:20.848957 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848965 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 08:17:20.848968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 08:17:20.848970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 08:17:20.848973 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 08:17:20.849115 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 08:17:20.854990 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2caeb2ebaf68110cb8728e60aae6db1b1fc48efae6018f66070766a4f42caa69\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:53Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.282129 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15cdec0a-5925-4966-a30b-f60c503f633e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f528b94ff12c7a804930dca4b93c23334a9ae3a53317750cd34b5695f08b4582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f001b70a83da28b1ea78a94f1cd70dfc63b3de5a21b041932d33c2ac970c986b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c5c043438b515e37a6d6d5c47b92479bbb7322fe35c3a1bc57ee7b3a746544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23df33e45cc27e4d343ae6ec3fb88f2dfc125ba7da424515d4bee5ec19281832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4a7cab8e1594814a687b1433a92642a19eff7c2eb12f96fa06f08a2e270733d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1844a18a6109c25a25b9c8cb3ff65e69cc657f66be03dd97e883d24a774b7638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76c127e5ff47d030a709c79b7c7f2c2d3b32009052ff95ff522d90ecd86cf993\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://404ed4cf3ada652b914fc9fd3295d149810f22a6cc4ea044f934ba8ee2595209\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T08:17:36Z\\\",\\\"message\\\":\\\"ping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 08:17:35.520098 6428 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 08:17:35.520395 6428 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 08:17:35.520708 6428 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 08:17:35.520930 6428 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 08:17:35.521362 6428 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1201 08:17:35.521431 6428 factory.go:656] Stopping watch factory\\\\nI1201 08:17:35.521454 6428 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1201 08:17:35.523766 6428 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1201 08:17:35.523786 6428 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1201 08:17:35.523846 6428 ovnkube.go:599] Stopped ovnkube\\\\nI1201 08:17:35.523873 6428 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1201 08:17:35.523949 6428 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76c127e5ff47d030a709c79b7c7f2c2d3b32009052ff95ff522d90ecd86cf993\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T08:17:52Z\\\",\\\"message\\\":\\\"}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1201 08:17:52.744847 6628 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-marketplace/community-operators]} name:Service_openshift-marketplace/community-operators_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.189:50051:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d389393c-7ba9-422c-b3f5-06e391d537d2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 08:17:52.746408 6628 model_client.go:382] Update operations generated as: [{Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1201 08:17:52.746269 6628 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f231e0092e1a85cc5d6e00fb8d3bd982223b670191ac3dc21d81ff1fab462a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-knmdv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:53Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.302828 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.302906 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.302933 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.302966 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.302989 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:53Z","lastTransitionTime":"2025-12-01T08:17:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.303694 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327477e7db10129d47ddf9078f32d0df2ffd4b8e59fc7bb77c17c60c63e9a936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:53Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.321949 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:53Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.337745 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:53Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.353186 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c470edd984db3b756e031e33f64a9bc5ca6b9c156ab479e5cd647745655a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:53Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.374078 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fd2f2b5d7dc3d7b8e1bb06fdf0eec3addfec3c230e044b6395d6cf535630ebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2250161e400b201f0528e33231c9007342002a37cd037a1b1e011f2aaaa22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:53Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.390025 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zjksw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70e79009-93be-49c4-a6b3-e8a06bcea7f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad3ea204ad731500a340e639f0e36abe28ba50464f1b2ff9b009e39c234cf708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd8m6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zjksw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:53Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.405175 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mzsvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"397b51b7-934a-41d1-a593-500a64161bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4adac5633dfe89777bf019818bab9ee3a208cad9d929c96cf2cb86b18c2d4264\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h49g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d83844478080e4829975fb6c8e0444d9fdebd44b08afcf45e7d0b04fc534a844\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h49g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mzsvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:53Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.406767 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.406802 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.406815 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.406833 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.406845 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:53Z","lastTransitionTime":"2025-12-01T08:17:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.433972 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43c9f762-d403-49b9-8294-d66976aca4dd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b358f641ff09bc8f4a452b0d4a8cf5f9790182f76779be31426d48d0278b0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d610c3a799d4a7df9ddfe2a28f2ced2e5ac4e96c67c93f8e6d3a0eee60d9ac48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43f9f72af3273da055eeb1e13ad6258a1b3b6b205402861beee9f87f02230297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e3cc2eca559417c71beed561569c4dc5b45ffb8407976e0695fc1b93219d37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c24fbeedbc7c40b6079b7ad89d8564a4f2cdaad2cf996e66682b082382da190\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:53Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.447670 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:53Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.459754 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jjms6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8828af41-beeb-47dd-96cf-3dbcb5175893\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b22220849cdb91592a44a67f9f6a09586146009483f46a0505a2dea3b02bccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvsch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jjms6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:53Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.474688 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9977ebb-82de-4e96-8763-0b5a84f8d4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf84c2eea7167eed95a4195d37bf784ffef310bc2079d8d06e1f47cb45a7864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwgxq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c1e51fa92aec80e93d0fce11c743b8c4fde23fc23d8626e8d5b3e25ab42350d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwgxq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fvdgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:53Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.488803 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7cl5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b488f4f3-d385-4d40-bdee-96d8fe2d42a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-glbhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-glbhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7cl5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:53Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.502199 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2fa377e-a3a6-46ce-af23-e677799f0115\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520771cf1ad6f9360064c5f304243c5878a2f1025c87d1001c97b2d5f95cb9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b65b9bc444101900c62fd90c7d45789d186a89148ac0fd9c7f0c6048c3f55ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbd3e682d207b8c18d6bcbc26aa2adae2a35645502d31b327ffddd51991e842\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4563974ed1467a87aab104756828e63853c282a61c4fcab1e757eb4da9afaa40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:53Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.509618 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.509652 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.509665 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.509684 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.509708 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:53Z","lastTransitionTime":"2025-12-01T08:17:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.518791 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dpkxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddcaf111-708a-45b3-a342-effd3061ab17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68ecba515f05ca83fdd0cdda10e3e5925a146aadb70ae17859586c12daf55dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd2f03a8020818d1cc98b6eeaf64a728e627a3401400c3af53654f8771ef02eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd2f03a8020818d1cc98b6eeaf64a728e627a3401400c3af53654f8771ef02eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8dd31c10eaf6bc673dec377a884040524418dd1891a96b2eb6e6e983979512a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8dd31c10eaf6bc673dec377a884040524418dd1891a96b2eb6e6e983979512a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a7a8ef66a9486ca3d13826df7aa39db2083cff2c386b4d0d858e0526e0f094d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a7a8ef66a9486ca3d13826df7aa39db2083cff2c386b4d0d858e0526e0f094d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd1c47c976c17bf17e4d660288ce30472372b6a7f33f6ffa6d96c1a4436275d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd1c47c976c17bf17e4d660288ce30472372b6a7f33f6ffa6d96c1a4436275d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b7c8e9648e5514ab1fcb63dc72822dd276a2d74987e2fb4f4800b26ec176be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b7c8e9648e5514ab1fcb63dc72822dd276a2d74987e2fb4f4800b26ec176be7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d7115db7fe248437f4b8918f06c9aa48b141fcd3f31add80faee81b4347e47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d7115db7fe248437f4b8918f06c9aa48b141fcd3f31add80faee81b4347e47b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dpkxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:53Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.532840 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ww6lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72500ffd-4ca3-4614-a3a2-bbdc5a7506c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://668cfaa2af20e2bb082fc47e1702cfd9f704c4fdf56a4d27cf25d6915e7cd18a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmsvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ww6lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:53Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.550534 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 08:17:53 crc kubenswrapper[5004]: E1201 08:17:53.550736 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 08:18:25.550705513 +0000 UTC m=+83.115697505 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.550825 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.550905 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.550935 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.550971 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:17:53 crc kubenswrapper[5004]: E1201 08:17:53.551056 5004 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 08:17:53 crc kubenswrapper[5004]: E1201 08:17:53.551087 5004 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 08:17:53 crc kubenswrapper[5004]: E1201 08:17:53.551107 5004 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 08:17:53 crc kubenswrapper[5004]: E1201 08:17:53.551145 5004 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 08:17:53 crc kubenswrapper[5004]: E1201 08:17:53.551139 5004 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 08:17:53 crc kubenswrapper[5004]: E1201 08:17:53.551163 5004 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 08:17:53 crc kubenswrapper[5004]: E1201 08:17:53.551192 5004 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 08:17:53 crc kubenswrapper[5004]: E1201 08:17:53.551170 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 08:18:25.551148234 +0000 UTC m=+83.116140256 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 08:17:53 crc kubenswrapper[5004]: E1201 08:17:53.551062 5004 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 08:17:53 crc kubenswrapper[5004]: E1201 08:17:53.551242 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 08:18:25.551231306 +0000 UTC m=+83.116223298 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 08:17:53 crc kubenswrapper[5004]: E1201 08:17:53.551263 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 08:18:25.551256177 +0000 UTC m=+83.116248169 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 08:17:53 crc kubenswrapper[5004]: E1201 08:17:53.551303 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 08:18:25.551276907 +0000 UTC m=+83.116268999 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.612305 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.612384 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.612400 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.612424 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.612443 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:53Z","lastTransitionTime":"2025-12-01T08:17:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.715195 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.715263 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.715288 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.715320 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.715343 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:53Z","lastTransitionTime":"2025-12-01T08:17:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.758473 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.758612 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 08:17:53 crc kubenswrapper[5004]: E1201 08:17:53.758663 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.758609 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 08:17:53 crc kubenswrapper[5004]: E1201 08:17:53.758817 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 08:17:53 crc kubenswrapper[5004]: E1201 08:17:53.758966 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.817294 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.817321 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.817331 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.817345 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.817356 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:53Z","lastTransitionTime":"2025-12-01T08:17:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.920237 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.920751 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.920771 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.920795 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:53 crc kubenswrapper[5004]: I1201 08:17:53.920813 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:53Z","lastTransitionTime":"2025-12-01T08:17:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:54 crc kubenswrapper[5004]: I1201 08:17:54.024117 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:54 crc kubenswrapper[5004]: I1201 08:17:54.024189 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:54 crc kubenswrapper[5004]: I1201 08:17:54.024205 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:54 crc kubenswrapper[5004]: I1201 08:17:54.024233 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:54 crc kubenswrapper[5004]: I1201 08:17:54.024252 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:54Z","lastTransitionTime":"2025-12-01T08:17:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:54 crc kubenswrapper[5004]: I1201 08:17:54.127687 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:54 crc kubenswrapper[5004]: I1201 08:17:54.127736 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:54 crc kubenswrapper[5004]: I1201 08:17:54.127754 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:54 crc kubenswrapper[5004]: I1201 08:17:54.127775 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:54 crc kubenswrapper[5004]: I1201 08:17:54.127792 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:54Z","lastTransitionTime":"2025-12-01T08:17:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:54 crc kubenswrapper[5004]: I1201 08:17:54.229793 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:54 crc kubenswrapper[5004]: I1201 08:17:54.229860 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:54 crc kubenswrapper[5004]: I1201 08:17:54.229883 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:54 crc kubenswrapper[5004]: I1201 08:17:54.229915 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:54 crc kubenswrapper[5004]: I1201 08:17:54.229937 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:54Z","lastTransitionTime":"2025-12-01T08:17:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:54 crc kubenswrapper[5004]: I1201 08:17:54.230310 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-knmdv_15cdec0a-5925-4966-a30b-f60c503f633e/ovnkube-controller/2.log" Dec 01 08:17:54 crc kubenswrapper[5004]: I1201 08:17:54.235406 5004 scope.go:117] "RemoveContainer" containerID="76c127e5ff47d030a709c79b7c7f2c2d3b32009052ff95ff522d90ecd86cf993" Dec 01 08:17:54 crc kubenswrapper[5004]: E1201 08:17:54.235675 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-knmdv_openshift-ovn-kubernetes(15cdec0a-5925-4966-a30b-f60c503f633e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" podUID="15cdec0a-5925-4966-a30b-f60c503f633e" Dec 01 08:17:54 crc kubenswrapper[5004]: I1201 08:17:54.259389 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2fa377e-a3a6-46ce-af23-e677799f0115\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520771cf1ad6f9360064c5f304243c5878a2f1025c87d1001c97b2d5f95cb9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b65b9bc444101900c62fd90c7d45789d186a89148ac0fd9c7f0c6048c3f55ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbd3e682d207b8c18d6bcbc26aa2adae2a35645502d31b327ffddd51991e842\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4563974ed1467a87aab104756828e63853c282a61c4fcab1e757eb4da9afaa40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:54Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:54 crc kubenswrapper[5004]: I1201 08:17:54.287786 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dpkxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddcaf111-708a-45b3-a342-effd3061ab17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68ecba515f05ca83fdd0cdda10e3e5925a146aadb70ae17859586c12daf55dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd2f03a8020818d1cc98b6eeaf64a728e627a3401400c3af53654f8771ef02eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd2f03a8020818d1cc98b6eeaf64a728e627a3401400c3af53654f8771ef02eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8dd31c10eaf6bc673dec377a884040524418dd1891a96b2eb6e6e983979512a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8dd31c10eaf6bc673dec377a884040524418dd1891a96b2eb6e6e983979512a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a7a8ef66a9486ca3d13826df7aa39db2083cff2c386b4d0d858e0526e0f094d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a7a8ef66a9486ca3d13826df7aa39db2083cff2c386b4d0d858e0526e0f094d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd1c47c976c17bf17e4d660288ce30472372b6a7f33f6ffa6d96c1a4436275d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd1c47c976c17bf17e4d660288ce30472372b6a7f33f6ffa6d96c1a4436275d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b7c8e9648e5514ab1fcb63dc72822dd276a2d74987e2fb4f4800b26ec176be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b7c8e9648e5514ab1fcb63dc72822dd276a2d74987e2fb4f4800b26ec176be7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d7115db7fe248437f4b8918f06c9aa48b141fcd3f31add80faee81b4347e47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d7115db7fe248437f4b8918f06c9aa48b141fcd3f31add80faee81b4347e47b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dpkxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:54Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:54 crc kubenswrapper[5004]: I1201 08:17:54.303933 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ww6lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72500ffd-4ca3-4614-a3a2-bbdc5a7506c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://668cfaa2af20e2bb082fc47e1702cfd9f704c4fdf56a4d27cf25d6915e7cd18a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmsvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ww6lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:54Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:54 crc kubenswrapper[5004]: I1201 08:17:54.325377 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0ed2b6d-0c61-4639-bc3b-1c8effc4815d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45a0d7a12e7c114de1740a151e04f7f39844b709740cab958f156ac918af3228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d69a370af9f8f6c575f99da6bb01cee0f660dccb7e85f16f5d8874d0e263e1fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82baee0fd931fd7b41cd5a73ca7e653ef13dfa3d573991cbe3238d6b95f6fac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f341e99100e985f65c24647d976a24782dfae7f52e752b774f41f69af27fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad581564fc519328ae429bce757797f6d87d8648e862b008637a245866ec4cbe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T08:17:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 08:17:15.185125 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 08:17:15.186793 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071633747/tls.crt::/tmp/serving-cert-4071633747/tls.key\\\\\\\"\\\\nI1201 08:17:20.827614 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 08:17:20.834528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 08:17:20.834549 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 08:17:20.837611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 08:17:20.837630 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 08:17:20.848940 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 08:17:20.848957 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848965 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 08:17:20.848968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 08:17:20.848970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 08:17:20.848973 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 08:17:20.849115 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 08:17:20.854990 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2caeb2ebaf68110cb8728e60aae6db1b1fc48efae6018f66070766a4f42caa69\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:54Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:54 crc kubenswrapper[5004]: I1201 08:17:54.332014 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:54 crc kubenswrapper[5004]: I1201 08:17:54.332095 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:54 crc kubenswrapper[5004]: I1201 08:17:54.332120 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:54 crc kubenswrapper[5004]: I1201 08:17:54.332151 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:54 crc kubenswrapper[5004]: I1201 08:17:54.332177 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:54Z","lastTransitionTime":"2025-12-01T08:17:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:54 crc kubenswrapper[5004]: I1201 08:17:54.356890 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15cdec0a-5925-4966-a30b-f60c503f633e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f528b94ff12c7a804930dca4b93c23334a9ae3a53317750cd34b5695f08b4582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f001b70a83da28b1ea78a94f1cd70dfc63b3de5a21b041932d33c2ac970c986b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c5c043438b515e37a6d6d5c47b92479bbb7322fe35c3a1bc57ee7b3a746544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23df33e45cc27e4d343ae6ec3fb88f2dfc125ba7da424515d4bee5ec19281832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4a7cab8e1594814a687b1433a92642a19eff7c2eb12f96fa06f08a2e270733d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1844a18a6109c25a25b9c8cb3ff65e69cc657f66be03dd97e883d24a774b7638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76c127e5ff47d030a709c79b7c7f2c2d3b32009052ff95ff522d90ecd86cf993\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76c127e5ff47d030a709c79b7c7f2c2d3b32009052ff95ff522d90ecd86cf993\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T08:17:52Z\\\",\\\"message\\\":\\\"}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1201 08:17:52.744847 6628 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-marketplace/community-operators]} name:Service_openshift-marketplace/community-operators_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.189:50051:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d389393c-7ba9-422c-b3f5-06e391d537d2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 08:17:52.746408 6628 model_client.go:382] Update operations generated as: [{Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1201 08:17:52.746269 6628 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-knmdv_openshift-ovn-kubernetes(15cdec0a-5925-4966-a30b-f60c503f633e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f231e0092e1a85cc5d6e00fb8d3bd982223b670191ac3dc21d81ff1fab462a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-knmdv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:54Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:54 crc kubenswrapper[5004]: I1201 08:17:54.380056 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327477e7db10129d47ddf9078f32d0df2ffd4b8e59fc7bb77c17c60c63e9a936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:54Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:54 crc kubenswrapper[5004]: I1201 08:17:54.396330 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:54Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:54 crc kubenswrapper[5004]: I1201 08:17:54.415311 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:54Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:54 crc kubenswrapper[5004]: I1201 08:17:54.434796 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:54 crc kubenswrapper[5004]: I1201 08:17:54.434861 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:54 crc kubenswrapper[5004]: I1201 08:17:54.434884 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:54 crc kubenswrapper[5004]: I1201 08:17:54.434912 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:54 crc kubenswrapper[5004]: I1201 08:17:54.434936 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:54Z","lastTransitionTime":"2025-12-01T08:17:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:54 crc kubenswrapper[5004]: I1201 08:17:54.440556 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c470edd984db3b756e031e33f64a9bc5ca6b9c156ab479e5cd647745655a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:54Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:54 crc kubenswrapper[5004]: I1201 08:17:54.462162 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fd2f2b5d7dc3d7b8e1bb06fdf0eec3addfec3c230e044b6395d6cf535630ebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2250161e400b201f0528e33231c9007342002a37cd037a1b1e011f2aaaa22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:54Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:54 crc kubenswrapper[5004]: I1201 08:17:54.481306 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zjksw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70e79009-93be-49c4-a6b3-e8a06bcea7f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad3ea204ad731500a340e639f0e36abe28ba50464f1b2ff9b009e39c234cf708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd8m6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zjksw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:54Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:54 crc kubenswrapper[5004]: I1201 08:17:54.498406 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mzsvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"397b51b7-934a-41d1-a593-500a64161bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4adac5633dfe89777bf019818bab9ee3a208cad9d929c96cf2cb86b18c2d4264\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h49g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d83844478080e4829975fb6c8e0444d9fdebd44b08afcf45e7d0b04fc534a844\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h49g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mzsvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:54Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:54 crc kubenswrapper[5004]: I1201 08:17:54.532946 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43c9f762-d403-49b9-8294-d66976aca4dd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b358f641ff09bc8f4a452b0d4a8cf5f9790182f76779be31426d48d0278b0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d610c3a799d4a7df9ddfe2a28f2ced2e5ac4e96c67c93f8e6d3a0eee60d9ac48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43f9f72af3273da055eeb1e13ad6258a1b3b6b205402861beee9f87f02230297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e3cc2eca559417c71beed561569c4dc5b45ffb8407976e0695fc1b93219d37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c24fbeedbc7c40b6079b7ad89d8564a4f2cdaad2cf996e66682b082382da190\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:54Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:54 crc kubenswrapper[5004]: I1201 08:17:54.537650 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:54 crc kubenswrapper[5004]: I1201 08:17:54.537704 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:54 crc kubenswrapper[5004]: I1201 08:17:54.537720 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:54 crc kubenswrapper[5004]: I1201 08:17:54.537745 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:54 crc kubenswrapper[5004]: I1201 08:17:54.537763 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:54Z","lastTransitionTime":"2025-12-01T08:17:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:54 crc kubenswrapper[5004]: I1201 08:17:54.551308 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:54Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:54 crc kubenswrapper[5004]: I1201 08:17:54.568359 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jjms6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8828af41-beeb-47dd-96cf-3dbcb5175893\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b22220849cdb91592a44a67f9f6a09586146009483f46a0505a2dea3b02bccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvsch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jjms6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:54Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:54 crc kubenswrapper[5004]: I1201 08:17:54.586690 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9977ebb-82de-4e96-8763-0b5a84f8d4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf84c2eea7167eed95a4195d37bf784ffef310bc2079d8d06e1f47cb45a7864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwgxq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c1e51fa92aec80e93d0fce11c743b8c4fde23fc23d8626e8d5b3e25ab42350d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwgxq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fvdgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:54Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:54 crc kubenswrapper[5004]: I1201 08:17:54.603034 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7cl5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b488f4f3-d385-4d40-bdee-96d8fe2d42a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-glbhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-glbhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7cl5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:54Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:54 crc kubenswrapper[5004]: I1201 08:17:54.640864 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:54 crc kubenswrapper[5004]: I1201 08:17:54.640904 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:54 crc kubenswrapper[5004]: I1201 08:17:54.640913 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:54 crc kubenswrapper[5004]: I1201 08:17:54.640927 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:54 crc kubenswrapper[5004]: I1201 08:17:54.640936 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:54Z","lastTransitionTime":"2025-12-01T08:17:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:54 crc kubenswrapper[5004]: I1201 08:17:54.743238 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:54 crc kubenswrapper[5004]: I1201 08:17:54.743273 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:54 crc kubenswrapper[5004]: I1201 08:17:54.743281 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:54 crc kubenswrapper[5004]: I1201 08:17:54.743296 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:54 crc kubenswrapper[5004]: I1201 08:17:54.743305 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:54Z","lastTransitionTime":"2025-12-01T08:17:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:54 crc kubenswrapper[5004]: I1201 08:17:54.758970 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cl5l" Dec 01 08:17:54 crc kubenswrapper[5004]: E1201 08:17:54.759147 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cl5l" podUID="b488f4f3-d385-4d40-bdee-96d8fe2d42a1" Dec 01 08:17:54 crc kubenswrapper[5004]: I1201 08:17:54.845483 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:54 crc kubenswrapper[5004]: I1201 08:17:54.845530 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:54 crc kubenswrapper[5004]: I1201 08:17:54.845546 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:54 crc kubenswrapper[5004]: I1201 08:17:54.845586 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:54 crc kubenswrapper[5004]: I1201 08:17:54.845603 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:54Z","lastTransitionTime":"2025-12-01T08:17:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:54 crc kubenswrapper[5004]: I1201 08:17:54.947760 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:54 crc kubenswrapper[5004]: I1201 08:17:54.947790 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:54 crc kubenswrapper[5004]: I1201 08:17:54.947799 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:54 crc kubenswrapper[5004]: I1201 08:17:54.947813 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:54 crc kubenswrapper[5004]: I1201 08:17:54.947821 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:54Z","lastTransitionTime":"2025-12-01T08:17:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.050627 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.050694 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.050717 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.050747 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.050769 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:55Z","lastTransitionTime":"2025-12-01T08:17:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.153354 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.153466 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.153523 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.153550 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.153654 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:55Z","lastTransitionTime":"2025-12-01T08:17:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.158724 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.158764 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.158776 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.158823 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.158836 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:55Z","lastTransitionTime":"2025-12-01T08:17:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:55 crc kubenswrapper[5004]: E1201 08:17:55.175740 5004 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c8341267-df98-484f-a5e7-cf024fab437c\\\",\\\"systemUUID\\\":\\\"77547a65-c048-47ee-89b1-8422fc81b7aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:55Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.181305 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.181369 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.181387 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.181414 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.181433 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:55Z","lastTransitionTime":"2025-12-01T08:17:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:55 crc kubenswrapper[5004]: E1201 08:17:55.200509 5004 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c8341267-df98-484f-a5e7-cf024fab437c\\\",\\\"systemUUID\\\":\\\"77547a65-c048-47ee-89b1-8422fc81b7aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:55Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.205842 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.205901 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.205922 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.205946 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.205963 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:55Z","lastTransitionTime":"2025-12-01T08:17:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:55 crc kubenswrapper[5004]: E1201 08:17:55.224695 5004 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c8341267-df98-484f-a5e7-cf024fab437c\\\",\\\"systemUUID\\\":\\\"77547a65-c048-47ee-89b1-8422fc81b7aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:55Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.229486 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.229597 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.229625 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.229655 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.229674 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:55Z","lastTransitionTime":"2025-12-01T08:17:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:55 crc kubenswrapper[5004]: E1201 08:17:55.249533 5004 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c8341267-df98-484f-a5e7-cf024fab437c\\\",\\\"systemUUID\\\":\\\"77547a65-c048-47ee-89b1-8422fc81b7aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:55Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.253713 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.253767 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.253786 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.253809 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.253828 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:55Z","lastTransitionTime":"2025-12-01T08:17:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:55 crc kubenswrapper[5004]: E1201 08:17:55.271915 5004 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:17:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c8341267-df98-484f-a5e7-cf024fab437c\\\",\\\"systemUUID\\\":\\\"77547a65-c048-47ee-89b1-8422fc81b7aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:55Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:55 crc kubenswrapper[5004]: E1201 08:17:55.272054 5004 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.273663 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.273714 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.273733 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.273756 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.273773 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:55Z","lastTransitionTime":"2025-12-01T08:17:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.376529 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.376640 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.376667 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.376705 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.376730 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:55Z","lastTransitionTime":"2025-12-01T08:17:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.480325 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.480411 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.480436 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.480466 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.480489 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:55Z","lastTransitionTime":"2025-12-01T08:17:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.583165 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.583208 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.583220 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.583237 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.583248 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:55Z","lastTransitionTime":"2025-12-01T08:17:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.686430 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.686492 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.686514 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.686545 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.686604 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:55Z","lastTransitionTime":"2025-12-01T08:17:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.758227 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.758263 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.758271 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:17:55 crc kubenswrapper[5004]: E1201 08:17:55.758403 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 08:17:55 crc kubenswrapper[5004]: E1201 08:17:55.758646 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 08:17:55 crc kubenswrapper[5004]: E1201 08:17:55.758835 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.789728 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.789791 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.789816 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.789846 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.789871 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:55Z","lastTransitionTime":"2025-12-01T08:17:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.892700 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.892765 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.892781 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.892809 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.892827 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:55Z","lastTransitionTime":"2025-12-01T08:17:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.995540 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.995596 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.995606 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.995622 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:55 crc kubenswrapper[5004]: I1201 08:17:55.995635 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:55Z","lastTransitionTime":"2025-12-01T08:17:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:56 crc kubenswrapper[5004]: I1201 08:17:56.098416 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:56 crc kubenswrapper[5004]: I1201 08:17:56.098473 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:56 crc kubenswrapper[5004]: I1201 08:17:56.098555 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:56 crc kubenswrapper[5004]: I1201 08:17:56.098614 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:56 crc kubenswrapper[5004]: I1201 08:17:56.098632 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:56Z","lastTransitionTime":"2025-12-01T08:17:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:56 crc kubenswrapper[5004]: I1201 08:17:56.201653 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:56 crc kubenswrapper[5004]: I1201 08:17:56.201713 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:56 crc kubenswrapper[5004]: I1201 08:17:56.201728 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:56 crc kubenswrapper[5004]: I1201 08:17:56.201752 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:56 crc kubenswrapper[5004]: I1201 08:17:56.201769 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:56Z","lastTransitionTime":"2025-12-01T08:17:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:56 crc kubenswrapper[5004]: I1201 08:17:56.304870 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:56 crc kubenswrapper[5004]: I1201 08:17:56.304933 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:56 crc kubenswrapper[5004]: I1201 08:17:56.304955 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:56 crc kubenswrapper[5004]: I1201 08:17:56.304987 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:56 crc kubenswrapper[5004]: I1201 08:17:56.305011 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:56Z","lastTransitionTime":"2025-12-01T08:17:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:56 crc kubenswrapper[5004]: I1201 08:17:56.408033 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:56 crc kubenswrapper[5004]: I1201 08:17:56.408130 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:56 crc kubenswrapper[5004]: I1201 08:17:56.408142 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:56 crc kubenswrapper[5004]: I1201 08:17:56.408162 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:56 crc kubenswrapper[5004]: I1201 08:17:56.408174 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:56Z","lastTransitionTime":"2025-12-01T08:17:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:56 crc kubenswrapper[5004]: I1201 08:17:56.511131 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:56 crc kubenswrapper[5004]: I1201 08:17:56.511167 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:56 crc kubenswrapper[5004]: I1201 08:17:56.511176 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:56 crc kubenswrapper[5004]: I1201 08:17:56.511192 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:56 crc kubenswrapper[5004]: I1201 08:17:56.511202 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:56Z","lastTransitionTime":"2025-12-01T08:17:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:56 crc kubenswrapper[5004]: I1201 08:17:56.614102 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:56 crc kubenswrapper[5004]: I1201 08:17:56.614155 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:56 crc kubenswrapper[5004]: I1201 08:17:56.614164 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:56 crc kubenswrapper[5004]: I1201 08:17:56.614181 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:56 crc kubenswrapper[5004]: I1201 08:17:56.614191 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:56Z","lastTransitionTime":"2025-12-01T08:17:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:56 crc kubenswrapper[5004]: I1201 08:17:56.717099 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:56 crc kubenswrapper[5004]: I1201 08:17:56.717155 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:56 crc kubenswrapper[5004]: I1201 08:17:56.717172 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:56 crc kubenswrapper[5004]: I1201 08:17:56.717194 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:56 crc kubenswrapper[5004]: I1201 08:17:56.717211 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:56Z","lastTransitionTime":"2025-12-01T08:17:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:56 crc kubenswrapper[5004]: I1201 08:17:56.760097 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cl5l" Dec 01 08:17:56 crc kubenswrapper[5004]: E1201 08:17:56.760510 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cl5l" podUID="b488f4f3-d385-4d40-bdee-96d8fe2d42a1" Dec 01 08:17:56 crc kubenswrapper[5004]: I1201 08:17:56.820188 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:56 crc kubenswrapper[5004]: I1201 08:17:56.820237 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:56 crc kubenswrapper[5004]: I1201 08:17:56.820246 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:56 crc kubenswrapper[5004]: I1201 08:17:56.820264 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:56 crc kubenswrapper[5004]: I1201 08:17:56.820274 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:56Z","lastTransitionTime":"2025-12-01T08:17:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:56 crc kubenswrapper[5004]: I1201 08:17:56.923593 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:56 crc kubenswrapper[5004]: I1201 08:17:56.923633 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:56 crc kubenswrapper[5004]: I1201 08:17:56.923651 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:56 crc kubenswrapper[5004]: I1201 08:17:56.923669 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:56 crc kubenswrapper[5004]: I1201 08:17:56.923680 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:56Z","lastTransitionTime":"2025-12-01T08:17:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:57 crc kubenswrapper[5004]: I1201 08:17:57.026645 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:57 crc kubenswrapper[5004]: I1201 08:17:57.026715 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:57 crc kubenswrapper[5004]: I1201 08:17:57.026733 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:57 crc kubenswrapper[5004]: I1201 08:17:57.026759 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:57 crc kubenswrapper[5004]: I1201 08:17:57.026781 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:57Z","lastTransitionTime":"2025-12-01T08:17:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:57 crc kubenswrapper[5004]: I1201 08:17:57.130274 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:57 crc kubenswrapper[5004]: I1201 08:17:57.130318 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:57 crc kubenswrapper[5004]: I1201 08:17:57.130327 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:57 crc kubenswrapper[5004]: I1201 08:17:57.130348 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:57 crc kubenswrapper[5004]: I1201 08:17:57.130359 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:57Z","lastTransitionTime":"2025-12-01T08:17:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:57 crc kubenswrapper[5004]: I1201 08:17:57.234032 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:57 crc kubenswrapper[5004]: I1201 08:17:57.234090 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:57 crc kubenswrapper[5004]: I1201 08:17:57.234109 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:57 crc kubenswrapper[5004]: I1201 08:17:57.234132 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:57 crc kubenswrapper[5004]: I1201 08:17:57.234148 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:57Z","lastTransitionTime":"2025-12-01T08:17:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:57 crc kubenswrapper[5004]: I1201 08:17:57.337678 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:57 crc kubenswrapper[5004]: I1201 08:17:57.337720 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:57 crc kubenswrapper[5004]: I1201 08:17:57.337736 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:57 crc kubenswrapper[5004]: I1201 08:17:57.337759 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:57 crc kubenswrapper[5004]: I1201 08:17:57.337775 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:57Z","lastTransitionTime":"2025-12-01T08:17:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:57 crc kubenswrapper[5004]: I1201 08:17:57.440729 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:57 crc kubenswrapper[5004]: I1201 08:17:57.440776 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:57 crc kubenswrapper[5004]: I1201 08:17:57.440792 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:57 crc kubenswrapper[5004]: I1201 08:17:57.440815 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:57 crc kubenswrapper[5004]: I1201 08:17:57.440832 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:57Z","lastTransitionTime":"2025-12-01T08:17:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:57 crc kubenswrapper[5004]: I1201 08:17:57.544205 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:57 crc kubenswrapper[5004]: I1201 08:17:57.544255 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:57 crc kubenswrapper[5004]: I1201 08:17:57.544271 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:57 crc kubenswrapper[5004]: I1201 08:17:57.544296 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:57 crc kubenswrapper[5004]: I1201 08:17:57.544313 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:57Z","lastTransitionTime":"2025-12-01T08:17:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:57 crc kubenswrapper[5004]: I1201 08:17:57.647522 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:57 crc kubenswrapper[5004]: I1201 08:17:57.647683 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:57 crc kubenswrapper[5004]: I1201 08:17:57.647702 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:57 crc kubenswrapper[5004]: I1201 08:17:57.647728 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:57 crc kubenswrapper[5004]: I1201 08:17:57.647778 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:57Z","lastTransitionTime":"2025-12-01T08:17:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:57 crc kubenswrapper[5004]: I1201 08:17:57.749766 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:57 crc kubenswrapper[5004]: I1201 08:17:57.749823 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:57 crc kubenswrapper[5004]: I1201 08:17:57.749840 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:57 crc kubenswrapper[5004]: I1201 08:17:57.749866 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:57 crc kubenswrapper[5004]: I1201 08:17:57.749883 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:57Z","lastTransitionTime":"2025-12-01T08:17:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:57 crc kubenswrapper[5004]: I1201 08:17:57.758116 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 08:17:57 crc kubenswrapper[5004]: I1201 08:17:57.758199 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:17:57 crc kubenswrapper[5004]: I1201 08:17:57.758203 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 08:17:57 crc kubenswrapper[5004]: E1201 08:17:57.758291 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 08:17:57 crc kubenswrapper[5004]: E1201 08:17:57.758488 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 08:17:57 crc kubenswrapper[5004]: E1201 08:17:57.758556 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 08:17:57 crc kubenswrapper[5004]: I1201 08:17:57.852890 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:57 crc kubenswrapper[5004]: I1201 08:17:57.852949 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:57 crc kubenswrapper[5004]: I1201 08:17:57.852962 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:57 crc kubenswrapper[5004]: I1201 08:17:57.852982 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:57 crc kubenswrapper[5004]: I1201 08:17:57.852995 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:57Z","lastTransitionTime":"2025-12-01T08:17:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:57 crc kubenswrapper[5004]: I1201 08:17:57.955983 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:57 crc kubenswrapper[5004]: I1201 08:17:57.956042 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:57 crc kubenswrapper[5004]: I1201 08:17:57.956060 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:57 crc kubenswrapper[5004]: I1201 08:17:57.956084 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:57 crc kubenswrapper[5004]: I1201 08:17:57.956102 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:57Z","lastTransitionTime":"2025-12-01T08:17:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:58 crc kubenswrapper[5004]: I1201 08:17:58.058814 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:58 crc kubenswrapper[5004]: I1201 08:17:58.058877 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:58 crc kubenswrapper[5004]: I1201 08:17:58.058903 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:58 crc kubenswrapper[5004]: I1201 08:17:58.058933 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:58 crc kubenswrapper[5004]: I1201 08:17:58.058953 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:58Z","lastTransitionTime":"2025-12-01T08:17:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:58 crc kubenswrapper[5004]: I1201 08:17:58.161384 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:58 crc kubenswrapper[5004]: I1201 08:17:58.161431 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:58 crc kubenswrapper[5004]: I1201 08:17:58.161444 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:58 crc kubenswrapper[5004]: I1201 08:17:58.161461 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:58 crc kubenswrapper[5004]: I1201 08:17:58.161474 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:58Z","lastTransitionTime":"2025-12-01T08:17:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:58 crc kubenswrapper[5004]: I1201 08:17:58.264167 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:58 crc kubenswrapper[5004]: I1201 08:17:58.264229 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:58 crc kubenswrapper[5004]: I1201 08:17:58.264248 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:58 crc kubenswrapper[5004]: I1201 08:17:58.264271 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:58 crc kubenswrapper[5004]: I1201 08:17:58.264288 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:58Z","lastTransitionTime":"2025-12-01T08:17:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:58 crc kubenswrapper[5004]: I1201 08:17:58.367201 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:58 crc kubenswrapper[5004]: I1201 08:17:58.367255 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:58 crc kubenswrapper[5004]: I1201 08:17:58.367271 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:58 crc kubenswrapper[5004]: I1201 08:17:58.367294 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:58 crc kubenswrapper[5004]: I1201 08:17:58.367311 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:58Z","lastTransitionTime":"2025-12-01T08:17:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:58 crc kubenswrapper[5004]: I1201 08:17:58.469803 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:58 crc kubenswrapper[5004]: I1201 08:17:58.469853 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:58 crc kubenswrapper[5004]: I1201 08:17:58.469871 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:58 crc kubenswrapper[5004]: I1201 08:17:58.469895 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:58 crc kubenswrapper[5004]: I1201 08:17:58.469912 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:58Z","lastTransitionTime":"2025-12-01T08:17:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:58 crc kubenswrapper[5004]: I1201 08:17:58.532884 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 08:17:58 crc kubenswrapper[5004]: I1201 08:17:58.544651 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 01 08:17:58 crc kubenswrapper[5004]: I1201 08:17:58.555197 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0ed2b6d-0c61-4639-bc3b-1c8effc4815d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45a0d7a12e7c114de1740a151e04f7f39844b709740cab958f156ac918af3228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d69a370af9f8f6c575f99da6bb01cee0f660dccb7e85f16f5d8874d0e263e1fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82baee0fd931fd7b41cd5a73ca7e653ef13dfa3d573991cbe3238d6b95f6fac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f341e99100e985f65c24647d976a24782dfae7f52e752b774f41f69af27fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad581564fc519328ae429bce757797f6d87d8648e862b008637a245866ec4cbe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T08:17:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 08:17:15.185125 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 08:17:15.186793 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071633747/tls.crt::/tmp/serving-cert-4071633747/tls.key\\\\\\\"\\\\nI1201 08:17:20.827614 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 08:17:20.834528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 08:17:20.834549 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 08:17:20.837611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 08:17:20.837630 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 08:17:20.848940 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 08:17:20.848957 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848965 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 08:17:20.848968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 08:17:20.848970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 08:17:20.848973 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 08:17:20.849115 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 08:17:20.854990 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2caeb2ebaf68110cb8728e60aae6db1b1fc48efae6018f66070766a4f42caa69\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:58Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:58 crc kubenswrapper[5004]: I1201 08:17:58.573246 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:58 crc kubenswrapper[5004]: I1201 08:17:58.573306 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:58 crc kubenswrapper[5004]: I1201 08:17:58.573322 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:58 crc kubenswrapper[5004]: I1201 08:17:58.573350 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:58 crc kubenswrapper[5004]: I1201 08:17:58.573370 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:58Z","lastTransitionTime":"2025-12-01T08:17:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:58 crc kubenswrapper[5004]: I1201 08:17:58.574877 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fd2f2b5d7dc3d7b8e1bb06fdf0eec3addfec3c230e044b6395d6cf535630ebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2250161e400b201f0528e33231c9007342002a37cd037a1b1e011f2aaaa22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:58Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:58 crc kubenswrapper[5004]: I1201 08:17:58.598301 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zjksw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70e79009-93be-49c4-a6b3-e8a06bcea7f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad3ea204ad731500a340e639f0e36abe28ba50464f1b2ff9b009e39c234cf708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd8m6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zjksw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:58Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:58 crc kubenswrapper[5004]: I1201 08:17:58.630162 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15cdec0a-5925-4966-a30b-f60c503f633e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f528b94ff12c7a804930dca4b93c23334a9ae3a53317750cd34b5695f08b4582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f001b70a83da28b1ea78a94f1cd70dfc63b3de5a21b041932d33c2ac970c986b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c5c043438b515e37a6d6d5c47b92479bbb7322fe35c3a1bc57ee7b3a746544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23df33e45cc27e4d343ae6ec3fb88f2dfc125ba7da424515d4bee5ec19281832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4a7cab8e1594814a687b1433a92642a19eff7c2eb12f96fa06f08a2e270733d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1844a18a6109c25a25b9c8cb3ff65e69cc657f66be03dd97e883d24a774b7638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76c127e5ff47d030a709c79b7c7f2c2d3b32009052ff95ff522d90ecd86cf993\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76c127e5ff47d030a709c79b7c7f2c2d3b32009052ff95ff522d90ecd86cf993\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T08:17:52Z\\\",\\\"message\\\":\\\"}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1201 08:17:52.744847 6628 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-marketplace/community-operators]} name:Service_openshift-marketplace/community-operators_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.189:50051:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d389393c-7ba9-422c-b3f5-06e391d537d2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 08:17:52.746408 6628 model_client.go:382] Update operations generated as: [{Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1201 08:17:52.746269 6628 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-knmdv_openshift-ovn-kubernetes(15cdec0a-5925-4966-a30b-f60c503f633e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f231e0092e1a85cc5d6e00fb8d3bd982223b670191ac3dc21d81ff1fab462a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-knmdv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:58Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:58 crc kubenswrapper[5004]: I1201 08:17:58.658248 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327477e7db10129d47ddf9078f32d0df2ffd4b8e59fc7bb77c17c60c63e9a936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:58Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:58 crc kubenswrapper[5004]: I1201 08:17:58.677089 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:58 crc kubenswrapper[5004]: I1201 08:17:58.677145 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:58 crc kubenswrapper[5004]: I1201 08:17:58.677163 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:58 crc kubenswrapper[5004]: I1201 08:17:58.677185 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:58 crc kubenswrapper[5004]: I1201 08:17:58.677202 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:58Z","lastTransitionTime":"2025-12-01T08:17:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:58 crc kubenswrapper[5004]: I1201 08:17:58.679692 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:58Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:58 crc kubenswrapper[5004]: I1201 08:17:58.697003 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:58Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:58 crc kubenswrapper[5004]: I1201 08:17:58.716841 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c470edd984db3b756e031e33f64a9bc5ca6b9c156ab479e5cd647745655a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:58Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:58 crc kubenswrapper[5004]: I1201 08:17:58.734902 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mzsvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"397b51b7-934a-41d1-a593-500a64161bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4adac5633dfe89777bf019818bab9ee3a208cad9d929c96cf2cb86b18c2d4264\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h49g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d83844478080e4829975fb6c8e0444d9fdebd44b08afcf45e7d0b04fc534a844\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h49g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mzsvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:58Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:58 crc kubenswrapper[5004]: I1201 08:17:58.758794 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cl5l" Dec 01 08:17:58 crc kubenswrapper[5004]: E1201 08:17:58.759041 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cl5l" podUID="b488f4f3-d385-4d40-bdee-96d8fe2d42a1" Dec 01 08:17:58 crc kubenswrapper[5004]: I1201 08:17:58.759749 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43c9f762-d403-49b9-8294-d66976aca4dd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b358f641ff09bc8f4a452b0d4a8cf5f9790182f76779be31426d48d0278b0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d610c3a799d4a7df9ddfe2a28f2ced2e5ac4e96c67c93f8e6d3a0eee60d9ac48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43f9f72af3273da055eeb1e13ad6258a1b3b6b205402861beee9f87f02230297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e3cc2eca559417c71beed561569c4dc5b45ffb8407976e0695fc1b93219d37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c24fbeedbc7c40b6079b7ad89d8564a4f2cdaad2cf996e66682b082382da190\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:58Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:58 crc kubenswrapper[5004]: I1201 08:17:58.778555 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:58Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:58 crc kubenswrapper[5004]: I1201 08:17:58.779847 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:58 crc kubenswrapper[5004]: I1201 08:17:58.779881 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:58 crc kubenswrapper[5004]: I1201 08:17:58.779893 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:58 crc kubenswrapper[5004]: I1201 08:17:58.779909 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:58 crc kubenswrapper[5004]: I1201 08:17:58.779923 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:58Z","lastTransitionTime":"2025-12-01T08:17:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:58 crc kubenswrapper[5004]: I1201 08:17:58.793152 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jjms6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8828af41-beeb-47dd-96cf-3dbcb5175893\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b22220849cdb91592a44a67f9f6a09586146009483f46a0505a2dea3b02bccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvsch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jjms6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:58Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:58 crc kubenswrapper[5004]: I1201 08:17:58.807213 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9977ebb-82de-4e96-8763-0b5a84f8d4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf84c2eea7167eed95a4195d37bf784ffef310bc2079d8d06e1f47cb45a7864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwgxq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c1e51fa92aec80e93d0fce11c743b8c4fde23fc23d8626e8d5b3e25ab42350d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwgxq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fvdgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:58Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:58 crc kubenswrapper[5004]: I1201 08:17:58.823129 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7cl5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b488f4f3-d385-4d40-bdee-96d8fe2d42a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-glbhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-glbhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7cl5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:58Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:58 crc kubenswrapper[5004]: I1201 08:17:58.837629 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2fa377e-a3a6-46ce-af23-e677799f0115\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520771cf1ad6f9360064c5f304243c5878a2f1025c87d1001c97b2d5f95cb9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b65b9bc444101900c62fd90c7d45789d186a89148ac0fd9c7f0c6048c3f55ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbd3e682d207b8c18d6bcbc26aa2adae2a35645502d31b327ffddd51991e842\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4563974ed1467a87aab104756828e63853c282a61c4fcab1e757eb4da9afaa40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:58Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:58 crc kubenswrapper[5004]: I1201 08:17:58.852365 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dpkxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddcaf111-708a-45b3-a342-effd3061ab17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68ecba515f05ca83fdd0cdda10e3e5925a146aadb70ae17859586c12daf55dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd2f03a8020818d1cc98b6eeaf64a728e627a3401400c3af53654f8771ef02eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd2f03a8020818d1cc98b6eeaf64a728e627a3401400c3af53654f8771ef02eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8dd31c10eaf6bc673dec377a884040524418dd1891a96b2eb6e6e983979512a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8dd31c10eaf6bc673dec377a884040524418dd1891a96b2eb6e6e983979512a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a7a8ef66a9486ca3d13826df7aa39db2083cff2c386b4d0d858e0526e0f094d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a7a8ef66a9486ca3d13826df7aa39db2083cff2c386b4d0d858e0526e0f094d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd1c47c976c17bf17e4d660288ce30472372b6a7f33f6ffa6d96c1a4436275d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd1c47c976c17bf17e4d660288ce30472372b6a7f33f6ffa6d96c1a4436275d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b7c8e9648e5514ab1fcb63dc72822dd276a2d74987e2fb4f4800b26ec176be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b7c8e9648e5514ab1fcb63dc72822dd276a2d74987e2fb4f4800b26ec176be7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d7115db7fe248437f4b8918f06c9aa48b141fcd3f31add80faee81b4347e47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d7115db7fe248437f4b8918f06c9aa48b141fcd3f31add80faee81b4347e47b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dpkxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:58Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:58 crc kubenswrapper[5004]: I1201 08:17:58.861692 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ww6lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72500ffd-4ca3-4614-a3a2-bbdc5a7506c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://668cfaa2af20e2bb082fc47e1702cfd9f704c4fdf56a4d27cf25d6915e7cd18a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmsvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ww6lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:17:58Z is after 2025-08-24T17:21:41Z" Dec 01 08:17:58 crc kubenswrapper[5004]: I1201 08:17:58.882876 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:58 crc kubenswrapper[5004]: I1201 08:17:58.882915 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:58 crc kubenswrapper[5004]: I1201 08:17:58.882933 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:58 crc kubenswrapper[5004]: I1201 08:17:58.882955 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:58 crc kubenswrapper[5004]: I1201 08:17:58.882976 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:58Z","lastTransitionTime":"2025-12-01T08:17:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:58 crc kubenswrapper[5004]: I1201 08:17:58.986375 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:58 crc kubenswrapper[5004]: I1201 08:17:58.986444 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:58 crc kubenswrapper[5004]: I1201 08:17:58.986461 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:58 crc kubenswrapper[5004]: I1201 08:17:58.986485 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:58 crc kubenswrapper[5004]: I1201 08:17:58.986502 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:58Z","lastTransitionTime":"2025-12-01T08:17:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:59 crc kubenswrapper[5004]: I1201 08:17:59.089511 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:59 crc kubenswrapper[5004]: I1201 08:17:59.089611 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:59 crc kubenswrapper[5004]: I1201 08:17:59.089631 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:59 crc kubenswrapper[5004]: I1201 08:17:59.089656 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:59 crc kubenswrapper[5004]: I1201 08:17:59.089675 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:59Z","lastTransitionTime":"2025-12-01T08:17:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:59 crc kubenswrapper[5004]: I1201 08:17:59.191812 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:59 crc kubenswrapper[5004]: I1201 08:17:59.191863 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:59 crc kubenswrapper[5004]: I1201 08:17:59.191875 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:59 crc kubenswrapper[5004]: I1201 08:17:59.191894 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:59 crc kubenswrapper[5004]: I1201 08:17:59.191908 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:59Z","lastTransitionTime":"2025-12-01T08:17:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:59 crc kubenswrapper[5004]: I1201 08:17:59.294216 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:59 crc kubenswrapper[5004]: I1201 08:17:59.294264 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:59 crc kubenswrapper[5004]: I1201 08:17:59.294276 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:59 crc kubenswrapper[5004]: I1201 08:17:59.294333 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:59 crc kubenswrapper[5004]: I1201 08:17:59.294352 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:59Z","lastTransitionTime":"2025-12-01T08:17:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:59 crc kubenswrapper[5004]: I1201 08:17:59.404770 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:59 crc kubenswrapper[5004]: I1201 08:17:59.404809 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:59 crc kubenswrapper[5004]: I1201 08:17:59.404818 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:59 crc kubenswrapper[5004]: I1201 08:17:59.404834 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:59 crc kubenswrapper[5004]: I1201 08:17:59.404843 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:59Z","lastTransitionTime":"2025-12-01T08:17:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:59 crc kubenswrapper[5004]: I1201 08:17:59.508018 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:59 crc kubenswrapper[5004]: I1201 08:17:59.508063 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:59 crc kubenswrapper[5004]: I1201 08:17:59.508075 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:59 crc kubenswrapper[5004]: I1201 08:17:59.508094 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:59 crc kubenswrapper[5004]: I1201 08:17:59.508107 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:59Z","lastTransitionTime":"2025-12-01T08:17:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:59 crc kubenswrapper[5004]: I1201 08:17:59.610778 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:59 crc kubenswrapper[5004]: I1201 08:17:59.610829 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:59 crc kubenswrapper[5004]: I1201 08:17:59.610851 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:59 crc kubenswrapper[5004]: I1201 08:17:59.610875 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:59 crc kubenswrapper[5004]: I1201 08:17:59.610893 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:59Z","lastTransitionTime":"2025-12-01T08:17:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:59 crc kubenswrapper[5004]: I1201 08:17:59.714245 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:59 crc kubenswrapper[5004]: I1201 08:17:59.714336 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:59 crc kubenswrapper[5004]: I1201 08:17:59.714358 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:59 crc kubenswrapper[5004]: I1201 08:17:59.714386 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:59 crc kubenswrapper[5004]: I1201 08:17:59.714407 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:59Z","lastTransitionTime":"2025-12-01T08:17:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:59 crc kubenswrapper[5004]: I1201 08:17:59.758242 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:17:59 crc kubenswrapper[5004]: I1201 08:17:59.758234 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 08:17:59 crc kubenswrapper[5004]: I1201 08:17:59.758400 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 08:17:59 crc kubenswrapper[5004]: E1201 08:17:59.758624 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 08:17:59 crc kubenswrapper[5004]: E1201 08:17:59.758884 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 08:17:59 crc kubenswrapper[5004]: E1201 08:17:59.758958 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 08:17:59 crc kubenswrapper[5004]: I1201 08:17:59.817183 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:59 crc kubenswrapper[5004]: I1201 08:17:59.817271 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:59 crc kubenswrapper[5004]: I1201 08:17:59.817293 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:59 crc kubenswrapper[5004]: I1201 08:17:59.817321 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:59 crc kubenswrapper[5004]: I1201 08:17:59.817345 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:59Z","lastTransitionTime":"2025-12-01T08:17:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:17:59 crc kubenswrapper[5004]: I1201 08:17:59.920005 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:17:59 crc kubenswrapper[5004]: I1201 08:17:59.920060 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:17:59 crc kubenswrapper[5004]: I1201 08:17:59.920082 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:17:59 crc kubenswrapper[5004]: I1201 08:17:59.920105 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:17:59 crc kubenswrapper[5004]: I1201 08:17:59.920122 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:17:59Z","lastTransitionTime":"2025-12-01T08:17:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:00 crc kubenswrapper[5004]: I1201 08:18:00.022767 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:00 crc kubenswrapper[5004]: I1201 08:18:00.022834 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:00 crc kubenswrapper[5004]: I1201 08:18:00.022852 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:00 crc kubenswrapper[5004]: I1201 08:18:00.022876 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:00 crc kubenswrapper[5004]: I1201 08:18:00.022894 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:00Z","lastTransitionTime":"2025-12-01T08:18:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:00 crc kubenswrapper[5004]: I1201 08:18:00.125337 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:00 crc kubenswrapper[5004]: I1201 08:18:00.125397 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:00 crc kubenswrapper[5004]: I1201 08:18:00.125413 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:00 crc kubenswrapper[5004]: I1201 08:18:00.125435 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:00 crc kubenswrapper[5004]: I1201 08:18:00.125451 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:00Z","lastTransitionTime":"2025-12-01T08:18:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:00 crc kubenswrapper[5004]: I1201 08:18:00.227853 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:00 crc kubenswrapper[5004]: I1201 08:18:00.227896 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:00 crc kubenswrapper[5004]: I1201 08:18:00.227904 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:00 crc kubenswrapper[5004]: I1201 08:18:00.227920 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:00 crc kubenswrapper[5004]: I1201 08:18:00.227929 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:00Z","lastTransitionTime":"2025-12-01T08:18:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:00 crc kubenswrapper[5004]: I1201 08:18:00.330290 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:00 crc kubenswrapper[5004]: I1201 08:18:00.330317 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:00 crc kubenswrapper[5004]: I1201 08:18:00.330324 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:00 crc kubenswrapper[5004]: I1201 08:18:00.330336 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:00 crc kubenswrapper[5004]: I1201 08:18:00.330345 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:00Z","lastTransitionTime":"2025-12-01T08:18:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:00 crc kubenswrapper[5004]: I1201 08:18:00.433452 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:00 crc kubenswrapper[5004]: I1201 08:18:00.433495 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:00 crc kubenswrapper[5004]: I1201 08:18:00.433508 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:00 crc kubenswrapper[5004]: I1201 08:18:00.433525 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:00 crc kubenswrapper[5004]: I1201 08:18:00.433536 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:00Z","lastTransitionTime":"2025-12-01T08:18:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:00 crc kubenswrapper[5004]: I1201 08:18:00.535195 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:00 crc kubenswrapper[5004]: I1201 08:18:00.535222 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:00 crc kubenswrapper[5004]: I1201 08:18:00.535229 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:00 crc kubenswrapper[5004]: I1201 08:18:00.535251 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:00 crc kubenswrapper[5004]: I1201 08:18:00.535259 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:00Z","lastTransitionTime":"2025-12-01T08:18:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:00 crc kubenswrapper[5004]: I1201 08:18:00.637388 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:00 crc kubenswrapper[5004]: I1201 08:18:00.637433 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:00 crc kubenswrapper[5004]: I1201 08:18:00.637441 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:00 crc kubenswrapper[5004]: I1201 08:18:00.637459 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:00 crc kubenswrapper[5004]: I1201 08:18:00.637468 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:00Z","lastTransitionTime":"2025-12-01T08:18:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:00 crc kubenswrapper[5004]: I1201 08:18:00.740315 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:00 crc kubenswrapper[5004]: I1201 08:18:00.740378 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:00 crc kubenswrapper[5004]: I1201 08:18:00.740395 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:00 crc kubenswrapper[5004]: I1201 08:18:00.740418 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:00 crc kubenswrapper[5004]: I1201 08:18:00.740435 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:00Z","lastTransitionTime":"2025-12-01T08:18:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:00 crc kubenswrapper[5004]: I1201 08:18:00.758883 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cl5l" Dec 01 08:18:00 crc kubenswrapper[5004]: E1201 08:18:00.759048 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cl5l" podUID="b488f4f3-d385-4d40-bdee-96d8fe2d42a1" Dec 01 08:18:00 crc kubenswrapper[5004]: I1201 08:18:00.843519 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:00 crc kubenswrapper[5004]: I1201 08:18:00.843733 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:00 crc kubenswrapper[5004]: I1201 08:18:00.843757 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:00 crc kubenswrapper[5004]: I1201 08:18:00.843781 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:00 crc kubenswrapper[5004]: I1201 08:18:00.843798 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:00Z","lastTransitionTime":"2025-12-01T08:18:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:00 crc kubenswrapper[5004]: I1201 08:18:00.946290 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:00 crc kubenswrapper[5004]: I1201 08:18:00.946336 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:00 crc kubenswrapper[5004]: I1201 08:18:00.946352 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:00 crc kubenswrapper[5004]: I1201 08:18:00.946377 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:00 crc kubenswrapper[5004]: I1201 08:18:00.946393 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:00Z","lastTransitionTime":"2025-12-01T08:18:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:01 crc kubenswrapper[5004]: I1201 08:18:01.049836 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:01 crc kubenswrapper[5004]: I1201 08:18:01.049907 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:01 crc kubenswrapper[5004]: I1201 08:18:01.049931 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:01 crc kubenswrapper[5004]: I1201 08:18:01.049963 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:01 crc kubenswrapper[5004]: I1201 08:18:01.049985 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:01Z","lastTransitionTime":"2025-12-01T08:18:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:01 crc kubenswrapper[5004]: I1201 08:18:01.152475 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:01 crc kubenswrapper[5004]: I1201 08:18:01.152537 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:01 crc kubenswrapper[5004]: I1201 08:18:01.152556 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:01 crc kubenswrapper[5004]: I1201 08:18:01.152628 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:01 crc kubenswrapper[5004]: I1201 08:18:01.152649 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:01Z","lastTransitionTime":"2025-12-01T08:18:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:01 crc kubenswrapper[5004]: I1201 08:18:01.259148 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:01 crc kubenswrapper[5004]: I1201 08:18:01.259215 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:01 crc kubenswrapper[5004]: I1201 08:18:01.259239 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:01 crc kubenswrapper[5004]: I1201 08:18:01.259266 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:01 crc kubenswrapper[5004]: I1201 08:18:01.259287 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:01Z","lastTransitionTime":"2025-12-01T08:18:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:01 crc kubenswrapper[5004]: I1201 08:18:01.361962 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:01 crc kubenswrapper[5004]: I1201 08:18:01.362223 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:01 crc kubenswrapper[5004]: I1201 08:18:01.362253 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:01 crc kubenswrapper[5004]: I1201 08:18:01.362287 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:01 crc kubenswrapper[5004]: I1201 08:18:01.362309 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:01Z","lastTransitionTime":"2025-12-01T08:18:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:01 crc kubenswrapper[5004]: I1201 08:18:01.465326 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:01 crc kubenswrapper[5004]: I1201 08:18:01.465379 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:01 crc kubenswrapper[5004]: I1201 08:18:01.465395 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:01 crc kubenswrapper[5004]: I1201 08:18:01.465425 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:01 crc kubenswrapper[5004]: I1201 08:18:01.465444 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:01Z","lastTransitionTime":"2025-12-01T08:18:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:01 crc kubenswrapper[5004]: I1201 08:18:01.568996 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:01 crc kubenswrapper[5004]: I1201 08:18:01.569057 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:01 crc kubenswrapper[5004]: I1201 08:18:01.569074 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:01 crc kubenswrapper[5004]: I1201 08:18:01.569100 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:01 crc kubenswrapper[5004]: I1201 08:18:01.569118 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:01Z","lastTransitionTime":"2025-12-01T08:18:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:01 crc kubenswrapper[5004]: I1201 08:18:01.672293 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:01 crc kubenswrapper[5004]: I1201 08:18:01.672357 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:01 crc kubenswrapper[5004]: I1201 08:18:01.672383 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:01 crc kubenswrapper[5004]: I1201 08:18:01.672413 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:01 crc kubenswrapper[5004]: I1201 08:18:01.672438 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:01Z","lastTransitionTime":"2025-12-01T08:18:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:01 crc kubenswrapper[5004]: I1201 08:18:01.758890 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 08:18:01 crc kubenswrapper[5004]: I1201 08:18:01.758959 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:18:01 crc kubenswrapper[5004]: I1201 08:18:01.758969 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 08:18:01 crc kubenswrapper[5004]: E1201 08:18:01.759059 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 08:18:01 crc kubenswrapper[5004]: E1201 08:18:01.759259 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 08:18:01 crc kubenswrapper[5004]: E1201 08:18:01.759373 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 08:18:01 crc kubenswrapper[5004]: I1201 08:18:01.775347 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:01 crc kubenswrapper[5004]: I1201 08:18:01.775414 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:01 crc kubenswrapper[5004]: I1201 08:18:01.775441 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:01 crc kubenswrapper[5004]: I1201 08:18:01.775470 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:01 crc kubenswrapper[5004]: I1201 08:18:01.775491 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:01Z","lastTransitionTime":"2025-12-01T08:18:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:01 crc kubenswrapper[5004]: I1201 08:18:01.878069 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:01 crc kubenswrapper[5004]: I1201 08:18:01.878145 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:01 crc kubenswrapper[5004]: I1201 08:18:01.878168 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:01 crc kubenswrapper[5004]: I1201 08:18:01.878198 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:01 crc kubenswrapper[5004]: I1201 08:18:01.878221 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:01Z","lastTransitionTime":"2025-12-01T08:18:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:01 crc kubenswrapper[5004]: I1201 08:18:01.981433 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:01 crc kubenswrapper[5004]: I1201 08:18:01.981494 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:01 crc kubenswrapper[5004]: I1201 08:18:01.981513 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:01 crc kubenswrapper[5004]: I1201 08:18:01.981537 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:01 crc kubenswrapper[5004]: I1201 08:18:01.981556 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:01Z","lastTransitionTime":"2025-12-01T08:18:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:02 crc kubenswrapper[5004]: I1201 08:18:02.084360 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:02 crc kubenswrapper[5004]: I1201 08:18:02.084421 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:02 crc kubenswrapper[5004]: I1201 08:18:02.084438 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:02 crc kubenswrapper[5004]: I1201 08:18:02.084465 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:02 crc kubenswrapper[5004]: I1201 08:18:02.084482 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:02Z","lastTransitionTime":"2025-12-01T08:18:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:02 crc kubenswrapper[5004]: I1201 08:18:02.187674 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:02 crc kubenswrapper[5004]: I1201 08:18:02.187741 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:02 crc kubenswrapper[5004]: I1201 08:18:02.187777 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:02 crc kubenswrapper[5004]: I1201 08:18:02.187806 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:02 crc kubenswrapper[5004]: I1201 08:18:02.187826 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:02Z","lastTransitionTime":"2025-12-01T08:18:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:02 crc kubenswrapper[5004]: I1201 08:18:02.290541 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:02 crc kubenswrapper[5004]: I1201 08:18:02.290638 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:02 crc kubenswrapper[5004]: I1201 08:18:02.290656 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:02 crc kubenswrapper[5004]: I1201 08:18:02.290686 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:02 crc kubenswrapper[5004]: I1201 08:18:02.290704 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:02Z","lastTransitionTime":"2025-12-01T08:18:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:02 crc kubenswrapper[5004]: I1201 08:18:02.394846 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:02 crc kubenswrapper[5004]: I1201 08:18:02.394898 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:02 crc kubenswrapper[5004]: I1201 08:18:02.394916 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:02 crc kubenswrapper[5004]: I1201 08:18:02.394941 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:02 crc kubenswrapper[5004]: I1201 08:18:02.394960 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:02Z","lastTransitionTime":"2025-12-01T08:18:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:02 crc kubenswrapper[5004]: I1201 08:18:02.497991 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:02 crc kubenswrapper[5004]: I1201 08:18:02.498044 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:02 crc kubenswrapper[5004]: I1201 08:18:02.498063 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:02 crc kubenswrapper[5004]: I1201 08:18:02.498087 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:02 crc kubenswrapper[5004]: I1201 08:18:02.498103 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:02Z","lastTransitionTime":"2025-12-01T08:18:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:02 crc kubenswrapper[5004]: I1201 08:18:02.600177 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:02 crc kubenswrapper[5004]: I1201 08:18:02.600241 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:02 crc kubenswrapper[5004]: I1201 08:18:02.600256 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:02 crc kubenswrapper[5004]: I1201 08:18:02.600282 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:02 crc kubenswrapper[5004]: I1201 08:18:02.600299 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:02Z","lastTransitionTime":"2025-12-01T08:18:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:02 crc kubenswrapper[5004]: I1201 08:18:02.703278 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:02 crc kubenswrapper[5004]: I1201 08:18:02.703402 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:02 crc kubenswrapper[5004]: I1201 08:18:02.703426 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:02 crc kubenswrapper[5004]: I1201 08:18:02.703527 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:02 crc kubenswrapper[5004]: I1201 08:18:02.703632 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:02Z","lastTransitionTime":"2025-12-01T08:18:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:02 crc kubenswrapper[5004]: I1201 08:18:02.758833 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cl5l" Dec 01 08:18:02 crc kubenswrapper[5004]: E1201 08:18:02.759029 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cl5l" podUID="b488f4f3-d385-4d40-bdee-96d8fe2d42a1" Dec 01 08:18:02 crc kubenswrapper[5004]: I1201 08:18:02.786236 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dpkxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddcaf111-708a-45b3-a342-effd3061ab17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68ecba515f05ca83fdd0cdda10e3e5925a146aadb70ae17859586c12daf55dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd2f03a8020818d1cc98b6eeaf64a728e627a3401400c3af53654f8771ef02eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd2f03a8020818d1cc98b6eeaf64a728e627a3401400c3af53654f8771ef02eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8dd31c10eaf6bc673dec377a884040524418dd1891a96b2eb6e6e983979512a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8dd31c10eaf6bc673dec377a884040524418dd1891a96b2eb6e6e983979512a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a7a8ef66a9486ca3d13826df7aa39db2083cff2c386b4d0d858e0526e0f094d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a7a8ef66a9486ca3d13826df7aa39db2083cff2c386b4d0d858e0526e0f094d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd1c47c976c17bf17e4d660288ce30472372b6a7f33f6ffa6d96c1a4436275d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd1c47c976c17bf17e4d660288ce30472372b6a7f33f6ffa6d96c1a4436275d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b7c8e9648e5514ab1fcb63dc72822dd276a2d74987e2fb4f4800b26ec176be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b7c8e9648e5514ab1fcb63dc72822dd276a2d74987e2fb4f4800b26ec176be7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d7115db7fe248437f4b8918f06c9aa48b141fcd3f31add80faee81b4347e47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d7115db7fe248437f4b8918f06c9aa48b141fcd3f31add80faee81b4347e47b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dpkxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:02Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:02 crc kubenswrapper[5004]: I1201 08:18:02.803269 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ww6lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72500ffd-4ca3-4614-a3a2-bbdc5a7506c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://668cfaa2af20e2bb082fc47e1702cfd9f704c4fdf56a4d27cf25d6915e7cd18a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmsvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ww6lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:02Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:02 crc kubenswrapper[5004]: I1201 08:18:02.807098 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:02 crc kubenswrapper[5004]: I1201 08:18:02.807128 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:02 crc kubenswrapper[5004]: I1201 08:18:02.807140 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:02 crc kubenswrapper[5004]: I1201 08:18:02.807157 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:02 crc kubenswrapper[5004]: I1201 08:18:02.807169 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:02Z","lastTransitionTime":"2025-12-01T08:18:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:02 crc kubenswrapper[5004]: I1201 08:18:02.821881 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2fa377e-a3a6-46ce-af23-e677799f0115\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520771cf1ad6f9360064c5f304243c5878a2f1025c87d1001c97b2d5f95cb9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b65b9bc444101900c62fd90c7d45789d186a89148ac0fd9c7f0c6048c3f55ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbd3e682d207b8c18d6bcbc26aa2adae2a35645502d31b327ffddd51991e842\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4563974ed1467a87aab104756828e63853c282a61c4fcab1e757eb4da9afaa40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:02Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:02 crc kubenswrapper[5004]: I1201 08:18:02.842140 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0ed2b6d-0c61-4639-bc3b-1c8effc4815d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45a0d7a12e7c114de1740a151e04f7f39844b709740cab958f156ac918af3228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d69a370af9f8f6c575f99da6bb01cee0f660dccb7e85f16f5d8874d0e263e1fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82baee0fd931fd7b41cd5a73ca7e653ef13dfa3d573991cbe3238d6b95f6fac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f341e99100e985f65c24647d976a24782dfae7f52e752b774f41f69af27fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad581564fc519328ae429bce757797f6d87d8648e862b008637a245866ec4cbe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T08:17:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 08:17:15.185125 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 08:17:15.186793 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071633747/tls.crt::/tmp/serving-cert-4071633747/tls.key\\\\\\\"\\\\nI1201 08:17:20.827614 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 08:17:20.834528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 08:17:20.834549 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 08:17:20.837611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 08:17:20.837630 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 08:17:20.848940 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 08:17:20.848957 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848965 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 08:17:20.848968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 08:17:20.848970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 08:17:20.848973 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 08:17:20.849115 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 08:17:20.854990 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2caeb2ebaf68110cb8728e60aae6db1b1fc48efae6018f66070766a4f42caa69\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:02Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:02 crc kubenswrapper[5004]: I1201 08:18:02.859981 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4ee478f-9254-4e56-96be-f5a83ff5d77c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37438df65f4dab6700f193e84f81d8ed41b3c208c3f6150bd5836c0642190572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f873da9c594c885720113b0d0fc01552050d030e802074043e9ede174fd9b16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ca757be16a55ba8df0e9629f7cc2653e2804a5ee5a2151ee3b07d2c30fe5b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3930a1c88c6014beb768ed20a37359ade08e77496a86b913f68c6196134f13a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3930a1c88c6014beb768ed20a37359ade08e77496a86b913f68c6196134f13a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:02Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:02 crc kubenswrapper[5004]: I1201 08:18:02.880807 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327477e7db10129d47ddf9078f32d0df2ffd4b8e59fc7bb77c17c60c63e9a936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:02Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:02 crc kubenswrapper[5004]: I1201 08:18:02.898046 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:02Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:02 crc kubenswrapper[5004]: I1201 08:18:02.910863 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:02 crc kubenswrapper[5004]: I1201 08:18:02.910922 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:02 crc kubenswrapper[5004]: I1201 08:18:02.910945 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:02 crc kubenswrapper[5004]: I1201 08:18:02.910974 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:02 crc kubenswrapper[5004]: I1201 08:18:02.910994 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:02Z","lastTransitionTime":"2025-12-01T08:18:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:02 crc kubenswrapper[5004]: I1201 08:18:02.915931 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:02Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:02 crc kubenswrapper[5004]: I1201 08:18:02.930636 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c470edd984db3b756e031e33f64a9bc5ca6b9c156ab479e5cd647745655a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:02Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:02 crc kubenswrapper[5004]: I1201 08:18:02.946368 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fd2f2b5d7dc3d7b8e1bb06fdf0eec3addfec3c230e044b6395d6cf535630ebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2250161e400b201f0528e33231c9007342002a37cd037a1b1e011f2aaaa22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:02Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:02 crc kubenswrapper[5004]: I1201 08:18:02.965685 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zjksw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70e79009-93be-49c4-a6b3-e8a06bcea7f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad3ea204ad731500a340e639f0e36abe28ba50464f1b2ff9b009e39c234cf708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd8m6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zjksw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:02Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:02 crc kubenswrapper[5004]: I1201 08:18:02.995164 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15cdec0a-5925-4966-a30b-f60c503f633e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f528b94ff12c7a804930dca4b93c23334a9ae3a53317750cd34b5695f08b4582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f001b70a83da28b1ea78a94f1cd70dfc63b3de5a21b041932d33c2ac970c986b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c5c043438b515e37a6d6d5c47b92479bbb7322fe35c3a1bc57ee7b3a746544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23df33e45cc27e4d343ae6ec3fb88f2dfc125ba7da424515d4bee5ec19281832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4a7cab8e1594814a687b1433a92642a19eff7c2eb12f96fa06f08a2e270733d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1844a18a6109c25a25b9c8cb3ff65e69cc657f66be03dd97e883d24a774b7638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76c127e5ff47d030a709c79b7c7f2c2d3b32009052ff95ff522d90ecd86cf993\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76c127e5ff47d030a709c79b7c7f2c2d3b32009052ff95ff522d90ecd86cf993\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T08:17:52Z\\\",\\\"message\\\":\\\"}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1201 08:17:52.744847 6628 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-marketplace/community-operators]} name:Service_openshift-marketplace/community-operators_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.189:50051:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d389393c-7ba9-422c-b3f5-06e391d537d2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 08:17:52.746408 6628 model_client.go:382] Update operations generated as: [{Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1201 08:17:52.746269 6628 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-knmdv_openshift-ovn-kubernetes(15cdec0a-5925-4966-a30b-f60c503f633e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f231e0092e1a85cc5d6e00fb8d3bd982223b670191ac3dc21d81ff1fab462a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-knmdv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:02Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:03 crc kubenswrapper[5004]: I1201 08:18:03.013187 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mzsvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"397b51b7-934a-41d1-a593-500a64161bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4adac5633dfe89777bf019818bab9ee3a208cad9d929c96cf2cb86b18c2d4264\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h49g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d83844478080e4829975fb6c8e0444d9fdebd44b08afcf45e7d0b04fc534a844\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h49g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mzsvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:03Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:03 crc kubenswrapper[5004]: I1201 08:18:03.013586 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:03 crc kubenswrapper[5004]: I1201 08:18:03.013615 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:03 crc kubenswrapper[5004]: I1201 08:18:03.013623 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:03 crc kubenswrapper[5004]: I1201 08:18:03.013635 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:03 crc kubenswrapper[5004]: I1201 08:18:03.013646 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:03Z","lastTransitionTime":"2025-12-01T08:18:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:03 crc kubenswrapper[5004]: I1201 08:18:03.032099 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:03Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:03 crc kubenswrapper[5004]: I1201 08:18:03.049222 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jjms6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8828af41-beeb-47dd-96cf-3dbcb5175893\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b22220849cdb91592a44a67f9f6a09586146009483f46a0505a2dea3b02bccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvsch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jjms6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:03Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:03 crc kubenswrapper[5004]: I1201 08:18:03.068068 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9977ebb-82de-4e96-8763-0b5a84f8d4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf84c2eea7167eed95a4195d37bf784ffef310bc2079d8d06e1f47cb45a7864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwgxq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c1e51fa92aec80e93d0fce11c743b8c4fde23fc23d8626e8d5b3e25ab42350d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwgxq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fvdgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:03Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:03 crc kubenswrapper[5004]: I1201 08:18:03.083647 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7cl5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b488f4f3-d385-4d40-bdee-96d8fe2d42a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-glbhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-glbhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7cl5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:03Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:03 crc kubenswrapper[5004]: I1201 08:18:03.113166 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43c9f762-d403-49b9-8294-d66976aca4dd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b358f641ff09bc8f4a452b0d4a8cf5f9790182f76779be31426d48d0278b0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d610c3a799d4a7df9ddfe2a28f2ced2e5ac4e96c67c93f8e6d3a0eee60d9ac48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43f9f72af3273da055eeb1e13ad6258a1b3b6b205402861beee9f87f02230297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e3cc2eca559417c71beed561569c4dc5b45ffb8407976e0695fc1b93219d37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c24fbeedbc7c40b6079b7ad89d8564a4f2cdaad2cf996e66682b082382da190\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:03Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:03 crc kubenswrapper[5004]: I1201 08:18:03.116928 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:03 crc kubenswrapper[5004]: I1201 08:18:03.116976 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:03 crc kubenswrapper[5004]: I1201 08:18:03.116998 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:03 crc kubenswrapper[5004]: I1201 08:18:03.117030 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:03 crc kubenswrapper[5004]: I1201 08:18:03.117053 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:03Z","lastTransitionTime":"2025-12-01T08:18:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:03 crc kubenswrapper[5004]: I1201 08:18:03.220271 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:03 crc kubenswrapper[5004]: I1201 08:18:03.220334 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:03 crc kubenswrapper[5004]: I1201 08:18:03.220352 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:03 crc kubenswrapper[5004]: I1201 08:18:03.220376 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:03 crc kubenswrapper[5004]: I1201 08:18:03.220394 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:03Z","lastTransitionTime":"2025-12-01T08:18:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:03 crc kubenswrapper[5004]: I1201 08:18:03.324515 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:03 crc kubenswrapper[5004]: I1201 08:18:03.324603 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:03 crc kubenswrapper[5004]: I1201 08:18:03.324617 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:03 crc kubenswrapper[5004]: I1201 08:18:03.324635 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:03 crc kubenswrapper[5004]: I1201 08:18:03.324646 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:03Z","lastTransitionTime":"2025-12-01T08:18:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:03 crc kubenswrapper[5004]: I1201 08:18:03.428106 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:03 crc kubenswrapper[5004]: I1201 08:18:03.428176 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:03 crc kubenswrapper[5004]: I1201 08:18:03.428195 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:03 crc kubenswrapper[5004]: I1201 08:18:03.428220 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:03 crc kubenswrapper[5004]: I1201 08:18:03.428237 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:03Z","lastTransitionTime":"2025-12-01T08:18:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:03 crc kubenswrapper[5004]: I1201 08:18:03.530763 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:03 crc kubenswrapper[5004]: I1201 08:18:03.530811 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:03 crc kubenswrapper[5004]: I1201 08:18:03.530823 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:03 crc kubenswrapper[5004]: I1201 08:18:03.530840 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:03 crc kubenswrapper[5004]: I1201 08:18:03.530850 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:03Z","lastTransitionTime":"2025-12-01T08:18:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:03 crc kubenswrapper[5004]: I1201 08:18:03.634186 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:03 crc kubenswrapper[5004]: I1201 08:18:03.634230 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:03 crc kubenswrapper[5004]: I1201 08:18:03.634244 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:03 crc kubenswrapper[5004]: I1201 08:18:03.634264 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:03 crc kubenswrapper[5004]: I1201 08:18:03.634277 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:03Z","lastTransitionTime":"2025-12-01T08:18:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:03 crc kubenswrapper[5004]: I1201 08:18:03.737124 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:03 crc kubenswrapper[5004]: I1201 08:18:03.737203 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:03 crc kubenswrapper[5004]: I1201 08:18:03.737228 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:03 crc kubenswrapper[5004]: I1201 08:18:03.737261 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:03 crc kubenswrapper[5004]: I1201 08:18:03.737284 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:03Z","lastTransitionTime":"2025-12-01T08:18:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:03 crc kubenswrapper[5004]: I1201 08:18:03.758828 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:18:03 crc kubenswrapper[5004]: I1201 08:18:03.758924 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 08:18:03 crc kubenswrapper[5004]: E1201 08:18:03.759043 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 08:18:03 crc kubenswrapper[5004]: E1201 08:18:03.759223 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 08:18:03 crc kubenswrapper[5004]: I1201 08:18:03.759360 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 08:18:03 crc kubenswrapper[5004]: E1201 08:18:03.759513 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 08:18:03 crc kubenswrapper[5004]: I1201 08:18:03.840897 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:03 crc kubenswrapper[5004]: I1201 08:18:03.841628 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:03 crc kubenswrapper[5004]: I1201 08:18:03.841777 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:03 crc kubenswrapper[5004]: I1201 08:18:03.841933 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:03 crc kubenswrapper[5004]: I1201 08:18:03.842072 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:03Z","lastTransitionTime":"2025-12-01T08:18:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:03 crc kubenswrapper[5004]: I1201 08:18:03.945744 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:03 crc kubenswrapper[5004]: I1201 08:18:03.945791 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:03 crc kubenswrapper[5004]: I1201 08:18:03.945806 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:03 crc kubenswrapper[5004]: I1201 08:18:03.945825 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:03 crc kubenswrapper[5004]: I1201 08:18:03.945840 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:03Z","lastTransitionTime":"2025-12-01T08:18:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:04 crc kubenswrapper[5004]: I1201 08:18:04.048407 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:04 crc kubenswrapper[5004]: I1201 08:18:04.048455 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:04 crc kubenswrapper[5004]: I1201 08:18:04.048472 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:04 crc kubenswrapper[5004]: I1201 08:18:04.048495 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:04 crc kubenswrapper[5004]: I1201 08:18:04.048513 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:04Z","lastTransitionTime":"2025-12-01T08:18:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:04 crc kubenswrapper[5004]: I1201 08:18:04.151240 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:04 crc kubenswrapper[5004]: I1201 08:18:04.151696 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:04 crc kubenswrapper[5004]: I1201 08:18:04.151808 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:04 crc kubenswrapper[5004]: I1201 08:18:04.151953 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:04 crc kubenswrapper[5004]: I1201 08:18:04.152099 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:04Z","lastTransitionTime":"2025-12-01T08:18:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:04 crc kubenswrapper[5004]: I1201 08:18:04.255052 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:04 crc kubenswrapper[5004]: I1201 08:18:04.255136 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:04 crc kubenswrapper[5004]: I1201 08:18:04.255160 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:04 crc kubenswrapper[5004]: I1201 08:18:04.255192 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:04 crc kubenswrapper[5004]: I1201 08:18:04.255214 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:04Z","lastTransitionTime":"2025-12-01T08:18:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:04 crc kubenswrapper[5004]: I1201 08:18:04.358081 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:04 crc kubenswrapper[5004]: I1201 08:18:04.358118 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:04 crc kubenswrapper[5004]: I1201 08:18:04.358128 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:04 crc kubenswrapper[5004]: I1201 08:18:04.358144 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:04 crc kubenswrapper[5004]: I1201 08:18:04.358155 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:04Z","lastTransitionTime":"2025-12-01T08:18:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:04 crc kubenswrapper[5004]: I1201 08:18:04.461385 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:04 crc kubenswrapper[5004]: I1201 08:18:04.461434 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:04 crc kubenswrapper[5004]: I1201 08:18:04.461450 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:04 crc kubenswrapper[5004]: I1201 08:18:04.461472 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:04 crc kubenswrapper[5004]: I1201 08:18:04.461488 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:04Z","lastTransitionTime":"2025-12-01T08:18:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:04 crc kubenswrapper[5004]: I1201 08:18:04.564434 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:04 crc kubenswrapper[5004]: I1201 08:18:04.564479 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:04 crc kubenswrapper[5004]: I1201 08:18:04.564495 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:04 crc kubenswrapper[5004]: I1201 08:18:04.564516 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:04 crc kubenswrapper[5004]: I1201 08:18:04.564533 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:04Z","lastTransitionTime":"2025-12-01T08:18:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:04 crc kubenswrapper[5004]: I1201 08:18:04.667266 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:04 crc kubenswrapper[5004]: I1201 08:18:04.667321 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:04 crc kubenswrapper[5004]: I1201 08:18:04.667365 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:04 crc kubenswrapper[5004]: I1201 08:18:04.667394 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:04 crc kubenswrapper[5004]: I1201 08:18:04.667445 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:04Z","lastTransitionTime":"2025-12-01T08:18:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:04 crc kubenswrapper[5004]: I1201 08:18:04.762666 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cl5l" Dec 01 08:18:04 crc kubenswrapper[5004]: E1201 08:18:04.762873 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cl5l" podUID="b488f4f3-d385-4d40-bdee-96d8fe2d42a1" Dec 01 08:18:04 crc kubenswrapper[5004]: I1201 08:18:04.771402 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:04 crc kubenswrapper[5004]: I1201 08:18:04.771456 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:04 crc kubenswrapper[5004]: I1201 08:18:04.771493 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:04 crc kubenswrapper[5004]: I1201 08:18:04.771520 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:04 crc kubenswrapper[5004]: I1201 08:18:04.771541 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:04Z","lastTransitionTime":"2025-12-01T08:18:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:04 crc kubenswrapper[5004]: I1201 08:18:04.874423 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:04 crc kubenswrapper[5004]: I1201 08:18:04.874476 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:04 crc kubenswrapper[5004]: I1201 08:18:04.874492 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:04 crc kubenswrapper[5004]: I1201 08:18:04.874514 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:04 crc kubenswrapper[5004]: I1201 08:18:04.874530 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:04Z","lastTransitionTime":"2025-12-01T08:18:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:04 crc kubenswrapper[5004]: I1201 08:18:04.977975 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:04 crc kubenswrapper[5004]: I1201 08:18:04.978036 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:04 crc kubenswrapper[5004]: I1201 08:18:04.978057 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:04 crc kubenswrapper[5004]: I1201 08:18:04.978083 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:04 crc kubenswrapper[5004]: I1201 08:18:04.978099 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:04Z","lastTransitionTime":"2025-12-01T08:18:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:05 crc kubenswrapper[5004]: I1201 08:18:05.081018 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:05 crc kubenswrapper[5004]: I1201 08:18:05.081064 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:05 crc kubenswrapper[5004]: I1201 08:18:05.081080 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:05 crc kubenswrapper[5004]: I1201 08:18:05.081100 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:05 crc kubenswrapper[5004]: I1201 08:18:05.081116 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:05Z","lastTransitionTime":"2025-12-01T08:18:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:05 crc kubenswrapper[5004]: I1201 08:18:05.182988 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:05 crc kubenswrapper[5004]: I1201 08:18:05.183023 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:05 crc kubenswrapper[5004]: I1201 08:18:05.183044 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:05 crc kubenswrapper[5004]: I1201 08:18:05.183059 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:05 crc kubenswrapper[5004]: I1201 08:18:05.183067 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:05Z","lastTransitionTime":"2025-12-01T08:18:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:05 crc kubenswrapper[5004]: I1201 08:18:05.285489 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:05 crc kubenswrapper[5004]: I1201 08:18:05.285847 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:05 crc kubenswrapper[5004]: I1201 08:18:05.285998 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:05 crc kubenswrapper[5004]: I1201 08:18:05.286154 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:05 crc kubenswrapper[5004]: I1201 08:18:05.286333 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:05Z","lastTransitionTime":"2025-12-01T08:18:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:05 crc kubenswrapper[5004]: I1201 08:18:05.389914 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:05 crc kubenswrapper[5004]: I1201 08:18:05.390841 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:05 crc kubenswrapper[5004]: I1201 08:18:05.391057 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:05 crc kubenswrapper[5004]: I1201 08:18:05.391250 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:05 crc kubenswrapper[5004]: I1201 08:18:05.391662 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:05Z","lastTransitionTime":"2025-12-01T08:18:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:05 crc kubenswrapper[5004]: I1201 08:18:05.487260 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:05 crc kubenswrapper[5004]: I1201 08:18:05.487651 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:05 crc kubenswrapper[5004]: I1201 08:18:05.487789 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:05 crc kubenswrapper[5004]: I1201 08:18:05.487936 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:05 crc kubenswrapper[5004]: I1201 08:18:05.488084 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:05Z","lastTransitionTime":"2025-12-01T08:18:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:05 crc kubenswrapper[5004]: E1201 08:18:05.509426 5004 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:18:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:18:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:18:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:18:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c8341267-df98-484f-a5e7-cf024fab437c\\\",\\\"systemUUID\\\":\\\"77547a65-c048-47ee-89b1-8422fc81b7aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:05Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:05 crc kubenswrapper[5004]: I1201 08:18:05.515031 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:05 crc kubenswrapper[5004]: I1201 08:18:05.515087 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:05 crc kubenswrapper[5004]: I1201 08:18:05.515110 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:05 crc kubenswrapper[5004]: I1201 08:18:05.515139 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:05 crc kubenswrapper[5004]: I1201 08:18:05.515162 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:05Z","lastTransitionTime":"2025-12-01T08:18:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:05 crc kubenswrapper[5004]: E1201 08:18:05.539190 5004 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:18:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:18:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:18:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:18:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c8341267-df98-484f-a5e7-cf024fab437c\\\",\\\"systemUUID\\\":\\\"77547a65-c048-47ee-89b1-8422fc81b7aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:05Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:05 crc kubenswrapper[5004]: I1201 08:18:05.544834 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:05 crc kubenswrapper[5004]: I1201 08:18:05.544950 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:05 crc kubenswrapper[5004]: I1201 08:18:05.545030 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:05 crc kubenswrapper[5004]: I1201 08:18:05.545133 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:05 crc kubenswrapper[5004]: I1201 08:18:05.545217 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:05Z","lastTransitionTime":"2025-12-01T08:18:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:05 crc kubenswrapper[5004]: E1201 08:18:05.561397 5004 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:18:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:18:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:18:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:18:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c8341267-df98-484f-a5e7-cf024fab437c\\\",\\\"systemUUID\\\":\\\"77547a65-c048-47ee-89b1-8422fc81b7aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:05Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:05 crc kubenswrapper[5004]: I1201 08:18:05.570206 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:05 crc kubenswrapper[5004]: I1201 08:18:05.570253 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:05 crc kubenswrapper[5004]: I1201 08:18:05.570268 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:05 crc kubenswrapper[5004]: I1201 08:18:05.570287 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:05 crc kubenswrapper[5004]: I1201 08:18:05.570304 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:05Z","lastTransitionTime":"2025-12-01T08:18:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:05 crc kubenswrapper[5004]: E1201 08:18:05.593910 5004 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:18:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:18:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:18:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:18:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c8341267-df98-484f-a5e7-cf024fab437c\\\",\\\"systemUUID\\\":\\\"77547a65-c048-47ee-89b1-8422fc81b7aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:05Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:05 crc kubenswrapper[5004]: I1201 08:18:05.599023 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:05 crc kubenswrapper[5004]: I1201 08:18:05.599078 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:05 crc kubenswrapper[5004]: I1201 08:18:05.599096 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:05 crc kubenswrapper[5004]: I1201 08:18:05.599119 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:05 crc kubenswrapper[5004]: I1201 08:18:05.599139 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:05Z","lastTransitionTime":"2025-12-01T08:18:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:05 crc kubenswrapper[5004]: E1201 08:18:05.618297 5004 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:18:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:18:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:18:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:18:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c8341267-df98-484f-a5e7-cf024fab437c\\\",\\\"systemUUID\\\":\\\"77547a65-c048-47ee-89b1-8422fc81b7aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:05Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:05 crc kubenswrapper[5004]: E1201 08:18:05.618531 5004 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 08:18:05 crc kubenswrapper[5004]: I1201 08:18:05.621170 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:05 crc kubenswrapper[5004]: I1201 08:18:05.621222 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:05 crc kubenswrapper[5004]: I1201 08:18:05.621241 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:05 crc kubenswrapper[5004]: I1201 08:18:05.621267 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:05 crc kubenswrapper[5004]: I1201 08:18:05.621286 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:05Z","lastTransitionTime":"2025-12-01T08:18:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:05 crc kubenswrapper[5004]: I1201 08:18:05.723581 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:05 crc kubenswrapper[5004]: I1201 08:18:05.723623 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:05 crc kubenswrapper[5004]: I1201 08:18:05.723635 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:05 crc kubenswrapper[5004]: I1201 08:18:05.723655 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:05 crc kubenswrapper[5004]: I1201 08:18:05.723668 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:05Z","lastTransitionTime":"2025-12-01T08:18:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:05 crc kubenswrapper[5004]: I1201 08:18:05.758322 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 08:18:05 crc kubenswrapper[5004]: I1201 08:18:05.758369 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:18:05 crc kubenswrapper[5004]: E1201 08:18:05.758441 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 08:18:05 crc kubenswrapper[5004]: E1201 08:18:05.758538 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 08:18:05 crc kubenswrapper[5004]: I1201 08:18:05.758676 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 08:18:05 crc kubenswrapper[5004]: E1201 08:18:05.758788 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 08:18:05 crc kubenswrapper[5004]: I1201 08:18:05.826375 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:05 crc kubenswrapper[5004]: I1201 08:18:05.826406 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:05 crc kubenswrapper[5004]: I1201 08:18:05.826415 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:05 crc kubenswrapper[5004]: I1201 08:18:05.826426 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:05 crc kubenswrapper[5004]: I1201 08:18:05.826435 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:05Z","lastTransitionTime":"2025-12-01T08:18:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:05 crc kubenswrapper[5004]: I1201 08:18:05.929356 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:05 crc kubenswrapper[5004]: I1201 08:18:05.929387 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:05 crc kubenswrapper[5004]: I1201 08:18:05.929395 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:05 crc kubenswrapper[5004]: I1201 08:18:05.929408 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:05 crc kubenswrapper[5004]: I1201 08:18:05.929417 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:05Z","lastTransitionTime":"2025-12-01T08:18:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:06 crc kubenswrapper[5004]: I1201 08:18:06.032183 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:06 crc kubenswrapper[5004]: I1201 08:18:06.032210 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:06 crc kubenswrapper[5004]: I1201 08:18:06.032219 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:06 crc kubenswrapper[5004]: I1201 08:18:06.032231 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:06 crc kubenswrapper[5004]: I1201 08:18:06.032241 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:06Z","lastTransitionTime":"2025-12-01T08:18:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:06 crc kubenswrapper[5004]: I1201 08:18:06.134773 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:06 crc kubenswrapper[5004]: I1201 08:18:06.134816 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:06 crc kubenswrapper[5004]: I1201 08:18:06.134828 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:06 crc kubenswrapper[5004]: I1201 08:18:06.134844 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:06 crc kubenswrapper[5004]: I1201 08:18:06.134857 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:06Z","lastTransitionTime":"2025-12-01T08:18:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:06 crc kubenswrapper[5004]: I1201 08:18:06.237900 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:06 crc kubenswrapper[5004]: I1201 08:18:06.237960 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:06 crc kubenswrapper[5004]: I1201 08:18:06.237979 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:06 crc kubenswrapper[5004]: I1201 08:18:06.238003 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:06 crc kubenswrapper[5004]: I1201 08:18:06.238022 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:06Z","lastTransitionTime":"2025-12-01T08:18:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:06 crc kubenswrapper[5004]: I1201 08:18:06.340952 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:06 crc kubenswrapper[5004]: I1201 08:18:06.340994 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:06 crc kubenswrapper[5004]: I1201 08:18:06.341006 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:06 crc kubenswrapper[5004]: I1201 08:18:06.341022 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:06 crc kubenswrapper[5004]: I1201 08:18:06.341033 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:06Z","lastTransitionTime":"2025-12-01T08:18:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:06 crc kubenswrapper[5004]: I1201 08:18:06.443450 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:06 crc kubenswrapper[5004]: I1201 08:18:06.443503 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:06 crc kubenswrapper[5004]: I1201 08:18:06.443522 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:06 crc kubenswrapper[5004]: I1201 08:18:06.443544 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:06 crc kubenswrapper[5004]: I1201 08:18:06.443556 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:06Z","lastTransitionTime":"2025-12-01T08:18:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:06 crc kubenswrapper[5004]: I1201 08:18:06.545728 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:06 crc kubenswrapper[5004]: I1201 08:18:06.545778 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:06 crc kubenswrapper[5004]: I1201 08:18:06.545789 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:06 crc kubenswrapper[5004]: I1201 08:18:06.545804 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:06 crc kubenswrapper[5004]: I1201 08:18:06.545814 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:06Z","lastTransitionTime":"2025-12-01T08:18:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:06 crc kubenswrapper[5004]: I1201 08:18:06.649190 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:06 crc kubenswrapper[5004]: I1201 08:18:06.649248 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:06 crc kubenswrapper[5004]: I1201 08:18:06.649264 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:06 crc kubenswrapper[5004]: I1201 08:18:06.649291 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:06 crc kubenswrapper[5004]: I1201 08:18:06.649313 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:06Z","lastTransitionTime":"2025-12-01T08:18:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:06 crc kubenswrapper[5004]: I1201 08:18:06.752066 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:06 crc kubenswrapper[5004]: I1201 08:18:06.752138 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:06 crc kubenswrapper[5004]: I1201 08:18:06.752156 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:06 crc kubenswrapper[5004]: I1201 08:18:06.752637 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:06 crc kubenswrapper[5004]: I1201 08:18:06.752702 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:06Z","lastTransitionTime":"2025-12-01T08:18:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:06 crc kubenswrapper[5004]: I1201 08:18:06.759017 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cl5l" Dec 01 08:18:06 crc kubenswrapper[5004]: E1201 08:18:06.759187 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cl5l" podUID="b488f4f3-d385-4d40-bdee-96d8fe2d42a1" Dec 01 08:18:06 crc kubenswrapper[5004]: I1201 08:18:06.759844 5004 scope.go:117] "RemoveContainer" containerID="76c127e5ff47d030a709c79b7c7f2c2d3b32009052ff95ff522d90ecd86cf993" Dec 01 08:18:06 crc kubenswrapper[5004]: E1201 08:18:06.760115 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-knmdv_openshift-ovn-kubernetes(15cdec0a-5925-4966-a30b-f60c503f633e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" podUID="15cdec0a-5925-4966-a30b-f60c503f633e" Dec 01 08:18:06 crc kubenswrapper[5004]: I1201 08:18:06.855595 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:06 crc kubenswrapper[5004]: I1201 08:18:06.855651 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:06 crc kubenswrapper[5004]: I1201 08:18:06.855668 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:06 crc kubenswrapper[5004]: I1201 08:18:06.855689 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:06 crc kubenswrapper[5004]: I1201 08:18:06.855706 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:06Z","lastTransitionTime":"2025-12-01T08:18:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:06 crc kubenswrapper[5004]: I1201 08:18:06.960764 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:06 crc kubenswrapper[5004]: I1201 08:18:06.960823 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:06 crc kubenswrapper[5004]: I1201 08:18:06.960841 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:06 crc kubenswrapper[5004]: I1201 08:18:06.960865 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:06 crc kubenswrapper[5004]: I1201 08:18:06.960883 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:06Z","lastTransitionTime":"2025-12-01T08:18:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:07 crc kubenswrapper[5004]: I1201 08:18:07.063814 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:07 crc kubenswrapper[5004]: I1201 08:18:07.063881 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:07 crc kubenswrapper[5004]: I1201 08:18:07.063899 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:07 crc kubenswrapper[5004]: I1201 08:18:07.063922 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:07 crc kubenswrapper[5004]: I1201 08:18:07.063940 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:07Z","lastTransitionTime":"2025-12-01T08:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:07 crc kubenswrapper[5004]: I1201 08:18:07.166770 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:07 crc kubenswrapper[5004]: I1201 08:18:07.167168 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:07 crc kubenswrapper[5004]: I1201 08:18:07.167311 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:07 crc kubenswrapper[5004]: I1201 08:18:07.167470 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:07 crc kubenswrapper[5004]: I1201 08:18:07.167643 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:07Z","lastTransitionTime":"2025-12-01T08:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:07 crc kubenswrapper[5004]: I1201 08:18:07.270849 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:07 crc kubenswrapper[5004]: I1201 08:18:07.270912 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:07 crc kubenswrapper[5004]: I1201 08:18:07.270926 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:07 crc kubenswrapper[5004]: I1201 08:18:07.270950 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:07 crc kubenswrapper[5004]: I1201 08:18:07.270968 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:07Z","lastTransitionTime":"2025-12-01T08:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:07 crc kubenswrapper[5004]: I1201 08:18:07.373639 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:07 crc kubenswrapper[5004]: I1201 08:18:07.373703 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:07 crc kubenswrapper[5004]: I1201 08:18:07.373716 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:07 crc kubenswrapper[5004]: I1201 08:18:07.373742 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:07 crc kubenswrapper[5004]: I1201 08:18:07.373784 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:07Z","lastTransitionTime":"2025-12-01T08:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:07 crc kubenswrapper[5004]: I1201 08:18:07.476675 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:07 crc kubenswrapper[5004]: I1201 08:18:07.476730 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:07 crc kubenswrapper[5004]: I1201 08:18:07.476740 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:07 crc kubenswrapper[5004]: I1201 08:18:07.476756 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:07 crc kubenswrapper[5004]: I1201 08:18:07.476768 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:07Z","lastTransitionTime":"2025-12-01T08:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:07 crc kubenswrapper[5004]: I1201 08:18:07.579003 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:07 crc kubenswrapper[5004]: I1201 08:18:07.579050 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:07 crc kubenswrapper[5004]: I1201 08:18:07.579060 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:07 crc kubenswrapper[5004]: I1201 08:18:07.579074 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:07 crc kubenswrapper[5004]: I1201 08:18:07.579084 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:07Z","lastTransitionTime":"2025-12-01T08:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:07 crc kubenswrapper[5004]: I1201 08:18:07.682144 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:07 crc kubenswrapper[5004]: I1201 08:18:07.682213 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:07 crc kubenswrapper[5004]: I1201 08:18:07.682231 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:07 crc kubenswrapper[5004]: I1201 08:18:07.682258 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:07 crc kubenswrapper[5004]: I1201 08:18:07.682275 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:07Z","lastTransitionTime":"2025-12-01T08:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:07 crc kubenswrapper[5004]: I1201 08:18:07.758189 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:18:07 crc kubenswrapper[5004]: I1201 08:18:07.758189 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 08:18:07 crc kubenswrapper[5004]: E1201 08:18:07.758431 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 08:18:07 crc kubenswrapper[5004]: I1201 08:18:07.758451 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 08:18:07 crc kubenswrapper[5004]: E1201 08:18:07.758595 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 08:18:07 crc kubenswrapper[5004]: E1201 08:18:07.758869 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 08:18:07 crc kubenswrapper[5004]: I1201 08:18:07.785111 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:07 crc kubenswrapper[5004]: I1201 08:18:07.785151 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:07 crc kubenswrapper[5004]: I1201 08:18:07.785163 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:07 crc kubenswrapper[5004]: I1201 08:18:07.785177 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:07 crc kubenswrapper[5004]: I1201 08:18:07.785189 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:07Z","lastTransitionTime":"2025-12-01T08:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:07 crc kubenswrapper[5004]: I1201 08:18:07.887571 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:07 crc kubenswrapper[5004]: I1201 08:18:07.887615 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:07 crc kubenswrapper[5004]: I1201 08:18:07.887626 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:07 crc kubenswrapper[5004]: I1201 08:18:07.887642 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:07 crc kubenswrapper[5004]: I1201 08:18:07.887654 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:07Z","lastTransitionTime":"2025-12-01T08:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:07 crc kubenswrapper[5004]: I1201 08:18:07.990078 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:07 crc kubenswrapper[5004]: I1201 08:18:07.990126 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:07 crc kubenswrapper[5004]: I1201 08:18:07.990139 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:07 crc kubenswrapper[5004]: I1201 08:18:07.990155 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:07 crc kubenswrapper[5004]: I1201 08:18:07.990167 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:07Z","lastTransitionTime":"2025-12-01T08:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:08 crc kubenswrapper[5004]: I1201 08:18:08.092576 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:08 crc kubenswrapper[5004]: I1201 08:18:08.092609 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:08 crc kubenswrapper[5004]: I1201 08:18:08.092618 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:08 crc kubenswrapper[5004]: I1201 08:18:08.092633 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:08 crc kubenswrapper[5004]: I1201 08:18:08.092642 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:08Z","lastTransitionTime":"2025-12-01T08:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:08 crc kubenswrapper[5004]: I1201 08:18:08.194984 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:08 crc kubenswrapper[5004]: I1201 08:18:08.195044 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:08 crc kubenswrapper[5004]: I1201 08:18:08.195056 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:08 crc kubenswrapper[5004]: I1201 08:18:08.195074 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:08 crc kubenswrapper[5004]: I1201 08:18:08.195085 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:08Z","lastTransitionTime":"2025-12-01T08:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:08 crc kubenswrapper[5004]: I1201 08:18:08.296553 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:08 crc kubenswrapper[5004]: I1201 08:18:08.296612 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:08 crc kubenswrapper[5004]: I1201 08:18:08.296623 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:08 crc kubenswrapper[5004]: I1201 08:18:08.296640 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:08 crc kubenswrapper[5004]: I1201 08:18:08.296652 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:08Z","lastTransitionTime":"2025-12-01T08:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:08 crc kubenswrapper[5004]: I1201 08:18:08.399683 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:08 crc kubenswrapper[5004]: I1201 08:18:08.399724 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:08 crc kubenswrapper[5004]: I1201 08:18:08.399733 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:08 crc kubenswrapper[5004]: I1201 08:18:08.399748 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:08 crc kubenswrapper[5004]: I1201 08:18:08.399758 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:08Z","lastTransitionTime":"2025-12-01T08:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:08 crc kubenswrapper[5004]: I1201 08:18:08.502013 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:08 crc kubenswrapper[5004]: I1201 08:18:08.502052 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:08 crc kubenswrapper[5004]: I1201 08:18:08.502060 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:08 crc kubenswrapper[5004]: I1201 08:18:08.502074 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:08 crc kubenswrapper[5004]: I1201 08:18:08.502084 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:08Z","lastTransitionTime":"2025-12-01T08:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:08 crc kubenswrapper[5004]: I1201 08:18:08.605302 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:08 crc kubenswrapper[5004]: I1201 08:18:08.605362 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:08 crc kubenswrapper[5004]: I1201 08:18:08.605382 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:08 crc kubenswrapper[5004]: I1201 08:18:08.605405 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:08 crc kubenswrapper[5004]: I1201 08:18:08.605424 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:08Z","lastTransitionTime":"2025-12-01T08:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:08 crc kubenswrapper[5004]: I1201 08:18:08.708601 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:08 crc kubenswrapper[5004]: I1201 08:18:08.708657 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:08 crc kubenswrapper[5004]: I1201 08:18:08.708678 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:08 crc kubenswrapper[5004]: I1201 08:18:08.708705 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:08 crc kubenswrapper[5004]: I1201 08:18:08.708726 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:08Z","lastTransitionTime":"2025-12-01T08:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:08 crc kubenswrapper[5004]: I1201 08:18:08.758257 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cl5l" Dec 01 08:18:08 crc kubenswrapper[5004]: E1201 08:18:08.758416 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cl5l" podUID="b488f4f3-d385-4d40-bdee-96d8fe2d42a1" Dec 01 08:18:08 crc kubenswrapper[5004]: I1201 08:18:08.810987 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:08 crc kubenswrapper[5004]: I1201 08:18:08.811019 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:08 crc kubenswrapper[5004]: I1201 08:18:08.811030 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:08 crc kubenswrapper[5004]: I1201 08:18:08.811044 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:08 crc kubenswrapper[5004]: I1201 08:18:08.811054 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:08Z","lastTransitionTime":"2025-12-01T08:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:08 crc kubenswrapper[5004]: I1201 08:18:08.913502 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:08 crc kubenswrapper[5004]: I1201 08:18:08.913554 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:08 crc kubenswrapper[5004]: I1201 08:18:08.913599 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:08 crc kubenswrapper[5004]: I1201 08:18:08.913622 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:08 crc kubenswrapper[5004]: I1201 08:18:08.913641 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:08Z","lastTransitionTime":"2025-12-01T08:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:09 crc kubenswrapper[5004]: I1201 08:18:09.017045 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:09 crc kubenswrapper[5004]: I1201 08:18:09.017092 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:09 crc kubenswrapper[5004]: I1201 08:18:09.017103 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:09 crc kubenswrapper[5004]: I1201 08:18:09.017124 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:09 crc kubenswrapper[5004]: I1201 08:18:09.017137 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:09Z","lastTransitionTime":"2025-12-01T08:18:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:09 crc kubenswrapper[5004]: I1201 08:18:09.029664 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b488f4f3-d385-4d40-bdee-96d8fe2d42a1-metrics-certs\") pod \"network-metrics-daemon-7cl5l\" (UID: \"b488f4f3-d385-4d40-bdee-96d8fe2d42a1\") " pod="openshift-multus/network-metrics-daemon-7cl5l" Dec 01 08:18:09 crc kubenswrapper[5004]: E1201 08:18:09.029826 5004 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 08:18:09 crc kubenswrapper[5004]: E1201 08:18:09.029885 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b488f4f3-d385-4d40-bdee-96d8fe2d42a1-metrics-certs podName:b488f4f3-d385-4d40-bdee-96d8fe2d42a1 nodeName:}" failed. No retries permitted until 2025-12-01 08:18:41.0298664 +0000 UTC m=+98.594858382 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b488f4f3-d385-4d40-bdee-96d8fe2d42a1-metrics-certs") pod "network-metrics-daemon-7cl5l" (UID: "b488f4f3-d385-4d40-bdee-96d8fe2d42a1") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 08:18:09 crc kubenswrapper[5004]: I1201 08:18:09.119199 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:09 crc kubenswrapper[5004]: I1201 08:18:09.119258 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:09 crc kubenswrapper[5004]: I1201 08:18:09.119275 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:09 crc kubenswrapper[5004]: I1201 08:18:09.119299 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:09 crc kubenswrapper[5004]: I1201 08:18:09.119318 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:09Z","lastTransitionTime":"2025-12-01T08:18:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:09 crc kubenswrapper[5004]: I1201 08:18:09.222076 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:09 crc kubenswrapper[5004]: I1201 08:18:09.222124 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:09 crc kubenswrapper[5004]: I1201 08:18:09.222140 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:09 crc kubenswrapper[5004]: I1201 08:18:09.222165 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:09 crc kubenswrapper[5004]: I1201 08:18:09.222183 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:09Z","lastTransitionTime":"2025-12-01T08:18:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:09 crc kubenswrapper[5004]: I1201 08:18:09.323913 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:09 crc kubenswrapper[5004]: I1201 08:18:09.323963 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:09 crc kubenswrapper[5004]: I1201 08:18:09.323974 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:09 crc kubenswrapper[5004]: I1201 08:18:09.323995 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:09 crc kubenswrapper[5004]: I1201 08:18:09.324009 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:09Z","lastTransitionTime":"2025-12-01T08:18:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:09 crc kubenswrapper[5004]: I1201 08:18:09.426255 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:09 crc kubenswrapper[5004]: I1201 08:18:09.426291 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:09 crc kubenswrapper[5004]: I1201 08:18:09.426301 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:09 crc kubenswrapper[5004]: I1201 08:18:09.426318 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:09 crc kubenswrapper[5004]: I1201 08:18:09.426327 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:09Z","lastTransitionTime":"2025-12-01T08:18:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:09 crc kubenswrapper[5004]: I1201 08:18:09.529279 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:09 crc kubenswrapper[5004]: I1201 08:18:09.529310 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:09 crc kubenswrapper[5004]: I1201 08:18:09.529322 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:09 crc kubenswrapper[5004]: I1201 08:18:09.529337 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:09 crc kubenswrapper[5004]: I1201 08:18:09.529348 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:09Z","lastTransitionTime":"2025-12-01T08:18:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:09 crc kubenswrapper[5004]: I1201 08:18:09.632219 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:09 crc kubenswrapper[5004]: I1201 08:18:09.632256 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:09 crc kubenswrapper[5004]: I1201 08:18:09.632271 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:09 crc kubenswrapper[5004]: I1201 08:18:09.632285 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:09 crc kubenswrapper[5004]: I1201 08:18:09.632297 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:09Z","lastTransitionTime":"2025-12-01T08:18:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:09 crc kubenswrapper[5004]: I1201 08:18:09.734304 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:09 crc kubenswrapper[5004]: I1201 08:18:09.734332 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:09 crc kubenswrapper[5004]: I1201 08:18:09.734340 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:09 crc kubenswrapper[5004]: I1201 08:18:09.734355 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:09 crc kubenswrapper[5004]: I1201 08:18:09.734365 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:09Z","lastTransitionTime":"2025-12-01T08:18:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:09 crc kubenswrapper[5004]: I1201 08:18:09.757891 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 08:18:09 crc kubenswrapper[5004]: I1201 08:18:09.757937 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 08:18:09 crc kubenswrapper[5004]: E1201 08:18:09.757985 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 08:18:09 crc kubenswrapper[5004]: I1201 08:18:09.758021 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:18:09 crc kubenswrapper[5004]: E1201 08:18:09.758165 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 08:18:09 crc kubenswrapper[5004]: E1201 08:18:09.758413 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 08:18:09 crc kubenswrapper[5004]: I1201 08:18:09.837930 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:09 crc kubenswrapper[5004]: I1201 08:18:09.837974 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:09 crc kubenswrapper[5004]: I1201 08:18:09.837986 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:09 crc kubenswrapper[5004]: I1201 08:18:09.838002 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:09 crc kubenswrapper[5004]: I1201 08:18:09.838012 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:09Z","lastTransitionTime":"2025-12-01T08:18:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:09 crc kubenswrapper[5004]: I1201 08:18:09.940073 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:09 crc kubenswrapper[5004]: I1201 08:18:09.940106 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:09 crc kubenswrapper[5004]: I1201 08:18:09.940118 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:09 crc kubenswrapper[5004]: I1201 08:18:09.940133 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:09 crc kubenswrapper[5004]: I1201 08:18:09.940143 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:09Z","lastTransitionTime":"2025-12-01T08:18:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.046922 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.047016 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.047047 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.047082 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.047118 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:10Z","lastTransitionTime":"2025-12-01T08:18:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.149534 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.149592 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.149603 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.149616 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.149624 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:10Z","lastTransitionTime":"2025-12-01T08:18:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.251838 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.251862 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.251870 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.251882 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.251891 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:10Z","lastTransitionTime":"2025-12-01T08:18:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.287151 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zjksw_70e79009-93be-49c4-a6b3-e8a06bcea7f4/kube-multus/0.log" Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.287199 5004 generic.go:334] "Generic (PLEG): container finished" podID="70e79009-93be-49c4-a6b3-e8a06bcea7f4" containerID="ad3ea204ad731500a340e639f0e36abe28ba50464f1b2ff9b009e39c234cf708" exitCode=1 Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.287224 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zjksw" event={"ID":"70e79009-93be-49c4-a6b3-e8a06bcea7f4","Type":"ContainerDied","Data":"ad3ea204ad731500a340e639f0e36abe28ba50464f1b2ff9b009e39c234cf708"} Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.287570 5004 scope.go:117] "RemoveContainer" containerID="ad3ea204ad731500a340e639f0e36abe28ba50464f1b2ff9b009e39c234cf708" Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.305556 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:10Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.323746 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:10Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.342875 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c470edd984db3b756e031e33f64a9bc5ca6b9c156ab479e5cd647745655a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:10Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.354856 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.354898 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.354910 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.354927 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.354938 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:10Z","lastTransitionTime":"2025-12-01T08:18:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.365350 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fd2f2b5d7dc3d7b8e1bb06fdf0eec3addfec3c230e044b6395d6cf535630ebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2250161e400b201f0528e33231c9007342002a37cd037a1b1e011f2aaaa22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:10Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.381714 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zjksw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70e79009-93be-49c4-a6b3-e8a06bcea7f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad3ea204ad731500a340e639f0e36abe28ba50464f1b2ff9b009e39c234cf708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad3ea204ad731500a340e639f0e36abe28ba50464f1b2ff9b009e39c234cf708\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T08:18:10Z\\\",\\\"message\\\":\\\"2025-12-01T08:17:24+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_50c6dd0c-b888-41cb-a4f6-d1199cf62aeb\\\\n2025-12-01T08:17:24+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_50c6dd0c-b888-41cb-a4f6-d1199cf62aeb to /host/opt/cni/bin/\\\\n2025-12-01T08:17:25Z [verbose] multus-daemon started\\\\n2025-12-01T08:17:25Z [verbose] Readiness Indicator file check\\\\n2025-12-01T08:18:10Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd8m6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zjksw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:10Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.409110 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15cdec0a-5925-4966-a30b-f60c503f633e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f528b94ff12c7a804930dca4b93c23334a9ae3a53317750cd34b5695f08b4582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f001b70a83da28b1ea78a94f1cd70dfc63b3de5a21b041932d33c2ac970c986b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c5c043438b515e37a6d6d5c47b92479bbb7322fe35c3a1bc57ee7b3a746544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23df33e45cc27e4d343ae6ec3fb88f2dfc125ba7da424515d4bee5ec19281832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4a7cab8e1594814a687b1433a92642a19eff7c2eb12f96fa06f08a2e270733d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1844a18a6109c25a25b9c8cb3ff65e69cc657f66be03dd97e883d24a774b7638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76c127e5ff47d030a709c79b7c7f2c2d3b32009052ff95ff522d90ecd86cf993\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76c127e5ff47d030a709c79b7c7f2c2d3b32009052ff95ff522d90ecd86cf993\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T08:17:52Z\\\",\\\"message\\\":\\\"}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1201 08:17:52.744847 6628 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-marketplace/community-operators]} name:Service_openshift-marketplace/community-operators_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.189:50051:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d389393c-7ba9-422c-b3f5-06e391d537d2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 08:17:52.746408 6628 model_client.go:382] Update operations generated as: [{Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1201 08:17:52.746269 6628 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-knmdv_openshift-ovn-kubernetes(15cdec0a-5925-4966-a30b-f60c503f633e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f231e0092e1a85cc5d6e00fb8d3bd982223b670191ac3dc21d81ff1fab462a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-knmdv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:10Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.426053 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327477e7db10129d47ddf9078f32d0df2ffd4b8e59fc7bb77c17c60c63e9a936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:10Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.442009 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mzsvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"397b51b7-934a-41d1-a593-500a64161bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4adac5633dfe89777bf019818bab9ee3a208cad9d929c96cf2cb86b18c2d4264\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h49g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d83844478080e4829975fb6c8e0444d9fdebd44b08afcf45e7d0b04fc534a844\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h49g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mzsvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:10Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.453846 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jjms6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8828af41-beeb-47dd-96cf-3dbcb5175893\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b22220849cdb91592a44a67f9f6a09586146009483f46a0505a2dea3b02bccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvsch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jjms6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:10Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.457423 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.457473 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.457485 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.457507 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.457525 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:10Z","lastTransitionTime":"2025-12-01T08:18:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.467758 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9977ebb-82de-4e96-8763-0b5a84f8d4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf84c2eea7167eed95a4195d37bf784ffef310bc2079d8d06e1f47cb45a7864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwgxq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c1e51fa92aec80e93d0fce11c743b8c4fde23fc23d8626e8d5b3e25ab42350d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwgxq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fvdgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:10Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.478716 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7cl5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b488f4f3-d385-4d40-bdee-96d8fe2d42a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-glbhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-glbhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7cl5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:10Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.496932 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43c9f762-d403-49b9-8294-d66976aca4dd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b358f641ff09bc8f4a452b0d4a8cf5f9790182f76779be31426d48d0278b0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d610c3a799d4a7df9ddfe2a28f2ced2e5ac4e96c67c93f8e6d3a0eee60d9ac48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43f9f72af3273da055eeb1e13ad6258a1b3b6b205402861beee9f87f02230297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e3cc2eca559417c71beed561569c4dc5b45ffb8407976e0695fc1b93219d37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c24fbeedbc7c40b6079b7ad89d8564a4f2cdaad2cf996e66682b082382da190\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:10Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.510450 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:10Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.521382 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ww6lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72500ffd-4ca3-4614-a3a2-bbdc5a7506c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://668cfaa2af20e2bb082fc47e1702cfd9f704c4fdf56a4d27cf25d6915e7cd18a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmsvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ww6lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:10Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.534433 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2fa377e-a3a6-46ce-af23-e677799f0115\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520771cf1ad6f9360064c5f304243c5878a2f1025c87d1001c97b2d5f95cb9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b65b9bc444101900c62fd90c7d45789d186a89148ac0fd9c7f0c6048c3f55ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbd3e682d207b8c18d6bcbc26aa2adae2a35645502d31b327ffddd51991e842\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4563974ed1467a87aab104756828e63853c282a61c4fcab1e757eb4da9afaa40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:10Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.549030 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dpkxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddcaf111-708a-45b3-a342-effd3061ab17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68ecba515f05ca83fdd0cdda10e3e5925a146aadb70ae17859586c12daf55dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd2f03a8020818d1cc98b6eeaf64a728e627a3401400c3af53654f8771ef02eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd2f03a8020818d1cc98b6eeaf64a728e627a3401400c3af53654f8771ef02eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8dd31c10eaf6bc673dec377a884040524418dd1891a96b2eb6e6e983979512a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8dd31c10eaf6bc673dec377a884040524418dd1891a96b2eb6e6e983979512a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a7a8ef66a9486ca3d13826df7aa39db2083cff2c386b4d0d858e0526e0f094d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a7a8ef66a9486ca3d13826df7aa39db2083cff2c386b4d0d858e0526e0f094d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd1c47c976c17bf17e4d660288ce30472372b6a7f33f6ffa6d96c1a4436275d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd1c47c976c17bf17e4d660288ce30472372b6a7f33f6ffa6d96c1a4436275d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b7c8e9648e5514ab1fcb63dc72822dd276a2d74987e2fb4f4800b26ec176be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b7c8e9648e5514ab1fcb63dc72822dd276a2d74987e2fb4f4800b26ec176be7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d7115db7fe248437f4b8918f06c9aa48b141fcd3f31add80faee81b4347e47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d7115db7fe248437f4b8918f06c9aa48b141fcd3f31add80faee81b4347e47b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dpkxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:10Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.560121 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.560175 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.560184 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.560198 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.560209 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:10Z","lastTransitionTime":"2025-12-01T08:18:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.563094 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4ee478f-9254-4e56-96be-f5a83ff5d77c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37438df65f4dab6700f193e84f81d8ed41b3c208c3f6150bd5836c0642190572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f873da9c594c885720113b0d0fc01552050d030e802074043e9ede174fd9b16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ca757be16a55ba8df0e9629f7cc2653e2804a5ee5a2151ee3b07d2c30fe5b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3930a1c88c6014beb768ed20a37359ade08e77496a86b913f68c6196134f13a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3930a1c88c6014beb768ed20a37359ade08e77496a86b913f68c6196134f13a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:10Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.578455 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0ed2b6d-0c61-4639-bc3b-1c8effc4815d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45a0d7a12e7c114de1740a151e04f7f39844b709740cab958f156ac918af3228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d69a370af9f8f6c575f99da6bb01cee0f660dccb7e85f16f5d8874d0e263e1fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82baee0fd931fd7b41cd5a73ca7e653ef13dfa3d573991cbe3238d6b95f6fac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f341e99100e985f65c24647d976a24782dfae7f52e752b774f41f69af27fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad581564fc519328ae429bce757797f6d87d8648e862b008637a245866ec4cbe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T08:17:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 08:17:15.185125 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 08:17:15.186793 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071633747/tls.crt::/tmp/serving-cert-4071633747/tls.key\\\\\\\"\\\\nI1201 08:17:20.827614 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 08:17:20.834528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 08:17:20.834549 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 08:17:20.837611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 08:17:20.837630 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 08:17:20.848940 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 08:17:20.848957 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848965 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 08:17:20.848968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 08:17:20.848970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 08:17:20.848973 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 08:17:20.849115 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 08:17:20.854990 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2caeb2ebaf68110cb8728e60aae6db1b1fc48efae6018f66070766a4f42caa69\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:10Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.661934 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.661979 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.661988 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.662004 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.662014 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:10Z","lastTransitionTime":"2025-12-01T08:18:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.758875 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cl5l" Dec 01 08:18:10 crc kubenswrapper[5004]: E1201 08:18:10.759115 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cl5l" podUID="b488f4f3-d385-4d40-bdee-96d8fe2d42a1" Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.764137 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.764203 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.764226 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.764255 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.764276 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:10Z","lastTransitionTime":"2025-12-01T08:18:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.867896 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.867944 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.867956 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.867979 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.867992 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:10Z","lastTransitionTime":"2025-12-01T08:18:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.970316 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.970357 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.970367 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.970384 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:10 crc kubenswrapper[5004]: I1201 08:18:10.970395 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:10Z","lastTransitionTime":"2025-12-01T08:18:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.072889 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.072956 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.072976 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.073000 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.073019 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:11Z","lastTransitionTime":"2025-12-01T08:18:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.175972 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.176033 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.176054 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.176080 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.176097 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:11Z","lastTransitionTime":"2025-12-01T08:18:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.278961 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.278997 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.279005 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.279020 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.279031 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:11Z","lastTransitionTime":"2025-12-01T08:18:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.291771 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zjksw_70e79009-93be-49c4-a6b3-e8a06bcea7f4/kube-multus/0.log" Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.291813 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zjksw" event={"ID":"70e79009-93be-49c4-a6b3-e8a06bcea7f4","Type":"ContainerStarted","Data":"862dd7caaf04a01f96f2dba70cd2226e85b55f172ee6a34c178f756a75832a08"} Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.320874 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43c9f762-d403-49b9-8294-d66976aca4dd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b358f641ff09bc8f4a452b0d4a8cf5f9790182f76779be31426d48d0278b0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d610c3a799d4a7df9ddfe2a28f2ced2e5ac4e96c67c93f8e6d3a0eee60d9ac48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43f9f72af3273da055eeb1e13ad6258a1b3b6b205402861beee9f87f02230297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e3cc2eca559417c71beed561569c4dc5b45ffb8407976e0695fc1b93219d37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c24fbeedbc7c40b6079b7ad89d8564a4f2cdaad2cf996e66682b082382da190\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:11Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.337107 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:11Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.353127 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jjms6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8828af41-beeb-47dd-96cf-3dbcb5175893\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b22220849cdb91592a44a67f9f6a09586146009483f46a0505a2dea3b02bccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvsch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jjms6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:11Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.366415 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9977ebb-82de-4e96-8763-0b5a84f8d4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf84c2eea7167eed95a4195d37bf784ffef310bc2079d8d06e1f47cb45a7864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwgxq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c1e51fa92aec80e93d0fce11c743b8c4fde23fc23d8626e8d5b3e25ab42350d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwgxq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fvdgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:11Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.382420 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7cl5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b488f4f3-d385-4d40-bdee-96d8fe2d42a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-glbhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-glbhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7cl5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:11Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.382849 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.382887 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.382908 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.382938 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.382961 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:11Z","lastTransitionTime":"2025-12-01T08:18:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.396683 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2fa377e-a3a6-46ce-af23-e677799f0115\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520771cf1ad6f9360064c5f304243c5878a2f1025c87d1001c97b2d5f95cb9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b65b9bc444101900c62fd90c7d45789d186a89148ac0fd9c7f0c6048c3f55ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbd3e682d207b8c18d6bcbc26aa2adae2a35645502d31b327ffddd51991e842\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4563974ed1467a87aab104756828e63853c282a61c4fcab1e757eb4da9afaa40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:11Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.412411 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dpkxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddcaf111-708a-45b3-a342-effd3061ab17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68ecba515f05ca83fdd0cdda10e3e5925a146aadb70ae17859586c12daf55dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd2f03a8020818d1cc98b6eeaf64a728e627a3401400c3af53654f8771ef02eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd2f03a8020818d1cc98b6eeaf64a728e627a3401400c3af53654f8771ef02eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8dd31c10eaf6bc673dec377a884040524418dd1891a96b2eb6e6e983979512a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8dd31c10eaf6bc673dec377a884040524418dd1891a96b2eb6e6e983979512a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a7a8ef66a9486ca3d13826df7aa39db2083cff2c386b4d0d858e0526e0f094d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a7a8ef66a9486ca3d13826df7aa39db2083cff2c386b4d0d858e0526e0f094d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd1c47c976c17bf17e4d660288ce30472372b6a7f33f6ffa6d96c1a4436275d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd1c47c976c17bf17e4d660288ce30472372b6a7f33f6ffa6d96c1a4436275d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b7c8e9648e5514ab1fcb63dc72822dd276a2d74987e2fb4f4800b26ec176be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b7c8e9648e5514ab1fcb63dc72822dd276a2d74987e2fb4f4800b26ec176be7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d7115db7fe248437f4b8918f06c9aa48b141fcd3f31add80faee81b4347e47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d7115db7fe248437f4b8918f06c9aa48b141fcd3f31add80faee81b4347e47b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dpkxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:11Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.422792 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ww6lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72500ffd-4ca3-4614-a3a2-bbdc5a7506c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://668cfaa2af20e2bb082fc47e1702cfd9f704c4fdf56a4d27cf25d6915e7cd18a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmsvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ww6lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:11Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.436348 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4ee478f-9254-4e56-96be-f5a83ff5d77c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37438df65f4dab6700f193e84f81d8ed41b3c208c3f6150bd5836c0642190572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f873da9c594c885720113b0d0fc01552050d030e802074043e9ede174fd9b16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ca757be16a55ba8df0e9629f7cc2653e2804a5ee5a2151ee3b07d2c30fe5b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3930a1c88c6014beb768ed20a37359ade08e77496a86b913f68c6196134f13a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3930a1c88c6014beb768ed20a37359ade08e77496a86b913f68c6196134f13a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:11Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.454601 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0ed2b6d-0c61-4639-bc3b-1c8effc4815d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45a0d7a12e7c114de1740a151e04f7f39844b709740cab958f156ac918af3228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d69a370af9f8f6c575f99da6bb01cee0f660dccb7e85f16f5d8874d0e263e1fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82baee0fd931fd7b41cd5a73ca7e653ef13dfa3d573991cbe3238d6b95f6fac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f341e99100e985f65c24647d976a24782dfae7f52e752b774f41f69af27fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad581564fc519328ae429bce757797f6d87d8648e862b008637a245866ec4cbe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T08:17:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 08:17:15.185125 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 08:17:15.186793 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071633747/tls.crt::/tmp/serving-cert-4071633747/tls.key\\\\\\\"\\\\nI1201 08:17:20.827614 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 08:17:20.834528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 08:17:20.834549 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 08:17:20.837611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 08:17:20.837630 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 08:17:20.848940 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 08:17:20.848957 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848965 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 08:17:20.848968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 08:17:20.848970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 08:17:20.848973 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 08:17:20.849115 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 08:17:20.854990 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2caeb2ebaf68110cb8728e60aae6db1b1fc48efae6018f66070766a4f42caa69\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:11Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.468191 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fd2f2b5d7dc3d7b8e1bb06fdf0eec3addfec3c230e044b6395d6cf535630ebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2250161e400b201f0528e33231c9007342002a37cd037a1b1e011f2aaaa22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:11Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.482995 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zjksw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70e79009-93be-49c4-a6b3-e8a06bcea7f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://862dd7caaf04a01f96f2dba70cd2226e85b55f172ee6a34c178f756a75832a08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad3ea204ad731500a340e639f0e36abe28ba50464f1b2ff9b009e39c234cf708\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T08:18:10Z\\\",\\\"message\\\":\\\"2025-12-01T08:17:24+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_50c6dd0c-b888-41cb-a4f6-d1199cf62aeb\\\\n2025-12-01T08:17:24+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_50c6dd0c-b888-41cb-a4f6-d1199cf62aeb to /host/opt/cni/bin/\\\\n2025-12-01T08:17:25Z [verbose] multus-daemon started\\\\n2025-12-01T08:17:25Z [verbose] Readiness Indicator file check\\\\n2025-12-01T08:18:10Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:18:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd8m6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zjksw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:11Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.485595 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.485616 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.485630 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.485643 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.485652 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:11Z","lastTransitionTime":"2025-12-01T08:18:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.499836 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15cdec0a-5925-4966-a30b-f60c503f633e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f528b94ff12c7a804930dca4b93c23334a9ae3a53317750cd34b5695f08b4582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f001b70a83da28b1ea78a94f1cd70dfc63b3de5a21b041932d33c2ac970c986b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c5c043438b515e37a6d6d5c47b92479bbb7322fe35c3a1bc57ee7b3a746544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23df33e45cc27e4d343ae6ec3fb88f2dfc125ba7da424515d4bee5ec19281832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4a7cab8e1594814a687b1433a92642a19eff7c2eb12f96fa06f08a2e270733d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1844a18a6109c25a25b9c8cb3ff65e69cc657f66be03dd97e883d24a774b7638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76c127e5ff47d030a709c79b7c7f2c2d3b32009052ff95ff522d90ecd86cf993\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76c127e5ff47d030a709c79b7c7f2c2d3b32009052ff95ff522d90ecd86cf993\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T08:17:52Z\\\",\\\"message\\\":\\\"}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1201 08:17:52.744847 6628 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-marketplace/community-operators]} name:Service_openshift-marketplace/community-operators_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.189:50051:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d389393c-7ba9-422c-b3f5-06e391d537d2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 08:17:52.746408 6628 model_client.go:382] Update operations generated as: [{Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1201 08:17:52.746269 6628 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-knmdv_openshift-ovn-kubernetes(15cdec0a-5925-4966-a30b-f60c503f633e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f231e0092e1a85cc5d6e00fb8d3bd982223b670191ac3dc21d81ff1fab462a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-knmdv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:11Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.514158 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327477e7db10129d47ddf9078f32d0df2ffd4b8e59fc7bb77c17c60c63e9a936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:11Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.527507 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:11Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.540889 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:11Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.552766 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c470edd984db3b756e031e33f64a9bc5ca6b9c156ab479e5cd647745655a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:11Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.565398 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mzsvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"397b51b7-934a-41d1-a593-500a64161bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4adac5633dfe89777bf019818bab9ee3a208cad9d929c96cf2cb86b18c2d4264\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h49g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d83844478080e4829975fb6c8e0444d9fdebd44b08afcf45e7d0b04fc534a844\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h49g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mzsvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:11Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.587825 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.587858 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.587868 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.587884 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.587896 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:11Z","lastTransitionTime":"2025-12-01T08:18:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.690754 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.690809 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.690820 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.690839 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.690852 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:11Z","lastTransitionTime":"2025-12-01T08:18:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.758046 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.758117 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.758072 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:18:11 crc kubenswrapper[5004]: E1201 08:18:11.758259 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 08:18:11 crc kubenswrapper[5004]: E1201 08:18:11.758346 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 08:18:11 crc kubenswrapper[5004]: E1201 08:18:11.758427 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.793124 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.793165 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.793174 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.793189 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.793200 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:11Z","lastTransitionTime":"2025-12-01T08:18:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.895493 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.895536 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.895548 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.895583 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.895595 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:11Z","lastTransitionTime":"2025-12-01T08:18:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.997762 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.997815 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.997831 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.997854 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:11 crc kubenswrapper[5004]: I1201 08:18:11.997870 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:11Z","lastTransitionTime":"2025-12-01T08:18:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:12 crc kubenswrapper[5004]: I1201 08:18:12.100897 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:12 crc kubenswrapper[5004]: I1201 08:18:12.100938 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:12 crc kubenswrapper[5004]: I1201 08:18:12.100951 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:12 crc kubenswrapper[5004]: I1201 08:18:12.100967 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:12 crc kubenswrapper[5004]: I1201 08:18:12.100977 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:12Z","lastTransitionTime":"2025-12-01T08:18:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:12 crc kubenswrapper[5004]: I1201 08:18:12.203157 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:12 crc kubenswrapper[5004]: I1201 08:18:12.203200 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:12 crc kubenswrapper[5004]: I1201 08:18:12.203211 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:12 crc kubenswrapper[5004]: I1201 08:18:12.203228 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:12 crc kubenswrapper[5004]: I1201 08:18:12.203239 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:12Z","lastTransitionTime":"2025-12-01T08:18:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:12 crc kubenswrapper[5004]: I1201 08:18:12.305863 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:12 crc kubenswrapper[5004]: I1201 08:18:12.305911 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:12 crc kubenswrapper[5004]: I1201 08:18:12.305924 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:12 crc kubenswrapper[5004]: I1201 08:18:12.305944 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:12 crc kubenswrapper[5004]: I1201 08:18:12.305956 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:12Z","lastTransitionTime":"2025-12-01T08:18:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:12 crc kubenswrapper[5004]: I1201 08:18:12.408135 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:12 crc kubenswrapper[5004]: I1201 08:18:12.408216 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:12 crc kubenswrapper[5004]: I1201 08:18:12.408240 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:12 crc kubenswrapper[5004]: I1201 08:18:12.408267 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:12 crc kubenswrapper[5004]: I1201 08:18:12.408286 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:12Z","lastTransitionTime":"2025-12-01T08:18:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:12 crc kubenswrapper[5004]: I1201 08:18:12.510713 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:12 crc kubenswrapper[5004]: I1201 08:18:12.510757 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:12 crc kubenswrapper[5004]: I1201 08:18:12.510766 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:12 crc kubenswrapper[5004]: I1201 08:18:12.510780 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:12 crc kubenswrapper[5004]: I1201 08:18:12.510789 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:12Z","lastTransitionTime":"2025-12-01T08:18:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:12 crc kubenswrapper[5004]: I1201 08:18:12.613786 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:12 crc kubenswrapper[5004]: I1201 08:18:12.613826 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:12 crc kubenswrapper[5004]: I1201 08:18:12.613834 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:12 crc kubenswrapper[5004]: I1201 08:18:12.613847 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:12 crc kubenswrapper[5004]: I1201 08:18:12.613861 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:12Z","lastTransitionTime":"2025-12-01T08:18:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:12 crc kubenswrapper[5004]: I1201 08:18:12.716805 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:12 crc kubenswrapper[5004]: I1201 08:18:12.716844 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:12 crc kubenswrapper[5004]: I1201 08:18:12.716852 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:12 crc kubenswrapper[5004]: I1201 08:18:12.716865 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:12 crc kubenswrapper[5004]: I1201 08:18:12.716873 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:12Z","lastTransitionTime":"2025-12-01T08:18:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:12 crc kubenswrapper[5004]: I1201 08:18:12.762803 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cl5l" Dec 01 08:18:12 crc kubenswrapper[5004]: E1201 08:18:12.762974 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cl5l" podUID="b488f4f3-d385-4d40-bdee-96d8fe2d42a1" Dec 01 08:18:12 crc kubenswrapper[5004]: I1201 08:18:12.822936 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:12 crc kubenswrapper[5004]: I1201 08:18:12.822993 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:12 crc kubenswrapper[5004]: I1201 08:18:12.823008 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:12 crc kubenswrapper[5004]: I1201 08:18:12.823030 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:12 crc kubenswrapper[5004]: I1201 08:18:12.823047 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:12Z","lastTransitionTime":"2025-12-01T08:18:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:12 crc kubenswrapper[5004]: I1201 08:18:12.831255 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327477e7db10129d47ddf9078f32d0df2ffd4b8e59fc7bb77c17c60c63e9a936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:12Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:12 crc kubenswrapper[5004]: I1201 08:18:12.851124 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:12Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:12 crc kubenswrapper[5004]: I1201 08:18:12.866069 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:12Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:12 crc kubenswrapper[5004]: I1201 08:18:12.882927 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c470edd984db3b756e031e33f64a9bc5ca6b9c156ab479e5cd647745655a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:12Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:12 crc kubenswrapper[5004]: I1201 08:18:12.902065 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fd2f2b5d7dc3d7b8e1bb06fdf0eec3addfec3c230e044b6395d6cf535630ebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2250161e400b201f0528e33231c9007342002a37cd037a1b1e011f2aaaa22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:12Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:12 crc kubenswrapper[5004]: I1201 08:18:12.918870 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zjksw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70e79009-93be-49c4-a6b3-e8a06bcea7f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://862dd7caaf04a01f96f2dba70cd2226e85b55f172ee6a34c178f756a75832a08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad3ea204ad731500a340e639f0e36abe28ba50464f1b2ff9b009e39c234cf708\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T08:18:10Z\\\",\\\"message\\\":\\\"2025-12-01T08:17:24+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_50c6dd0c-b888-41cb-a4f6-d1199cf62aeb\\\\n2025-12-01T08:17:24+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_50c6dd0c-b888-41cb-a4f6-d1199cf62aeb to /host/opt/cni/bin/\\\\n2025-12-01T08:17:25Z [verbose] multus-daemon started\\\\n2025-12-01T08:17:25Z [verbose] Readiness Indicator file check\\\\n2025-12-01T08:18:10Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:18:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd8m6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zjksw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:12Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:12 crc kubenswrapper[5004]: I1201 08:18:12.925807 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:12 crc kubenswrapper[5004]: I1201 08:18:12.925829 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:12 crc kubenswrapper[5004]: I1201 08:18:12.925837 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:12 crc kubenswrapper[5004]: I1201 08:18:12.925851 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:12 crc kubenswrapper[5004]: I1201 08:18:12.925862 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:12Z","lastTransitionTime":"2025-12-01T08:18:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:12 crc kubenswrapper[5004]: I1201 08:18:12.938263 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15cdec0a-5925-4966-a30b-f60c503f633e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f528b94ff12c7a804930dca4b93c23334a9ae3a53317750cd34b5695f08b4582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f001b70a83da28b1ea78a94f1cd70dfc63b3de5a21b041932d33c2ac970c986b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c5c043438b515e37a6d6d5c47b92479bbb7322fe35c3a1bc57ee7b3a746544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23df33e45cc27e4d343ae6ec3fb88f2dfc125ba7da424515d4bee5ec19281832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4a7cab8e1594814a687b1433a92642a19eff7c2eb12f96fa06f08a2e270733d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1844a18a6109c25a25b9c8cb3ff65e69cc657f66be03dd97e883d24a774b7638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76c127e5ff47d030a709c79b7c7f2c2d3b32009052ff95ff522d90ecd86cf993\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76c127e5ff47d030a709c79b7c7f2c2d3b32009052ff95ff522d90ecd86cf993\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T08:17:52Z\\\",\\\"message\\\":\\\"}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1201 08:17:52.744847 6628 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-marketplace/community-operators]} name:Service_openshift-marketplace/community-operators_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.189:50051:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d389393c-7ba9-422c-b3f5-06e391d537d2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 08:17:52.746408 6628 model_client.go:382] Update operations generated as: [{Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1201 08:17:52.746269 6628 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-knmdv_openshift-ovn-kubernetes(15cdec0a-5925-4966-a30b-f60c503f633e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f231e0092e1a85cc5d6e00fb8d3bd982223b670191ac3dc21d81ff1fab462a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-knmdv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:12Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:12 crc kubenswrapper[5004]: I1201 08:18:12.948868 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mzsvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"397b51b7-934a-41d1-a593-500a64161bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4adac5633dfe89777bf019818bab9ee3a208cad9d929c96cf2cb86b18c2d4264\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h49g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d83844478080e4829975fb6c8e0444d9fdebd44b08afcf45e7d0b04fc534a844\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h49g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mzsvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:12Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:12 crc kubenswrapper[5004]: I1201 08:18:12.964896 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43c9f762-d403-49b9-8294-d66976aca4dd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b358f641ff09bc8f4a452b0d4a8cf5f9790182f76779be31426d48d0278b0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d610c3a799d4a7df9ddfe2a28f2ced2e5ac4e96c67c93f8e6d3a0eee60d9ac48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43f9f72af3273da055eeb1e13ad6258a1b3b6b205402861beee9f87f02230297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e3cc2eca559417c71beed561569c4dc5b45ffb8407976e0695fc1b93219d37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c24fbeedbc7c40b6079b7ad89d8564a4f2cdaad2cf996e66682b082382da190\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:12Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:12 crc kubenswrapper[5004]: I1201 08:18:12.975169 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:12Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:12 crc kubenswrapper[5004]: I1201 08:18:12.984188 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jjms6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8828af41-beeb-47dd-96cf-3dbcb5175893\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b22220849cdb91592a44a67f9f6a09586146009483f46a0505a2dea3b02bccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvsch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jjms6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:12Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:12 crc kubenswrapper[5004]: I1201 08:18:12.994028 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9977ebb-82de-4e96-8763-0b5a84f8d4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf84c2eea7167eed95a4195d37bf784ffef310bc2079d8d06e1f47cb45a7864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwgxq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c1e51fa92aec80e93d0fce11c743b8c4fde23fc23d8626e8d5b3e25ab42350d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwgxq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fvdgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:12Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:13 crc kubenswrapper[5004]: I1201 08:18:13.002505 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7cl5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b488f4f3-d385-4d40-bdee-96d8fe2d42a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-glbhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-glbhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7cl5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:13Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:13 crc kubenswrapper[5004]: I1201 08:18:13.011638 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2fa377e-a3a6-46ce-af23-e677799f0115\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520771cf1ad6f9360064c5f304243c5878a2f1025c87d1001c97b2d5f95cb9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b65b9bc444101900c62fd90c7d45789d186a89148ac0fd9c7f0c6048c3f55ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbd3e682d207b8c18d6bcbc26aa2adae2a35645502d31b327ffddd51991e842\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4563974ed1467a87aab104756828e63853c282a61c4fcab1e757eb4da9afaa40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:13Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:13 crc kubenswrapper[5004]: I1201 08:18:13.023705 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dpkxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddcaf111-708a-45b3-a342-effd3061ab17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68ecba515f05ca83fdd0cdda10e3e5925a146aadb70ae17859586c12daf55dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd2f03a8020818d1cc98b6eeaf64a728e627a3401400c3af53654f8771ef02eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd2f03a8020818d1cc98b6eeaf64a728e627a3401400c3af53654f8771ef02eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8dd31c10eaf6bc673dec377a884040524418dd1891a96b2eb6e6e983979512a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8dd31c10eaf6bc673dec377a884040524418dd1891a96b2eb6e6e983979512a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a7a8ef66a9486ca3d13826df7aa39db2083cff2c386b4d0d858e0526e0f094d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a7a8ef66a9486ca3d13826df7aa39db2083cff2c386b4d0d858e0526e0f094d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd1c47c976c17bf17e4d660288ce30472372b6a7f33f6ffa6d96c1a4436275d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd1c47c976c17bf17e4d660288ce30472372b6a7f33f6ffa6d96c1a4436275d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b7c8e9648e5514ab1fcb63dc72822dd276a2d74987e2fb4f4800b26ec176be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b7c8e9648e5514ab1fcb63dc72822dd276a2d74987e2fb4f4800b26ec176be7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d7115db7fe248437f4b8918f06c9aa48b141fcd3f31add80faee81b4347e47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d7115db7fe248437f4b8918f06c9aa48b141fcd3f31add80faee81b4347e47b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dpkxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:13Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:13 crc kubenswrapper[5004]: I1201 08:18:13.029072 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:13 crc kubenswrapper[5004]: I1201 08:18:13.029093 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:13 crc kubenswrapper[5004]: I1201 08:18:13.029106 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:13 crc kubenswrapper[5004]: I1201 08:18:13.029119 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:13 crc kubenswrapper[5004]: I1201 08:18:13.029130 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:13Z","lastTransitionTime":"2025-12-01T08:18:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:13 crc kubenswrapper[5004]: I1201 08:18:13.032092 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ww6lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72500ffd-4ca3-4614-a3a2-bbdc5a7506c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://668cfaa2af20e2bb082fc47e1702cfd9f704c4fdf56a4d27cf25d6915e7cd18a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmsvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ww6lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:13Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:13 crc kubenswrapper[5004]: I1201 08:18:13.042441 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4ee478f-9254-4e56-96be-f5a83ff5d77c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37438df65f4dab6700f193e84f81d8ed41b3c208c3f6150bd5836c0642190572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f873da9c594c885720113b0d0fc01552050d030e802074043e9ede174fd9b16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ca757be16a55ba8df0e9629f7cc2653e2804a5ee5a2151ee3b07d2c30fe5b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3930a1c88c6014beb768ed20a37359ade08e77496a86b913f68c6196134f13a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3930a1c88c6014beb768ed20a37359ade08e77496a86b913f68c6196134f13a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:13Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:13 crc kubenswrapper[5004]: I1201 08:18:13.052831 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0ed2b6d-0c61-4639-bc3b-1c8effc4815d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45a0d7a12e7c114de1740a151e04f7f39844b709740cab958f156ac918af3228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d69a370af9f8f6c575f99da6bb01cee0f660dccb7e85f16f5d8874d0e263e1fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82baee0fd931fd7b41cd5a73ca7e653ef13dfa3d573991cbe3238d6b95f6fac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f341e99100e985f65c24647d976a24782dfae7f52e752b774f41f69af27fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad581564fc519328ae429bce757797f6d87d8648e862b008637a245866ec4cbe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T08:17:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 08:17:15.185125 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 08:17:15.186793 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071633747/tls.crt::/tmp/serving-cert-4071633747/tls.key\\\\\\\"\\\\nI1201 08:17:20.827614 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 08:17:20.834528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 08:17:20.834549 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 08:17:20.837611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 08:17:20.837630 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 08:17:20.848940 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 08:17:20.848957 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848965 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 08:17:20.848968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 08:17:20.848970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 08:17:20.848973 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 08:17:20.849115 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 08:17:20.854990 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2caeb2ebaf68110cb8728e60aae6db1b1fc48efae6018f66070766a4f42caa69\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:13Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:13 crc kubenswrapper[5004]: I1201 08:18:13.131641 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:13 crc kubenswrapper[5004]: I1201 08:18:13.131687 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:13 crc kubenswrapper[5004]: I1201 08:18:13.131699 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:13 crc kubenswrapper[5004]: I1201 08:18:13.131717 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:13 crc kubenswrapper[5004]: I1201 08:18:13.131730 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:13Z","lastTransitionTime":"2025-12-01T08:18:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:13 crc kubenswrapper[5004]: I1201 08:18:13.233811 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:13 crc kubenswrapper[5004]: I1201 08:18:13.233852 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:13 crc kubenswrapper[5004]: I1201 08:18:13.233864 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:13 crc kubenswrapper[5004]: I1201 08:18:13.233884 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:13 crc kubenswrapper[5004]: I1201 08:18:13.233897 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:13Z","lastTransitionTime":"2025-12-01T08:18:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:13 crc kubenswrapper[5004]: I1201 08:18:13.335891 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:13 crc kubenswrapper[5004]: I1201 08:18:13.336296 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:13 crc kubenswrapper[5004]: I1201 08:18:13.336461 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:13 crc kubenswrapper[5004]: I1201 08:18:13.336681 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:13 crc kubenswrapper[5004]: I1201 08:18:13.336846 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:13Z","lastTransitionTime":"2025-12-01T08:18:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:13 crc kubenswrapper[5004]: I1201 08:18:13.440366 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:13 crc kubenswrapper[5004]: I1201 08:18:13.440432 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:13 crc kubenswrapper[5004]: I1201 08:18:13.440451 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:13 crc kubenswrapper[5004]: I1201 08:18:13.440475 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:13 crc kubenswrapper[5004]: I1201 08:18:13.440492 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:13Z","lastTransitionTime":"2025-12-01T08:18:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:13 crc kubenswrapper[5004]: I1201 08:18:13.547810 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:13 crc kubenswrapper[5004]: I1201 08:18:13.548149 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:13 crc kubenswrapper[5004]: I1201 08:18:13.548331 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:13 crc kubenswrapper[5004]: I1201 08:18:13.548551 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:13 crc kubenswrapper[5004]: I1201 08:18:13.548811 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:13Z","lastTransitionTime":"2025-12-01T08:18:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:13 crc kubenswrapper[5004]: I1201 08:18:13.651635 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:13 crc kubenswrapper[5004]: I1201 08:18:13.651697 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:13 crc kubenswrapper[5004]: I1201 08:18:13.651714 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:13 crc kubenswrapper[5004]: I1201 08:18:13.651738 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:13 crc kubenswrapper[5004]: I1201 08:18:13.651754 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:13Z","lastTransitionTime":"2025-12-01T08:18:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:13 crc kubenswrapper[5004]: I1201 08:18:13.754186 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:13 crc kubenswrapper[5004]: I1201 08:18:13.754247 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:13 crc kubenswrapper[5004]: I1201 08:18:13.754266 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:13 crc kubenswrapper[5004]: I1201 08:18:13.754289 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:13 crc kubenswrapper[5004]: I1201 08:18:13.754306 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:13Z","lastTransitionTime":"2025-12-01T08:18:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:13 crc kubenswrapper[5004]: I1201 08:18:13.758533 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 08:18:13 crc kubenswrapper[5004]: I1201 08:18:13.758627 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 08:18:13 crc kubenswrapper[5004]: I1201 08:18:13.758648 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:18:13 crc kubenswrapper[5004]: E1201 08:18:13.759064 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 08:18:13 crc kubenswrapper[5004]: E1201 08:18:13.759217 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 08:18:13 crc kubenswrapper[5004]: E1201 08:18:13.758917 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 08:18:13 crc kubenswrapper[5004]: I1201 08:18:13.857165 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:13 crc kubenswrapper[5004]: I1201 08:18:13.857223 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:13 crc kubenswrapper[5004]: I1201 08:18:13.857240 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:13 crc kubenswrapper[5004]: I1201 08:18:13.857263 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:13 crc kubenswrapper[5004]: I1201 08:18:13.857281 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:13Z","lastTransitionTime":"2025-12-01T08:18:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:13 crc kubenswrapper[5004]: I1201 08:18:13.959965 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:13 crc kubenswrapper[5004]: I1201 08:18:13.960028 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:13 crc kubenswrapper[5004]: I1201 08:18:13.960046 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:13 crc kubenswrapper[5004]: I1201 08:18:13.960072 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:13 crc kubenswrapper[5004]: I1201 08:18:13.960089 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:13Z","lastTransitionTime":"2025-12-01T08:18:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:14 crc kubenswrapper[5004]: I1201 08:18:14.062713 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:14 crc kubenswrapper[5004]: I1201 08:18:14.063056 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:14 crc kubenswrapper[5004]: I1201 08:18:14.063212 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:14 crc kubenswrapper[5004]: I1201 08:18:14.063389 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:14 crc kubenswrapper[5004]: I1201 08:18:14.063536 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:14Z","lastTransitionTime":"2025-12-01T08:18:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:14 crc kubenswrapper[5004]: I1201 08:18:14.166173 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:14 crc kubenswrapper[5004]: I1201 08:18:14.166525 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:14 crc kubenswrapper[5004]: I1201 08:18:14.166729 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:14 crc kubenswrapper[5004]: I1201 08:18:14.166873 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:14 crc kubenswrapper[5004]: I1201 08:18:14.167014 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:14Z","lastTransitionTime":"2025-12-01T08:18:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:14 crc kubenswrapper[5004]: I1201 08:18:14.269983 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:14 crc kubenswrapper[5004]: I1201 08:18:14.270029 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:14 crc kubenswrapper[5004]: I1201 08:18:14.270040 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:14 crc kubenswrapper[5004]: I1201 08:18:14.270059 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:14 crc kubenswrapper[5004]: I1201 08:18:14.270072 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:14Z","lastTransitionTime":"2025-12-01T08:18:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:14 crc kubenswrapper[5004]: I1201 08:18:14.373081 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:14 crc kubenswrapper[5004]: I1201 08:18:14.373143 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:14 crc kubenswrapper[5004]: I1201 08:18:14.373159 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:14 crc kubenswrapper[5004]: I1201 08:18:14.373183 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:14 crc kubenswrapper[5004]: I1201 08:18:14.373200 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:14Z","lastTransitionTime":"2025-12-01T08:18:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:14 crc kubenswrapper[5004]: I1201 08:18:14.476317 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:14 crc kubenswrapper[5004]: I1201 08:18:14.476345 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:14 crc kubenswrapper[5004]: I1201 08:18:14.476353 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:14 crc kubenswrapper[5004]: I1201 08:18:14.476365 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:14 crc kubenswrapper[5004]: I1201 08:18:14.476375 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:14Z","lastTransitionTime":"2025-12-01T08:18:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:14 crc kubenswrapper[5004]: I1201 08:18:14.579192 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:14 crc kubenswrapper[5004]: I1201 08:18:14.579247 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:14 crc kubenswrapper[5004]: I1201 08:18:14.579259 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:14 crc kubenswrapper[5004]: I1201 08:18:14.579280 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:14 crc kubenswrapper[5004]: I1201 08:18:14.579293 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:14Z","lastTransitionTime":"2025-12-01T08:18:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:14 crc kubenswrapper[5004]: I1201 08:18:14.681460 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:14 crc kubenswrapper[5004]: I1201 08:18:14.681497 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:14 crc kubenswrapper[5004]: I1201 08:18:14.681505 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:14 crc kubenswrapper[5004]: I1201 08:18:14.681518 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:14 crc kubenswrapper[5004]: I1201 08:18:14.681527 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:14Z","lastTransitionTime":"2025-12-01T08:18:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:14 crc kubenswrapper[5004]: I1201 08:18:14.758754 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cl5l" Dec 01 08:18:14 crc kubenswrapper[5004]: E1201 08:18:14.758882 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cl5l" podUID="b488f4f3-d385-4d40-bdee-96d8fe2d42a1" Dec 01 08:18:14 crc kubenswrapper[5004]: I1201 08:18:14.769342 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 01 08:18:14 crc kubenswrapper[5004]: I1201 08:18:14.784068 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:14 crc kubenswrapper[5004]: I1201 08:18:14.784146 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:14 crc kubenswrapper[5004]: I1201 08:18:14.784160 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:14 crc kubenswrapper[5004]: I1201 08:18:14.784176 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:14 crc kubenswrapper[5004]: I1201 08:18:14.784188 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:14Z","lastTransitionTime":"2025-12-01T08:18:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:14 crc kubenswrapper[5004]: I1201 08:18:14.886718 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:14 crc kubenswrapper[5004]: I1201 08:18:14.886755 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:14 crc kubenswrapper[5004]: I1201 08:18:14.886766 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:14 crc kubenswrapper[5004]: I1201 08:18:14.886778 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:14 crc kubenswrapper[5004]: I1201 08:18:14.886788 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:14Z","lastTransitionTime":"2025-12-01T08:18:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:14 crc kubenswrapper[5004]: I1201 08:18:14.989359 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:14 crc kubenswrapper[5004]: I1201 08:18:14.989393 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:14 crc kubenswrapper[5004]: I1201 08:18:14.989405 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:14 crc kubenswrapper[5004]: I1201 08:18:14.989418 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:14 crc kubenswrapper[5004]: I1201 08:18:14.989427 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:14Z","lastTransitionTime":"2025-12-01T08:18:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:15 crc kubenswrapper[5004]: I1201 08:18:15.090853 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:15 crc kubenswrapper[5004]: I1201 08:18:15.090981 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:15 crc kubenswrapper[5004]: I1201 08:18:15.090995 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:15 crc kubenswrapper[5004]: I1201 08:18:15.091010 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:15 crc kubenswrapper[5004]: I1201 08:18:15.091019 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:15Z","lastTransitionTime":"2025-12-01T08:18:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:15 crc kubenswrapper[5004]: I1201 08:18:15.194205 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:15 crc kubenswrapper[5004]: I1201 08:18:15.194253 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:15 crc kubenswrapper[5004]: I1201 08:18:15.194264 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:15 crc kubenswrapper[5004]: I1201 08:18:15.194281 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:15 crc kubenswrapper[5004]: I1201 08:18:15.194292 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:15Z","lastTransitionTime":"2025-12-01T08:18:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:15 crc kubenswrapper[5004]: I1201 08:18:15.296619 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:15 crc kubenswrapper[5004]: I1201 08:18:15.296681 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:15 crc kubenswrapper[5004]: I1201 08:18:15.296700 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:15 crc kubenswrapper[5004]: I1201 08:18:15.296723 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:15 crc kubenswrapper[5004]: I1201 08:18:15.296739 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:15Z","lastTransitionTime":"2025-12-01T08:18:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:15 crc kubenswrapper[5004]: I1201 08:18:15.398426 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:15 crc kubenswrapper[5004]: I1201 08:18:15.398474 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:15 crc kubenswrapper[5004]: I1201 08:18:15.398510 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:15 crc kubenswrapper[5004]: I1201 08:18:15.398527 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:15 crc kubenswrapper[5004]: I1201 08:18:15.398539 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:15Z","lastTransitionTime":"2025-12-01T08:18:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:15 crc kubenswrapper[5004]: I1201 08:18:15.501217 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:15 crc kubenswrapper[5004]: I1201 08:18:15.501275 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:15 crc kubenswrapper[5004]: I1201 08:18:15.501292 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:15 crc kubenswrapper[5004]: I1201 08:18:15.501319 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:15 crc kubenswrapper[5004]: I1201 08:18:15.501337 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:15Z","lastTransitionTime":"2025-12-01T08:18:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:15 crc kubenswrapper[5004]: I1201 08:18:15.604005 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:15 crc kubenswrapper[5004]: I1201 08:18:15.604080 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:15 crc kubenswrapper[5004]: I1201 08:18:15.604104 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:15 crc kubenswrapper[5004]: I1201 08:18:15.604134 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:15 crc kubenswrapper[5004]: I1201 08:18:15.604161 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:15Z","lastTransitionTime":"2025-12-01T08:18:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:15 crc kubenswrapper[5004]: I1201 08:18:15.707666 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:15 crc kubenswrapper[5004]: I1201 08:18:15.707744 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:15 crc kubenswrapper[5004]: I1201 08:18:15.707768 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:15 crc kubenswrapper[5004]: I1201 08:18:15.707801 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:15 crc kubenswrapper[5004]: I1201 08:18:15.707823 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:15Z","lastTransitionTime":"2025-12-01T08:18:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:15 crc kubenswrapper[5004]: I1201 08:18:15.758111 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 08:18:15 crc kubenswrapper[5004]: I1201 08:18:15.758210 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 08:18:15 crc kubenswrapper[5004]: I1201 08:18:15.758290 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:18:15 crc kubenswrapper[5004]: E1201 08:18:15.758265 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 08:18:15 crc kubenswrapper[5004]: E1201 08:18:15.758427 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 08:18:15 crc kubenswrapper[5004]: E1201 08:18:15.758508 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 08:18:15 crc kubenswrapper[5004]: I1201 08:18:15.810269 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:15 crc kubenswrapper[5004]: I1201 08:18:15.810317 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:15 crc kubenswrapper[5004]: I1201 08:18:15.810334 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:15 crc kubenswrapper[5004]: I1201 08:18:15.810360 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:15 crc kubenswrapper[5004]: I1201 08:18:15.810377 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:15Z","lastTransitionTime":"2025-12-01T08:18:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:15 crc kubenswrapper[5004]: I1201 08:18:15.913947 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:15 crc kubenswrapper[5004]: I1201 08:18:15.914000 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:15 crc kubenswrapper[5004]: I1201 08:18:15.914017 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:15 crc kubenswrapper[5004]: I1201 08:18:15.914039 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:15 crc kubenswrapper[5004]: I1201 08:18:15.914056 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:15Z","lastTransitionTime":"2025-12-01T08:18:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:15 crc kubenswrapper[5004]: I1201 08:18:15.959901 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:15 crc kubenswrapper[5004]: I1201 08:18:15.959980 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:15 crc kubenswrapper[5004]: I1201 08:18:15.960001 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:15 crc kubenswrapper[5004]: I1201 08:18:15.960052 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:15 crc kubenswrapper[5004]: I1201 08:18:15.960071 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:15Z","lastTransitionTime":"2025-12-01T08:18:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:15 crc kubenswrapper[5004]: E1201 08:18:15.980344 5004 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:18:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:18:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:18:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:18:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c8341267-df98-484f-a5e7-cf024fab437c\\\",\\\"systemUUID\\\":\\\"77547a65-c048-47ee-89b1-8422fc81b7aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:15Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:15 crc kubenswrapper[5004]: I1201 08:18:15.986399 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:15 crc kubenswrapper[5004]: I1201 08:18:15.986478 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:15 crc kubenswrapper[5004]: I1201 08:18:15.986507 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:15 crc kubenswrapper[5004]: I1201 08:18:15.986540 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:15 crc kubenswrapper[5004]: I1201 08:18:15.986607 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:15Z","lastTransitionTime":"2025-12-01T08:18:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:16 crc kubenswrapper[5004]: E1201 08:18:16.005619 5004 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:18:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:18:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:18:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:18:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c8341267-df98-484f-a5e7-cf024fab437c\\\",\\\"systemUUID\\\":\\\"77547a65-c048-47ee-89b1-8422fc81b7aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:16Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:16 crc kubenswrapper[5004]: I1201 08:18:16.010425 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:16 crc kubenswrapper[5004]: I1201 08:18:16.010482 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:16 crc kubenswrapper[5004]: I1201 08:18:16.010500 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:16 crc kubenswrapper[5004]: I1201 08:18:16.010524 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:16 crc kubenswrapper[5004]: I1201 08:18:16.010540 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:16Z","lastTransitionTime":"2025-12-01T08:18:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:16 crc kubenswrapper[5004]: E1201 08:18:16.029802 5004 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:18:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:18:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:18:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:18:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c8341267-df98-484f-a5e7-cf024fab437c\\\",\\\"systemUUID\\\":\\\"77547a65-c048-47ee-89b1-8422fc81b7aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:16Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:16 crc kubenswrapper[5004]: I1201 08:18:16.034594 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:16 crc kubenswrapper[5004]: I1201 08:18:16.034645 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:16 crc kubenswrapper[5004]: I1201 08:18:16.034661 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:16 crc kubenswrapper[5004]: I1201 08:18:16.034681 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:16 crc kubenswrapper[5004]: I1201 08:18:16.034698 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:16Z","lastTransitionTime":"2025-12-01T08:18:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:16 crc kubenswrapper[5004]: E1201 08:18:16.051733 5004 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:18:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:18:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:18:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:18:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c8341267-df98-484f-a5e7-cf024fab437c\\\",\\\"systemUUID\\\":\\\"77547a65-c048-47ee-89b1-8422fc81b7aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:16Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:16 crc kubenswrapper[5004]: I1201 08:18:16.055928 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:16 crc kubenswrapper[5004]: I1201 08:18:16.055973 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:16 crc kubenswrapper[5004]: I1201 08:18:16.055984 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:16 crc kubenswrapper[5004]: I1201 08:18:16.056005 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:16 crc kubenswrapper[5004]: I1201 08:18:16.056016 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:16Z","lastTransitionTime":"2025-12-01T08:18:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:16 crc kubenswrapper[5004]: E1201 08:18:16.068303 5004 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:18:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:18:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:18:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:18:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c8341267-df98-484f-a5e7-cf024fab437c\\\",\\\"systemUUID\\\":\\\"77547a65-c048-47ee-89b1-8422fc81b7aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:16Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:16 crc kubenswrapper[5004]: E1201 08:18:16.068533 5004 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 08:18:16 crc kubenswrapper[5004]: I1201 08:18:16.070509 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:16 crc kubenswrapper[5004]: I1201 08:18:16.070606 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:16 crc kubenswrapper[5004]: I1201 08:18:16.070624 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:16 crc kubenswrapper[5004]: I1201 08:18:16.070648 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:16 crc kubenswrapper[5004]: I1201 08:18:16.070664 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:16Z","lastTransitionTime":"2025-12-01T08:18:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:16 crc kubenswrapper[5004]: I1201 08:18:16.174219 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:16 crc kubenswrapper[5004]: I1201 08:18:16.174287 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:16 crc kubenswrapper[5004]: I1201 08:18:16.174303 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:16 crc kubenswrapper[5004]: I1201 08:18:16.174327 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:16 crc kubenswrapper[5004]: I1201 08:18:16.174353 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:16Z","lastTransitionTime":"2025-12-01T08:18:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:16 crc kubenswrapper[5004]: I1201 08:18:16.278145 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:16 crc kubenswrapper[5004]: I1201 08:18:16.278204 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:16 crc kubenswrapper[5004]: I1201 08:18:16.278221 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:16 crc kubenswrapper[5004]: I1201 08:18:16.278244 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:16 crc kubenswrapper[5004]: I1201 08:18:16.278261 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:16Z","lastTransitionTime":"2025-12-01T08:18:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:16 crc kubenswrapper[5004]: I1201 08:18:16.381504 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:16 crc kubenswrapper[5004]: I1201 08:18:16.381591 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:16 crc kubenswrapper[5004]: I1201 08:18:16.381645 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:16 crc kubenswrapper[5004]: I1201 08:18:16.381670 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:16 crc kubenswrapper[5004]: I1201 08:18:16.381687 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:16Z","lastTransitionTime":"2025-12-01T08:18:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:16 crc kubenswrapper[5004]: I1201 08:18:16.484795 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:16 crc kubenswrapper[5004]: I1201 08:18:16.484864 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:16 crc kubenswrapper[5004]: I1201 08:18:16.484884 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:16 crc kubenswrapper[5004]: I1201 08:18:16.484909 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:16 crc kubenswrapper[5004]: I1201 08:18:16.484926 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:16Z","lastTransitionTime":"2025-12-01T08:18:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:16 crc kubenswrapper[5004]: I1201 08:18:16.587881 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:16 crc kubenswrapper[5004]: I1201 08:18:16.587925 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:16 crc kubenswrapper[5004]: I1201 08:18:16.587937 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:16 crc kubenswrapper[5004]: I1201 08:18:16.587953 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:16 crc kubenswrapper[5004]: I1201 08:18:16.587966 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:16Z","lastTransitionTime":"2025-12-01T08:18:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:16 crc kubenswrapper[5004]: I1201 08:18:16.691198 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:16 crc kubenswrapper[5004]: I1201 08:18:16.691253 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:16 crc kubenswrapper[5004]: I1201 08:18:16.691270 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:16 crc kubenswrapper[5004]: I1201 08:18:16.691290 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:16 crc kubenswrapper[5004]: I1201 08:18:16.691306 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:16Z","lastTransitionTime":"2025-12-01T08:18:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:16 crc kubenswrapper[5004]: I1201 08:18:16.758408 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cl5l" Dec 01 08:18:16 crc kubenswrapper[5004]: E1201 08:18:16.758655 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cl5l" podUID="b488f4f3-d385-4d40-bdee-96d8fe2d42a1" Dec 01 08:18:16 crc kubenswrapper[5004]: I1201 08:18:16.794344 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:16 crc kubenswrapper[5004]: I1201 08:18:16.794410 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:16 crc kubenswrapper[5004]: I1201 08:18:16.794428 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:16 crc kubenswrapper[5004]: I1201 08:18:16.794454 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:16 crc kubenswrapper[5004]: I1201 08:18:16.794474 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:16Z","lastTransitionTime":"2025-12-01T08:18:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:16 crc kubenswrapper[5004]: I1201 08:18:16.897701 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:16 crc kubenswrapper[5004]: I1201 08:18:16.897806 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:16 crc kubenswrapper[5004]: I1201 08:18:16.897829 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:16 crc kubenswrapper[5004]: I1201 08:18:16.897891 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:16 crc kubenswrapper[5004]: I1201 08:18:16.897913 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:16Z","lastTransitionTime":"2025-12-01T08:18:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:17 crc kubenswrapper[5004]: I1201 08:18:17.003448 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:17 crc kubenswrapper[5004]: I1201 08:18:17.003816 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:17 crc kubenswrapper[5004]: I1201 08:18:17.003908 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:17 crc kubenswrapper[5004]: I1201 08:18:17.003945 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:17 crc kubenswrapper[5004]: I1201 08:18:17.004930 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:17Z","lastTransitionTime":"2025-12-01T08:18:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:17 crc kubenswrapper[5004]: I1201 08:18:17.108257 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:17 crc kubenswrapper[5004]: I1201 08:18:17.108360 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:17 crc kubenswrapper[5004]: I1201 08:18:17.108382 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:17 crc kubenswrapper[5004]: I1201 08:18:17.108414 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:17 crc kubenswrapper[5004]: I1201 08:18:17.108437 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:17Z","lastTransitionTime":"2025-12-01T08:18:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:17 crc kubenswrapper[5004]: I1201 08:18:17.212183 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:17 crc kubenswrapper[5004]: I1201 08:18:17.212236 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:17 crc kubenswrapper[5004]: I1201 08:18:17.212295 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:17 crc kubenswrapper[5004]: I1201 08:18:17.212317 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:17 crc kubenswrapper[5004]: I1201 08:18:17.212334 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:17Z","lastTransitionTime":"2025-12-01T08:18:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:17 crc kubenswrapper[5004]: I1201 08:18:17.315258 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:17 crc kubenswrapper[5004]: I1201 08:18:17.315324 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:17 crc kubenswrapper[5004]: I1201 08:18:17.315341 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:17 crc kubenswrapper[5004]: I1201 08:18:17.315366 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:17 crc kubenswrapper[5004]: I1201 08:18:17.315383 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:17Z","lastTransitionTime":"2025-12-01T08:18:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:17 crc kubenswrapper[5004]: I1201 08:18:17.417941 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:17 crc kubenswrapper[5004]: I1201 08:18:17.417995 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:17 crc kubenswrapper[5004]: I1201 08:18:17.418007 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:17 crc kubenswrapper[5004]: I1201 08:18:17.418030 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:17 crc kubenswrapper[5004]: I1201 08:18:17.418042 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:17Z","lastTransitionTime":"2025-12-01T08:18:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:17 crc kubenswrapper[5004]: I1201 08:18:17.521850 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:17 crc kubenswrapper[5004]: I1201 08:18:17.521919 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:17 crc kubenswrapper[5004]: I1201 08:18:17.521937 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:17 crc kubenswrapper[5004]: I1201 08:18:17.521965 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:17 crc kubenswrapper[5004]: I1201 08:18:17.521984 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:17Z","lastTransitionTime":"2025-12-01T08:18:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:17 crc kubenswrapper[5004]: I1201 08:18:17.624496 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:17 crc kubenswrapper[5004]: I1201 08:18:17.624592 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:17 crc kubenswrapper[5004]: I1201 08:18:17.624613 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:17 crc kubenswrapper[5004]: I1201 08:18:17.624638 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:17 crc kubenswrapper[5004]: I1201 08:18:17.624655 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:17Z","lastTransitionTime":"2025-12-01T08:18:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:17 crc kubenswrapper[5004]: I1201 08:18:17.732954 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:17 crc kubenswrapper[5004]: I1201 08:18:17.733057 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:17 crc kubenswrapper[5004]: I1201 08:18:17.733177 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:17 crc kubenswrapper[5004]: I1201 08:18:17.733249 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:17 crc kubenswrapper[5004]: I1201 08:18:17.733273 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:17Z","lastTransitionTime":"2025-12-01T08:18:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:17 crc kubenswrapper[5004]: I1201 08:18:17.757974 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 08:18:17 crc kubenswrapper[5004]: I1201 08:18:17.757996 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 08:18:17 crc kubenswrapper[5004]: I1201 08:18:17.758040 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:18:17 crc kubenswrapper[5004]: E1201 08:18:17.758153 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 08:18:17 crc kubenswrapper[5004]: E1201 08:18:17.758305 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 08:18:17 crc kubenswrapper[5004]: E1201 08:18:17.758597 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 08:18:17 crc kubenswrapper[5004]: I1201 08:18:17.837313 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:17 crc kubenswrapper[5004]: I1201 08:18:17.837378 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:17 crc kubenswrapper[5004]: I1201 08:18:17.837398 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:17 crc kubenswrapper[5004]: I1201 08:18:17.837423 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:17 crc kubenswrapper[5004]: I1201 08:18:17.837442 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:17Z","lastTransitionTime":"2025-12-01T08:18:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:17 crc kubenswrapper[5004]: I1201 08:18:17.940140 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:17 crc kubenswrapper[5004]: I1201 08:18:17.940204 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:17 crc kubenswrapper[5004]: I1201 08:18:17.940221 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:17 crc kubenswrapper[5004]: I1201 08:18:17.940247 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:17 crc kubenswrapper[5004]: I1201 08:18:17.940264 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:17Z","lastTransitionTime":"2025-12-01T08:18:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:18 crc kubenswrapper[5004]: I1201 08:18:18.043478 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:18 crc kubenswrapper[5004]: I1201 08:18:18.043553 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:18 crc kubenswrapper[5004]: I1201 08:18:18.043610 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:18 crc kubenswrapper[5004]: I1201 08:18:18.043640 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:18 crc kubenswrapper[5004]: I1201 08:18:18.043705 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:18Z","lastTransitionTime":"2025-12-01T08:18:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:18 crc kubenswrapper[5004]: I1201 08:18:18.146734 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:18 crc kubenswrapper[5004]: I1201 08:18:18.146794 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:18 crc kubenswrapper[5004]: I1201 08:18:18.146818 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:18 crc kubenswrapper[5004]: I1201 08:18:18.146845 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:18 crc kubenswrapper[5004]: I1201 08:18:18.146867 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:18Z","lastTransitionTime":"2025-12-01T08:18:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:18 crc kubenswrapper[5004]: I1201 08:18:18.249777 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:18 crc kubenswrapper[5004]: I1201 08:18:18.249831 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:18 crc kubenswrapper[5004]: I1201 08:18:18.249846 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:18 crc kubenswrapper[5004]: I1201 08:18:18.249870 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:18 crc kubenswrapper[5004]: I1201 08:18:18.249887 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:18Z","lastTransitionTime":"2025-12-01T08:18:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:18 crc kubenswrapper[5004]: I1201 08:18:18.352940 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:18 crc kubenswrapper[5004]: I1201 08:18:18.353001 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:18 crc kubenswrapper[5004]: I1201 08:18:18.353012 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:18 crc kubenswrapper[5004]: I1201 08:18:18.353033 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:18 crc kubenswrapper[5004]: I1201 08:18:18.353047 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:18Z","lastTransitionTime":"2025-12-01T08:18:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:18 crc kubenswrapper[5004]: I1201 08:18:18.456091 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:18 crc kubenswrapper[5004]: I1201 08:18:18.456179 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:18 crc kubenswrapper[5004]: I1201 08:18:18.456209 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:18 crc kubenswrapper[5004]: I1201 08:18:18.456241 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:18 crc kubenswrapper[5004]: I1201 08:18:18.456262 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:18Z","lastTransitionTime":"2025-12-01T08:18:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:18 crc kubenswrapper[5004]: I1201 08:18:18.559618 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:18 crc kubenswrapper[5004]: I1201 08:18:18.559684 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:18 crc kubenswrapper[5004]: I1201 08:18:18.559701 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:18 crc kubenswrapper[5004]: I1201 08:18:18.559726 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:18 crc kubenswrapper[5004]: I1201 08:18:18.559756 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:18Z","lastTransitionTime":"2025-12-01T08:18:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:18 crc kubenswrapper[5004]: I1201 08:18:18.662879 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:18 crc kubenswrapper[5004]: I1201 08:18:18.662947 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:18 crc kubenswrapper[5004]: I1201 08:18:18.662968 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:18 crc kubenswrapper[5004]: I1201 08:18:18.662997 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:18 crc kubenswrapper[5004]: I1201 08:18:18.663018 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:18Z","lastTransitionTime":"2025-12-01T08:18:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:18 crc kubenswrapper[5004]: I1201 08:18:18.758506 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cl5l" Dec 01 08:18:18 crc kubenswrapper[5004]: E1201 08:18:18.758762 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cl5l" podUID="b488f4f3-d385-4d40-bdee-96d8fe2d42a1" Dec 01 08:18:18 crc kubenswrapper[5004]: I1201 08:18:18.765886 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:18 crc kubenswrapper[5004]: I1201 08:18:18.765951 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:18 crc kubenswrapper[5004]: I1201 08:18:18.765974 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:18 crc kubenswrapper[5004]: I1201 08:18:18.766002 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:18 crc kubenswrapper[5004]: I1201 08:18:18.766025 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:18Z","lastTransitionTime":"2025-12-01T08:18:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:18 crc kubenswrapper[5004]: I1201 08:18:18.869011 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:18 crc kubenswrapper[5004]: I1201 08:18:18.869077 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:18 crc kubenswrapper[5004]: I1201 08:18:18.869094 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:18 crc kubenswrapper[5004]: I1201 08:18:18.869122 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:18 crc kubenswrapper[5004]: I1201 08:18:18.869144 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:18Z","lastTransitionTime":"2025-12-01T08:18:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:18 crc kubenswrapper[5004]: I1201 08:18:18.971857 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:18 crc kubenswrapper[5004]: I1201 08:18:18.971936 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:18 crc kubenswrapper[5004]: I1201 08:18:18.971953 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:18 crc kubenswrapper[5004]: I1201 08:18:18.971983 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:18 crc kubenswrapper[5004]: I1201 08:18:18.972005 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:18Z","lastTransitionTime":"2025-12-01T08:18:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:19 crc kubenswrapper[5004]: I1201 08:18:19.074802 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:19 crc kubenswrapper[5004]: I1201 08:18:19.074861 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:19 crc kubenswrapper[5004]: I1201 08:18:19.074882 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:19 crc kubenswrapper[5004]: I1201 08:18:19.074910 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:19 crc kubenswrapper[5004]: I1201 08:18:19.074932 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:19Z","lastTransitionTime":"2025-12-01T08:18:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:19 crc kubenswrapper[5004]: I1201 08:18:19.176965 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:19 crc kubenswrapper[5004]: I1201 08:18:19.177027 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:19 crc kubenswrapper[5004]: I1201 08:18:19.177049 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:19 crc kubenswrapper[5004]: I1201 08:18:19.177076 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:19 crc kubenswrapper[5004]: I1201 08:18:19.177096 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:19Z","lastTransitionTime":"2025-12-01T08:18:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:19 crc kubenswrapper[5004]: I1201 08:18:19.279514 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:19 crc kubenswrapper[5004]: I1201 08:18:19.279599 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:19 crc kubenswrapper[5004]: I1201 08:18:19.279617 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:19 crc kubenswrapper[5004]: I1201 08:18:19.279641 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:19 crc kubenswrapper[5004]: I1201 08:18:19.279696 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:19Z","lastTransitionTime":"2025-12-01T08:18:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:19 crc kubenswrapper[5004]: I1201 08:18:19.382362 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:19 crc kubenswrapper[5004]: I1201 08:18:19.382413 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:19 crc kubenswrapper[5004]: I1201 08:18:19.382424 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:19 crc kubenswrapper[5004]: I1201 08:18:19.382443 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:19 crc kubenswrapper[5004]: I1201 08:18:19.382454 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:19Z","lastTransitionTime":"2025-12-01T08:18:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:19 crc kubenswrapper[5004]: I1201 08:18:19.496540 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:19 crc kubenswrapper[5004]: I1201 08:18:19.496632 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:19 crc kubenswrapper[5004]: I1201 08:18:19.496656 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:19 crc kubenswrapper[5004]: I1201 08:18:19.496687 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:19 crc kubenswrapper[5004]: I1201 08:18:19.496710 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:19Z","lastTransitionTime":"2025-12-01T08:18:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:19 crc kubenswrapper[5004]: I1201 08:18:19.599998 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:19 crc kubenswrapper[5004]: I1201 08:18:19.600053 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:19 crc kubenswrapper[5004]: I1201 08:18:19.600075 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:19 crc kubenswrapper[5004]: I1201 08:18:19.600102 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:19 crc kubenswrapper[5004]: I1201 08:18:19.600124 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:19Z","lastTransitionTime":"2025-12-01T08:18:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:19 crc kubenswrapper[5004]: I1201 08:18:19.702439 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:19 crc kubenswrapper[5004]: I1201 08:18:19.702490 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:19 crc kubenswrapper[5004]: I1201 08:18:19.702505 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:19 crc kubenswrapper[5004]: I1201 08:18:19.702526 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:19 crc kubenswrapper[5004]: I1201 08:18:19.702543 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:19Z","lastTransitionTime":"2025-12-01T08:18:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:19 crc kubenswrapper[5004]: I1201 08:18:19.758490 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:18:19 crc kubenswrapper[5004]: I1201 08:18:19.758528 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 08:18:19 crc kubenswrapper[5004]: E1201 08:18:19.758888 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 08:18:19 crc kubenswrapper[5004]: I1201 08:18:19.758931 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 08:18:19 crc kubenswrapper[5004]: E1201 08:18:19.759090 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 08:18:19 crc kubenswrapper[5004]: E1201 08:18:19.759200 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 08:18:19 crc kubenswrapper[5004]: I1201 08:18:19.760315 5004 scope.go:117] "RemoveContainer" containerID="76c127e5ff47d030a709c79b7c7f2c2d3b32009052ff95ff522d90ecd86cf993" Dec 01 08:18:19 crc kubenswrapper[5004]: I1201 08:18:19.805388 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:19 crc kubenswrapper[5004]: I1201 08:18:19.805751 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:19 crc kubenswrapper[5004]: I1201 08:18:19.805770 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:19 crc kubenswrapper[5004]: I1201 08:18:19.805790 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:19 crc kubenswrapper[5004]: I1201 08:18:19.805806 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:19Z","lastTransitionTime":"2025-12-01T08:18:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:19 crc kubenswrapper[5004]: I1201 08:18:19.908919 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:19 crc kubenswrapper[5004]: I1201 08:18:19.908983 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:19 crc kubenswrapper[5004]: I1201 08:18:19.909001 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:19 crc kubenswrapper[5004]: I1201 08:18:19.909028 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:19 crc kubenswrapper[5004]: I1201 08:18:19.909046 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:19Z","lastTransitionTime":"2025-12-01T08:18:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.011330 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.011405 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.011429 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.011460 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.011488 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:20Z","lastTransitionTime":"2025-12-01T08:18:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.113806 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.113863 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.113881 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.113905 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.113923 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:20Z","lastTransitionTime":"2025-12-01T08:18:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.219086 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.219125 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.219135 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.219151 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.219161 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:20Z","lastTransitionTime":"2025-12-01T08:18:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.320452 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.320505 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.320524 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.320548 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.320592 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:20Z","lastTransitionTime":"2025-12-01T08:18:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.320764 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-knmdv_15cdec0a-5925-4966-a30b-f60c503f633e/ovnkube-controller/2.log" Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.323599 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" event={"ID":"15cdec0a-5925-4966-a30b-f60c503f633e","Type":"ContainerStarted","Data":"06e08a797d0a4810bb4314a26c9ecefd52e0bd2f04615b2bd39c4ccf951a33c9"} Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.324226 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.338180 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9977ebb-82de-4e96-8763-0b5a84f8d4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf84c2eea7167eed95a4195d37bf784ffef310bc2079d8d06e1f47cb45a7864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwgxq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c1e51fa92aec80e93d0fce11c743b8c4fde23fc23d8626e8d5b3e25ab42350d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwgxq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fvdgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:20Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.351227 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7cl5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b488f4f3-d385-4d40-bdee-96d8fe2d42a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-glbhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-glbhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7cl5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:20Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.379111 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43c9f762-d403-49b9-8294-d66976aca4dd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b358f641ff09bc8f4a452b0d4a8cf5f9790182f76779be31426d48d0278b0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d610c3a799d4a7df9ddfe2a28f2ced2e5ac4e96c67c93f8e6d3a0eee60d9ac48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43f9f72af3273da055eeb1e13ad6258a1b3b6b205402861beee9f87f02230297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e3cc2eca559417c71beed561569c4dc5b45ffb8407976e0695fc1b93219d37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c24fbeedbc7c40b6079b7ad89d8564a4f2cdaad2cf996e66682b082382da190\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:20Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.394102 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:20Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.409663 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jjms6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8828af41-beeb-47dd-96cf-3dbcb5175893\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b22220849cdb91592a44a67f9f6a09586146009483f46a0505a2dea3b02bccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvsch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jjms6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:20Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.424121 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.424191 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.424215 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.424245 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.424270 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:20Z","lastTransitionTime":"2025-12-01T08:18:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.427913 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2fa377e-a3a6-46ce-af23-e677799f0115\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520771cf1ad6f9360064c5f304243c5878a2f1025c87d1001c97b2d5f95cb9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b65b9bc444101900c62fd90c7d45789d186a89148ac0fd9c7f0c6048c3f55ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbd3e682d207b8c18d6bcbc26aa2adae2a35645502d31b327ffddd51991e842\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4563974ed1467a87aab104756828e63853c282a61c4fcab1e757eb4da9afaa40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:20Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.450509 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dpkxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddcaf111-708a-45b3-a342-effd3061ab17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68ecba515f05ca83fdd0cdda10e3e5925a146aadb70ae17859586c12daf55dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd2f03a8020818d1cc98b6eeaf64a728e627a3401400c3af53654f8771ef02eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd2f03a8020818d1cc98b6eeaf64a728e627a3401400c3af53654f8771ef02eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8dd31c10eaf6bc673dec377a884040524418dd1891a96b2eb6e6e983979512a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8dd31c10eaf6bc673dec377a884040524418dd1891a96b2eb6e6e983979512a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a7a8ef66a9486ca3d13826df7aa39db2083cff2c386b4d0d858e0526e0f094d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a7a8ef66a9486ca3d13826df7aa39db2083cff2c386b4d0d858e0526e0f094d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd1c47c976c17bf17e4d660288ce30472372b6a7f33f6ffa6d96c1a4436275d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd1c47c976c17bf17e4d660288ce30472372b6a7f33f6ffa6d96c1a4436275d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b7c8e9648e5514ab1fcb63dc72822dd276a2d74987e2fb4f4800b26ec176be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b7c8e9648e5514ab1fcb63dc72822dd276a2d74987e2fb4f4800b26ec176be7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d7115db7fe248437f4b8918f06c9aa48b141fcd3f31add80faee81b4347e47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d7115db7fe248437f4b8918f06c9aa48b141fcd3f31add80faee81b4347e47b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dpkxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:20Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.465250 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ww6lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72500ffd-4ca3-4614-a3a2-bbdc5a7506c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://668cfaa2af20e2bb082fc47e1702cfd9f704c4fdf56a4d27cf25d6915e7cd18a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmsvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ww6lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:20Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.481139 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4ee478f-9254-4e56-96be-f5a83ff5d77c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37438df65f4dab6700f193e84f81d8ed41b3c208c3f6150bd5836c0642190572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f873da9c594c885720113b0d0fc01552050d030e802074043e9ede174fd9b16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ca757be16a55ba8df0e9629f7cc2653e2804a5ee5a2151ee3b07d2c30fe5b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3930a1c88c6014beb768ed20a37359ade08e77496a86b913f68c6196134f13a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3930a1c88c6014beb768ed20a37359ade08e77496a86b913f68c6196134f13a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:20Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.500705 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0ed2b6d-0c61-4639-bc3b-1c8effc4815d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45a0d7a12e7c114de1740a151e04f7f39844b709740cab958f156ac918af3228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d69a370af9f8f6c575f99da6bb01cee0f660dccb7e85f16f5d8874d0e263e1fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82baee0fd931fd7b41cd5a73ca7e653ef13dfa3d573991cbe3238d6b95f6fac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f341e99100e985f65c24647d976a24782dfae7f52e752b774f41f69af27fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad581564fc519328ae429bce757797f6d87d8648e862b008637a245866ec4cbe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T08:17:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 08:17:15.185125 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 08:17:15.186793 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071633747/tls.crt::/tmp/serving-cert-4071633747/tls.key\\\\\\\"\\\\nI1201 08:17:20.827614 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 08:17:20.834528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 08:17:20.834549 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 08:17:20.837611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 08:17:20.837630 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 08:17:20.848940 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 08:17:20.848957 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848965 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 08:17:20.848968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 08:17:20.848970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 08:17:20.848973 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 08:17:20.849115 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 08:17:20.854990 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2caeb2ebaf68110cb8728e60aae6db1b1fc48efae6018f66070766a4f42caa69\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:20Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.519297 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:20Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.527462 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.527521 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.527544 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.527600 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.527628 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:20Z","lastTransitionTime":"2025-12-01T08:18:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.538197 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c470edd984db3b756e031e33f64a9bc5ca6b9c156ab479e5cd647745655a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:20Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.556623 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fd2f2b5d7dc3d7b8e1bb06fdf0eec3addfec3c230e044b6395d6cf535630ebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2250161e400b201f0528e33231c9007342002a37cd037a1b1e011f2aaaa22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:20Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.572915 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zjksw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70e79009-93be-49c4-a6b3-e8a06bcea7f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://862dd7caaf04a01f96f2dba70cd2226e85b55f172ee6a34c178f756a75832a08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad3ea204ad731500a340e639f0e36abe28ba50464f1b2ff9b009e39c234cf708\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T08:18:10Z\\\",\\\"message\\\":\\\"2025-12-01T08:17:24+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_50c6dd0c-b888-41cb-a4f6-d1199cf62aeb\\\\n2025-12-01T08:17:24+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_50c6dd0c-b888-41cb-a4f6-d1199cf62aeb to /host/opt/cni/bin/\\\\n2025-12-01T08:17:25Z [verbose] multus-daemon started\\\\n2025-12-01T08:17:25Z [verbose] Readiness Indicator file check\\\\n2025-12-01T08:18:10Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:18:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd8m6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zjksw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:20Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.596203 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15cdec0a-5925-4966-a30b-f60c503f633e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f528b94ff12c7a804930dca4b93c23334a9ae3a53317750cd34b5695f08b4582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f001b70a83da28b1ea78a94f1cd70dfc63b3de5a21b041932d33c2ac970c986b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c5c043438b515e37a6d6d5c47b92479bbb7322fe35c3a1bc57ee7b3a746544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23df33e45cc27e4d343ae6ec3fb88f2dfc125ba7da424515d4bee5ec19281832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4a7cab8e1594814a687b1433a92642a19eff7c2eb12f96fa06f08a2e270733d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1844a18a6109c25a25b9c8cb3ff65e69cc657f66be03dd97e883d24a774b7638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06e08a797d0a4810bb4314a26c9ecefd52e0bd2f04615b2bd39c4ccf951a33c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76c127e5ff47d030a709c79b7c7f2c2d3b32009052ff95ff522d90ecd86cf993\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T08:17:52Z\\\",\\\"message\\\":\\\"}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1201 08:17:52.744847 6628 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-marketplace/community-operators]} name:Service_openshift-marketplace/community-operators_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.189:50051:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d389393c-7ba9-422c-b3f5-06e391d537d2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 08:17:52.746408 6628 model_client.go:382] Update operations generated as: [{Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1201 08:17:52.746269 6628 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f231e0092e1a85cc5d6e00fb8d3bd982223b670191ac3dc21d81ff1fab462a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-knmdv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:20Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.617357 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59844f9c-e37f-4c08-986d-b0b0dd2870ee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8561c15ad57f27642507b1cf97c865989aea032f1d31998d18efb066baa9c283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://488ed5075af94d953143c09a2bac601fe1af57a34197f91965f22a1028ebd2b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://488ed5075af94d953143c09a2bac601fe1af57a34197f91965f22a1028ebd2b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:20Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.629651 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.629697 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.629714 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.629736 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.629753 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:20Z","lastTransitionTime":"2025-12-01T08:18:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.631554 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327477e7db10129d47ddf9078f32d0df2ffd4b8e59fc7bb77c17c60c63e9a936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:20Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.648429 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:20Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.661400 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mzsvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"397b51b7-934a-41d1-a593-500a64161bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4adac5633dfe89777bf019818bab9ee3a208cad9d929c96cf2cb86b18c2d4264\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h49g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d83844478080e4829975fb6c8e0444d9fdebd44b08afcf45e7d0b04fc534a844\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h49g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mzsvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:20Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.732876 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.732937 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.732960 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.732985 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.733003 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:20Z","lastTransitionTime":"2025-12-01T08:18:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.758412 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cl5l" Dec 01 08:18:20 crc kubenswrapper[5004]: E1201 08:18:20.758688 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cl5l" podUID="b488f4f3-d385-4d40-bdee-96d8fe2d42a1" Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.835768 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.835833 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.835851 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.841668 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.841733 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:20Z","lastTransitionTime":"2025-12-01T08:18:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.945844 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.945927 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.945950 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.945982 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:20 crc kubenswrapper[5004]: I1201 08:18:20.946014 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:20Z","lastTransitionTime":"2025-12-01T08:18:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.048887 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.048937 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.048948 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.048967 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.048979 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:21Z","lastTransitionTime":"2025-12-01T08:18:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.152358 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.152432 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.152455 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.152490 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.152512 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:21Z","lastTransitionTime":"2025-12-01T08:18:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.254838 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.254919 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.254962 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.254992 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.255014 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:21Z","lastTransitionTime":"2025-12-01T08:18:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.329230 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-knmdv_15cdec0a-5925-4966-a30b-f60c503f633e/ovnkube-controller/3.log" Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.330218 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-knmdv_15cdec0a-5925-4966-a30b-f60c503f633e/ovnkube-controller/2.log" Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.334332 5004 generic.go:334] "Generic (PLEG): container finished" podID="15cdec0a-5925-4966-a30b-f60c503f633e" containerID="06e08a797d0a4810bb4314a26c9ecefd52e0bd2f04615b2bd39c4ccf951a33c9" exitCode=1 Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.334394 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" event={"ID":"15cdec0a-5925-4966-a30b-f60c503f633e","Type":"ContainerDied","Data":"06e08a797d0a4810bb4314a26c9ecefd52e0bd2f04615b2bd39c4ccf951a33c9"} Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.334454 5004 scope.go:117] "RemoveContainer" containerID="76c127e5ff47d030a709c79b7c7f2c2d3b32009052ff95ff522d90ecd86cf993" Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.335434 5004 scope.go:117] "RemoveContainer" containerID="06e08a797d0a4810bb4314a26c9ecefd52e0bd2f04615b2bd39c4ccf951a33c9" Dec 01 08:18:21 crc kubenswrapper[5004]: E1201 08:18:21.335793 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-knmdv_openshift-ovn-kubernetes(15cdec0a-5925-4966-a30b-f60c503f633e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" podUID="15cdec0a-5925-4966-a30b-f60c503f633e" Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.358045 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:21Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.358475 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.358516 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.358531 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.358554 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.358603 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:21Z","lastTransitionTime":"2025-12-01T08:18:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.378652 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:21Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.401367 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c470edd984db3b756e031e33f64a9bc5ca6b9c156ab479e5cd647745655a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:21Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.423045 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fd2f2b5d7dc3d7b8e1bb06fdf0eec3addfec3c230e044b6395d6cf535630ebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2250161e400b201f0528e33231c9007342002a37cd037a1b1e011f2aaaa22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:21Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.443600 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zjksw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70e79009-93be-49c4-a6b3-e8a06bcea7f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://862dd7caaf04a01f96f2dba70cd2226e85b55f172ee6a34c178f756a75832a08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad3ea204ad731500a340e639f0e36abe28ba50464f1b2ff9b009e39c234cf708\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T08:18:10Z\\\",\\\"message\\\":\\\"2025-12-01T08:17:24+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_50c6dd0c-b888-41cb-a4f6-d1199cf62aeb\\\\n2025-12-01T08:17:24+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_50c6dd0c-b888-41cb-a4f6-d1199cf62aeb to /host/opt/cni/bin/\\\\n2025-12-01T08:17:25Z [verbose] multus-daemon started\\\\n2025-12-01T08:17:25Z [verbose] Readiness Indicator file check\\\\n2025-12-01T08:18:10Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:18:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd8m6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zjksw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:21Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.461800 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.461857 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.461874 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.461899 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.461917 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:21Z","lastTransitionTime":"2025-12-01T08:18:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.474869 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15cdec0a-5925-4966-a30b-f60c503f633e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f528b94ff12c7a804930dca4b93c23334a9ae3a53317750cd34b5695f08b4582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f001b70a83da28b1ea78a94f1cd70dfc63b3de5a21b041932d33c2ac970c986b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c5c043438b515e37a6d6d5c47b92479bbb7322fe35c3a1bc57ee7b3a746544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23df33e45cc27e4d343ae6ec3fb88f2dfc125ba7da424515d4bee5ec19281832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4a7cab8e1594814a687b1433a92642a19eff7c2eb12f96fa06f08a2e270733d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1844a18a6109c25a25b9c8cb3ff65e69cc657f66be03dd97e883d24a774b7638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06e08a797d0a4810bb4314a26c9ecefd52e0bd2f04615b2bd39c4ccf951a33c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76c127e5ff47d030a709c79b7c7f2c2d3b32009052ff95ff522d90ecd86cf993\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T08:17:52Z\\\",\\\"message\\\":\\\"}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1201 08:17:52.744847 6628 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-marketplace/community-operators]} name:Service_openshift-marketplace/community-operators_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.189:50051:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d389393c-7ba9-422c-b3f5-06e391d537d2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 08:17:52.746408 6628 model_client.go:382] Update operations generated as: [{Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1201 08:17:52.746269 6628 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06e08a797d0a4810bb4314a26c9ecefd52e0bd2f04615b2bd39c4ccf951a33c9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T08:18:20Z\\\",\\\"message\\\":\\\"ressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 08:18:20.712430 6978 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1201 08:18:20.712460 6978 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 08:18:20.712505 6978 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1201 08:18:20.712514 6978 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1201 08:18:20.712520 6978 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1201 08:18:20.712548 6978 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 08:18:20.712546 6978 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1201 08:18:20.712578 6978 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1201 08:18:20.712586 6978 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1201 08:18:20.712592 6978 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1201 08:18:20.712593 6978 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 08:18:20.712599 6978 factory.go:656] Stopping watch factory\\\\nI1201 08:18:20.712611 6978 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 08:18:20.712622 6978 ovnkube.go:599] Stopped ovnkube\\\\nI1201 08:18:20.712626 6978 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1201 08:18:2\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T08:18:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f231e0092e1a85cc5d6e00fb8d3bd982223b670191ac3dc21d81ff1fab462a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-knmdv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:21Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.492542 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59844f9c-e37f-4c08-986d-b0b0dd2870ee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8561c15ad57f27642507b1cf97c865989aea032f1d31998d18efb066baa9c283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://488ed5075af94d953143c09a2bac601fe1af57a34197f91965f22a1028ebd2b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://488ed5075af94d953143c09a2bac601fe1af57a34197f91965f22a1028ebd2b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:21Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.513395 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327477e7db10129d47ddf9078f32d0df2ffd4b8e59fc7bb77c17c60c63e9a936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:21Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.531880 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mzsvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"397b51b7-934a-41d1-a593-500a64161bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4adac5633dfe89777bf019818bab9ee3a208cad9d929c96cf2cb86b18c2d4264\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h49g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d83844478080e4829975fb6c8e0444d9fdebd44b08afcf45e7d0b04fc534a844\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h49g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mzsvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:21Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.546811 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jjms6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8828af41-beeb-47dd-96cf-3dbcb5175893\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b22220849cdb91592a44a67f9f6a09586146009483f46a0505a2dea3b02bccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvsch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jjms6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:21Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.564492 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.564548 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.564614 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.564647 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.564667 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:21Z","lastTransitionTime":"2025-12-01T08:18:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.564721 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9977ebb-82de-4e96-8763-0b5a84f8d4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf84c2eea7167eed95a4195d37bf784ffef310bc2079d8d06e1f47cb45a7864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwgxq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c1e51fa92aec80e93d0fce11c743b8c4fde23fc23d8626e8d5b3e25ab42350d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwgxq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fvdgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:21Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.580647 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7cl5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b488f4f3-d385-4d40-bdee-96d8fe2d42a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-glbhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-glbhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7cl5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:21Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.612870 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43c9f762-d403-49b9-8294-d66976aca4dd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b358f641ff09bc8f4a452b0d4a8cf5f9790182f76779be31426d48d0278b0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d610c3a799d4a7df9ddfe2a28f2ced2e5ac4e96c67c93f8e6d3a0eee60d9ac48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43f9f72af3273da055eeb1e13ad6258a1b3b6b205402861beee9f87f02230297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e3cc2eca559417c71beed561569c4dc5b45ffb8407976e0695fc1b93219d37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c24fbeedbc7c40b6079b7ad89d8564a4f2cdaad2cf996e66682b082382da190\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:21Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.632197 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:21Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.647548 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ww6lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72500ffd-4ca3-4614-a3a2-bbdc5a7506c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://668cfaa2af20e2bb082fc47e1702cfd9f704c4fdf56a4d27cf25d6915e7cd18a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmsvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ww6lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:21Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.664404 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2fa377e-a3a6-46ce-af23-e677799f0115\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520771cf1ad6f9360064c5f304243c5878a2f1025c87d1001c97b2d5f95cb9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b65b9bc444101900c62fd90c7d45789d186a89148ac0fd9c7f0c6048c3f55ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbd3e682d207b8c18d6bcbc26aa2adae2a35645502d31b327ffddd51991e842\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4563974ed1467a87aab104756828e63853c282a61c4fcab1e757eb4da9afaa40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:21Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.667481 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.667585 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.667613 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.667644 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.667666 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:21Z","lastTransitionTime":"2025-12-01T08:18:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.687904 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dpkxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddcaf111-708a-45b3-a342-effd3061ab17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68ecba515f05ca83fdd0cdda10e3e5925a146aadb70ae17859586c12daf55dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd2f03a8020818d1cc98b6eeaf64a728e627a3401400c3af53654f8771ef02eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd2f03a8020818d1cc98b6eeaf64a728e627a3401400c3af53654f8771ef02eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8dd31c10eaf6bc673dec377a884040524418dd1891a96b2eb6e6e983979512a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8dd31c10eaf6bc673dec377a884040524418dd1891a96b2eb6e6e983979512a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a7a8ef66a9486ca3d13826df7aa39db2083cff2c386b4d0d858e0526e0f094d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a7a8ef66a9486ca3d13826df7aa39db2083cff2c386b4d0d858e0526e0f094d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd1c47c976c17bf17e4d660288ce30472372b6a7f33f6ffa6d96c1a4436275d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd1c47c976c17bf17e4d660288ce30472372b6a7f33f6ffa6d96c1a4436275d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b7c8e9648e5514ab1fcb63dc72822dd276a2d74987e2fb4f4800b26ec176be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b7c8e9648e5514ab1fcb63dc72822dd276a2d74987e2fb4f4800b26ec176be7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d7115db7fe248437f4b8918f06c9aa48b141fcd3f31add80faee81b4347e47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d7115db7fe248437f4b8918f06c9aa48b141fcd3f31add80faee81b4347e47b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dpkxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:21Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.709415 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4ee478f-9254-4e56-96be-f5a83ff5d77c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37438df65f4dab6700f193e84f81d8ed41b3c208c3f6150bd5836c0642190572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f873da9c594c885720113b0d0fc01552050d030e802074043e9ede174fd9b16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ca757be16a55ba8df0e9629f7cc2653e2804a5ee5a2151ee3b07d2c30fe5b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3930a1c88c6014beb768ed20a37359ade08e77496a86b913f68c6196134f13a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3930a1c88c6014beb768ed20a37359ade08e77496a86b913f68c6196134f13a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:21Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.731707 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0ed2b6d-0c61-4639-bc3b-1c8effc4815d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45a0d7a12e7c114de1740a151e04f7f39844b709740cab958f156ac918af3228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d69a370af9f8f6c575f99da6bb01cee0f660dccb7e85f16f5d8874d0e263e1fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82baee0fd931fd7b41cd5a73ca7e653ef13dfa3d573991cbe3238d6b95f6fac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f341e99100e985f65c24647d976a24782dfae7f52e752b774f41f69af27fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad581564fc519328ae429bce757797f6d87d8648e862b008637a245866ec4cbe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T08:17:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 08:17:15.185125 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 08:17:15.186793 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071633747/tls.crt::/tmp/serving-cert-4071633747/tls.key\\\\\\\"\\\\nI1201 08:17:20.827614 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 08:17:20.834528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 08:17:20.834549 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 08:17:20.837611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 08:17:20.837630 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 08:17:20.848940 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 08:17:20.848957 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848965 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 08:17:20.848968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 08:17:20.848970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 08:17:20.848973 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 08:17:20.849115 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 08:17:20.854990 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2caeb2ebaf68110cb8728e60aae6db1b1fc48efae6018f66070766a4f42caa69\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:21Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.758455 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.758521 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.758771 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 08:18:21 crc kubenswrapper[5004]: E1201 08:18:21.759251 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 08:18:21 crc kubenswrapper[5004]: E1201 08:18:21.759071 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 08:18:21 crc kubenswrapper[5004]: E1201 08:18:21.759460 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.771235 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.771293 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.771314 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.771345 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.771371 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:21Z","lastTransitionTime":"2025-12-01T08:18:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.874161 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.874242 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.874267 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.874300 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.874326 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:21Z","lastTransitionTime":"2025-12-01T08:18:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.980489 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.980599 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.980615 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.980634 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:21 crc kubenswrapper[5004]: I1201 08:18:21.980652 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:21Z","lastTransitionTime":"2025-12-01T08:18:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.083636 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.083692 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.083754 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.083782 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.083798 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:22Z","lastTransitionTime":"2025-12-01T08:18:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.187119 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.187150 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.187161 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.187175 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.187184 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:22Z","lastTransitionTime":"2025-12-01T08:18:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.290231 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.290323 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.290341 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.290367 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.290384 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:22Z","lastTransitionTime":"2025-12-01T08:18:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.340297 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-knmdv_15cdec0a-5925-4966-a30b-f60c503f633e/ovnkube-controller/3.log" Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.344059 5004 scope.go:117] "RemoveContainer" containerID="06e08a797d0a4810bb4314a26c9ecefd52e0bd2f04615b2bd39c4ccf951a33c9" Dec 01 08:18:22 crc kubenswrapper[5004]: E1201 08:18:22.344265 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-knmdv_openshift-ovn-kubernetes(15cdec0a-5925-4966-a30b-f60c503f633e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" podUID="15cdec0a-5925-4966-a30b-f60c503f633e" Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.360616 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4ee478f-9254-4e56-96be-f5a83ff5d77c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37438df65f4dab6700f193e84f81d8ed41b3c208c3f6150bd5836c0642190572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f873da9c594c885720113b0d0fc01552050d030e802074043e9ede174fd9b16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ca757be16a55ba8df0e9629f7cc2653e2804a5ee5a2151ee3b07d2c30fe5b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3930a1c88c6014beb768ed20a37359ade08e77496a86b913f68c6196134f13a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3930a1c88c6014beb768ed20a37359ade08e77496a86b913f68c6196134f13a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:22Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.379663 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0ed2b6d-0c61-4639-bc3b-1c8effc4815d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45a0d7a12e7c114de1740a151e04f7f39844b709740cab958f156ac918af3228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d69a370af9f8f6c575f99da6bb01cee0f660dccb7e85f16f5d8874d0e263e1fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82baee0fd931fd7b41cd5a73ca7e653ef13dfa3d573991cbe3238d6b95f6fac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f341e99100e985f65c24647d976a24782dfae7f52e752b774f41f69af27fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad581564fc519328ae429bce757797f6d87d8648e862b008637a245866ec4cbe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T08:17:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 08:17:15.185125 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 08:17:15.186793 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071633747/tls.crt::/tmp/serving-cert-4071633747/tls.key\\\\\\\"\\\\nI1201 08:17:20.827614 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 08:17:20.834528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 08:17:20.834549 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 08:17:20.837611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 08:17:20.837630 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 08:17:20.848940 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 08:17:20.848957 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848965 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 08:17:20.848968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 08:17:20.848970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 08:17:20.848973 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 08:17:20.849115 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 08:17:20.854990 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2caeb2ebaf68110cb8728e60aae6db1b1fc48efae6018f66070766a4f42caa69\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:22Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.393210 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.393241 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.393268 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.393284 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.393334 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:22Z","lastTransitionTime":"2025-12-01T08:18:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.395592 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:22Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.413191 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:22Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.427185 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c470edd984db3b756e031e33f64a9bc5ca6b9c156ab479e5cd647745655a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:22Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.446296 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fd2f2b5d7dc3d7b8e1bb06fdf0eec3addfec3c230e044b6395d6cf535630ebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2250161e400b201f0528e33231c9007342002a37cd037a1b1e011f2aaaa22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:22Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.462798 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zjksw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70e79009-93be-49c4-a6b3-e8a06bcea7f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://862dd7caaf04a01f96f2dba70cd2226e85b55f172ee6a34c178f756a75832a08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad3ea204ad731500a340e639f0e36abe28ba50464f1b2ff9b009e39c234cf708\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T08:18:10Z\\\",\\\"message\\\":\\\"2025-12-01T08:17:24+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_50c6dd0c-b888-41cb-a4f6-d1199cf62aeb\\\\n2025-12-01T08:17:24+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_50c6dd0c-b888-41cb-a4f6-d1199cf62aeb to /host/opt/cni/bin/\\\\n2025-12-01T08:17:25Z [verbose] multus-daemon started\\\\n2025-12-01T08:17:25Z [verbose] Readiness Indicator file check\\\\n2025-12-01T08:18:10Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:18:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd8m6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zjksw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:22Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.487423 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15cdec0a-5925-4966-a30b-f60c503f633e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f528b94ff12c7a804930dca4b93c23334a9ae3a53317750cd34b5695f08b4582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f001b70a83da28b1ea78a94f1cd70dfc63b3de5a21b041932d33c2ac970c986b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c5c043438b515e37a6d6d5c47b92479bbb7322fe35c3a1bc57ee7b3a746544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23df33e45cc27e4d343ae6ec3fb88f2dfc125ba7da424515d4bee5ec19281832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4a7cab8e1594814a687b1433a92642a19eff7c2eb12f96fa06f08a2e270733d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1844a18a6109c25a25b9c8cb3ff65e69cc657f66be03dd97e883d24a774b7638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06e08a797d0a4810bb4314a26c9ecefd52e0bd2f04615b2bd39c4ccf951a33c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06e08a797d0a4810bb4314a26c9ecefd52e0bd2f04615b2bd39c4ccf951a33c9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T08:18:20Z\\\",\\\"message\\\":\\\"ressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 08:18:20.712430 6978 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1201 08:18:20.712460 6978 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 08:18:20.712505 6978 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1201 08:18:20.712514 6978 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1201 08:18:20.712520 6978 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1201 08:18:20.712548 6978 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 08:18:20.712546 6978 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1201 08:18:20.712578 6978 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1201 08:18:20.712586 6978 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1201 08:18:20.712592 6978 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1201 08:18:20.712593 6978 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 08:18:20.712599 6978 factory.go:656] Stopping watch factory\\\\nI1201 08:18:20.712611 6978 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 08:18:20.712622 6978 ovnkube.go:599] Stopped ovnkube\\\\nI1201 08:18:20.712626 6978 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1201 08:18:2\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T08:18:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-knmdv_openshift-ovn-kubernetes(15cdec0a-5925-4966-a30b-f60c503f633e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f231e0092e1a85cc5d6e00fb8d3bd982223b670191ac3dc21d81ff1fab462a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-knmdv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:22Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.498496 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.498553 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.498592 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.498616 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.498633 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:22Z","lastTransitionTime":"2025-12-01T08:18:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.505497 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59844f9c-e37f-4c08-986d-b0b0dd2870ee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8561c15ad57f27642507b1cf97c865989aea032f1d31998d18efb066baa9c283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://488ed5075af94d953143c09a2bac601fe1af57a34197f91965f22a1028ebd2b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://488ed5075af94d953143c09a2bac601fe1af57a34197f91965f22a1028ebd2b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:22Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.522129 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327477e7db10129d47ddf9078f32d0df2ffd4b8e59fc7bb77c17c60c63e9a936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:22Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.537159 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mzsvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"397b51b7-934a-41d1-a593-500a64161bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4adac5633dfe89777bf019818bab9ee3a208cad9d929c96cf2cb86b18c2d4264\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h49g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d83844478080e4829975fb6c8e0444d9fdebd44b08afcf45e7d0b04fc534a844\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h49g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mzsvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:22Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.552829 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jjms6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8828af41-beeb-47dd-96cf-3dbcb5175893\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b22220849cdb91592a44a67f9f6a09586146009483f46a0505a2dea3b02bccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvsch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jjms6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:22Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.566169 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9977ebb-82de-4e96-8763-0b5a84f8d4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf84c2eea7167eed95a4195d37bf784ffef310bc2079d8d06e1f47cb45a7864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwgxq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c1e51fa92aec80e93d0fce11c743b8c4fde23fc23d8626e8d5b3e25ab42350d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwgxq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fvdgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:22Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.579198 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7cl5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b488f4f3-d385-4d40-bdee-96d8fe2d42a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-glbhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-glbhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7cl5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:22Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.601971 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.602016 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.602032 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.602055 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.602074 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:22Z","lastTransitionTime":"2025-12-01T08:18:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.605105 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43c9f762-d403-49b9-8294-d66976aca4dd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b358f641ff09bc8f4a452b0d4a8cf5f9790182f76779be31426d48d0278b0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d610c3a799d4a7df9ddfe2a28f2ced2e5ac4e96c67c93f8e6d3a0eee60d9ac48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43f9f72af3273da055eeb1e13ad6258a1b3b6b205402861beee9f87f02230297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e3cc2eca559417c71beed561569c4dc5b45ffb8407976e0695fc1b93219d37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c24fbeedbc7c40b6079b7ad89d8564a4f2cdaad2cf996e66682b082382da190\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:22Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.620365 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:22Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.630143 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ww6lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72500ffd-4ca3-4614-a3a2-bbdc5a7506c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://668cfaa2af20e2bb082fc47e1702cfd9f704c4fdf56a4d27cf25d6915e7cd18a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmsvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ww6lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:22Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.640746 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2fa377e-a3a6-46ce-af23-e677799f0115\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520771cf1ad6f9360064c5f304243c5878a2f1025c87d1001c97b2d5f95cb9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b65b9bc444101900c62fd90c7d45789d186a89148ac0fd9c7f0c6048c3f55ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbd3e682d207b8c18d6bcbc26aa2adae2a35645502d31b327ffddd51991e842\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4563974ed1467a87aab104756828e63853c282a61c4fcab1e757eb4da9afaa40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:22Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.656332 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dpkxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddcaf111-708a-45b3-a342-effd3061ab17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68ecba515f05ca83fdd0cdda10e3e5925a146aadb70ae17859586c12daf55dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd2f03a8020818d1cc98b6eeaf64a728e627a3401400c3af53654f8771ef02eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd2f03a8020818d1cc98b6eeaf64a728e627a3401400c3af53654f8771ef02eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8dd31c10eaf6bc673dec377a884040524418dd1891a96b2eb6e6e983979512a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8dd31c10eaf6bc673dec377a884040524418dd1891a96b2eb6e6e983979512a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a7a8ef66a9486ca3d13826df7aa39db2083cff2c386b4d0d858e0526e0f094d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a7a8ef66a9486ca3d13826df7aa39db2083cff2c386b4d0d858e0526e0f094d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd1c47c976c17bf17e4d660288ce30472372b6a7f33f6ffa6d96c1a4436275d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd1c47c976c17bf17e4d660288ce30472372b6a7f33f6ffa6d96c1a4436275d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b7c8e9648e5514ab1fcb63dc72822dd276a2d74987e2fb4f4800b26ec176be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b7c8e9648e5514ab1fcb63dc72822dd276a2d74987e2fb4f4800b26ec176be7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d7115db7fe248437f4b8918f06c9aa48b141fcd3f31add80faee81b4347e47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d7115db7fe248437f4b8918f06c9aa48b141fcd3f31add80faee81b4347e47b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dpkxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:22Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.704920 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.705038 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.705123 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.705203 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.705231 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:22Z","lastTransitionTime":"2025-12-01T08:18:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.758316 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cl5l" Dec 01 08:18:22 crc kubenswrapper[5004]: E1201 08:18:22.758552 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cl5l" podUID="b488f4f3-d385-4d40-bdee-96d8fe2d42a1" Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.779217 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:22Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.799851 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03c470edd984db3b756e031e33f64a9bc5ca6b9c156ab479e5cd647745655a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:22Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.812307 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.812387 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.812404 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.812426 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.812467 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:22Z","lastTransitionTime":"2025-12-01T08:18:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.825553 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fd2f2b5d7dc3d7b8e1bb06fdf0eec3addfec3c230e044b6395d6cf535630ebd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b2250161e400b201f0528e33231c9007342002a37cd037a1b1e011f2aaaa22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:22Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.847515 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zjksw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70e79009-93be-49c4-a6b3-e8a06bcea7f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://862dd7caaf04a01f96f2dba70cd2226e85b55f172ee6a34c178f756a75832a08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad3ea204ad731500a340e639f0e36abe28ba50464f1b2ff9b009e39c234cf708\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T08:18:10Z\\\",\\\"message\\\":\\\"2025-12-01T08:17:24+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_50c6dd0c-b888-41cb-a4f6-d1199cf62aeb\\\\n2025-12-01T08:17:24+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_50c6dd0c-b888-41cb-a4f6-d1199cf62aeb to /host/opt/cni/bin/\\\\n2025-12-01T08:17:25Z [verbose] multus-daemon started\\\\n2025-12-01T08:17:25Z [verbose] Readiness Indicator file check\\\\n2025-12-01T08:18:10Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:18:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bd8m6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zjksw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:22Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.877385 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15cdec0a-5925-4966-a30b-f60c503f633e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f528b94ff12c7a804930dca4b93c23334a9ae3a53317750cd34b5695f08b4582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f001b70a83da28b1ea78a94f1cd70dfc63b3de5a21b041932d33c2ac970c986b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c5c043438b515e37a6d6d5c47b92479bbb7322fe35c3a1bc57ee7b3a746544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23df33e45cc27e4d343ae6ec3fb88f2dfc125ba7da424515d4bee5ec19281832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4a7cab8e1594814a687b1433a92642a19eff7c2eb12f96fa06f08a2e270733d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1844a18a6109c25a25b9c8cb3ff65e69cc657f66be03dd97e883d24a774b7638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06e08a797d0a4810bb4314a26c9ecefd52e0bd2f04615b2bd39c4ccf951a33c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06e08a797d0a4810bb4314a26c9ecefd52e0bd2f04615b2bd39c4ccf951a33c9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T08:18:20Z\\\",\\\"message\\\":\\\"ressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 08:18:20.712430 6978 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1201 08:18:20.712460 6978 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 08:18:20.712505 6978 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1201 08:18:20.712514 6978 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1201 08:18:20.712520 6978 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1201 08:18:20.712548 6978 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 08:18:20.712546 6978 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1201 08:18:20.712578 6978 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1201 08:18:20.712586 6978 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1201 08:18:20.712592 6978 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1201 08:18:20.712593 6978 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 08:18:20.712599 6978 factory.go:656] Stopping watch factory\\\\nI1201 08:18:20.712611 6978 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 08:18:20.712622 6978 ovnkube.go:599] Stopped ovnkube\\\\nI1201 08:18:20.712626 6978 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1201 08:18:2\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T08:18:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-knmdv_openshift-ovn-kubernetes(15cdec0a-5925-4966-a30b-f60c503f633e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f231e0092e1a85cc5d6e00fb8d3bd982223b670191ac3dc21d81ff1fab462a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f472x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-knmdv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:22Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.895015 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59844f9c-e37f-4c08-986d-b0b0dd2870ee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8561c15ad57f27642507b1cf97c865989aea032f1d31998d18efb066baa9c283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://488ed5075af94d953143c09a2bac601fe1af57a34197f91965f22a1028ebd2b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://488ed5075af94d953143c09a2bac601fe1af57a34197f91965f22a1028ebd2b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:22Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.910109 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327477e7db10129d47ddf9078f32d0df2ffd4b8e59fc7bb77c17c60c63e9a936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:22Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.914910 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.914976 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.914995 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.915021 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.915042 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:22Z","lastTransitionTime":"2025-12-01T08:18:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.929772 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:22Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.948204 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mzsvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"397b51b7-934a-41d1-a593-500a64161bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4adac5633dfe89777bf019818bab9ee3a208cad9d929c96cf2cb86b18c2d4264\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h49g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d83844478080e4829975fb6c8e0444d9fdebd44b08afcf45e7d0b04fc534a844\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h49g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mzsvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:22Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.966444 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9977ebb-82de-4e96-8763-0b5a84f8d4ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf84c2eea7167eed95a4195d37bf784ffef310bc2079d8d06e1f47cb45a7864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwgxq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c1e51fa92aec80e93d0fce11c743b8c4fde23fc23d8626e8d5b3e25ab42350d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pwgxq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fvdgt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:22Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:22 crc kubenswrapper[5004]: I1201 08:18:22.984272 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7cl5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b488f4f3-d385-4d40-bdee-96d8fe2d42a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-glbhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-glbhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7cl5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:22Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:23 crc kubenswrapper[5004]: I1201 08:18:23.009182 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43c9f762-d403-49b9-8294-d66976aca4dd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b358f641ff09bc8f4a452b0d4a8cf5f9790182f76779be31426d48d0278b0fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d610c3a799d4a7df9ddfe2a28f2ced2e5ac4e96c67c93f8e6d3a0eee60d9ac48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43f9f72af3273da055eeb1e13ad6258a1b3b6b205402861beee9f87f02230297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38e3cc2eca559417c71beed561569c4dc5b45ffb8407976e0695fc1b93219d37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c24fbeedbc7c40b6079b7ad89d8564a4f2cdaad2cf996e66682b082382da190\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e163cd374befc8b8c6cdfe3916235cd63280474c2a09eeebf108fb75d378a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37376e093170ad4138e102e3a1e2c2b45b3944bf36b0d77dce126bd493d60803\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4db8d673550f14f3614e47dc0bf81e0920753be13fb01c7ef0d6bd3716611f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:23Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:23 crc kubenswrapper[5004]: I1201 08:18:23.018476 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:23 crc kubenswrapper[5004]: I1201 08:18:23.018526 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:23 crc kubenswrapper[5004]: I1201 08:18:23.018544 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:23 crc kubenswrapper[5004]: I1201 08:18:23.018596 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:23 crc kubenswrapper[5004]: I1201 08:18:23.018618 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:23Z","lastTransitionTime":"2025-12-01T08:18:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:23 crc kubenswrapper[5004]: I1201 08:18:23.028445 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:23Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:23 crc kubenswrapper[5004]: I1201 08:18:23.042957 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jjms6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8828af41-beeb-47dd-96cf-3dbcb5175893\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b22220849cdb91592a44a67f9f6a09586146009483f46a0505a2dea3b02bccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvsch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jjms6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:23Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:23 crc kubenswrapper[5004]: I1201 08:18:23.061819 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2fa377e-a3a6-46ce-af23-e677799f0115\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b520771cf1ad6f9360064c5f304243c5878a2f1025c87d1001c97b2d5f95cb9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b65b9bc444101900c62fd90c7d45789d186a89148ac0fd9c7f0c6048c3f55ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edbd3e682d207b8c18d6bcbc26aa2adae2a35645502d31b327ffddd51991e842\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4563974ed1467a87aab104756828e63853c282a61c4fcab1e757eb4da9afaa40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:23Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:23 crc kubenswrapper[5004]: I1201 08:18:23.084913 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dpkxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddcaf111-708a-45b3-a342-effd3061ab17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68ecba515f05ca83fdd0cdda10e3e5925a146aadb70ae17859586c12daf55dac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd2f03a8020818d1cc98b6eeaf64a728e627a3401400c3af53654f8771ef02eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd2f03a8020818d1cc98b6eeaf64a728e627a3401400c3af53654f8771ef02eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8dd31c10eaf6bc673dec377a884040524418dd1891a96b2eb6e6e983979512a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8dd31c10eaf6bc673dec377a884040524418dd1891a96b2eb6e6e983979512a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a7a8ef66a9486ca3d13826df7aa39db2083cff2c386b4d0d858e0526e0f094d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a7a8ef66a9486ca3d13826df7aa39db2083cff2c386b4d0d858e0526e0f094d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd1c47c976c17bf17e4d660288ce30472372b6a7f33f6ffa6d96c1a4436275d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd1c47c976c17bf17e4d660288ce30472372b6a7f33f6ffa6d96c1a4436275d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b7c8e9648e5514ab1fcb63dc72822dd276a2d74987e2fb4f4800b26ec176be7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b7c8e9648e5514ab1fcb63dc72822dd276a2d74987e2fb4f4800b26ec176be7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d7115db7fe248437f4b8918f06c9aa48b141fcd3f31add80faee81b4347e47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d7115db7fe248437f4b8918f06c9aa48b141fcd3f31add80faee81b4347e47b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xb29s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dpkxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:23Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:23 crc kubenswrapper[5004]: I1201 08:18:23.101641 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ww6lq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72500ffd-4ca3-4614-a3a2-bbdc5a7506c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://668cfaa2af20e2bb082fc47e1702cfd9f704c4fdf56a4d27cf25d6915e7cd18a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmsvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ww6lq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:23Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:23 crc kubenswrapper[5004]: I1201 08:18:23.121015 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:23 crc kubenswrapper[5004]: I1201 08:18:23.121096 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:23 crc kubenswrapper[5004]: I1201 08:18:23.121000 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4ee478f-9254-4e56-96be-f5a83ff5d77c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37438df65f4dab6700f193e84f81d8ed41b3c208c3f6150bd5836c0642190572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f873da9c594c885720113b0d0fc01552050d030e802074043e9ede174fd9b16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ca757be16a55ba8df0e9629f7cc2653e2804a5ee5a2151ee3b07d2c30fe5b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3930a1c88c6014beb768ed20a37359ade08e77496a86b913f68c6196134f13a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3930a1c88c6014beb768ed20a37359ade08e77496a86b913f68c6196134f13a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:23Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:23 crc kubenswrapper[5004]: I1201 08:18:23.121118 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:23 crc kubenswrapper[5004]: I1201 08:18:23.121308 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:23 crc kubenswrapper[5004]: I1201 08:18:23.121323 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:23Z","lastTransitionTime":"2025-12-01T08:18:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:23 crc kubenswrapper[5004]: I1201 08:18:23.144822 5004 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0ed2b6d-0c61-4639-bc3b-1c8effc4815d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T08:17:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45a0d7a12e7c114de1740a151e04f7f39844b709740cab958f156ac918af3228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d69a370af9f8f6c575f99da6bb01cee0f660dccb7e85f16f5d8874d0e263e1fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82baee0fd931fd7b41cd5a73ca7e653ef13dfa3d573991cbe3238d6b95f6fac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa5f341e99100e985f65c24647d976a24782dfae7f52e752b774f41f69af27fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad581564fc519328ae429bce757797f6d87d8648e862b008637a245866ec4cbe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T08:17:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 08:17:15.185125 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 08:17:15.186793 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071633747/tls.crt::/tmp/serving-cert-4071633747/tls.key\\\\\\\"\\\\nI1201 08:17:20.827614 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 08:17:20.834528 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 08:17:20.834549 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 08:17:20.837611 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 08:17:20.837630 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 08:17:20.848940 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 08:17:20.848957 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 08:17:20.848965 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 08:17:20.848968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 08:17:20.848970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 08:17:20.848973 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 08:17:20.849115 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 08:17:20.854990 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2caeb2ebaf68110cb8728e60aae6db1b1fc48efae6018f66070766a4f42caa69\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T08:17:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T08:17:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T08:17:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T08:17:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:23Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:23 crc kubenswrapper[5004]: I1201 08:18:23.224760 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:23 crc kubenswrapper[5004]: I1201 08:18:23.224827 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:23 crc kubenswrapper[5004]: I1201 08:18:23.224842 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:23 crc kubenswrapper[5004]: I1201 08:18:23.224865 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:23 crc kubenswrapper[5004]: I1201 08:18:23.224880 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:23Z","lastTransitionTime":"2025-12-01T08:18:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:23 crc kubenswrapper[5004]: I1201 08:18:23.327120 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:23 crc kubenswrapper[5004]: I1201 08:18:23.327177 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:23 crc kubenswrapper[5004]: I1201 08:18:23.327192 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:23 crc kubenswrapper[5004]: I1201 08:18:23.327213 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:23 crc kubenswrapper[5004]: I1201 08:18:23.327230 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:23Z","lastTransitionTime":"2025-12-01T08:18:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:23 crc kubenswrapper[5004]: I1201 08:18:23.429886 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:23 crc kubenswrapper[5004]: I1201 08:18:23.429928 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:23 crc kubenswrapper[5004]: I1201 08:18:23.429939 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:23 crc kubenswrapper[5004]: I1201 08:18:23.429957 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:23 crc kubenswrapper[5004]: I1201 08:18:23.429969 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:23Z","lastTransitionTime":"2025-12-01T08:18:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:23 crc kubenswrapper[5004]: I1201 08:18:23.532362 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:23 crc kubenswrapper[5004]: I1201 08:18:23.532405 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:23 crc kubenswrapper[5004]: I1201 08:18:23.532418 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:23 crc kubenswrapper[5004]: I1201 08:18:23.532435 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:23 crc kubenswrapper[5004]: I1201 08:18:23.532450 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:23Z","lastTransitionTime":"2025-12-01T08:18:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:23 crc kubenswrapper[5004]: I1201 08:18:23.635304 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:23 crc kubenswrapper[5004]: I1201 08:18:23.635341 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:23 crc kubenswrapper[5004]: I1201 08:18:23.635353 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:23 crc kubenswrapper[5004]: I1201 08:18:23.635370 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:23 crc kubenswrapper[5004]: I1201 08:18:23.635382 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:23Z","lastTransitionTime":"2025-12-01T08:18:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:23 crc kubenswrapper[5004]: I1201 08:18:23.738428 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:23 crc kubenswrapper[5004]: I1201 08:18:23.738701 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:23 crc kubenswrapper[5004]: I1201 08:18:23.738828 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:23 crc kubenswrapper[5004]: I1201 08:18:23.738931 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:23 crc kubenswrapper[5004]: I1201 08:18:23.739013 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:23Z","lastTransitionTime":"2025-12-01T08:18:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:23 crc kubenswrapper[5004]: I1201 08:18:23.757889 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 08:18:23 crc kubenswrapper[5004]: E1201 08:18:23.758151 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 08:18:23 crc kubenswrapper[5004]: I1201 08:18:23.757891 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:18:23 crc kubenswrapper[5004]: E1201 08:18:23.758329 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 08:18:23 crc kubenswrapper[5004]: I1201 08:18:23.757889 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 08:18:23 crc kubenswrapper[5004]: E1201 08:18:23.758535 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 08:18:23 crc kubenswrapper[5004]: I1201 08:18:23.841974 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:23 crc kubenswrapper[5004]: I1201 08:18:23.842029 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:23 crc kubenswrapper[5004]: I1201 08:18:23.842046 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:23 crc kubenswrapper[5004]: I1201 08:18:23.842068 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:23 crc kubenswrapper[5004]: I1201 08:18:23.842087 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:23Z","lastTransitionTime":"2025-12-01T08:18:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:23 crc kubenswrapper[5004]: I1201 08:18:23.945176 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:23 crc kubenswrapper[5004]: I1201 08:18:23.945246 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:23 crc kubenswrapper[5004]: I1201 08:18:23.945263 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:23 crc kubenswrapper[5004]: I1201 08:18:23.945298 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:23 crc kubenswrapper[5004]: I1201 08:18:23.945315 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:23Z","lastTransitionTime":"2025-12-01T08:18:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:24 crc kubenswrapper[5004]: I1201 08:18:24.048112 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:24 crc kubenswrapper[5004]: I1201 08:18:24.048186 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:24 crc kubenswrapper[5004]: I1201 08:18:24.048222 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:24 crc kubenswrapper[5004]: I1201 08:18:24.048258 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:24 crc kubenswrapper[5004]: I1201 08:18:24.048275 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:24Z","lastTransitionTime":"2025-12-01T08:18:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:24 crc kubenswrapper[5004]: I1201 08:18:24.151942 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:24 crc kubenswrapper[5004]: I1201 08:18:24.152005 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:24 crc kubenswrapper[5004]: I1201 08:18:24.152022 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:24 crc kubenswrapper[5004]: I1201 08:18:24.152477 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:24 crc kubenswrapper[5004]: I1201 08:18:24.152533 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:24Z","lastTransitionTime":"2025-12-01T08:18:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:24 crc kubenswrapper[5004]: I1201 08:18:24.255948 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:24 crc kubenswrapper[5004]: I1201 08:18:24.256013 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:24 crc kubenswrapper[5004]: I1201 08:18:24.256029 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:24 crc kubenswrapper[5004]: I1201 08:18:24.256053 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:24 crc kubenswrapper[5004]: I1201 08:18:24.256070 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:24Z","lastTransitionTime":"2025-12-01T08:18:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:24 crc kubenswrapper[5004]: I1201 08:18:24.358424 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:24 crc kubenswrapper[5004]: I1201 08:18:24.358484 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:24 crc kubenswrapper[5004]: I1201 08:18:24.358500 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:24 crc kubenswrapper[5004]: I1201 08:18:24.358522 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:24 crc kubenswrapper[5004]: I1201 08:18:24.358539 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:24Z","lastTransitionTime":"2025-12-01T08:18:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:24 crc kubenswrapper[5004]: I1201 08:18:24.461139 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:24 crc kubenswrapper[5004]: I1201 08:18:24.461183 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:24 crc kubenswrapper[5004]: I1201 08:18:24.461195 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:24 crc kubenswrapper[5004]: I1201 08:18:24.461210 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:24 crc kubenswrapper[5004]: I1201 08:18:24.461221 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:24Z","lastTransitionTime":"2025-12-01T08:18:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:24 crc kubenswrapper[5004]: I1201 08:18:24.563543 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:24 crc kubenswrapper[5004]: I1201 08:18:24.563626 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:24 crc kubenswrapper[5004]: I1201 08:18:24.563642 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:24 crc kubenswrapper[5004]: I1201 08:18:24.563665 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:24 crc kubenswrapper[5004]: I1201 08:18:24.563682 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:24Z","lastTransitionTime":"2025-12-01T08:18:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:24 crc kubenswrapper[5004]: I1201 08:18:24.666807 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:24 crc kubenswrapper[5004]: I1201 08:18:24.666859 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:24 crc kubenswrapper[5004]: I1201 08:18:24.666871 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:24 crc kubenswrapper[5004]: I1201 08:18:24.666888 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:24 crc kubenswrapper[5004]: I1201 08:18:24.666897 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:24Z","lastTransitionTime":"2025-12-01T08:18:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:24 crc kubenswrapper[5004]: I1201 08:18:24.758764 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cl5l" Dec 01 08:18:24 crc kubenswrapper[5004]: E1201 08:18:24.759337 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cl5l" podUID="b488f4f3-d385-4d40-bdee-96d8fe2d42a1" Dec 01 08:18:24 crc kubenswrapper[5004]: I1201 08:18:24.769544 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:24 crc kubenswrapper[5004]: I1201 08:18:24.769606 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:24 crc kubenswrapper[5004]: I1201 08:18:24.769619 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:24 crc kubenswrapper[5004]: I1201 08:18:24.769631 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:24 crc kubenswrapper[5004]: I1201 08:18:24.769641 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:24Z","lastTransitionTime":"2025-12-01T08:18:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:24 crc kubenswrapper[5004]: I1201 08:18:24.872519 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:24 crc kubenswrapper[5004]: I1201 08:18:24.872632 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:24 crc kubenswrapper[5004]: I1201 08:18:24.872657 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:24 crc kubenswrapper[5004]: I1201 08:18:24.872688 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:24 crc kubenswrapper[5004]: I1201 08:18:24.872712 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:24Z","lastTransitionTime":"2025-12-01T08:18:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:24 crc kubenswrapper[5004]: I1201 08:18:24.976115 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:24 crc kubenswrapper[5004]: I1201 08:18:24.976175 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:24 crc kubenswrapper[5004]: I1201 08:18:24.976192 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:24 crc kubenswrapper[5004]: I1201 08:18:24.976217 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:24 crc kubenswrapper[5004]: I1201 08:18:24.976235 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:24Z","lastTransitionTime":"2025-12-01T08:18:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:25 crc kubenswrapper[5004]: I1201 08:18:25.078977 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:25 crc kubenswrapper[5004]: I1201 08:18:25.079043 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:25 crc kubenswrapper[5004]: I1201 08:18:25.079058 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:25 crc kubenswrapper[5004]: I1201 08:18:25.079081 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:25 crc kubenswrapper[5004]: I1201 08:18:25.079097 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:25Z","lastTransitionTime":"2025-12-01T08:18:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:25 crc kubenswrapper[5004]: I1201 08:18:25.182235 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:25 crc kubenswrapper[5004]: I1201 08:18:25.182292 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:25 crc kubenswrapper[5004]: I1201 08:18:25.182311 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:25 crc kubenswrapper[5004]: I1201 08:18:25.182334 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:25 crc kubenswrapper[5004]: I1201 08:18:25.182349 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:25Z","lastTransitionTime":"2025-12-01T08:18:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:25 crc kubenswrapper[5004]: I1201 08:18:25.285209 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:25 crc kubenswrapper[5004]: I1201 08:18:25.285279 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:25 crc kubenswrapper[5004]: I1201 08:18:25.285296 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:25 crc kubenswrapper[5004]: I1201 08:18:25.285319 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:25 crc kubenswrapper[5004]: I1201 08:18:25.285340 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:25Z","lastTransitionTime":"2025-12-01T08:18:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:25 crc kubenswrapper[5004]: I1201 08:18:25.387670 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:25 crc kubenswrapper[5004]: I1201 08:18:25.387709 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:25 crc kubenswrapper[5004]: I1201 08:18:25.387720 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:25 crc kubenswrapper[5004]: I1201 08:18:25.387737 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:25 crc kubenswrapper[5004]: I1201 08:18:25.387748 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:25Z","lastTransitionTime":"2025-12-01T08:18:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:25 crc kubenswrapper[5004]: I1201 08:18:25.490315 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:25 crc kubenswrapper[5004]: I1201 08:18:25.490446 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:25 crc kubenswrapper[5004]: I1201 08:18:25.490476 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:25 crc kubenswrapper[5004]: I1201 08:18:25.490508 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:25 crc kubenswrapper[5004]: I1201 08:18:25.490532 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:25Z","lastTransitionTime":"2025-12-01T08:18:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:25 crc kubenswrapper[5004]: I1201 08:18:25.594000 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:25 crc kubenswrapper[5004]: I1201 08:18:25.594042 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:25 crc kubenswrapper[5004]: I1201 08:18:25.594053 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:25 crc kubenswrapper[5004]: I1201 08:18:25.594068 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:25 crc kubenswrapper[5004]: I1201 08:18:25.594078 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:25Z","lastTransitionTime":"2025-12-01T08:18:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:25 crc kubenswrapper[5004]: I1201 08:18:25.646260 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 08:18:25 crc kubenswrapper[5004]: I1201 08:18:25.646381 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:18:25 crc kubenswrapper[5004]: I1201 08:18:25.646428 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 08:18:25 crc kubenswrapper[5004]: E1201 08:18:25.646470 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 08:19:29.646441954 +0000 UTC m=+147.211433966 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:18:25 crc kubenswrapper[5004]: I1201 08:18:25.646512 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:18:25 crc kubenswrapper[5004]: I1201 08:18:25.646594 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 08:18:25 crc kubenswrapper[5004]: E1201 08:18:25.646616 5004 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 08:18:25 crc kubenswrapper[5004]: E1201 08:18:25.646639 5004 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 08:18:25 crc kubenswrapper[5004]: E1201 08:18:25.646658 5004 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 08:18:25 crc kubenswrapper[5004]: E1201 08:18:25.646715 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 08:19:29.64669502 +0000 UTC m=+147.211687042 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 08:18:25 crc kubenswrapper[5004]: E1201 08:18:25.646735 5004 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 08:18:25 crc kubenswrapper[5004]: E1201 08:18:25.646736 5004 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 08:18:25 crc kubenswrapper[5004]: E1201 08:18:25.646760 5004 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 08:18:25 crc kubenswrapper[5004]: E1201 08:18:25.646779 5004 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 08:18:25 crc kubenswrapper[5004]: E1201 08:18:25.646846 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 08:19:29.646814713 +0000 UTC m=+147.211806725 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 08:18:25 crc kubenswrapper[5004]: E1201 08:18:25.646906 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 08:19:29.646873295 +0000 UTC m=+147.211865317 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 08:18:25 crc kubenswrapper[5004]: E1201 08:18:25.646787 5004 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 08:18:25 crc kubenswrapper[5004]: E1201 08:18:25.646975 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 08:19:29.646962697 +0000 UTC m=+147.211954719 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 08:18:25 crc kubenswrapper[5004]: I1201 08:18:25.696691 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:25 crc kubenswrapper[5004]: I1201 08:18:25.696741 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:25 crc kubenswrapper[5004]: I1201 08:18:25.696757 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:25 crc kubenswrapper[5004]: I1201 08:18:25.696778 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:25 crc kubenswrapper[5004]: I1201 08:18:25.696796 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:25Z","lastTransitionTime":"2025-12-01T08:18:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:25 crc kubenswrapper[5004]: I1201 08:18:25.758889 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 08:18:25 crc kubenswrapper[5004]: I1201 08:18:25.758946 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:18:25 crc kubenswrapper[5004]: I1201 08:18:25.758901 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 08:18:25 crc kubenswrapper[5004]: E1201 08:18:25.759043 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 08:18:25 crc kubenswrapper[5004]: E1201 08:18:25.759220 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 08:18:25 crc kubenswrapper[5004]: E1201 08:18:25.759328 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 08:18:25 crc kubenswrapper[5004]: I1201 08:18:25.799702 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:25 crc kubenswrapper[5004]: I1201 08:18:25.799759 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:25 crc kubenswrapper[5004]: I1201 08:18:25.799779 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:25 crc kubenswrapper[5004]: I1201 08:18:25.799803 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:25 crc kubenswrapper[5004]: I1201 08:18:25.799821 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:25Z","lastTransitionTime":"2025-12-01T08:18:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:25 crc kubenswrapper[5004]: I1201 08:18:25.902838 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:25 crc kubenswrapper[5004]: I1201 08:18:25.902915 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:25 crc kubenswrapper[5004]: I1201 08:18:25.902938 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:25 crc kubenswrapper[5004]: I1201 08:18:25.902967 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:25 crc kubenswrapper[5004]: I1201 08:18:25.902989 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:25Z","lastTransitionTime":"2025-12-01T08:18:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.005670 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.005733 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.005751 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.005775 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.005792 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:26Z","lastTransitionTime":"2025-12-01T08:18:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.109164 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.109224 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.109240 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.109267 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.109285 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:26Z","lastTransitionTime":"2025-12-01T08:18:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.141914 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.141961 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.141978 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.142002 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.142025 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:26Z","lastTransitionTime":"2025-12-01T08:18:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:26 crc kubenswrapper[5004]: E1201 08:18:26.163312 5004 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:18:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:18:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:18:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:18:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c8341267-df98-484f-a5e7-cf024fab437c\\\",\\\"systemUUID\\\":\\\"77547a65-c048-47ee-89b1-8422fc81b7aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:26Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.168007 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.168071 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.168094 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.168122 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.168143 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:26Z","lastTransitionTime":"2025-12-01T08:18:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:26 crc kubenswrapper[5004]: E1201 08:18:26.188526 5004 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:18:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:18:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:18:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:18:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c8341267-df98-484f-a5e7-cf024fab437c\\\",\\\"systemUUID\\\":\\\"77547a65-c048-47ee-89b1-8422fc81b7aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:26Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.194044 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.194097 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.194114 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.194135 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.194150 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:26Z","lastTransitionTime":"2025-12-01T08:18:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:26 crc kubenswrapper[5004]: E1201 08:18:26.213009 5004 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:18:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:18:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:18:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:18:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c8341267-df98-484f-a5e7-cf024fab437c\\\",\\\"systemUUID\\\":\\\"77547a65-c048-47ee-89b1-8422fc81b7aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:26Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.217962 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.218013 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.218029 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.218055 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.218079 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:26Z","lastTransitionTime":"2025-12-01T08:18:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:26 crc kubenswrapper[5004]: E1201 08:18:26.237819 5004 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:18:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:18:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:18:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:18:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c8341267-df98-484f-a5e7-cf024fab437c\\\",\\\"systemUUID\\\":\\\"77547a65-c048-47ee-89b1-8422fc81b7aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:26Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.243355 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.243459 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.243493 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.243532 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.243557 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:26Z","lastTransitionTime":"2025-12-01T08:18:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:26 crc kubenswrapper[5004]: E1201 08:18:26.267855 5004 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:18:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:18:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:18:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T08:18:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T08:18:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c8341267-df98-484f-a5e7-cf024fab437c\\\",\\\"systemUUID\\\":\\\"77547a65-c048-47ee-89b1-8422fc81b7aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T08:18:26Z is after 2025-08-24T17:21:41Z" Dec 01 08:18:26 crc kubenswrapper[5004]: E1201 08:18:26.268197 5004 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.270326 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.270391 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.270416 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.270449 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.270470 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:26Z","lastTransitionTime":"2025-12-01T08:18:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.373391 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.373459 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.373483 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.373511 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.373534 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:26Z","lastTransitionTime":"2025-12-01T08:18:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.477975 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.478046 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.478069 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.478099 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.478119 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:26Z","lastTransitionTime":"2025-12-01T08:18:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.581744 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.581824 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.581849 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.581881 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.581903 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:26Z","lastTransitionTime":"2025-12-01T08:18:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.684862 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.684919 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.684935 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.684959 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.684978 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:26Z","lastTransitionTime":"2025-12-01T08:18:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.758890 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cl5l" Dec 01 08:18:26 crc kubenswrapper[5004]: E1201 08:18:26.759132 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cl5l" podUID="b488f4f3-d385-4d40-bdee-96d8fe2d42a1" Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.788029 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.788084 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.788100 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.788124 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.788147 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:26Z","lastTransitionTime":"2025-12-01T08:18:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.891248 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.891293 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.891310 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.891333 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.891352 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:26Z","lastTransitionTime":"2025-12-01T08:18:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.995054 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.995127 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.995146 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.995171 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:26 crc kubenswrapper[5004]: I1201 08:18:26.995192 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:26Z","lastTransitionTime":"2025-12-01T08:18:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:27 crc kubenswrapper[5004]: I1201 08:18:27.098188 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:27 crc kubenswrapper[5004]: I1201 08:18:27.098240 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:27 crc kubenswrapper[5004]: I1201 08:18:27.098252 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:27 crc kubenswrapper[5004]: I1201 08:18:27.098268 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:27 crc kubenswrapper[5004]: I1201 08:18:27.098279 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:27Z","lastTransitionTime":"2025-12-01T08:18:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:27 crc kubenswrapper[5004]: I1201 08:18:27.203878 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:27 crc kubenswrapper[5004]: I1201 08:18:27.203963 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:27 crc kubenswrapper[5004]: I1201 08:18:27.204043 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:27 crc kubenswrapper[5004]: I1201 08:18:27.204078 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:27 crc kubenswrapper[5004]: I1201 08:18:27.204099 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:27Z","lastTransitionTime":"2025-12-01T08:18:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:27 crc kubenswrapper[5004]: I1201 08:18:27.306771 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:27 crc kubenswrapper[5004]: I1201 08:18:27.306902 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:27 crc kubenswrapper[5004]: I1201 08:18:27.306920 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:27 crc kubenswrapper[5004]: I1201 08:18:27.306944 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:27 crc kubenswrapper[5004]: I1201 08:18:27.306961 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:27Z","lastTransitionTime":"2025-12-01T08:18:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:27 crc kubenswrapper[5004]: I1201 08:18:27.410242 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:27 crc kubenswrapper[5004]: I1201 08:18:27.410300 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:27 crc kubenswrapper[5004]: I1201 08:18:27.410317 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:27 crc kubenswrapper[5004]: I1201 08:18:27.410340 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:27 crc kubenswrapper[5004]: I1201 08:18:27.410356 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:27Z","lastTransitionTime":"2025-12-01T08:18:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:27 crc kubenswrapper[5004]: I1201 08:18:27.513277 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:27 crc kubenswrapper[5004]: I1201 08:18:27.513342 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:27 crc kubenswrapper[5004]: I1201 08:18:27.513359 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:27 crc kubenswrapper[5004]: I1201 08:18:27.513390 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:27 crc kubenswrapper[5004]: I1201 08:18:27.513501 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:27Z","lastTransitionTime":"2025-12-01T08:18:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:27 crc kubenswrapper[5004]: I1201 08:18:27.616431 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:27 crc kubenswrapper[5004]: I1201 08:18:27.616503 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:27 crc kubenswrapper[5004]: I1201 08:18:27.616528 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:27 crc kubenswrapper[5004]: I1201 08:18:27.616595 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:27 crc kubenswrapper[5004]: I1201 08:18:27.616617 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:27Z","lastTransitionTime":"2025-12-01T08:18:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:27 crc kubenswrapper[5004]: I1201 08:18:27.719422 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:27 crc kubenswrapper[5004]: I1201 08:18:27.719496 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:27 crc kubenswrapper[5004]: I1201 08:18:27.719521 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:27 crc kubenswrapper[5004]: I1201 08:18:27.719547 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:27 crc kubenswrapper[5004]: I1201 08:18:27.719597 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:27Z","lastTransitionTime":"2025-12-01T08:18:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:27 crc kubenswrapper[5004]: I1201 08:18:27.758144 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 08:18:27 crc kubenswrapper[5004]: I1201 08:18:27.758179 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 08:18:27 crc kubenswrapper[5004]: I1201 08:18:27.758210 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:18:27 crc kubenswrapper[5004]: E1201 08:18:27.758319 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 08:18:27 crc kubenswrapper[5004]: E1201 08:18:27.758445 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 08:18:27 crc kubenswrapper[5004]: E1201 08:18:27.758662 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 08:18:27 crc kubenswrapper[5004]: I1201 08:18:27.821596 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:27 crc kubenswrapper[5004]: I1201 08:18:27.821641 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:27 crc kubenswrapper[5004]: I1201 08:18:27.821668 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:27 crc kubenswrapper[5004]: I1201 08:18:27.821691 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:27 crc kubenswrapper[5004]: I1201 08:18:27.821707 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:27Z","lastTransitionTime":"2025-12-01T08:18:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:27 crc kubenswrapper[5004]: I1201 08:18:27.924106 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:27 crc kubenswrapper[5004]: I1201 08:18:27.924181 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:27 crc kubenswrapper[5004]: I1201 08:18:27.924206 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:27 crc kubenswrapper[5004]: I1201 08:18:27.924235 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:27 crc kubenswrapper[5004]: I1201 08:18:27.924255 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:27Z","lastTransitionTime":"2025-12-01T08:18:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:28 crc kubenswrapper[5004]: I1201 08:18:28.027165 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:28 crc kubenswrapper[5004]: I1201 08:18:28.027228 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:28 crc kubenswrapper[5004]: I1201 08:18:28.027241 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:28 crc kubenswrapper[5004]: I1201 08:18:28.027258 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:28 crc kubenswrapper[5004]: I1201 08:18:28.027270 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:28Z","lastTransitionTime":"2025-12-01T08:18:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:28 crc kubenswrapper[5004]: I1201 08:18:28.129488 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:28 crc kubenswrapper[5004]: I1201 08:18:28.129555 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:28 crc kubenswrapper[5004]: I1201 08:18:28.129597 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:28 crc kubenswrapper[5004]: I1201 08:18:28.129631 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:28 crc kubenswrapper[5004]: I1201 08:18:28.129644 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:28Z","lastTransitionTime":"2025-12-01T08:18:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:28 crc kubenswrapper[5004]: I1201 08:18:28.232291 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:28 crc kubenswrapper[5004]: I1201 08:18:28.232491 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:28 crc kubenswrapper[5004]: I1201 08:18:28.232525 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:28 crc kubenswrapper[5004]: I1201 08:18:28.232605 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:28 crc kubenswrapper[5004]: I1201 08:18:28.232646 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:28Z","lastTransitionTime":"2025-12-01T08:18:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:28 crc kubenswrapper[5004]: I1201 08:18:28.334670 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:28 crc kubenswrapper[5004]: I1201 08:18:28.334751 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:28 crc kubenswrapper[5004]: I1201 08:18:28.334772 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:28 crc kubenswrapper[5004]: I1201 08:18:28.334804 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:28 crc kubenswrapper[5004]: I1201 08:18:28.334831 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:28Z","lastTransitionTime":"2025-12-01T08:18:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:28 crc kubenswrapper[5004]: I1201 08:18:28.437374 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:28 crc kubenswrapper[5004]: I1201 08:18:28.437439 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:28 crc kubenswrapper[5004]: I1201 08:18:28.437457 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:28 crc kubenswrapper[5004]: I1201 08:18:28.437485 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:28 crc kubenswrapper[5004]: I1201 08:18:28.437503 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:28Z","lastTransitionTime":"2025-12-01T08:18:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:28 crc kubenswrapper[5004]: I1201 08:18:28.540272 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:28 crc kubenswrapper[5004]: I1201 08:18:28.540351 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:28 crc kubenswrapper[5004]: I1201 08:18:28.540374 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:28 crc kubenswrapper[5004]: I1201 08:18:28.540399 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:28 crc kubenswrapper[5004]: I1201 08:18:28.540418 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:28Z","lastTransitionTime":"2025-12-01T08:18:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:28 crc kubenswrapper[5004]: I1201 08:18:28.643888 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:28 crc kubenswrapper[5004]: I1201 08:18:28.643955 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:28 crc kubenswrapper[5004]: I1201 08:18:28.643972 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:28 crc kubenswrapper[5004]: I1201 08:18:28.643995 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:28 crc kubenswrapper[5004]: I1201 08:18:28.644013 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:28Z","lastTransitionTime":"2025-12-01T08:18:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:28 crc kubenswrapper[5004]: I1201 08:18:28.748228 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:28 crc kubenswrapper[5004]: I1201 08:18:28.748303 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:28 crc kubenswrapper[5004]: I1201 08:18:28.748321 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:28 crc kubenswrapper[5004]: I1201 08:18:28.748348 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:28 crc kubenswrapper[5004]: I1201 08:18:28.748364 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:28Z","lastTransitionTime":"2025-12-01T08:18:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:28 crc kubenswrapper[5004]: I1201 08:18:28.758692 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cl5l" Dec 01 08:18:28 crc kubenswrapper[5004]: E1201 08:18:28.759135 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cl5l" podUID="b488f4f3-d385-4d40-bdee-96d8fe2d42a1" Dec 01 08:18:28 crc kubenswrapper[5004]: I1201 08:18:28.850744 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:28 crc kubenswrapper[5004]: I1201 08:18:28.850805 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:28 crc kubenswrapper[5004]: I1201 08:18:28.850827 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:28 crc kubenswrapper[5004]: I1201 08:18:28.850857 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:28 crc kubenswrapper[5004]: I1201 08:18:28.850874 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:28Z","lastTransitionTime":"2025-12-01T08:18:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:28 crc kubenswrapper[5004]: I1201 08:18:28.953732 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:28 crc kubenswrapper[5004]: I1201 08:18:28.953798 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:28 crc kubenswrapper[5004]: I1201 08:18:28.953814 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:28 crc kubenswrapper[5004]: I1201 08:18:28.953839 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:28 crc kubenswrapper[5004]: I1201 08:18:28.953857 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:28Z","lastTransitionTime":"2025-12-01T08:18:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:29 crc kubenswrapper[5004]: I1201 08:18:29.057048 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:29 crc kubenswrapper[5004]: I1201 08:18:29.057545 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:29 crc kubenswrapper[5004]: I1201 08:18:29.057784 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:29 crc kubenswrapper[5004]: I1201 08:18:29.058000 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:29 crc kubenswrapper[5004]: I1201 08:18:29.058224 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:29Z","lastTransitionTime":"2025-12-01T08:18:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:29 crc kubenswrapper[5004]: I1201 08:18:29.161605 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:29 crc kubenswrapper[5004]: I1201 08:18:29.161678 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:29 crc kubenswrapper[5004]: I1201 08:18:29.161701 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:29 crc kubenswrapper[5004]: I1201 08:18:29.161732 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:29 crc kubenswrapper[5004]: I1201 08:18:29.161753 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:29Z","lastTransitionTime":"2025-12-01T08:18:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:29 crc kubenswrapper[5004]: I1201 08:18:29.264392 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:29 crc kubenswrapper[5004]: I1201 08:18:29.264463 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:29 crc kubenswrapper[5004]: I1201 08:18:29.264483 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:29 crc kubenswrapper[5004]: I1201 08:18:29.264511 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:29 crc kubenswrapper[5004]: I1201 08:18:29.264528 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:29Z","lastTransitionTime":"2025-12-01T08:18:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:29 crc kubenswrapper[5004]: I1201 08:18:29.366821 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:29 crc kubenswrapper[5004]: I1201 08:18:29.366967 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:29 crc kubenswrapper[5004]: I1201 08:18:29.367005 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:29 crc kubenswrapper[5004]: I1201 08:18:29.367033 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:29 crc kubenswrapper[5004]: I1201 08:18:29.367058 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:29Z","lastTransitionTime":"2025-12-01T08:18:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:29 crc kubenswrapper[5004]: I1201 08:18:29.469938 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:29 crc kubenswrapper[5004]: I1201 08:18:29.470024 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:29 crc kubenswrapper[5004]: I1201 08:18:29.470047 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:29 crc kubenswrapper[5004]: I1201 08:18:29.470079 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:29 crc kubenswrapper[5004]: I1201 08:18:29.470104 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:29Z","lastTransitionTime":"2025-12-01T08:18:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:29 crc kubenswrapper[5004]: I1201 08:18:29.572651 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:29 crc kubenswrapper[5004]: I1201 08:18:29.572691 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:29 crc kubenswrapper[5004]: I1201 08:18:29.572701 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:29 crc kubenswrapper[5004]: I1201 08:18:29.572715 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:29 crc kubenswrapper[5004]: I1201 08:18:29.572726 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:29Z","lastTransitionTime":"2025-12-01T08:18:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:29 crc kubenswrapper[5004]: I1201 08:18:29.675919 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:29 crc kubenswrapper[5004]: I1201 08:18:29.675987 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:29 crc kubenswrapper[5004]: I1201 08:18:29.676010 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:29 crc kubenswrapper[5004]: I1201 08:18:29.676039 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:29 crc kubenswrapper[5004]: I1201 08:18:29.676061 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:29Z","lastTransitionTime":"2025-12-01T08:18:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:29 crc kubenswrapper[5004]: I1201 08:18:29.758360 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 08:18:29 crc kubenswrapper[5004]: I1201 08:18:29.758390 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 08:18:29 crc kubenswrapper[5004]: E1201 08:18:29.758484 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 08:18:29 crc kubenswrapper[5004]: I1201 08:18:29.758378 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:18:29 crc kubenswrapper[5004]: E1201 08:18:29.758714 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 08:18:29 crc kubenswrapper[5004]: E1201 08:18:29.758746 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 08:18:29 crc kubenswrapper[5004]: I1201 08:18:29.778208 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:29 crc kubenswrapper[5004]: I1201 08:18:29.778273 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:29 crc kubenswrapper[5004]: I1201 08:18:29.778294 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:29 crc kubenswrapper[5004]: I1201 08:18:29.778321 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:29 crc kubenswrapper[5004]: I1201 08:18:29.778342 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:29Z","lastTransitionTime":"2025-12-01T08:18:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:29 crc kubenswrapper[5004]: I1201 08:18:29.881206 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:29 crc kubenswrapper[5004]: I1201 08:18:29.881268 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:29 crc kubenswrapper[5004]: I1201 08:18:29.881287 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:29 crc kubenswrapper[5004]: I1201 08:18:29.881311 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:29 crc kubenswrapper[5004]: I1201 08:18:29.881332 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:29Z","lastTransitionTime":"2025-12-01T08:18:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:29 crc kubenswrapper[5004]: I1201 08:18:29.984102 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:29 crc kubenswrapper[5004]: I1201 08:18:29.984217 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:29 crc kubenswrapper[5004]: I1201 08:18:29.984243 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:29 crc kubenswrapper[5004]: I1201 08:18:29.984273 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:29 crc kubenswrapper[5004]: I1201 08:18:29.984296 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:29Z","lastTransitionTime":"2025-12-01T08:18:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:30 crc kubenswrapper[5004]: I1201 08:18:30.087419 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:30 crc kubenswrapper[5004]: I1201 08:18:30.087497 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:30 crc kubenswrapper[5004]: I1201 08:18:30.087525 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:30 crc kubenswrapper[5004]: I1201 08:18:30.087604 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:30 crc kubenswrapper[5004]: I1201 08:18:30.087630 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:30Z","lastTransitionTime":"2025-12-01T08:18:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:30 crc kubenswrapper[5004]: I1201 08:18:30.190176 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:30 crc kubenswrapper[5004]: I1201 08:18:30.190245 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:30 crc kubenswrapper[5004]: I1201 08:18:30.190267 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:30 crc kubenswrapper[5004]: I1201 08:18:30.190298 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:30 crc kubenswrapper[5004]: I1201 08:18:30.190320 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:30Z","lastTransitionTime":"2025-12-01T08:18:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:30 crc kubenswrapper[5004]: I1201 08:18:30.293542 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:30 crc kubenswrapper[5004]: I1201 08:18:30.293656 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:30 crc kubenswrapper[5004]: I1201 08:18:30.293680 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:30 crc kubenswrapper[5004]: I1201 08:18:30.293711 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:30 crc kubenswrapper[5004]: I1201 08:18:30.293804 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:30Z","lastTransitionTime":"2025-12-01T08:18:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:30 crc kubenswrapper[5004]: I1201 08:18:30.396601 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:30 crc kubenswrapper[5004]: I1201 08:18:30.396671 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:30 crc kubenswrapper[5004]: I1201 08:18:30.396695 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:30 crc kubenswrapper[5004]: I1201 08:18:30.396724 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:30 crc kubenswrapper[5004]: I1201 08:18:30.396744 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:30Z","lastTransitionTime":"2025-12-01T08:18:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:30 crc kubenswrapper[5004]: I1201 08:18:30.498953 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:30 crc kubenswrapper[5004]: I1201 08:18:30.499025 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:30 crc kubenswrapper[5004]: I1201 08:18:30.499047 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:30 crc kubenswrapper[5004]: I1201 08:18:30.499076 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:30 crc kubenswrapper[5004]: I1201 08:18:30.499099 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:30Z","lastTransitionTime":"2025-12-01T08:18:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:30 crc kubenswrapper[5004]: I1201 08:18:30.601675 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:30 crc kubenswrapper[5004]: I1201 08:18:30.601749 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:30 crc kubenswrapper[5004]: I1201 08:18:30.601775 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:30 crc kubenswrapper[5004]: I1201 08:18:30.601807 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:30 crc kubenswrapper[5004]: I1201 08:18:30.601831 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:30Z","lastTransitionTime":"2025-12-01T08:18:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:30 crc kubenswrapper[5004]: I1201 08:18:30.704608 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:30 crc kubenswrapper[5004]: I1201 08:18:30.704654 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:30 crc kubenswrapper[5004]: I1201 08:18:30.704669 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:30 crc kubenswrapper[5004]: I1201 08:18:30.704692 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:30 crc kubenswrapper[5004]: I1201 08:18:30.704708 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:30Z","lastTransitionTime":"2025-12-01T08:18:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:30 crc kubenswrapper[5004]: I1201 08:18:30.758185 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cl5l" Dec 01 08:18:30 crc kubenswrapper[5004]: E1201 08:18:30.758361 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cl5l" podUID="b488f4f3-d385-4d40-bdee-96d8fe2d42a1" Dec 01 08:18:30 crc kubenswrapper[5004]: I1201 08:18:30.807409 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:30 crc kubenswrapper[5004]: I1201 08:18:30.807464 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:30 crc kubenswrapper[5004]: I1201 08:18:30.807482 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:30 crc kubenswrapper[5004]: I1201 08:18:30.807508 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:30 crc kubenswrapper[5004]: I1201 08:18:30.807526 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:30Z","lastTransitionTime":"2025-12-01T08:18:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:30 crc kubenswrapper[5004]: I1201 08:18:30.910060 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:30 crc kubenswrapper[5004]: I1201 08:18:30.910147 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:30 crc kubenswrapper[5004]: I1201 08:18:30.910164 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:30 crc kubenswrapper[5004]: I1201 08:18:30.910221 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:30 crc kubenswrapper[5004]: I1201 08:18:30.910238 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:30Z","lastTransitionTime":"2025-12-01T08:18:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:31 crc kubenswrapper[5004]: I1201 08:18:31.013860 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:31 crc kubenswrapper[5004]: I1201 08:18:31.013962 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:31 crc kubenswrapper[5004]: I1201 08:18:31.013981 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:31 crc kubenswrapper[5004]: I1201 08:18:31.014005 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:31 crc kubenswrapper[5004]: I1201 08:18:31.014022 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:31Z","lastTransitionTime":"2025-12-01T08:18:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:31 crc kubenswrapper[5004]: I1201 08:18:31.117134 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:31 crc kubenswrapper[5004]: I1201 08:18:31.117212 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:31 crc kubenswrapper[5004]: I1201 08:18:31.117236 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:31 crc kubenswrapper[5004]: I1201 08:18:31.117265 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:31 crc kubenswrapper[5004]: I1201 08:18:31.117288 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:31Z","lastTransitionTime":"2025-12-01T08:18:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:31 crc kubenswrapper[5004]: I1201 08:18:31.219884 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:31 crc kubenswrapper[5004]: I1201 08:18:31.219948 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:31 crc kubenswrapper[5004]: I1201 08:18:31.219966 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:31 crc kubenswrapper[5004]: I1201 08:18:31.219990 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:31 crc kubenswrapper[5004]: I1201 08:18:31.220009 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:31Z","lastTransitionTime":"2025-12-01T08:18:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:31 crc kubenswrapper[5004]: I1201 08:18:31.323555 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:31 crc kubenswrapper[5004]: I1201 08:18:31.323672 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:31 crc kubenswrapper[5004]: I1201 08:18:31.323691 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:31 crc kubenswrapper[5004]: I1201 08:18:31.323716 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:31 crc kubenswrapper[5004]: I1201 08:18:31.323733 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:31Z","lastTransitionTime":"2025-12-01T08:18:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:31 crc kubenswrapper[5004]: I1201 08:18:31.426875 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:31 crc kubenswrapper[5004]: I1201 08:18:31.426937 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:31 crc kubenswrapper[5004]: I1201 08:18:31.426957 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:31 crc kubenswrapper[5004]: I1201 08:18:31.426983 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:31 crc kubenswrapper[5004]: I1201 08:18:31.426999 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:31Z","lastTransitionTime":"2025-12-01T08:18:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:31 crc kubenswrapper[5004]: I1201 08:18:31.529866 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:31 crc kubenswrapper[5004]: I1201 08:18:31.529928 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:31 crc kubenswrapper[5004]: I1201 08:18:31.529944 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:31 crc kubenswrapper[5004]: I1201 08:18:31.529968 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:31 crc kubenswrapper[5004]: I1201 08:18:31.530000 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:31Z","lastTransitionTime":"2025-12-01T08:18:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:31 crc kubenswrapper[5004]: I1201 08:18:31.633097 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:31 crc kubenswrapper[5004]: I1201 08:18:31.633182 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:31 crc kubenswrapper[5004]: I1201 08:18:31.633205 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:31 crc kubenswrapper[5004]: I1201 08:18:31.633233 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:31 crc kubenswrapper[5004]: I1201 08:18:31.633250 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:31Z","lastTransitionTime":"2025-12-01T08:18:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:31 crc kubenswrapper[5004]: I1201 08:18:31.736339 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:31 crc kubenswrapper[5004]: I1201 08:18:31.736400 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:31 crc kubenswrapper[5004]: I1201 08:18:31.736421 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:31 crc kubenswrapper[5004]: I1201 08:18:31.736449 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:31 crc kubenswrapper[5004]: I1201 08:18:31.736471 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:31Z","lastTransitionTime":"2025-12-01T08:18:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:31 crc kubenswrapper[5004]: I1201 08:18:31.758239 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 08:18:31 crc kubenswrapper[5004]: I1201 08:18:31.758295 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 08:18:31 crc kubenswrapper[5004]: I1201 08:18:31.758330 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:18:31 crc kubenswrapper[5004]: E1201 08:18:31.758461 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 08:18:31 crc kubenswrapper[5004]: E1201 08:18:31.758694 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 08:18:31 crc kubenswrapper[5004]: E1201 08:18:31.759035 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 08:18:31 crc kubenswrapper[5004]: I1201 08:18:31.839748 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:31 crc kubenswrapper[5004]: I1201 08:18:31.839810 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:31 crc kubenswrapper[5004]: I1201 08:18:31.839831 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:31 crc kubenswrapper[5004]: I1201 08:18:31.839861 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:31 crc kubenswrapper[5004]: I1201 08:18:31.839880 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:31Z","lastTransitionTime":"2025-12-01T08:18:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:31 crc kubenswrapper[5004]: I1201 08:18:31.943241 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:31 crc kubenswrapper[5004]: I1201 08:18:31.943293 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:31 crc kubenswrapper[5004]: I1201 08:18:31.943310 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:31 crc kubenswrapper[5004]: I1201 08:18:31.943332 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:31 crc kubenswrapper[5004]: I1201 08:18:31.943348 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:31Z","lastTransitionTime":"2025-12-01T08:18:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:32 crc kubenswrapper[5004]: I1201 08:18:32.045963 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:32 crc kubenswrapper[5004]: I1201 08:18:32.046031 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:32 crc kubenswrapper[5004]: I1201 08:18:32.046053 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:32 crc kubenswrapper[5004]: I1201 08:18:32.046083 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:32 crc kubenswrapper[5004]: I1201 08:18:32.046105 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:32Z","lastTransitionTime":"2025-12-01T08:18:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:32 crc kubenswrapper[5004]: I1201 08:18:32.149268 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:32 crc kubenswrapper[5004]: I1201 08:18:32.149320 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:32 crc kubenswrapper[5004]: I1201 08:18:32.149336 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:32 crc kubenswrapper[5004]: I1201 08:18:32.149359 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:32 crc kubenswrapper[5004]: I1201 08:18:32.149376 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:32Z","lastTransitionTime":"2025-12-01T08:18:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:32 crc kubenswrapper[5004]: I1201 08:18:32.251305 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:32 crc kubenswrapper[5004]: I1201 08:18:32.251347 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:32 crc kubenswrapper[5004]: I1201 08:18:32.251363 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:32 crc kubenswrapper[5004]: I1201 08:18:32.251383 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:32 crc kubenswrapper[5004]: I1201 08:18:32.251395 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:32Z","lastTransitionTime":"2025-12-01T08:18:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:32 crc kubenswrapper[5004]: I1201 08:18:32.353679 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:32 crc kubenswrapper[5004]: I1201 08:18:32.353735 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:32 crc kubenswrapper[5004]: I1201 08:18:32.353755 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:32 crc kubenswrapper[5004]: I1201 08:18:32.353817 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:32 crc kubenswrapper[5004]: I1201 08:18:32.353839 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:32Z","lastTransitionTime":"2025-12-01T08:18:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:32 crc kubenswrapper[5004]: I1201 08:18:32.456181 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:32 crc kubenswrapper[5004]: I1201 08:18:32.456230 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:32 crc kubenswrapper[5004]: I1201 08:18:32.456245 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:32 crc kubenswrapper[5004]: I1201 08:18:32.456266 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:32 crc kubenswrapper[5004]: I1201 08:18:32.456281 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:32Z","lastTransitionTime":"2025-12-01T08:18:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:32 crc kubenswrapper[5004]: I1201 08:18:32.558146 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:32 crc kubenswrapper[5004]: I1201 08:18:32.558203 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:32 crc kubenswrapper[5004]: I1201 08:18:32.558217 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:32 crc kubenswrapper[5004]: I1201 08:18:32.558236 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:32 crc kubenswrapper[5004]: I1201 08:18:32.558249 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:32Z","lastTransitionTime":"2025-12-01T08:18:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:32 crc kubenswrapper[5004]: I1201 08:18:32.660763 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:32 crc kubenswrapper[5004]: I1201 08:18:32.660796 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:32 crc kubenswrapper[5004]: I1201 08:18:32.660805 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:32 crc kubenswrapper[5004]: I1201 08:18:32.660838 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:32 crc kubenswrapper[5004]: I1201 08:18:32.660847 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:32Z","lastTransitionTime":"2025-12-01T08:18:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:32 crc kubenswrapper[5004]: I1201 08:18:32.758003 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cl5l" Dec 01 08:18:32 crc kubenswrapper[5004]: E1201 08:18:32.758204 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cl5l" podUID="b488f4f3-d385-4d40-bdee-96d8fe2d42a1" Dec 01 08:18:32 crc kubenswrapper[5004]: I1201 08:18:32.763973 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:32 crc kubenswrapper[5004]: I1201 08:18:32.764005 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:32 crc kubenswrapper[5004]: I1201 08:18:32.764016 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:32 crc kubenswrapper[5004]: I1201 08:18:32.764028 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:32 crc kubenswrapper[5004]: I1201 08:18:32.764041 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:32Z","lastTransitionTime":"2025-12-01T08:18:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:32 crc kubenswrapper[5004]: I1201 08:18:32.785523 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=70.785509934 podStartE2EDuration="1m10.785509934s" podCreationTimestamp="2025-12-01 08:17:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:18:32.784073838 +0000 UTC m=+90.349065830" watchObservedRunningTime="2025-12-01 08:18:32.785509934 +0000 UTC m=+90.350501916" Dec 01 08:18:32 crc kubenswrapper[5004]: I1201 08:18:32.805068 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-dpkxw" podStartSLOduration=70.80503263 podStartE2EDuration="1m10.80503263s" podCreationTimestamp="2025-12-01 08:17:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:18:32.803113812 +0000 UTC m=+90.368105784" watchObservedRunningTime="2025-12-01 08:18:32.80503263 +0000 UTC m=+90.370024612" Dec 01 08:18:32 crc kubenswrapper[5004]: I1201 08:18:32.813819 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-ww6lq" podStartSLOduration=70.813798858 podStartE2EDuration="1m10.813798858s" podCreationTimestamp="2025-12-01 08:17:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:18:32.812763492 +0000 UTC m=+90.377755484" watchObservedRunningTime="2025-12-01 08:18:32.813798858 +0000 UTC m=+90.378790850" Dec 01 08:18:32 crc kubenswrapper[5004]: I1201 08:18:32.829377 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=34.829357856 podStartE2EDuration="34.829357856s" podCreationTimestamp="2025-12-01 08:17:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:18:32.827394997 +0000 UTC m=+90.392386979" watchObservedRunningTime="2025-12-01 08:18:32.829357856 +0000 UTC m=+90.394349838" Dec 01 08:18:32 crc kubenswrapper[5004]: I1201 08:18:32.848974 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=71.848947893 podStartE2EDuration="1m11.848947893s" podCreationTimestamp="2025-12-01 08:17:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:18:32.844460562 +0000 UTC m=+90.409452564" watchObservedRunningTime="2025-12-01 08:18:32.848947893 +0000 UTC m=+90.413939885" Dec 01 08:18:32 crc kubenswrapper[5004]: I1201 08:18:32.865824 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:32 crc kubenswrapper[5004]: I1201 08:18:32.865856 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:32 crc kubenswrapper[5004]: I1201 08:18:32.865864 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:32 crc kubenswrapper[5004]: I1201 08:18:32.865897 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:32 crc kubenswrapper[5004]: I1201 08:18:32.865909 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:32Z","lastTransitionTime":"2025-12-01T08:18:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:32 crc kubenswrapper[5004]: I1201 08:18:32.904698 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-zjksw" podStartSLOduration=70.904670361 podStartE2EDuration="1m10.904670361s" podCreationTimestamp="2025-12-01 08:17:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:18:32.903785299 +0000 UTC m=+90.468777311" watchObservedRunningTime="2025-12-01 08:18:32.904670361 +0000 UTC m=+90.469662383" Dec 01 08:18:32 crc kubenswrapper[5004]: I1201 08:18:32.940289 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=18.940269967 podStartE2EDuration="18.940269967s" podCreationTimestamp="2025-12-01 08:18:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:18:32.940129884 +0000 UTC m=+90.505121906" watchObservedRunningTime="2025-12-01 08:18:32.940269967 +0000 UTC m=+90.505261949" Dec 01 08:18:32 crc kubenswrapper[5004]: I1201 08:18:32.968981 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:32 crc kubenswrapper[5004]: I1201 08:18:32.969026 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:32 crc kubenswrapper[5004]: I1201 08:18:32.969039 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:32 crc kubenswrapper[5004]: I1201 08:18:32.969057 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:32 crc kubenswrapper[5004]: I1201 08:18:32.969073 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:32Z","lastTransitionTime":"2025-12-01T08:18:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:32 crc kubenswrapper[5004]: I1201 08:18:32.987579 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mzsvz" podStartSLOduration=69.987536765 podStartE2EDuration="1m9.987536765s" podCreationTimestamp="2025-12-01 08:17:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:18:32.985689899 +0000 UTC m=+90.550681891" watchObservedRunningTime="2025-12-01 08:18:32.987536765 +0000 UTC m=+90.552528757" Dec 01 08:18:32 crc kubenswrapper[5004]: I1201 08:18:32.997998 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podStartSLOduration=70.997979105 podStartE2EDuration="1m10.997979105s" podCreationTimestamp="2025-12-01 08:17:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:18:32.996932049 +0000 UTC m=+90.561924061" watchObservedRunningTime="2025-12-01 08:18:32.997979105 +0000 UTC m=+90.562971107" Dec 01 08:18:33 crc kubenswrapper[5004]: I1201 08:18:33.043602 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=72.043541199 podStartE2EDuration="1m12.043541199s" podCreationTimestamp="2025-12-01 08:17:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:18:33.042640537 +0000 UTC m=+90.607632539" watchObservedRunningTime="2025-12-01 08:18:33.043541199 +0000 UTC m=+90.608533221" Dec 01 08:18:33 crc kubenswrapper[5004]: I1201 08:18:33.071004 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:33 crc kubenswrapper[5004]: I1201 08:18:33.071033 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:33 crc kubenswrapper[5004]: I1201 08:18:33.071044 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:33 crc kubenswrapper[5004]: I1201 08:18:33.071059 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:33 crc kubenswrapper[5004]: I1201 08:18:33.071069 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:33Z","lastTransitionTime":"2025-12-01T08:18:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:33 crc kubenswrapper[5004]: I1201 08:18:33.072683 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-jjms6" podStartSLOduration=71.072665265 podStartE2EDuration="1m11.072665265s" podCreationTimestamp="2025-12-01 08:17:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:18:33.07168361 +0000 UTC m=+90.636675602" watchObservedRunningTime="2025-12-01 08:18:33.072665265 +0000 UTC m=+90.637657287" Dec 01 08:18:33 crc kubenswrapper[5004]: I1201 08:18:33.174074 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:33 crc kubenswrapper[5004]: I1201 08:18:33.174133 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:33 crc kubenswrapper[5004]: I1201 08:18:33.174145 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:33 crc kubenswrapper[5004]: I1201 08:18:33.174164 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:33 crc kubenswrapper[5004]: I1201 08:18:33.174175 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:33Z","lastTransitionTime":"2025-12-01T08:18:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:33 crc kubenswrapper[5004]: I1201 08:18:33.276046 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:33 crc kubenswrapper[5004]: I1201 08:18:33.276104 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:33 crc kubenswrapper[5004]: I1201 08:18:33.276123 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:33 crc kubenswrapper[5004]: I1201 08:18:33.276154 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:33 crc kubenswrapper[5004]: I1201 08:18:33.276175 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:33Z","lastTransitionTime":"2025-12-01T08:18:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:33 crc kubenswrapper[5004]: I1201 08:18:33.380252 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:33 crc kubenswrapper[5004]: I1201 08:18:33.380306 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:33 crc kubenswrapper[5004]: I1201 08:18:33.380322 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:33 crc kubenswrapper[5004]: I1201 08:18:33.380348 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:33 crc kubenswrapper[5004]: I1201 08:18:33.380364 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:33Z","lastTransitionTime":"2025-12-01T08:18:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:33 crc kubenswrapper[5004]: I1201 08:18:33.482645 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:33 crc kubenswrapper[5004]: I1201 08:18:33.482702 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:33 crc kubenswrapper[5004]: I1201 08:18:33.482720 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:33 crc kubenswrapper[5004]: I1201 08:18:33.482743 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:33 crc kubenswrapper[5004]: I1201 08:18:33.482761 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:33Z","lastTransitionTime":"2025-12-01T08:18:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:33 crc kubenswrapper[5004]: I1201 08:18:33.585345 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:33 crc kubenswrapper[5004]: I1201 08:18:33.585401 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:33 crc kubenswrapper[5004]: I1201 08:18:33.585419 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:33 crc kubenswrapper[5004]: I1201 08:18:33.585442 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:33 crc kubenswrapper[5004]: I1201 08:18:33.585463 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:33Z","lastTransitionTime":"2025-12-01T08:18:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:33 crc kubenswrapper[5004]: I1201 08:18:33.688541 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:33 crc kubenswrapper[5004]: I1201 08:18:33.688862 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:33 crc kubenswrapper[5004]: I1201 08:18:33.689119 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:33 crc kubenswrapper[5004]: I1201 08:18:33.689363 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:33 crc kubenswrapper[5004]: I1201 08:18:33.689530 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:33Z","lastTransitionTime":"2025-12-01T08:18:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:33 crc kubenswrapper[5004]: I1201 08:18:33.758844 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:18:33 crc kubenswrapper[5004]: I1201 08:18:33.758844 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 08:18:33 crc kubenswrapper[5004]: I1201 08:18:33.759175 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 08:18:33 crc kubenswrapper[5004]: E1201 08:18:33.759723 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 08:18:33 crc kubenswrapper[5004]: E1201 08:18:33.759843 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 08:18:33 crc kubenswrapper[5004]: E1201 08:18:33.759984 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 08:18:33 crc kubenswrapper[5004]: I1201 08:18:33.792669 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:33 crc kubenswrapper[5004]: I1201 08:18:33.792745 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:33 crc kubenswrapper[5004]: I1201 08:18:33.792769 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:33 crc kubenswrapper[5004]: I1201 08:18:33.792798 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:33 crc kubenswrapper[5004]: I1201 08:18:33.792823 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:33Z","lastTransitionTime":"2025-12-01T08:18:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:33 crc kubenswrapper[5004]: I1201 08:18:33.895507 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:33 crc kubenswrapper[5004]: I1201 08:18:33.895549 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:33 crc kubenswrapper[5004]: I1201 08:18:33.895574 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:33 crc kubenswrapper[5004]: I1201 08:18:33.895606 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:33 crc kubenswrapper[5004]: I1201 08:18:33.895617 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:33Z","lastTransitionTime":"2025-12-01T08:18:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:33 crc kubenswrapper[5004]: I1201 08:18:33.997850 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:33 crc kubenswrapper[5004]: I1201 08:18:33.997898 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:33 crc kubenswrapper[5004]: I1201 08:18:33.997910 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:33 crc kubenswrapper[5004]: I1201 08:18:33.997927 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:33 crc kubenswrapper[5004]: I1201 08:18:33.997938 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:33Z","lastTransitionTime":"2025-12-01T08:18:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:34 crc kubenswrapper[5004]: I1201 08:18:34.106197 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:34 crc kubenswrapper[5004]: I1201 08:18:34.106231 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:34 crc kubenswrapper[5004]: I1201 08:18:34.106243 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:34 crc kubenswrapper[5004]: I1201 08:18:34.106259 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:34 crc kubenswrapper[5004]: I1201 08:18:34.106272 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:34Z","lastTransitionTime":"2025-12-01T08:18:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:34 crc kubenswrapper[5004]: I1201 08:18:34.208485 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:34 crc kubenswrapper[5004]: I1201 08:18:34.208535 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:34 crc kubenswrapper[5004]: I1201 08:18:34.208552 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:34 crc kubenswrapper[5004]: I1201 08:18:34.208603 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:34 crc kubenswrapper[5004]: I1201 08:18:34.208619 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:34Z","lastTransitionTime":"2025-12-01T08:18:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:34 crc kubenswrapper[5004]: I1201 08:18:34.310991 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:34 crc kubenswrapper[5004]: I1201 08:18:34.311032 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:34 crc kubenswrapper[5004]: I1201 08:18:34.311044 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:34 crc kubenswrapper[5004]: I1201 08:18:34.311059 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:34 crc kubenswrapper[5004]: I1201 08:18:34.311070 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:34Z","lastTransitionTime":"2025-12-01T08:18:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:34 crc kubenswrapper[5004]: I1201 08:18:34.413904 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:34 crc kubenswrapper[5004]: I1201 08:18:34.413945 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:34 crc kubenswrapper[5004]: I1201 08:18:34.413956 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:34 crc kubenswrapper[5004]: I1201 08:18:34.413977 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:34 crc kubenswrapper[5004]: I1201 08:18:34.413990 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:34Z","lastTransitionTime":"2025-12-01T08:18:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:34 crc kubenswrapper[5004]: I1201 08:18:34.516389 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:34 crc kubenswrapper[5004]: I1201 08:18:34.516457 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:34 crc kubenswrapper[5004]: I1201 08:18:34.516478 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:34 crc kubenswrapper[5004]: I1201 08:18:34.516504 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:34 crc kubenswrapper[5004]: I1201 08:18:34.516523 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:34Z","lastTransitionTime":"2025-12-01T08:18:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:34 crc kubenswrapper[5004]: I1201 08:18:34.619104 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:34 crc kubenswrapper[5004]: I1201 08:18:34.619152 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:34 crc kubenswrapper[5004]: I1201 08:18:34.619163 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:34 crc kubenswrapper[5004]: I1201 08:18:34.619178 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:34 crc kubenswrapper[5004]: I1201 08:18:34.619189 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:34Z","lastTransitionTime":"2025-12-01T08:18:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:34 crc kubenswrapper[5004]: I1201 08:18:34.721816 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:34 crc kubenswrapper[5004]: I1201 08:18:34.721892 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:34 crc kubenswrapper[5004]: I1201 08:18:34.721911 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:34 crc kubenswrapper[5004]: I1201 08:18:34.721939 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:34 crc kubenswrapper[5004]: I1201 08:18:34.721956 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:34Z","lastTransitionTime":"2025-12-01T08:18:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:34 crc kubenswrapper[5004]: I1201 08:18:34.757959 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cl5l" Dec 01 08:18:34 crc kubenswrapper[5004]: E1201 08:18:34.758208 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cl5l" podUID="b488f4f3-d385-4d40-bdee-96d8fe2d42a1" Dec 01 08:18:34 crc kubenswrapper[5004]: I1201 08:18:34.825630 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:34 crc kubenswrapper[5004]: I1201 08:18:34.825708 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:34 crc kubenswrapper[5004]: I1201 08:18:34.825726 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:34 crc kubenswrapper[5004]: I1201 08:18:34.825752 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:34 crc kubenswrapper[5004]: I1201 08:18:34.825769 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:34Z","lastTransitionTime":"2025-12-01T08:18:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:34 crc kubenswrapper[5004]: I1201 08:18:34.929495 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:34 crc kubenswrapper[5004]: I1201 08:18:34.929945 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:34 crc kubenswrapper[5004]: I1201 08:18:34.930102 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:34 crc kubenswrapper[5004]: I1201 08:18:34.930248 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:34 crc kubenswrapper[5004]: I1201 08:18:34.930381 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:34Z","lastTransitionTime":"2025-12-01T08:18:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:35 crc kubenswrapper[5004]: I1201 08:18:35.034323 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:35 crc kubenswrapper[5004]: I1201 08:18:35.034720 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:35 crc kubenswrapper[5004]: I1201 08:18:35.034913 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:35 crc kubenswrapper[5004]: I1201 08:18:35.035070 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:35 crc kubenswrapper[5004]: I1201 08:18:35.035211 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:35Z","lastTransitionTime":"2025-12-01T08:18:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:35 crc kubenswrapper[5004]: I1201 08:18:35.137637 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:35 crc kubenswrapper[5004]: I1201 08:18:35.137679 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:35 crc kubenswrapper[5004]: I1201 08:18:35.137690 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:35 crc kubenswrapper[5004]: I1201 08:18:35.137704 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:35 crc kubenswrapper[5004]: I1201 08:18:35.137714 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:35Z","lastTransitionTime":"2025-12-01T08:18:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:35 crc kubenswrapper[5004]: I1201 08:18:35.240509 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:35 crc kubenswrapper[5004]: I1201 08:18:35.240624 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:35 crc kubenswrapper[5004]: I1201 08:18:35.240661 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:35 crc kubenswrapper[5004]: I1201 08:18:35.240691 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:35 crc kubenswrapper[5004]: I1201 08:18:35.240715 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:35Z","lastTransitionTime":"2025-12-01T08:18:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:35 crc kubenswrapper[5004]: I1201 08:18:35.344237 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:35 crc kubenswrapper[5004]: I1201 08:18:35.344335 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:35 crc kubenswrapper[5004]: I1201 08:18:35.344351 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:35 crc kubenswrapper[5004]: I1201 08:18:35.344368 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:35 crc kubenswrapper[5004]: I1201 08:18:35.344379 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:35Z","lastTransitionTime":"2025-12-01T08:18:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:35 crc kubenswrapper[5004]: I1201 08:18:35.447067 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:35 crc kubenswrapper[5004]: I1201 08:18:35.447124 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:35 crc kubenswrapper[5004]: I1201 08:18:35.447141 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:35 crc kubenswrapper[5004]: I1201 08:18:35.447164 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:35 crc kubenswrapper[5004]: I1201 08:18:35.447180 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:35Z","lastTransitionTime":"2025-12-01T08:18:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:35 crc kubenswrapper[5004]: I1201 08:18:35.550159 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:35 crc kubenswrapper[5004]: I1201 08:18:35.550215 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:35 crc kubenswrapper[5004]: I1201 08:18:35.550239 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:35 crc kubenswrapper[5004]: I1201 08:18:35.550269 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:35 crc kubenswrapper[5004]: I1201 08:18:35.550289 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:35Z","lastTransitionTime":"2025-12-01T08:18:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:35 crc kubenswrapper[5004]: I1201 08:18:35.654801 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:35 crc kubenswrapper[5004]: I1201 08:18:35.654841 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:35 crc kubenswrapper[5004]: I1201 08:18:35.654851 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:35 crc kubenswrapper[5004]: I1201 08:18:35.654867 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:35 crc kubenswrapper[5004]: I1201 08:18:35.654879 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:35Z","lastTransitionTime":"2025-12-01T08:18:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:35 crc kubenswrapper[5004]: I1201 08:18:35.757132 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:35 crc kubenswrapper[5004]: I1201 08:18:35.757184 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:35 crc kubenswrapper[5004]: I1201 08:18:35.757199 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:35 crc kubenswrapper[5004]: I1201 08:18:35.757218 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:35 crc kubenswrapper[5004]: I1201 08:18:35.757232 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:35Z","lastTransitionTime":"2025-12-01T08:18:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:35 crc kubenswrapper[5004]: I1201 08:18:35.757753 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 08:18:35 crc kubenswrapper[5004]: I1201 08:18:35.757980 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:18:35 crc kubenswrapper[5004]: I1201 08:18:35.757991 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 08:18:35 crc kubenswrapper[5004]: E1201 08:18:35.757979 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 08:18:35 crc kubenswrapper[5004]: E1201 08:18:35.758506 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 08:18:35 crc kubenswrapper[5004]: I1201 08:18:35.758637 5004 scope.go:117] "RemoveContainer" containerID="06e08a797d0a4810bb4314a26c9ecefd52e0bd2f04615b2bd39c4ccf951a33c9" Dec 01 08:18:35 crc kubenswrapper[5004]: E1201 08:18:35.758691 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 08:18:35 crc kubenswrapper[5004]: E1201 08:18:35.759819 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-knmdv_openshift-ovn-kubernetes(15cdec0a-5925-4966-a30b-f60c503f633e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" podUID="15cdec0a-5925-4966-a30b-f60c503f633e" Dec 01 08:18:35 crc kubenswrapper[5004]: I1201 08:18:35.860672 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:35 crc kubenswrapper[5004]: I1201 08:18:35.860732 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:35 crc kubenswrapper[5004]: I1201 08:18:35.860750 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:35 crc kubenswrapper[5004]: I1201 08:18:35.860776 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:35 crc kubenswrapper[5004]: I1201 08:18:35.860793 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:35Z","lastTransitionTime":"2025-12-01T08:18:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:35 crc kubenswrapper[5004]: I1201 08:18:35.963073 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:35 crc kubenswrapper[5004]: I1201 08:18:35.963464 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:35 crc kubenswrapper[5004]: I1201 08:18:35.963661 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:35 crc kubenswrapper[5004]: I1201 08:18:35.963814 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:35 crc kubenswrapper[5004]: I1201 08:18:35.963974 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:35Z","lastTransitionTime":"2025-12-01T08:18:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:36 crc kubenswrapper[5004]: I1201 08:18:36.067083 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:36 crc kubenswrapper[5004]: I1201 08:18:36.067442 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:36 crc kubenswrapper[5004]: I1201 08:18:36.067667 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:36 crc kubenswrapper[5004]: I1201 08:18:36.068012 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:36 crc kubenswrapper[5004]: I1201 08:18:36.068414 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:36Z","lastTransitionTime":"2025-12-01T08:18:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:36 crc kubenswrapper[5004]: I1201 08:18:36.171598 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:36 crc kubenswrapper[5004]: I1201 08:18:36.171640 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:36 crc kubenswrapper[5004]: I1201 08:18:36.171648 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:36 crc kubenswrapper[5004]: I1201 08:18:36.171662 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:36 crc kubenswrapper[5004]: I1201 08:18:36.171671 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:36Z","lastTransitionTime":"2025-12-01T08:18:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:36 crc kubenswrapper[5004]: I1201 08:18:36.274541 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:36 crc kubenswrapper[5004]: I1201 08:18:36.274876 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:36 crc kubenswrapper[5004]: I1201 08:18:36.275125 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:36 crc kubenswrapper[5004]: I1201 08:18:36.275757 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:36 crc kubenswrapper[5004]: I1201 08:18:36.275804 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:36Z","lastTransitionTime":"2025-12-01T08:18:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:36 crc kubenswrapper[5004]: I1201 08:18:36.378867 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:36 crc kubenswrapper[5004]: I1201 08:18:36.378923 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:36 crc kubenswrapper[5004]: I1201 08:18:36.378939 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:36 crc kubenswrapper[5004]: I1201 08:18:36.378963 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:36 crc kubenswrapper[5004]: I1201 08:18:36.378979 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:36Z","lastTransitionTime":"2025-12-01T08:18:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:36 crc kubenswrapper[5004]: I1201 08:18:36.423226 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 08:18:36 crc kubenswrapper[5004]: I1201 08:18:36.423471 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 08:18:36 crc kubenswrapper[5004]: I1201 08:18:36.423498 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 08:18:36 crc kubenswrapper[5004]: I1201 08:18:36.423520 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 08:18:36 crc kubenswrapper[5004]: I1201 08:18:36.423536 5004 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T08:18:36Z","lastTransitionTime":"2025-12-01T08:18:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 08:18:36 crc kubenswrapper[5004]: I1201 08:18:36.494929 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-rq2zg"] Dec 01 08:18:36 crc kubenswrapper[5004]: I1201 08:18:36.495543 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rq2zg" Dec 01 08:18:36 crc kubenswrapper[5004]: I1201 08:18:36.497961 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 01 08:18:36 crc kubenswrapper[5004]: I1201 08:18:36.498175 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 01 08:18:36 crc kubenswrapper[5004]: I1201 08:18:36.498996 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 01 08:18:36 crc kubenswrapper[5004]: I1201 08:18:36.501078 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 01 08:18:36 crc kubenswrapper[5004]: I1201 08:18:36.556998 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7086b197-aea4-483b-b735-a1f64ea1ccdc-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-rq2zg\" (UID: \"7086b197-aea4-483b-b735-a1f64ea1ccdc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rq2zg" Dec 01 08:18:36 crc kubenswrapper[5004]: I1201 08:18:36.557113 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/7086b197-aea4-483b-b735-a1f64ea1ccdc-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-rq2zg\" (UID: \"7086b197-aea4-483b-b735-a1f64ea1ccdc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rq2zg" Dec 01 08:18:36 crc kubenswrapper[5004]: I1201 08:18:36.557331 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7086b197-aea4-483b-b735-a1f64ea1ccdc-service-ca\") pod \"cluster-version-operator-5c965bbfc6-rq2zg\" (UID: \"7086b197-aea4-483b-b735-a1f64ea1ccdc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rq2zg" Dec 01 08:18:36 crc kubenswrapper[5004]: I1201 08:18:36.557406 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7086b197-aea4-483b-b735-a1f64ea1ccdc-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-rq2zg\" (UID: \"7086b197-aea4-483b-b735-a1f64ea1ccdc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rq2zg" Dec 01 08:18:36 crc kubenswrapper[5004]: I1201 08:18:36.557490 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/7086b197-aea4-483b-b735-a1f64ea1ccdc-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-rq2zg\" (UID: \"7086b197-aea4-483b-b735-a1f64ea1ccdc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rq2zg" Dec 01 08:18:36 crc kubenswrapper[5004]: I1201 08:18:36.658443 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/7086b197-aea4-483b-b735-a1f64ea1ccdc-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-rq2zg\" (UID: \"7086b197-aea4-483b-b735-a1f64ea1ccdc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rq2zg" Dec 01 08:18:36 crc kubenswrapper[5004]: I1201 08:18:36.658623 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7086b197-aea4-483b-b735-a1f64ea1ccdc-service-ca\") pod \"cluster-version-operator-5c965bbfc6-rq2zg\" (UID: \"7086b197-aea4-483b-b735-a1f64ea1ccdc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rq2zg" Dec 01 08:18:36 crc kubenswrapper[5004]: I1201 08:18:36.658640 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/7086b197-aea4-483b-b735-a1f64ea1ccdc-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-rq2zg\" (UID: \"7086b197-aea4-483b-b735-a1f64ea1ccdc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rq2zg" Dec 01 08:18:36 crc kubenswrapper[5004]: I1201 08:18:36.658679 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7086b197-aea4-483b-b735-a1f64ea1ccdc-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-rq2zg\" (UID: \"7086b197-aea4-483b-b735-a1f64ea1ccdc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rq2zg" Dec 01 08:18:36 crc kubenswrapper[5004]: I1201 08:18:36.658723 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/7086b197-aea4-483b-b735-a1f64ea1ccdc-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-rq2zg\" (UID: \"7086b197-aea4-483b-b735-a1f64ea1ccdc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rq2zg" Dec 01 08:18:36 crc kubenswrapper[5004]: I1201 08:18:36.658823 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7086b197-aea4-483b-b735-a1f64ea1ccdc-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-rq2zg\" (UID: \"7086b197-aea4-483b-b735-a1f64ea1ccdc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rq2zg" Dec 01 08:18:36 crc kubenswrapper[5004]: I1201 08:18:36.658890 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/7086b197-aea4-483b-b735-a1f64ea1ccdc-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-rq2zg\" (UID: \"7086b197-aea4-483b-b735-a1f64ea1ccdc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rq2zg" Dec 01 08:18:36 crc kubenswrapper[5004]: I1201 08:18:36.660018 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7086b197-aea4-483b-b735-a1f64ea1ccdc-service-ca\") pod \"cluster-version-operator-5c965bbfc6-rq2zg\" (UID: \"7086b197-aea4-483b-b735-a1f64ea1ccdc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rq2zg" Dec 01 08:18:36 crc kubenswrapper[5004]: I1201 08:18:36.670928 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7086b197-aea4-483b-b735-a1f64ea1ccdc-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-rq2zg\" (UID: \"7086b197-aea4-483b-b735-a1f64ea1ccdc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rq2zg" Dec 01 08:18:36 crc kubenswrapper[5004]: I1201 08:18:36.688931 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7086b197-aea4-483b-b735-a1f64ea1ccdc-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-rq2zg\" (UID: \"7086b197-aea4-483b-b735-a1f64ea1ccdc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rq2zg" Dec 01 08:18:36 crc kubenswrapper[5004]: I1201 08:18:36.758747 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cl5l" Dec 01 08:18:36 crc kubenswrapper[5004]: E1201 08:18:36.758967 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cl5l" podUID="b488f4f3-d385-4d40-bdee-96d8fe2d42a1" Dec 01 08:18:36 crc kubenswrapper[5004]: I1201 08:18:36.819959 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rq2zg" Dec 01 08:18:36 crc kubenswrapper[5004]: W1201 08:18:36.841185 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7086b197_aea4_483b_b735_a1f64ea1ccdc.slice/crio-b87c6b6573ccc4ae66d4c2e38f8566dd51381c3e5d88e3fec58ec36224804661 WatchSource:0}: Error finding container b87c6b6573ccc4ae66d4c2e38f8566dd51381c3e5d88e3fec58ec36224804661: Status 404 returned error can't find the container with id b87c6b6573ccc4ae66d4c2e38f8566dd51381c3e5d88e3fec58ec36224804661 Dec 01 08:18:37 crc kubenswrapper[5004]: I1201 08:18:37.396082 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rq2zg" event={"ID":"7086b197-aea4-483b-b735-a1f64ea1ccdc","Type":"ContainerStarted","Data":"2395fc716bc579f1114d5178db6c383b58c848e7b35d620ea7540a36f6599197"} Dec 01 08:18:37 crc kubenswrapper[5004]: I1201 08:18:37.396120 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rq2zg" event={"ID":"7086b197-aea4-483b-b735-a1f64ea1ccdc","Type":"ContainerStarted","Data":"b87c6b6573ccc4ae66d4c2e38f8566dd51381c3e5d88e3fec58ec36224804661"} Dec 01 08:18:37 crc kubenswrapper[5004]: I1201 08:18:37.412648 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rq2zg" podStartSLOduration=75.412629611 podStartE2EDuration="1m15.412629611s" podCreationTimestamp="2025-12-01 08:17:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:18:37.411984834 +0000 UTC m=+94.976976906" watchObservedRunningTime="2025-12-01 08:18:37.412629611 +0000 UTC m=+94.977621623" Dec 01 08:18:37 crc kubenswrapper[5004]: I1201 08:18:37.758167 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 08:18:37 crc kubenswrapper[5004]: I1201 08:18:37.758210 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 08:18:37 crc kubenswrapper[5004]: I1201 08:18:37.758275 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:18:37 crc kubenswrapper[5004]: E1201 08:18:37.758287 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 08:18:37 crc kubenswrapper[5004]: E1201 08:18:37.758512 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 08:18:37 crc kubenswrapper[5004]: E1201 08:18:37.758552 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 08:18:38 crc kubenswrapper[5004]: I1201 08:18:38.758540 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cl5l" Dec 01 08:18:38 crc kubenswrapper[5004]: E1201 08:18:38.758797 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cl5l" podUID="b488f4f3-d385-4d40-bdee-96d8fe2d42a1" Dec 01 08:18:39 crc kubenswrapper[5004]: I1201 08:18:39.758172 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:18:39 crc kubenswrapper[5004]: I1201 08:18:39.758227 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 08:18:39 crc kubenswrapper[5004]: I1201 08:18:39.758275 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 08:18:39 crc kubenswrapper[5004]: E1201 08:18:39.758380 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 08:18:39 crc kubenswrapper[5004]: E1201 08:18:39.758601 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 08:18:39 crc kubenswrapper[5004]: E1201 08:18:39.758769 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 08:18:40 crc kubenswrapper[5004]: I1201 08:18:40.758296 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cl5l" Dec 01 08:18:40 crc kubenswrapper[5004]: E1201 08:18:40.758534 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cl5l" podUID="b488f4f3-d385-4d40-bdee-96d8fe2d42a1" Dec 01 08:18:41 crc kubenswrapper[5004]: I1201 08:18:41.107997 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b488f4f3-d385-4d40-bdee-96d8fe2d42a1-metrics-certs\") pod \"network-metrics-daemon-7cl5l\" (UID: \"b488f4f3-d385-4d40-bdee-96d8fe2d42a1\") " pod="openshift-multus/network-metrics-daemon-7cl5l" Dec 01 08:18:41 crc kubenswrapper[5004]: E1201 08:18:41.108222 5004 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 08:18:41 crc kubenswrapper[5004]: E1201 08:18:41.108312 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b488f4f3-d385-4d40-bdee-96d8fe2d42a1-metrics-certs podName:b488f4f3-d385-4d40-bdee-96d8fe2d42a1 nodeName:}" failed. No retries permitted until 2025-12-01 08:19:45.108283861 +0000 UTC m=+162.673275883 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b488f4f3-d385-4d40-bdee-96d8fe2d42a1-metrics-certs") pod "network-metrics-daemon-7cl5l" (UID: "b488f4f3-d385-4d40-bdee-96d8fe2d42a1") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 08:18:41 crc kubenswrapper[5004]: I1201 08:18:41.758214 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 08:18:41 crc kubenswrapper[5004]: I1201 08:18:41.758287 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 08:18:41 crc kubenswrapper[5004]: E1201 08:18:41.758408 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 08:18:41 crc kubenswrapper[5004]: I1201 08:18:41.758502 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:18:41 crc kubenswrapper[5004]: E1201 08:18:41.758707 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 08:18:41 crc kubenswrapper[5004]: E1201 08:18:41.758881 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 08:18:42 crc kubenswrapper[5004]: I1201 08:18:42.760710 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cl5l" Dec 01 08:18:42 crc kubenswrapper[5004]: E1201 08:18:42.760867 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cl5l" podUID="b488f4f3-d385-4d40-bdee-96d8fe2d42a1" Dec 01 08:18:43 crc kubenswrapper[5004]: I1201 08:18:43.757855 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 08:18:43 crc kubenswrapper[5004]: I1201 08:18:43.757888 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:18:43 crc kubenswrapper[5004]: I1201 08:18:43.757857 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 08:18:43 crc kubenswrapper[5004]: E1201 08:18:43.758000 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 08:18:43 crc kubenswrapper[5004]: E1201 08:18:43.758055 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 08:18:43 crc kubenswrapper[5004]: E1201 08:18:43.758115 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 08:18:44 crc kubenswrapper[5004]: I1201 08:18:44.758229 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cl5l" Dec 01 08:18:44 crc kubenswrapper[5004]: E1201 08:18:44.758779 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cl5l" podUID="b488f4f3-d385-4d40-bdee-96d8fe2d42a1" Dec 01 08:18:45 crc kubenswrapper[5004]: I1201 08:18:45.758087 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:18:45 crc kubenswrapper[5004]: I1201 08:18:45.758108 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 08:18:45 crc kubenswrapper[5004]: I1201 08:18:45.758211 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 08:18:45 crc kubenswrapper[5004]: E1201 08:18:45.758699 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 08:18:45 crc kubenswrapper[5004]: E1201 08:18:45.758795 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 08:18:45 crc kubenswrapper[5004]: E1201 08:18:45.758487 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 08:18:46 crc kubenswrapper[5004]: I1201 08:18:46.758959 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cl5l" Dec 01 08:18:46 crc kubenswrapper[5004]: E1201 08:18:46.759095 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cl5l" podUID="b488f4f3-d385-4d40-bdee-96d8fe2d42a1" Dec 01 08:18:47 crc kubenswrapper[5004]: I1201 08:18:47.758162 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 08:18:47 crc kubenswrapper[5004]: I1201 08:18:47.758196 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 08:18:47 crc kubenswrapper[5004]: E1201 08:18:47.758745 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 08:18:47 crc kubenswrapper[5004]: I1201 08:18:47.758211 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:18:47 crc kubenswrapper[5004]: E1201 08:18:47.758609 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 08:18:47 crc kubenswrapper[5004]: E1201 08:18:47.758924 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 08:18:48 crc kubenswrapper[5004]: I1201 08:18:48.758218 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cl5l" Dec 01 08:18:48 crc kubenswrapper[5004]: E1201 08:18:48.758449 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cl5l" podUID="b488f4f3-d385-4d40-bdee-96d8fe2d42a1" Dec 01 08:18:49 crc kubenswrapper[5004]: I1201 08:18:49.758169 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:18:49 crc kubenswrapper[5004]: I1201 08:18:49.758282 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 08:18:49 crc kubenswrapper[5004]: I1201 08:18:49.758349 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 08:18:49 crc kubenswrapper[5004]: E1201 08:18:49.758495 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 08:18:49 crc kubenswrapper[5004]: E1201 08:18:49.758713 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 08:18:49 crc kubenswrapper[5004]: E1201 08:18:49.758938 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 08:18:49 crc kubenswrapper[5004]: I1201 08:18:49.760896 5004 scope.go:117] "RemoveContainer" containerID="06e08a797d0a4810bb4314a26c9ecefd52e0bd2f04615b2bd39c4ccf951a33c9" Dec 01 08:18:49 crc kubenswrapper[5004]: E1201 08:18:49.761224 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-knmdv_openshift-ovn-kubernetes(15cdec0a-5925-4966-a30b-f60c503f633e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" podUID="15cdec0a-5925-4966-a30b-f60c503f633e" Dec 01 08:18:50 crc kubenswrapper[5004]: I1201 08:18:50.758347 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cl5l" Dec 01 08:18:50 crc kubenswrapper[5004]: E1201 08:18:50.758706 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cl5l" podUID="b488f4f3-d385-4d40-bdee-96d8fe2d42a1" Dec 01 08:18:51 crc kubenswrapper[5004]: I1201 08:18:51.758787 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 08:18:51 crc kubenswrapper[5004]: I1201 08:18:51.758839 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:18:51 crc kubenswrapper[5004]: I1201 08:18:51.758864 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 08:18:51 crc kubenswrapper[5004]: E1201 08:18:51.758976 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 08:18:51 crc kubenswrapper[5004]: E1201 08:18:51.759099 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 08:18:51 crc kubenswrapper[5004]: E1201 08:18:51.759314 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 08:18:52 crc kubenswrapper[5004]: I1201 08:18:52.758753 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cl5l" Dec 01 08:18:52 crc kubenswrapper[5004]: E1201 08:18:52.760768 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cl5l" podUID="b488f4f3-d385-4d40-bdee-96d8fe2d42a1" Dec 01 08:18:53 crc kubenswrapper[5004]: I1201 08:18:53.758044 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:18:53 crc kubenswrapper[5004]: I1201 08:18:53.758097 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 08:18:53 crc kubenswrapper[5004]: I1201 08:18:53.758651 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 08:18:53 crc kubenswrapper[5004]: E1201 08:18:53.759202 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 08:18:53 crc kubenswrapper[5004]: E1201 08:18:53.759811 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 08:18:53 crc kubenswrapper[5004]: E1201 08:18:53.759944 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 08:18:54 crc kubenswrapper[5004]: I1201 08:18:54.759771 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cl5l" Dec 01 08:18:54 crc kubenswrapper[5004]: E1201 08:18:54.759907 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cl5l" podUID="b488f4f3-d385-4d40-bdee-96d8fe2d42a1" Dec 01 08:18:55 crc kubenswrapper[5004]: I1201 08:18:55.757886 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 08:18:55 crc kubenswrapper[5004]: I1201 08:18:55.757948 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:18:55 crc kubenswrapper[5004]: I1201 08:18:55.757945 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 08:18:55 crc kubenswrapper[5004]: E1201 08:18:55.758036 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 08:18:55 crc kubenswrapper[5004]: E1201 08:18:55.758188 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 08:18:55 crc kubenswrapper[5004]: E1201 08:18:55.758404 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 08:18:56 crc kubenswrapper[5004]: I1201 08:18:56.465605 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zjksw_70e79009-93be-49c4-a6b3-e8a06bcea7f4/kube-multus/1.log" Dec 01 08:18:56 crc kubenswrapper[5004]: I1201 08:18:56.467120 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zjksw_70e79009-93be-49c4-a6b3-e8a06bcea7f4/kube-multus/0.log" Dec 01 08:18:56 crc kubenswrapper[5004]: I1201 08:18:56.467243 5004 generic.go:334] "Generic (PLEG): container finished" podID="70e79009-93be-49c4-a6b3-e8a06bcea7f4" containerID="862dd7caaf04a01f96f2dba70cd2226e85b55f172ee6a34c178f756a75832a08" exitCode=1 Dec 01 08:18:56 crc kubenswrapper[5004]: I1201 08:18:56.467288 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zjksw" event={"ID":"70e79009-93be-49c4-a6b3-e8a06bcea7f4","Type":"ContainerDied","Data":"862dd7caaf04a01f96f2dba70cd2226e85b55f172ee6a34c178f756a75832a08"} Dec 01 08:18:56 crc kubenswrapper[5004]: I1201 08:18:56.467334 5004 scope.go:117] "RemoveContainer" containerID="ad3ea204ad731500a340e639f0e36abe28ba50464f1b2ff9b009e39c234cf708" Dec 01 08:18:56 crc kubenswrapper[5004]: I1201 08:18:56.467954 5004 scope.go:117] "RemoveContainer" containerID="862dd7caaf04a01f96f2dba70cd2226e85b55f172ee6a34c178f756a75832a08" Dec 01 08:18:56 crc kubenswrapper[5004]: E1201 08:18:56.468228 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-zjksw_openshift-multus(70e79009-93be-49c4-a6b3-e8a06bcea7f4)\"" pod="openshift-multus/multus-zjksw" podUID="70e79009-93be-49c4-a6b3-e8a06bcea7f4" Dec 01 08:18:56 crc kubenswrapper[5004]: I1201 08:18:56.758878 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cl5l" Dec 01 08:18:56 crc kubenswrapper[5004]: E1201 08:18:56.759113 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cl5l" podUID="b488f4f3-d385-4d40-bdee-96d8fe2d42a1" Dec 01 08:18:57 crc kubenswrapper[5004]: I1201 08:18:57.473657 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zjksw_70e79009-93be-49c4-a6b3-e8a06bcea7f4/kube-multus/1.log" Dec 01 08:18:57 crc kubenswrapper[5004]: I1201 08:18:57.758758 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:18:57 crc kubenswrapper[5004]: I1201 08:18:57.758814 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 08:18:57 crc kubenswrapper[5004]: I1201 08:18:57.759077 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 08:18:57 crc kubenswrapper[5004]: E1201 08:18:57.759133 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 08:18:57 crc kubenswrapper[5004]: E1201 08:18:57.759270 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 08:18:57 crc kubenswrapper[5004]: E1201 08:18:57.759452 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 08:18:58 crc kubenswrapper[5004]: I1201 08:18:58.758598 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cl5l" Dec 01 08:18:58 crc kubenswrapper[5004]: E1201 08:18:58.758819 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cl5l" podUID="b488f4f3-d385-4d40-bdee-96d8fe2d42a1" Dec 01 08:18:59 crc kubenswrapper[5004]: I1201 08:18:59.758672 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:18:59 crc kubenswrapper[5004]: I1201 08:18:59.758746 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 08:18:59 crc kubenswrapper[5004]: E1201 08:18:59.758832 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 08:18:59 crc kubenswrapper[5004]: E1201 08:18:59.758957 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 08:18:59 crc kubenswrapper[5004]: I1201 08:18:59.758760 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 08:18:59 crc kubenswrapper[5004]: E1201 08:18:59.759207 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 08:19:00 crc kubenswrapper[5004]: I1201 08:19:00.758605 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cl5l" Dec 01 08:19:00 crc kubenswrapper[5004]: E1201 08:19:00.758786 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cl5l" podUID="b488f4f3-d385-4d40-bdee-96d8fe2d42a1" Dec 01 08:19:01 crc kubenswrapper[5004]: I1201 08:19:01.759044 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:19:01 crc kubenswrapper[5004]: I1201 08:19:01.759067 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 08:19:01 crc kubenswrapper[5004]: I1201 08:19:01.759148 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 08:19:01 crc kubenswrapper[5004]: E1201 08:19:01.759284 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 08:19:01 crc kubenswrapper[5004]: E1201 08:19:01.759389 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 08:19:01 crc kubenswrapper[5004]: E1201 08:19:01.759549 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 08:19:02 crc kubenswrapper[5004]: E1201 08:19:02.697763 5004 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Dec 01 08:19:02 crc kubenswrapper[5004]: I1201 08:19:02.757978 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cl5l" Dec 01 08:19:02 crc kubenswrapper[5004]: E1201 08:19:02.760205 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cl5l" podUID="b488f4f3-d385-4d40-bdee-96d8fe2d42a1" Dec 01 08:19:02 crc kubenswrapper[5004]: E1201 08:19:02.877819 5004 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 01 08:19:03 crc kubenswrapper[5004]: I1201 08:19:03.757965 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 08:19:03 crc kubenswrapper[5004]: I1201 08:19:03.758022 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:19:03 crc kubenswrapper[5004]: I1201 08:19:03.758055 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 08:19:03 crc kubenswrapper[5004]: E1201 08:19:03.758154 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 08:19:03 crc kubenswrapper[5004]: E1201 08:19:03.758215 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 08:19:03 crc kubenswrapper[5004]: E1201 08:19:03.758289 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 08:19:03 crc kubenswrapper[5004]: I1201 08:19:03.759791 5004 scope.go:117] "RemoveContainer" containerID="06e08a797d0a4810bb4314a26c9ecefd52e0bd2f04615b2bd39c4ccf951a33c9" Dec 01 08:19:04 crc kubenswrapper[5004]: I1201 08:19:04.500922 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-knmdv_15cdec0a-5925-4966-a30b-f60c503f633e/ovnkube-controller/3.log" Dec 01 08:19:04 crc kubenswrapper[5004]: I1201 08:19:04.504097 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" event={"ID":"15cdec0a-5925-4966-a30b-f60c503f633e","Type":"ContainerStarted","Data":"57bc8d8aca54ca6d854bd5065cc8d1346968b6531f884d87dc38f291e505419a"} Dec 01 08:19:04 crc kubenswrapper[5004]: I1201 08:19:04.504806 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:19:04 crc kubenswrapper[5004]: I1201 08:19:04.535026 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" podStartSLOduration=102.535009985 podStartE2EDuration="1m42.535009985s" podCreationTimestamp="2025-12-01 08:17:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:19:04.533684391 +0000 UTC m=+122.098676403" watchObservedRunningTime="2025-12-01 08:19:04.535009985 +0000 UTC m=+122.100001967" Dec 01 08:19:04 crc kubenswrapper[5004]: I1201 08:19:04.738612 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7cl5l"] Dec 01 08:19:04 crc kubenswrapper[5004]: I1201 08:19:04.738736 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cl5l" Dec 01 08:19:04 crc kubenswrapper[5004]: E1201 08:19:04.738832 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cl5l" podUID="b488f4f3-d385-4d40-bdee-96d8fe2d42a1" Dec 01 08:19:05 crc kubenswrapper[5004]: I1201 08:19:05.758348 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 08:19:05 crc kubenswrapper[5004]: I1201 08:19:05.758382 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 08:19:05 crc kubenswrapper[5004]: I1201 08:19:05.758409 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:19:05 crc kubenswrapper[5004]: E1201 08:19:05.758556 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 08:19:05 crc kubenswrapper[5004]: E1201 08:19:05.758687 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 08:19:05 crc kubenswrapper[5004]: E1201 08:19:05.758769 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 08:19:06 crc kubenswrapper[5004]: I1201 08:19:06.758736 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cl5l" Dec 01 08:19:06 crc kubenswrapper[5004]: E1201 08:19:06.758966 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cl5l" podUID="b488f4f3-d385-4d40-bdee-96d8fe2d42a1" Dec 01 08:19:07 crc kubenswrapper[5004]: I1201 08:19:07.758803 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 08:19:07 crc kubenswrapper[5004]: I1201 08:19:07.758864 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 08:19:07 crc kubenswrapper[5004]: I1201 08:19:07.758912 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:19:07 crc kubenswrapper[5004]: E1201 08:19:07.758991 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 08:19:07 crc kubenswrapper[5004]: E1201 08:19:07.759097 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 08:19:07 crc kubenswrapper[5004]: E1201 08:19:07.759196 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 08:19:07 crc kubenswrapper[5004]: E1201 08:19:07.879791 5004 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 01 08:19:08 crc kubenswrapper[5004]: I1201 08:19:08.758640 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cl5l" Dec 01 08:19:08 crc kubenswrapper[5004]: E1201 08:19:08.758947 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cl5l" podUID="b488f4f3-d385-4d40-bdee-96d8fe2d42a1" Dec 01 08:19:08 crc kubenswrapper[5004]: I1201 08:19:08.759257 5004 scope.go:117] "RemoveContainer" containerID="862dd7caaf04a01f96f2dba70cd2226e85b55f172ee6a34c178f756a75832a08" Dec 01 08:19:09 crc kubenswrapper[5004]: I1201 08:19:09.528179 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zjksw_70e79009-93be-49c4-a6b3-e8a06bcea7f4/kube-multus/1.log" Dec 01 08:19:09 crc kubenswrapper[5004]: I1201 08:19:09.528280 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zjksw" event={"ID":"70e79009-93be-49c4-a6b3-e8a06bcea7f4","Type":"ContainerStarted","Data":"c4499168a80cb7fe2301c6db0d0d9c80110f6f9bc8fc94b291f0b9b306dbb057"} Dec 01 08:19:09 crc kubenswrapper[5004]: I1201 08:19:09.758400 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:19:09 crc kubenswrapper[5004]: I1201 08:19:09.758465 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 08:19:09 crc kubenswrapper[5004]: I1201 08:19:09.758402 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 08:19:09 crc kubenswrapper[5004]: E1201 08:19:09.758652 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 08:19:09 crc kubenswrapper[5004]: E1201 08:19:09.758883 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 08:19:09 crc kubenswrapper[5004]: E1201 08:19:09.759042 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 08:19:10 crc kubenswrapper[5004]: I1201 08:19:10.758843 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cl5l" Dec 01 08:19:10 crc kubenswrapper[5004]: E1201 08:19:10.759040 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cl5l" podUID="b488f4f3-d385-4d40-bdee-96d8fe2d42a1" Dec 01 08:19:11 crc kubenswrapper[5004]: I1201 08:19:11.758850 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 08:19:11 crc kubenswrapper[5004]: I1201 08:19:11.758920 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 08:19:11 crc kubenswrapper[5004]: E1201 08:19:11.759010 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 08:19:11 crc kubenswrapper[5004]: I1201 08:19:11.758933 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:19:11 crc kubenswrapper[5004]: E1201 08:19:11.759165 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 08:19:11 crc kubenswrapper[5004]: E1201 08:19:11.759192 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 08:19:12 crc kubenswrapper[5004]: I1201 08:19:12.764114 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cl5l" Dec 01 08:19:12 crc kubenswrapper[5004]: E1201 08:19:12.764867 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cl5l" podUID="b488f4f3-d385-4d40-bdee-96d8fe2d42a1" Dec 01 08:19:13 crc kubenswrapper[5004]: I1201 08:19:13.758813 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 08:19:13 crc kubenswrapper[5004]: I1201 08:19:13.758908 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:19:13 crc kubenswrapper[5004]: I1201 08:19:13.758817 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 08:19:13 crc kubenswrapper[5004]: I1201 08:19:13.762338 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 01 08:19:13 crc kubenswrapper[5004]: I1201 08:19:13.762378 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 01 08:19:13 crc kubenswrapper[5004]: I1201 08:19:13.762513 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 01 08:19:13 crc kubenswrapper[5004]: I1201 08:19:13.762643 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 01 08:19:14 crc kubenswrapper[5004]: I1201 08:19:14.758805 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cl5l" Dec 01 08:19:14 crc kubenswrapper[5004]: I1201 08:19:14.762008 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 01 08:19:14 crc kubenswrapper[5004]: I1201 08:19:14.762087 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 01 08:19:16 crc kubenswrapper[5004]: I1201 08:19:16.999937 5004 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.069131 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-kplt9"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.070092 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-kplt9" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.072637 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-b9vjf"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.073489 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-b9vjf" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.075870 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-7khqv"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.076473 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7khqv" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.084460 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.084476 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.084606 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.084864 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.084949 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.085036 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.085157 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.085158 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.085201 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.085268 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.085417 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.085447 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.085555 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.085602 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.085690 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.085713 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.085768 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.085952 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.085972 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.085975 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.085972 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.087224 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-dsrp6"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.088011 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dsrp6" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.090544 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-j9k4x"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.090986 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-j9k4x" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.091355 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.095471 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-qjz9p"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.095995 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pkg9x"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.096345 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wvmxq"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.096947 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wvmxq" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.097865 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-qjz9p" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.098172 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pkg9x" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.103637 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4f588"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.104353 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4f588" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.104918 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-qmztb"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.105466 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-qmztb" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.106952 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-p77t7"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.107540 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-p77t7" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.115307 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-jshhn"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.117586 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-jshhn" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.122834 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3ebf4d5-102a-4552-b30b-cbacb3a779fa-config\") pod \"machine-api-operator-5694c8668f-b9vjf\" (UID: \"c3ebf4d5-102a-4552-b30b-cbacb3a779fa\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-b9vjf" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.122892 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlvzx\" (UniqueName: \"kubernetes.io/projected/17c7a11e-bffa-4ecf-abe0-c467a33538a8-kube-api-access-zlvzx\") pod \"openshift-apiserver-operator-796bbdcf4f-j9k4x\" (UID: \"17c7a11e-bffa-4ecf-abe0-c467a33538a8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-j9k4x" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.123012 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/29bfa426-07b0-4acb-a886-9f9316644d71-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pkg9x\" (UID: \"29bfa426-07b0-4acb-a886-9f9316644d71\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pkg9x" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.123043 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2daa468b-e9f8-41a3-ba94-a1e33093fc97-etcd-client\") pod \"apiserver-76f77b778f-kplt9\" (UID: \"2daa468b-e9f8-41a3-ba94-a1e33093fc97\") " pod="openshift-apiserver/apiserver-76f77b778f-kplt9" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.123077 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2bbf8d8-0338-4af4-8d6a-402033f87676-serving-cert\") pod \"route-controller-manager-6576b87f9c-7khqv\" (UID: \"d2bbf8d8-0338-4af4-8d6a-402033f87676\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7khqv" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.123107 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b170acc-4880-42d4-ae54-0946ba0029b5-config\") pod \"authentication-operator-69f744f599-qjz9p\" (UID: \"7b170acc-4880-42d4-ae54-0946ba0029b5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qjz9p" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.123281 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17c7a11e-bffa-4ecf-abe0-c467a33538a8-config\") pod \"openshift-apiserver-operator-796bbdcf4f-j9k4x\" (UID: \"17c7a11e-bffa-4ecf-abe0-c467a33538a8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-j9k4x" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.123323 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2daa468b-e9f8-41a3-ba94-a1e33093fc97-serving-cert\") pod \"apiserver-76f77b778f-kplt9\" (UID: \"2daa468b-e9f8-41a3-ba94-a1e33093fc97\") " pod="openshift-apiserver/apiserver-76f77b778f-kplt9" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.123450 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f694a09-4564-4103-b1b0-ea419e62082e-config\") pod \"machine-approver-56656f9798-dsrp6\" (UID: \"1f694a09-4564-4103-b1b0-ea419e62082e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dsrp6" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.123493 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghv98\" (UniqueName: \"kubernetes.io/projected/d2bbf8d8-0338-4af4-8d6a-402033f87676-kube-api-access-ghv98\") pod \"route-controller-manager-6576b87f9c-7khqv\" (UID: \"d2bbf8d8-0338-4af4-8d6a-402033f87676\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7khqv" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.123807 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/2daa468b-e9f8-41a3-ba94-a1e33093fc97-etcd-serving-ca\") pod \"apiserver-76f77b778f-kplt9\" (UID: \"2daa468b-e9f8-41a3-ba94-a1e33093fc97\") " pod="openshift-apiserver/apiserver-76f77b778f-kplt9" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.123853 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29bfa426-07b0-4acb-a886-9f9316644d71-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pkg9x\" (UID: \"29bfa426-07b0-4acb-a886-9f9316644d71\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pkg9x" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.123905 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b170acc-4880-42d4-ae54-0946ba0029b5-serving-cert\") pod \"authentication-operator-69f744f599-qjz9p\" (UID: \"7b170acc-4880-42d4-ae54-0946ba0029b5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qjz9p" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.123939 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b170acc-4880-42d4-ae54-0946ba0029b5-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-qjz9p\" (UID: \"7b170acc-4880-42d4-ae54-0946ba0029b5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qjz9p" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.123973 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bx9bx\" (UniqueName: \"kubernetes.io/projected/16dd04af-b1db-4a72-8f1f-8d53ffd52b41-kube-api-access-bx9bx\") pod \"cluster-samples-operator-665b6dd947-wvmxq\" (UID: \"16dd04af-b1db-4a72-8f1f-8d53ffd52b41\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wvmxq" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.124063 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2daa468b-e9f8-41a3-ba94-a1e33093fc97-config\") pod \"apiserver-76f77b778f-kplt9\" (UID: \"2daa468b-e9f8-41a3-ba94-a1e33093fc97\") " pod="openshift-apiserver/apiserver-76f77b778f-kplt9" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.124096 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ws6fx\" (UniqueName: \"kubernetes.io/projected/2daa468b-e9f8-41a3-ba94-a1e33093fc97-kube-api-access-ws6fx\") pod \"apiserver-76f77b778f-kplt9\" (UID: \"2daa468b-e9f8-41a3-ba94-a1e33093fc97\") " pod="openshift-apiserver/apiserver-76f77b778f-kplt9" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.124225 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/2daa468b-e9f8-41a3-ba94-a1e33093fc97-node-pullsecrets\") pod \"apiserver-76f77b778f-kplt9\" (UID: \"2daa468b-e9f8-41a3-ba94-a1e33093fc97\") " pod="openshift-apiserver/apiserver-76f77b778f-kplt9" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.124292 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxmpn\" (UniqueName: \"kubernetes.io/projected/7b170acc-4880-42d4-ae54-0946ba0029b5-kube-api-access-vxmpn\") pod \"authentication-operator-69f744f599-qjz9p\" (UID: \"7b170acc-4880-42d4-ae54-0946ba0029b5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qjz9p" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.124406 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2daa468b-e9f8-41a3-ba94-a1e33093fc97-trusted-ca-bundle\") pod \"apiserver-76f77b778f-kplt9\" (UID: \"2daa468b-e9f8-41a3-ba94-a1e33093fc97\") " pod="openshift-apiserver/apiserver-76f77b778f-kplt9" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.124462 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29bfa426-07b0-4acb-a886-9f9316644d71-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pkg9x\" (UID: \"29bfa426-07b0-4acb-a886-9f9316644d71\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pkg9x" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.124489 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/16dd04af-b1db-4a72-8f1f-8d53ffd52b41-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-wvmxq\" (UID: \"16dd04af-b1db-4a72-8f1f-8d53ffd52b41\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wvmxq" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.124516 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/2daa468b-e9f8-41a3-ba94-a1e33093fc97-encryption-config\") pod \"apiserver-76f77b778f-kplt9\" (UID: \"2daa468b-e9f8-41a3-ba94-a1e33093fc97\") " pod="openshift-apiserver/apiserver-76f77b778f-kplt9" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.124553 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/c3ebf4d5-102a-4552-b30b-cbacb3a779fa-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-b9vjf\" (UID: \"c3ebf4d5-102a-4552-b30b-cbacb3a779fa\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-b9vjf" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.124601 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d2bbf8d8-0338-4af4-8d6a-402033f87676-client-ca\") pod \"route-controller-manager-6576b87f9c-7khqv\" (UID: \"d2bbf8d8-0338-4af4-8d6a-402033f87676\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7khqv" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.124632 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17c7a11e-bffa-4ecf-abe0-c467a33538a8-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-j9k4x\" (UID: \"17c7a11e-bffa-4ecf-abe0-c467a33538a8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-j9k4x" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.124658 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/2daa468b-e9f8-41a3-ba94-a1e33093fc97-audit\") pod \"apiserver-76f77b778f-kplt9\" (UID: \"2daa468b-e9f8-41a3-ba94-a1e33093fc97\") " pod="openshift-apiserver/apiserver-76f77b778f-kplt9" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.124687 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/1f694a09-4564-4103-b1b0-ea419e62082e-machine-approver-tls\") pod \"machine-approver-56656f9798-dsrp6\" (UID: \"1f694a09-4564-4103-b1b0-ea419e62082e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dsrp6" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.124716 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2daa468b-e9f8-41a3-ba94-a1e33093fc97-audit-dir\") pod \"apiserver-76f77b778f-kplt9\" (UID: \"2daa468b-e9f8-41a3-ba94-a1e33093fc97\") " pod="openshift-apiserver/apiserver-76f77b778f-kplt9" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.124740 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnq4v\" (UniqueName: \"kubernetes.io/projected/1f694a09-4564-4103-b1b0-ea419e62082e-kube-api-access-jnq4v\") pod \"machine-approver-56656f9798-dsrp6\" (UID: \"1f694a09-4564-4103-b1b0-ea419e62082e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dsrp6" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.124769 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c3ebf4d5-102a-4552-b30b-cbacb3a779fa-images\") pod \"machine-api-operator-5694c8668f-b9vjf\" (UID: \"c3ebf4d5-102a-4552-b30b-cbacb3a779fa\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-b9vjf" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.124800 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/2daa468b-e9f8-41a3-ba94-a1e33093fc97-image-import-ca\") pod \"apiserver-76f77b778f-kplt9\" (UID: \"2daa468b-e9f8-41a3-ba94-a1e33093fc97\") " pod="openshift-apiserver/apiserver-76f77b778f-kplt9" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.126057 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b170acc-4880-42d4-ae54-0946ba0029b5-service-ca-bundle\") pod \"authentication-operator-69f744f599-qjz9p\" (UID: \"7b170acc-4880-42d4-ae54-0946ba0029b5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qjz9p" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.126133 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbwpr\" (UniqueName: \"kubernetes.io/projected/c3ebf4d5-102a-4552-b30b-cbacb3a779fa-kube-api-access-qbwpr\") pod \"machine-api-operator-5694c8668f-b9vjf\" (UID: \"c3ebf4d5-102a-4552-b30b-cbacb3a779fa\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-b9vjf" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.126179 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2bbf8d8-0338-4af4-8d6a-402033f87676-config\") pod \"route-controller-manager-6576b87f9c-7khqv\" (UID: \"d2bbf8d8-0338-4af4-8d6a-402033f87676\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7khqv" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.126217 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1f694a09-4564-4103-b1b0-ea419e62082e-auth-proxy-config\") pod \"machine-approver-56656f9798-dsrp6\" (UID: \"1f694a09-4564-4103-b1b0-ea419e62082e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dsrp6" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.131538 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.131638 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.131845 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.132132 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.135710 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.136723 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-th28b"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.136985 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.137278 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.137313 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-th28b" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.137364 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-w4btr"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.137316 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.137901 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-w4btr" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.137354 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.138091 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-8jr7l"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.137362 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.137387 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.137414 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.137440 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.138313 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.137462 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.137494 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.138497 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-8jr7l" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.137508 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.137521 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.137607 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.137626 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.137641 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.137662 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.138965 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.148081 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.148107 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.148226 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.148281 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.148311 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.148365 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.148421 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.148438 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.148510 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.148522 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.148614 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.148747 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.148778 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.148834 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.148881 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.148935 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.148939 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.148884 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.149010 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.149059 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.149104 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.149152 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.149321 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.149494 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.149509 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.151245 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-829wj"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.151921 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-829wj" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.152657 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.154984 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.155616 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.159139 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vrl25"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.159347 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.160070 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.160268 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.163197 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.163457 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.163643 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.164128 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.164251 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.164470 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.164653 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.164672 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.164700 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.164785 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.164814 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.165131 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vrl25" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.175594 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.177613 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-h6qmw"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.177676 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.179364 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.179910 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.180798 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.186736 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.187286 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-p9c2d"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.188094 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-glpkv"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.188817 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.189815 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.190750 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-h6qmw" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.194243 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-v5wjg"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.194749 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9c2d" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.206529 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.207139 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-glpkv" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.207816 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-mrj44"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.208254 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-6x76r"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.208596 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zj88j"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.208847 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v5wjg" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.208935 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mrj44" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.209088 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-6x76r" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.209686 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-ft4b5"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.210365 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ft4b5" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.210630 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-zj88j" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.210954 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-njrts"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.213129 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nsn58"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.213215 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-njrts" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.212667 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.213649 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nsn58" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.214025 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.214144 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.214462 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.215266 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.217174 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-cnkdl"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.217725 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-44tqp"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.218005 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.218456 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-cnkdl" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.218593 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-44tqp" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.219917 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8g8bf"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.220378 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-rlzws"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.220860 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-rlzws" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.221253 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-f2m5b"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.221624 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-f2m5b" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.222269 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8g8bf" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.222835 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-pbxqh"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.223513 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-pbxqh" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.225338 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.225523 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hr5n5"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.226029 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409615-5d85l"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.226367 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409615-5d85l" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.226367 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hr5n5" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.226842 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-8z2fx"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.227514 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-8z2fx" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.228169 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17c7a11e-bffa-4ecf-abe0-c467a33538a8-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-j9k4x\" (UID: \"17c7a11e-bffa-4ecf-abe0-c467a33538a8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-j9k4x" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.228200 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/2daa468b-e9f8-41a3-ba94-a1e33093fc97-audit\") pod \"apiserver-76f77b778f-kplt9\" (UID: \"2daa468b-e9f8-41a3-ba94-a1e33093fc97\") " pod="openshift-apiserver/apiserver-76f77b778f-kplt9" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.228217 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/1f694a09-4564-4103-b1b0-ea419e62082e-machine-approver-tls\") pod \"machine-approver-56656f9798-dsrp6\" (UID: \"1f694a09-4564-4103-b1b0-ea419e62082e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dsrp6" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.228236 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d2bbf8d8-0338-4af4-8d6a-402033f87676-client-ca\") pod \"route-controller-manager-6576b87f9c-7khqv\" (UID: \"d2bbf8d8-0338-4af4-8d6a-402033f87676\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7khqv" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.228252 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2daa468b-e9f8-41a3-ba94-a1e33093fc97-audit-dir\") pod \"apiserver-76f77b778f-kplt9\" (UID: \"2daa468b-e9f8-41a3-ba94-a1e33093fc97\") " pod="openshift-apiserver/apiserver-76f77b778f-kplt9" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.228269 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnq4v\" (UniqueName: \"kubernetes.io/projected/1f694a09-4564-4103-b1b0-ea419e62082e-kube-api-access-jnq4v\") pod \"machine-approver-56656f9798-dsrp6\" (UID: \"1f694a09-4564-4103-b1b0-ea419e62082e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dsrp6" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.228285 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c3ebf4d5-102a-4552-b30b-cbacb3a779fa-images\") pod \"machine-api-operator-5694c8668f-b9vjf\" (UID: \"c3ebf4d5-102a-4552-b30b-cbacb3a779fa\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-b9vjf" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.228299 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/2daa468b-e9f8-41a3-ba94-a1e33093fc97-image-import-ca\") pod \"apiserver-76f77b778f-kplt9\" (UID: \"2daa468b-e9f8-41a3-ba94-a1e33093fc97\") " pod="openshift-apiserver/apiserver-76f77b778f-kplt9" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.228324 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b170acc-4880-42d4-ae54-0946ba0029b5-service-ca-bundle\") pod \"authentication-operator-69f744f599-qjz9p\" (UID: \"7b170acc-4880-42d4-ae54-0946ba0029b5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qjz9p" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.228339 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1f694a09-4564-4103-b1b0-ea419e62082e-auth-proxy-config\") pod \"machine-approver-56656f9798-dsrp6\" (UID: \"1f694a09-4564-4103-b1b0-ea419e62082e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dsrp6" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.228355 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbwpr\" (UniqueName: \"kubernetes.io/projected/c3ebf4d5-102a-4552-b30b-cbacb3a779fa-kube-api-access-qbwpr\") pod \"machine-api-operator-5694c8668f-b9vjf\" (UID: \"c3ebf4d5-102a-4552-b30b-cbacb3a779fa\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-b9vjf" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.228371 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2bbf8d8-0338-4af4-8d6a-402033f87676-config\") pod \"route-controller-manager-6576b87f9c-7khqv\" (UID: \"d2bbf8d8-0338-4af4-8d6a-402033f87676\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7khqv" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.228403 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlvzx\" (UniqueName: \"kubernetes.io/projected/17c7a11e-bffa-4ecf-abe0-c467a33538a8-kube-api-access-zlvzx\") pod \"openshift-apiserver-operator-796bbdcf4f-j9k4x\" (UID: \"17c7a11e-bffa-4ecf-abe0-c467a33538a8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-j9k4x" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.228418 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/29bfa426-07b0-4acb-a886-9f9316644d71-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pkg9x\" (UID: \"29bfa426-07b0-4acb-a886-9f9316644d71\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pkg9x" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.228436 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3ebf4d5-102a-4552-b30b-cbacb3a779fa-config\") pod \"machine-api-operator-5694c8668f-b9vjf\" (UID: \"c3ebf4d5-102a-4552-b30b-cbacb3a779fa\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-b9vjf" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.228451 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2daa468b-e9f8-41a3-ba94-a1e33093fc97-etcd-client\") pod \"apiserver-76f77b778f-kplt9\" (UID: \"2daa468b-e9f8-41a3-ba94-a1e33093fc97\") " pod="openshift-apiserver/apiserver-76f77b778f-kplt9" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.228465 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2bbf8d8-0338-4af4-8d6a-402033f87676-serving-cert\") pod \"route-controller-manager-6576b87f9c-7khqv\" (UID: \"d2bbf8d8-0338-4af4-8d6a-402033f87676\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7khqv" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.228480 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b170acc-4880-42d4-ae54-0946ba0029b5-config\") pod \"authentication-operator-69f744f599-qjz9p\" (UID: \"7b170acc-4880-42d4-ae54-0946ba0029b5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qjz9p" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.228497 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17c7a11e-bffa-4ecf-abe0-c467a33538a8-config\") pod \"openshift-apiserver-operator-796bbdcf4f-j9k4x\" (UID: \"17c7a11e-bffa-4ecf-abe0-c467a33538a8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-j9k4x" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.228520 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2daa468b-e9f8-41a3-ba94-a1e33093fc97-serving-cert\") pod \"apiserver-76f77b778f-kplt9\" (UID: \"2daa468b-e9f8-41a3-ba94-a1e33093fc97\") " pod="openshift-apiserver/apiserver-76f77b778f-kplt9" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.228538 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f694a09-4564-4103-b1b0-ea419e62082e-config\") pod \"machine-approver-56656f9798-dsrp6\" (UID: \"1f694a09-4564-4103-b1b0-ea419e62082e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dsrp6" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.228553 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghv98\" (UniqueName: \"kubernetes.io/projected/d2bbf8d8-0338-4af4-8d6a-402033f87676-kube-api-access-ghv98\") pod \"route-controller-manager-6576b87f9c-7khqv\" (UID: \"d2bbf8d8-0338-4af4-8d6a-402033f87676\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7khqv" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.228585 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/2daa468b-e9f8-41a3-ba94-a1e33093fc97-etcd-serving-ca\") pod \"apiserver-76f77b778f-kplt9\" (UID: \"2daa468b-e9f8-41a3-ba94-a1e33093fc97\") " pod="openshift-apiserver/apiserver-76f77b778f-kplt9" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.228600 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29bfa426-07b0-4acb-a886-9f9316644d71-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pkg9x\" (UID: \"29bfa426-07b0-4acb-a886-9f9316644d71\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pkg9x" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.228628 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b170acc-4880-42d4-ae54-0946ba0029b5-serving-cert\") pod \"authentication-operator-69f744f599-qjz9p\" (UID: \"7b170acc-4880-42d4-ae54-0946ba0029b5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qjz9p" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.228721 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b170acc-4880-42d4-ae54-0946ba0029b5-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-qjz9p\" (UID: \"7b170acc-4880-42d4-ae54-0946ba0029b5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qjz9p" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.228738 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bx9bx\" (UniqueName: \"kubernetes.io/projected/16dd04af-b1db-4a72-8f1f-8d53ffd52b41-kube-api-access-bx9bx\") pod \"cluster-samples-operator-665b6dd947-wvmxq\" (UID: \"16dd04af-b1db-4a72-8f1f-8d53ffd52b41\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wvmxq" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.228753 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2daa468b-e9f8-41a3-ba94-a1e33093fc97-config\") pod \"apiserver-76f77b778f-kplt9\" (UID: \"2daa468b-e9f8-41a3-ba94-a1e33093fc97\") " pod="openshift-apiserver/apiserver-76f77b778f-kplt9" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.228770 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/2daa468b-e9f8-41a3-ba94-a1e33093fc97-node-pullsecrets\") pod \"apiserver-76f77b778f-kplt9\" (UID: \"2daa468b-e9f8-41a3-ba94-a1e33093fc97\") " pod="openshift-apiserver/apiserver-76f77b778f-kplt9" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.228785 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ws6fx\" (UniqueName: \"kubernetes.io/projected/2daa468b-e9f8-41a3-ba94-a1e33093fc97-kube-api-access-ws6fx\") pod \"apiserver-76f77b778f-kplt9\" (UID: \"2daa468b-e9f8-41a3-ba94-a1e33093fc97\") " pod="openshift-apiserver/apiserver-76f77b778f-kplt9" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.228816 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxmpn\" (UniqueName: \"kubernetes.io/projected/7b170acc-4880-42d4-ae54-0946ba0029b5-kube-api-access-vxmpn\") pod \"authentication-operator-69f744f599-qjz9p\" (UID: \"7b170acc-4880-42d4-ae54-0946ba0029b5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qjz9p" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.228834 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2daa468b-e9f8-41a3-ba94-a1e33093fc97-trusted-ca-bundle\") pod \"apiserver-76f77b778f-kplt9\" (UID: \"2daa468b-e9f8-41a3-ba94-a1e33093fc97\") " pod="openshift-apiserver/apiserver-76f77b778f-kplt9" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.228849 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29bfa426-07b0-4acb-a886-9f9316644d71-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pkg9x\" (UID: \"29bfa426-07b0-4acb-a886-9f9316644d71\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pkg9x" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.228865 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/16dd04af-b1db-4a72-8f1f-8d53ffd52b41-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-wvmxq\" (UID: \"16dd04af-b1db-4a72-8f1f-8d53ffd52b41\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wvmxq" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.228879 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/2daa468b-e9f8-41a3-ba94-a1e33093fc97-encryption-config\") pod \"apiserver-76f77b778f-kplt9\" (UID: \"2daa468b-e9f8-41a3-ba94-a1e33093fc97\") " pod="openshift-apiserver/apiserver-76f77b778f-kplt9" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.228895 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/c3ebf4d5-102a-4552-b30b-cbacb3a779fa-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-b9vjf\" (UID: \"c3ebf4d5-102a-4552-b30b-cbacb3a779fa\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-b9vjf" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.230709 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jdc97"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.231080 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-qjz9p"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.231158 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wvmxq"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.231249 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jdc97" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.232279 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29bfa426-07b0-4acb-a886-9f9316644d71-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pkg9x\" (UID: \"29bfa426-07b0-4acb-a886-9f9316644d71\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pkg9x" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.232837 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c3ebf4d5-102a-4552-b30b-cbacb3a779fa-images\") pod \"machine-api-operator-5694c8668f-b9vjf\" (UID: \"c3ebf4d5-102a-4552-b30b-cbacb3a779fa\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-b9vjf" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.232942 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f694a09-4564-4103-b1b0-ea419e62082e-config\") pod \"machine-approver-56656f9798-dsrp6\" (UID: \"1f694a09-4564-4103-b1b0-ea419e62082e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dsrp6" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.232960 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b170acc-4880-42d4-ae54-0946ba0029b5-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-qjz9p\" (UID: \"7b170acc-4880-42d4-ae54-0946ba0029b5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qjz9p" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.234065 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/2daa468b-e9f8-41a3-ba94-a1e33093fc97-etcd-serving-ca\") pod \"apiserver-76f77b778f-kplt9\" (UID: \"2daa468b-e9f8-41a3-ba94-a1e33093fc97\") " pod="openshift-apiserver/apiserver-76f77b778f-kplt9" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.234119 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-b9vjf"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.242012 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rsfff"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.242475 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b170acc-4880-42d4-ae54-0946ba0029b5-service-ca-bundle\") pod \"authentication-operator-69f744f599-qjz9p\" (UID: \"7b170acc-4880-42d4-ae54-0946ba0029b5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qjz9p" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.242675 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3ebf4d5-102a-4552-b30b-cbacb3a779fa-config\") pod \"machine-api-operator-5694c8668f-b9vjf\" (UID: \"c3ebf4d5-102a-4552-b30b-cbacb3a779fa\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-b9vjf" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.242760 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/c3ebf4d5-102a-4552-b30b-cbacb3a779fa-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-b9vjf\" (UID: \"c3ebf4d5-102a-4552-b30b-cbacb3a779fa\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-b9vjf" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.243032 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2daa468b-e9f8-41a3-ba94-a1e33093fc97-config\") pod \"apiserver-76f77b778f-kplt9\" (UID: \"2daa468b-e9f8-41a3-ba94-a1e33093fc97\") " pod="openshift-apiserver/apiserver-76f77b778f-kplt9" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.243072 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1f694a09-4564-4103-b1b0-ea419e62082e-auth-proxy-config\") pod \"machine-approver-56656f9798-dsrp6\" (UID: \"1f694a09-4564-4103-b1b0-ea419e62082e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dsrp6" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.244036 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2bbf8d8-0338-4af4-8d6a-402033f87676-config\") pod \"route-controller-manager-6576b87f9c-7khqv\" (UID: \"d2bbf8d8-0338-4af4-8d6a-402033f87676\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7khqv" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.244181 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2daa468b-e9f8-41a3-ba94-a1e33093fc97-trusted-ca-bundle\") pod \"apiserver-76f77b778f-kplt9\" (UID: \"2daa468b-e9f8-41a3-ba94-a1e33093fc97\") " pod="openshift-apiserver/apiserver-76f77b778f-kplt9" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.244285 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/2daa468b-e9f8-41a3-ba94-a1e33093fc97-node-pullsecrets\") pod \"apiserver-76f77b778f-kplt9\" (UID: \"2daa468b-e9f8-41a3-ba94-a1e33093fc97\") " pod="openshift-apiserver/apiserver-76f77b778f-kplt9" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.245955 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17c7a11e-bffa-4ecf-abe0-c467a33538a8-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-j9k4x\" (UID: \"17c7a11e-bffa-4ecf-abe0-c467a33538a8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-j9k4x" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.246804 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.247779 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29bfa426-07b0-4acb-a886-9f9316644d71-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pkg9x\" (UID: \"29bfa426-07b0-4acb-a886-9f9316644d71\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pkg9x" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.248574 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d2bbf8d8-0338-4af4-8d6a-402033f87676-client-ca\") pod \"route-controller-manager-6576b87f9c-7khqv\" (UID: \"d2bbf8d8-0338-4af4-8d6a-402033f87676\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7khqv" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.248661 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2daa468b-e9f8-41a3-ba94-a1e33093fc97-etcd-client\") pod \"apiserver-76f77b778f-kplt9\" (UID: \"2daa468b-e9f8-41a3-ba94-a1e33093fc97\") " pod="openshift-apiserver/apiserver-76f77b778f-kplt9" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.248932 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2daa468b-e9f8-41a3-ba94-a1e33093fc97-audit-dir\") pod \"apiserver-76f77b778f-kplt9\" (UID: \"2daa468b-e9f8-41a3-ba94-a1e33093fc97\") " pod="openshift-apiserver/apiserver-76f77b778f-kplt9" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.249726 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b170acc-4880-42d4-ae54-0946ba0029b5-config\") pod \"authentication-operator-69f744f599-qjz9p\" (UID: \"7b170acc-4880-42d4-ae54-0946ba0029b5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qjz9p" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.250816 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/1f694a09-4564-4103-b1b0-ea419e62082e-machine-approver-tls\") pod \"machine-approver-56656f9798-dsrp6\" (UID: \"1f694a09-4564-4103-b1b0-ea419e62082e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dsrp6" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.250915 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/2daa468b-e9f8-41a3-ba94-a1e33093fc97-image-import-ca\") pod \"apiserver-76f77b778f-kplt9\" (UID: \"2daa468b-e9f8-41a3-ba94-a1e33093fc97\") " pod="openshift-apiserver/apiserver-76f77b778f-kplt9" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.251246 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b170acc-4880-42d4-ae54-0946ba0029b5-serving-cert\") pod \"authentication-operator-69f744f599-qjz9p\" (UID: \"7b170acc-4880-42d4-ae54-0946ba0029b5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qjz9p" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.251671 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17c7a11e-bffa-4ecf-abe0-c467a33538a8-config\") pod \"openshift-apiserver-operator-796bbdcf4f-j9k4x\" (UID: \"17c7a11e-bffa-4ecf-abe0-c467a33538a8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-j9k4x" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.252347 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2bbf8d8-0338-4af4-8d6a-402033f87676-serving-cert\") pod \"route-controller-manager-6576b87f9c-7khqv\" (UID: \"d2bbf8d8-0338-4af4-8d6a-402033f87676\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7khqv" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.252659 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2daa468b-e9f8-41a3-ba94-a1e33093fc97-serving-cert\") pod \"apiserver-76f77b778f-kplt9\" (UID: \"2daa468b-e9f8-41a3-ba94-a1e33093fc97\") " pod="openshift-apiserver/apiserver-76f77b778f-kplt9" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.252845 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.254397 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/2daa468b-e9f8-41a3-ba94-a1e33093fc97-encryption-config\") pod \"apiserver-76f77b778f-kplt9\" (UID: \"2daa468b-e9f8-41a3-ba94-a1e33093fc97\") " pod="openshift-apiserver/apiserver-76f77b778f-kplt9" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.256984 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-kplt9"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.257012 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-th28b"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.257022 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4f588"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.258692 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-h6qmw"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.258021 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rsfff" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.258934 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/2daa468b-e9f8-41a3-ba94-a1e33093fc97-audit\") pod \"apiserver-76f77b778f-kplt9\" (UID: \"2daa468b-e9f8-41a3-ba94-a1e33093fc97\") " pod="openshift-apiserver/apiserver-76f77b778f-kplt9" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.259723 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/16dd04af-b1db-4a72-8f1f-8d53ffd52b41-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-wvmxq\" (UID: \"16dd04af-b1db-4a72-8f1f-8d53ffd52b41\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wvmxq" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.263275 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-v5wjg"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.264755 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-p77t7"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.265816 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-829wj"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.269887 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-c6929"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.270872 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-c6929" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.271202 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-j9k4x"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.271877 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.274619 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zj88j"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.275800 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pkg9x"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.276690 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-p9c2d"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.277707 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-44tqp"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.278759 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-mrj44"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.279791 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-7khqv"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.280822 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-njrts"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.281851 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-w4btr"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.282860 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-qmztb"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.284284 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vrl25"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.285154 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-8jr7l"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.286176 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-jshhn"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.287202 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-glpkv"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.288213 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nsn58"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.289234 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-ft4b5"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.291461 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-dpnk8"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.292295 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.292386 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dpnk8" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.292771 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rsfff"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.293933 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8g8bf"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.295338 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-f2m5b"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.296903 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jdc97"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.298341 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-qwgn4"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.299054 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-qwgn4" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.300357 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409615-5d85l"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.301480 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-cnkdl"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.302783 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-rlzws"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.303847 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-8z2fx"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.305727 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-qwgn4"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.306785 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-dpnk8"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.307829 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hr5n5"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.308997 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-pbxqh"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.311140 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-pt82b"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.312656 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-pt82b" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.313481 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-pt82b"] Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.332970 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.352254 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.372548 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.392523 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.412699 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.432804 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.451899 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.472779 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.492551 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.512074 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.532978 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.561936 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.572765 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.592691 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.612747 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.632804 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.653307 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.673391 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.692890 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.713693 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.732922 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.753077 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.771973 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.792554 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.812496 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.833599 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.852370 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.873436 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.893104 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.912490 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.933078 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.953418 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.972798 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 01 08:19:17 crc kubenswrapper[5004]: I1201 08:19:17.994305 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 01 08:19:18 crc kubenswrapper[5004]: I1201 08:19:18.012639 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 01 08:19:18 crc kubenswrapper[5004]: I1201 08:19:18.032759 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 01 08:19:18 crc kubenswrapper[5004]: I1201 08:19:18.053185 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 01 08:19:18 crc kubenswrapper[5004]: I1201 08:19:18.084538 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 01 08:19:18 crc kubenswrapper[5004]: I1201 08:19:18.093868 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 01 08:19:18 crc kubenswrapper[5004]: I1201 08:19:18.113413 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 01 08:19:18 crc kubenswrapper[5004]: I1201 08:19:18.134011 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 01 08:19:18 crc kubenswrapper[5004]: I1201 08:19:18.153112 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 01 08:19:18 crc kubenswrapper[5004]: I1201 08:19:18.174643 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 01 08:19:18 crc kubenswrapper[5004]: I1201 08:19:18.193704 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 01 08:19:18 crc kubenswrapper[5004]: I1201 08:19:18.213535 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 01 08:19:18 crc kubenswrapper[5004]: I1201 08:19:18.230625 5004 request.go:700] Waited for 1.010872597s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-storage-version-migrator/secrets?fieldSelector=metadata.name%3Dkube-storage-version-migrator-sa-dockercfg-5xfcg&limit=500&resourceVersion=0 Dec 01 08:19:18 crc kubenswrapper[5004]: I1201 08:19:18.232435 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 01 08:19:18 crc kubenswrapper[5004]: I1201 08:19:18.273437 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 01 08:19:18 crc kubenswrapper[5004]: I1201 08:19:18.293120 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 01 08:19:18 crc kubenswrapper[5004]: I1201 08:19:18.313334 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 01 08:19:18 crc kubenswrapper[5004]: I1201 08:19:18.333645 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 01 08:19:18 crc kubenswrapper[5004]: I1201 08:19:18.353084 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 01 08:19:18 crc kubenswrapper[5004]: I1201 08:19:18.373634 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 01 08:19:18 crc kubenswrapper[5004]: I1201 08:19:18.428507 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 01 08:19:18 crc kubenswrapper[5004]: I1201 08:19:18.428521 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 01 08:19:18 crc kubenswrapper[5004]: I1201 08:19:18.432334 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 08:19:18 crc kubenswrapper[5004]: I1201 08:19:18.452037 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 01 08:19:18 crc kubenswrapper[5004]: I1201 08:19:18.473049 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 01 08:19:18 crc kubenswrapper[5004]: I1201 08:19:18.493204 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 01 08:19:18 crc kubenswrapper[5004]: I1201 08:19:18.512166 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 01 08:19:18 crc kubenswrapper[5004]: I1201 08:19:18.532937 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 01 08:19:18 crc kubenswrapper[5004]: I1201 08:19:18.552340 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 01 08:19:18 crc kubenswrapper[5004]: I1201 08:19:18.572915 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 01 08:19:18 crc kubenswrapper[5004]: I1201 08:19:18.592956 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 01 08:19:18 crc kubenswrapper[5004]: I1201 08:19:18.612935 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 01 08:19:18 crc kubenswrapper[5004]: I1201 08:19:18.633031 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 01 08:19:18 crc kubenswrapper[5004]: I1201 08:19:18.653235 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 01 08:19:18 crc kubenswrapper[5004]: I1201 08:19:18.673527 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 01 08:19:18 crc kubenswrapper[5004]: I1201 08:19:18.693946 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 01 08:19:18 crc kubenswrapper[5004]: I1201 08:19:18.713259 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 01 08:19:18 crc kubenswrapper[5004]: I1201 08:19:18.732600 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 01 08:19:18 crc kubenswrapper[5004]: I1201 08:19:18.759596 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 01 08:19:18 crc kubenswrapper[5004]: I1201 08:19:18.771770 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 01 08:19:18 crc kubenswrapper[5004]: I1201 08:19:18.793904 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 01 08:19:18 crc kubenswrapper[5004]: I1201 08:19:18.813316 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 08:19:18 crc kubenswrapper[5004]: I1201 08:19:18.832848 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 01 08:19:18 crc kubenswrapper[5004]: I1201 08:19:18.852789 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 01 08:19:18 crc kubenswrapper[5004]: I1201 08:19:18.892087 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnq4v\" (UniqueName: \"kubernetes.io/projected/1f694a09-4564-4103-b1b0-ea419e62082e-kube-api-access-jnq4v\") pod \"machine-approver-56656f9798-dsrp6\" (UID: \"1f694a09-4564-4103-b1b0-ea419e62082e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dsrp6" Dec 01 08:19:18 crc kubenswrapper[5004]: I1201 08:19:18.894204 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 01 08:19:18 crc kubenswrapper[5004]: I1201 08:19:18.939349 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlvzx\" (UniqueName: \"kubernetes.io/projected/17c7a11e-bffa-4ecf-abe0-c467a33538a8-kube-api-access-zlvzx\") pod \"openshift-apiserver-operator-796bbdcf4f-j9k4x\" (UID: \"17c7a11e-bffa-4ecf-abe0-c467a33538a8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-j9k4x" Dec 01 08:19:18 crc kubenswrapper[5004]: I1201 08:19:18.950732 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghv98\" (UniqueName: \"kubernetes.io/projected/d2bbf8d8-0338-4af4-8d6a-402033f87676-kube-api-access-ghv98\") pod \"route-controller-manager-6576b87f9c-7khqv\" (UID: \"d2bbf8d8-0338-4af4-8d6a-402033f87676\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7khqv" Dec 01 08:19:18 crc kubenswrapper[5004]: I1201 08:19:18.953245 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dsrp6" Dec 01 08:19:18 crc kubenswrapper[5004]: I1201 08:19:18.963742 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-j9k4x" Dec 01 08:19:18 crc kubenswrapper[5004]: W1201 08:19:18.980065 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f694a09_4564_4103_b1b0_ea419e62082e.slice/crio-0ab40725ddd1344d8de61b05f290d0e86dc37a722aaf8d89bc59a73feab74c28 WatchSource:0}: Error finding container 0ab40725ddd1344d8de61b05f290d0e86dc37a722aaf8d89bc59a73feab74c28: Status 404 returned error can't find the container with id 0ab40725ddd1344d8de61b05f290d0e86dc37a722aaf8d89bc59a73feab74c28 Dec 01 08:19:18 crc kubenswrapper[5004]: I1201 08:19:18.984356 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/29bfa426-07b0-4acb-a886-9f9316644d71-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pkg9x\" (UID: \"29bfa426-07b0-4acb-a886-9f9316644d71\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pkg9x" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.002889 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbwpr\" (UniqueName: \"kubernetes.io/projected/c3ebf4d5-102a-4552-b30b-cbacb3a779fa-kube-api-access-qbwpr\") pod \"machine-api-operator-5694c8668f-b9vjf\" (UID: \"c3ebf4d5-102a-4552-b30b-cbacb3a779fa\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-b9vjf" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.012595 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ws6fx\" (UniqueName: \"kubernetes.io/projected/2daa468b-e9f8-41a3-ba94-a1e33093fc97-kube-api-access-ws6fx\") pod \"apiserver-76f77b778f-kplt9\" (UID: \"2daa468b-e9f8-41a3-ba94-a1e33093fc97\") " pod="openshift-apiserver/apiserver-76f77b778f-kplt9" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.013075 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pkg9x" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.057503 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxmpn\" (UniqueName: \"kubernetes.io/projected/7b170acc-4880-42d4-ae54-0946ba0029b5-kube-api-access-vxmpn\") pod \"authentication-operator-69f744f599-qjz9p\" (UID: \"7b170acc-4880-42d4-ae54-0946ba0029b5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qjz9p" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.070705 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.071463 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bx9bx\" (UniqueName: \"kubernetes.io/projected/16dd04af-b1db-4a72-8f1f-8d53ffd52b41-kube-api-access-bx9bx\") pod \"cluster-samples-operator-665b6dd947-wvmxq\" (UID: \"16dd04af-b1db-4a72-8f1f-8d53ffd52b41\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wvmxq" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.076075 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.092724 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.116210 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.132207 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.152655 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.174104 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.192850 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.199757 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-kplt9" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.211326 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-j9k4x"] Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.212578 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.214723 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-b9vjf" Dec 01 08:19:19 crc kubenswrapper[5004]: W1201 08:19:19.218700 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17c7a11e_bffa_4ecf_abe0_c467a33538a8.slice/crio-5cf41589bc241bb2e5363c35553f5f6844e9e8a146e666e530c6fa48746ce5be WatchSource:0}: Error finding container 5cf41589bc241bb2e5363c35553f5f6844e9e8a146e666e530c6fa48746ce5be: Status 404 returned error can't find the container with id 5cf41589bc241bb2e5363c35553f5f6844e9e8a146e666e530c6fa48746ce5be Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.232702 5004 request.go:700] Waited for 1.933381147s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-dns/secrets?fieldSelector=metadata.name%3Ddns-default-metrics-tls&limit=500&resourceVersion=0 Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.234215 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.237409 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7khqv" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.251863 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.272941 5004 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.277824 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wvmxq" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.293739 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.294821 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-qjz9p" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.312072 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.357011 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-audit-dir\") pod \"oauth-openshift-558db77b4-p77t7\" (UID: \"15dc5e3f-02c6-474d-bd7b-d51ce42340b3\") " pod="openshift-authentication/oauth-openshift-558db77b4-p77t7" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.357047 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cdd6ce32-26c3-4202-860b-8b37ead2941c-config\") pod \"etcd-operator-b45778765-glpkv\" (UID: \"cdd6ce32-26c3-4202-860b-8b37ead2941c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-glpkv" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.357071 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9c645213-a3fd-4f35-9edd-60905873a559-bound-sa-token\") pod \"image-registry-697d97f7c8-8jr7l\" (UID: \"9c645213-a3fd-4f35-9edd-60905873a559\") " pod="openshift-image-registry/image-registry-697d97f7c8-8jr7l" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.357094 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-p77t7\" (UID: \"15dc5e3f-02c6-474d-bd7b-d51ce42340b3\") " pod="openshift-authentication/oauth-openshift-558db77b4-p77t7" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.357119 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hc6xg\" (UniqueName: \"kubernetes.io/projected/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-kube-api-access-hc6xg\") pod \"oauth-openshift-558db77b4-p77t7\" (UID: \"15dc5e3f-02c6-474d-bd7b-d51ce42340b3\") " pod="openshift-authentication/oauth-openshift-558db77b4-p77t7" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.357142 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-p77t7\" (UID: \"15dc5e3f-02c6-474d-bd7b-d51ce42340b3\") " pod="openshift-authentication/oauth-openshift-558db77b4-p77t7" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.357180 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-audit-policies\") pod \"oauth-openshift-558db77b4-p77t7\" (UID: \"15dc5e3f-02c6-474d-bd7b-d51ce42340b3\") " pod="openshift-authentication/oauth-openshift-558db77b4-p77t7" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.357214 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cdd6ce32-26c3-4202-860b-8b37ead2941c-serving-cert\") pod \"etcd-operator-b45778765-glpkv\" (UID: \"cdd6ce32-26c3-4202-860b-8b37ead2941c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-glpkv" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.357259 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hh7c\" (UniqueName: \"kubernetes.io/projected/6047f6c2-4e66-4dde-b262-383c622eef04-kube-api-access-5hh7c\") pod \"cluster-image-registry-operator-dc59b4c8b-829wj\" (UID: \"6047f6c2-4e66-4dde-b262-383c622eef04\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-829wj" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.357279 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/cdd6ce32-26c3-4202-860b-8b37ead2941c-etcd-service-ca\") pod \"etcd-operator-b45778765-glpkv\" (UID: \"cdd6ce32-26c3-4202-860b-8b37ead2941c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-glpkv" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.357303 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mh5lv\" (UniqueName: \"kubernetes.io/projected/59b7fdd8-0d91-4442-a2a8-41c92d027266-kube-api-access-mh5lv\") pod \"openshift-controller-manager-operator-756b6f6bc6-4f588\" (UID: \"59b7fdd8-0d91-4442-a2a8-41c92d027266\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4f588" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.357328 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-p77t7\" (UID: \"15dc5e3f-02c6-474d-bd7b-d51ce42340b3\") " pod="openshift-authentication/oauth-openshift-558db77b4-p77t7" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.357350 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af840de7-db59-4020-a5c3-2d888069db1e-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-vrl25\" (UID: \"af840de7-db59-4020-a5c3-2d888069db1e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vrl25" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.357374 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa8bb474-21e8-42b3-a2c6-81ef7d267d9d-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-h6qmw\" (UID: \"aa8bb474-21e8-42b3-a2c6-81ef7d267d9d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-h6qmw" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.357396 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/caebe48f-0ac0-436e-983b-6c5858472cf7-config\") pod \"console-operator-58897d9998-jshhn\" (UID: \"caebe48f-0ac0-436e-983b-6c5858472cf7\") " pod="openshift-console-operator/console-operator-58897d9998-jshhn" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.357420 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ce579b07-073d-450d-b056-1be2c7bed20f-console-config\") pod \"console-f9d7485db-th28b\" (UID: \"ce579b07-073d-450d-b056-1be2c7bed20f\") " pod="openshift-console/console-f9d7485db-th28b" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.357454 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce579b07-073d-450d-b056-1be2c7bed20f-trusted-ca-bundle\") pod \"console-f9d7485db-th28b\" (UID: \"ce579b07-073d-450d-b056-1be2c7bed20f\") " pod="openshift-console/console-f9d7485db-th28b" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.357491 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6047f6c2-4e66-4dde-b262-383c622eef04-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-829wj\" (UID: \"6047f6c2-4e66-4dde-b262-383c622eef04\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-829wj" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.357531 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e0f00c28-2b6e-4127-a70e-43761e7cdb9e-bound-sa-token\") pod \"ingress-operator-5b745b69d9-v5wjg\" (UID: \"e0f00c28-2b6e-4127-a70e-43761e7cdb9e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v5wjg" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.357573 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-p77t7\" (UID: \"15dc5e3f-02c6-474d-bd7b-d51ce42340b3\") " pod="openshift-authentication/oauth-openshift-558db77b4-p77t7" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.357628 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6047f6c2-4e66-4dde-b262-383c622eef04-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-829wj\" (UID: \"6047f6c2-4e66-4dde-b262-383c622eef04\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-829wj" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.357654 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/aa8bb474-21e8-42b3-a2c6-81ef7d267d9d-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-h6qmw\" (UID: \"aa8bb474-21e8-42b3-a2c6-81ef7d267d9d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-h6qmw" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.357679 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ce579b07-073d-450d-b056-1be2c7bed20f-console-serving-cert\") pod \"console-f9d7485db-th28b\" (UID: \"ce579b07-073d-450d-b056-1be2c7bed20f\") " pod="openshift-console/console-f9d7485db-th28b" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.357701 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a2404628-0f25-4889-8a15-73576dd41470-metrics-tls\") pod \"dns-operator-744455d44c-w4btr\" (UID: \"a2404628-0f25-4889-8a15-73576dd41470\") " pod="openshift-dns-operator/dns-operator-744455d44c-w4btr" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.357732 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plrp2\" (UniqueName: \"kubernetes.io/projected/55711190-8e14-4951-9ac3-dc3675c3a86e-kube-api-access-plrp2\") pod \"openshift-config-operator-7777fb866f-p9c2d\" (UID: \"55711190-8e14-4951-9ac3-dc3675c3a86e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9c2d" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.357757 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-p77t7\" (UID: \"15dc5e3f-02c6-474d-bd7b-d51ce42340b3\") " pod="openshift-authentication/oauth-openshift-558db77b4-p77t7" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.357780 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9c645213-a3fd-4f35-9edd-60905873a559-trusted-ca\") pod \"image-registry-697d97f7c8-8jr7l\" (UID: \"9c645213-a3fd-4f35-9edd-60905873a559\") " pod="openshift-image-registry/image-registry-697d97f7c8-8jr7l" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.357802 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-p77t7\" (UID: \"15dc5e3f-02c6-474d-bd7b-d51ce42340b3\") " pod="openshift-authentication/oauth-openshift-558db77b4-p77t7" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.357851 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa8bb474-21e8-42b3-a2c6-81ef7d267d9d-config\") pod \"kube-controller-manager-operator-78b949d7b-h6qmw\" (UID: \"aa8bb474-21e8-42b3-a2c6-81ef7d267d9d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-h6qmw" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.357877 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-p77t7\" (UID: \"15dc5e3f-02c6-474d-bd7b-d51ce42340b3\") " pod="openshift-authentication/oauth-openshift-558db77b4-p77t7" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.357923 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lv9xz\" (UniqueName: \"kubernetes.io/projected/ce579b07-073d-450d-b056-1be2c7bed20f-kube-api-access-lv9xz\") pod \"console-f9d7485db-th28b\" (UID: \"ce579b07-073d-450d-b056-1be2c7bed20f\") " pod="openshift-console/console-f9d7485db-th28b" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.357949 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-p77t7\" (UID: \"15dc5e3f-02c6-474d-bd7b-d51ce42340b3\") " pod="openshift-authentication/oauth-openshift-558db77b4-p77t7" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.357989 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/6047f6c2-4e66-4dde-b262-383c622eef04-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-829wj\" (UID: \"6047f6c2-4e66-4dde-b262-383c622eef04\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-829wj" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.358014 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e0f00c28-2b6e-4127-a70e-43761e7cdb9e-metrics-tls\") pod \"ingress-operator-5b745b69d9-v5wjg\" (UID: \"e0f00c28-2b6e-4127-a70e-43761e7cdb9e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v5wjg" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.358239 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55711190-8e14-4951-9ac3-dc3675c3a86e-serving-cert\") pod \"openshift-config-operator-7777fb866f-p9c2d\" (UID: \"55711190-8e14-4951-9ac3-dc3675c3a86e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9c2d" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.358269 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-p77t7\" (UID: \"15dc5e3f-02c6-474d-bd7b-d51ce42340b3\") " pod="openshift-authentication/oauth-openshift-558db77b4-p77t7" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.358295 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/af840de7-db59-4020-a5c3-2d888069db1e-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-vrl25\" (UID: \"af840de7-db59-4020-a5c3-2d888069db1e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vrl25" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.358318 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2d4r\" (UniqueName: \"kubernetes.io/projected/c888afb0-ad29-42e2-ba4a-594f27ebbe4e-kube-api-access-g2d4r\") pod \"downloads-7954f5f757-qmztb\" (UID: \"c888afb0-ad29-42e2-ba4a-594f27ebbe4e\") " pod="openshift-console/downloads-7954f5f757-qmztb" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.358344 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bv8b7\" (UniqueName: \"kubernetes.io/projected/9c645213-a3fd-4f35-9edd-60905873a559-kube-api-access-bv8b7\") pod \"image-registry-697d97f7c8-8jr7l\" (UID: \"9c645213-a3fd-4f35-9edd-60905873a559\") " pod="openshift-image-registry/image-registry-697d97f7c8-8jr7l" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.358367 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ce579b07-073d-450d-b056-1be2c7bed20f-service-ca\") pod \"console-f9d7485db-th28b\" (UID: \"ce579b07-073d-450d-b056-1be2c7bed20f\") " pod="openshift-console/console-f9d7485db-th28b" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.358390 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-p77t7\" (UID: \"15dc5e3f-02c6-474d-bd7b-d51ce42340b3\") " pod="openshift-authentication/oauth-openshift-558db77b4-p77t7" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.358412 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9c645213-a3fd-4f35-9edd-60905873a559-registry-certificates\") pod \"image-registry-697d97f7c8-8jr7l\" (UID: \"9c645213-a3fd-4f35-9edd-60905873a559\") " pod="openshift-image-registry/image-registry-697d97f7c8-8jr7l" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.358453 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8n8g\" (UniqueName: \"kubernetes.io/projected/e0f00c28-2b6e-4127-a70e-43761e7cdb9e-kube-api-access-r8n8g\") pod \"ingress-operator-5b745b69d9-v5wjg\" (UID: \"e0f00c28-2b6e-4127-a70e-43761e7cdb9e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v5wjg" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.358476 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ce579b07-073d-450d-b056-1be2c7bed20f-oauth-serving-cert\") pod \"console-f9d7485db-th28b\" (UID: \"ce579b07-073d-450d-b056-1be2c7bed20f\") " pod="openshift-console/console-f9d7485db-th28b" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.359971 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-565jz\" (UniqueName: \"kubernetes.io/projected/caebe48f-0ac0-436e-983b-6c5858472cf7-kube-api-access-565jz\") pod \"console-operator-58897d9998-jshhn\" (UID: \"caebe48f-0ac0-436e-983b-6c5858472cf7\") " pod="openshift-console-operator/console-operator-58897d9998-jshhn" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.360002 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59b7fdd8-0d91-4442-a2a8-41c92d027266-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-4f588\" (UID: \"59b7fdd8-0d91-4442-a2a8-41c92d027266\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4f588" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.360042 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7r65\" (UniqueName: \"kubernetes.io/projected/a2404628-0f25-4889-8a15-73576dd41470-kube-api-access-w7r65\") pod \"dns-operator-744455d44c-w4btr\" (UID: \"a2404628-0f25-4889-8a15-73576dd41470\") " pod="openshift-dns-operator/dns-operator-744455d44c-w4btr" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.360065 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af840de7-db59-4020-a5c3-2d888069db1e-config\") pod \"kube-apiserver-operator-766d6c64bb-vrl25\" (UID: \"af840de7-db59-4020-a5c3-2d888069db1e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vrl25" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.361280 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qg92\" (UniqueName: \"kubernetes.io/projected/cdd6ce32-26c3-4202-860b-8b37ead2941c-kube-api-access-4qg92\") pod \"etcd-operator-b45778765-glpkv\" (UID: \"cdd6ce32-26c3-4202-860b-8b37ead2941c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-glpkv" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.361399 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8jr7l\" (UID: \"9c645213-a3fd-4f35-9edd-60905873a559\") " pod="openshift-image-registry/image-registry-697d97f7c8-8jr7l" Dec 01 08:19:19 crc kubenswrapper[5004]: E1201 08:19:19.362391 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 08:19:19.862375097 +0000 UTC m=+137.427367079 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8jr7l" (UID: "9c645213-a3fd-4f35-9edd-60905873a559") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.362767 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/55711190-8e14-4951-9ac3-dc3675c3a86e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-p9c2d\" (UID: \"55711190-8e14-4951-9ac3-dc3675c3a86e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9c2d" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.362967 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ce579b07-073d-450d-b056-1be2c7bed20f-console-oauth-config\") pod \"console-f9d7485db-th28b\" (UID: \"ce579b07-073d-450d-b056-1be2c7bed20f\") " pod="openshift-console/console-f9d7485db-th28b" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.363167 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9c645213-a3fd-4f35-9edd-60905873a559-installation-pull-secrets\") pod \"image-registry-697d97f7c8-8jr7l\" (UID: \"9c645213-a3fd-4f35-9edd-60905873a559\") " pod="openshift-image-registry/image-registry-697d97f7c8-8jr7l" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.363419 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e0f00c28-2b6e-4127-a70e-43761e7cdb9e-trusted-ca\") pod \"ingress-operator-5b745b69d9-v5wjg\" (UID: \"e0f00c28-2b6e-4127-a70e-43761e7cdb9e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v5wjg" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.363481 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9c645213-a3fd-4f35-9edd-60905873a559-ca-trust-extracted\") pod \"image-registry-697d97f7c8-8jr7l\" (UID: \"9c645213-a3fd-4f35-9edd-60905873a559\") " pod="openshift-image-registry/image-registry-697d97f7c8-8jr7l" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.363509 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/caebe48f-0ac0-436e-983b-6c5858472cf7-serving-cert\") pod \"console-operator-58897d9998-jshhn\" (UID: \"caebe48f-0ac0-436e-983b-6c5858472cf7\") " pod="openshift-console-operator/console-operator-58897d9998-jshhn" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.363616 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59b7fdd8-0d91-4442-a2a8-41c92d027266-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-4f588\" (UID: \"59b7fdd8-0d91-4442-a2a8-41c92d027266\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4f588" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.363658 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/caebe48f-0ac0-436e-983b-6c5858472cf7-trusted-ca\") pod \"console-operator-58897d9998-jshhn\" (UID: \"caebe48f-0ac0-436e-983b-6c5858472cf7\") " pod="openshift-console-operator/console-operator-58897d9998-jshhn" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.363717 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/cdd6ce32-26c3-4202-860b-8b37ead2941c-etcd-client\") pod \"etcd-operator-b45778765-glpkv\" (UID: \"cdd6ce32-26c3-4202-860b-8b37ead2941c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-glpkv" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.363844 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9c645213-a3fd-4f35-9edd-60905873a559-registry-tls\") pod \"image-registry-697d97f7c8-8jr7l\" (UID: \"9c645213-a3fd-4f35-9edd-60905873a559\") " pod="openshift-image-registry/image-registry-697d97f7c8-8jr7l" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.363869 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/cdd6ce32-26c3-4202-860b-8b37ead2941c-etcd-ca\") pod \"etcd-operator-b45778765-glpkv\" (UID: \"cdd6ce32-26c3-4202-860b-8b37ead2941c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-glpkv" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.364170 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-p77t7\" (UID: \"15dc5e3f-02c6-474d-bd7b-d51ce42340b3\") " pod="openshift-authentication/oauth-openshift-558db77b4-p77t7" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.369413 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pkg9x"] Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.372989 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-kplt9"] Dec 01 08:19:19 crc kubenswrapper[5004]: W1201 08:19:19.386755 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29bfa426_07b0_4acb_a886_9f9316644d71.slice/crio-f9c0f9f1f7161878d988dd81e64f3316f99e068588740fc919ff4c5c597059dc WatchSource:0}: Error finding container f9c0f9f1f7161878d988dd81e64f3316f99e068588740fc919ff4c5c597059dc: Status 404 returned error can't find the container with id f9c0f9f1f7161878d988dd81e64f3316f99e068588740fc919ff4c5c597059dc Dec 01 08:19:19 crc kubenswrapper[5004]: W1201 08:19:19.390260 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2daa468b_e9f8_41a3_ba94_a1e33093fc97.slice/crio-27d4fc4cebc65d31bc005cc2f42ae3f5789d81475e9168e6cb8e90743aa42365 WatchSource:0}: Error finding container 27d4fc4cebc65d31bc005cc2f42ae3f5789d81475e9168e6cb8e90743aa42365: Status 404 returned error can't find the container with id 27d4fc4cebc65d31bc005cc2f42ae3f5789d81475e9168e6cb8e90743aa42365 Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.398544 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-b9vjf"] Dec 01 08:19:19 crc kubenswrapper[5004]: W1201 08:19:19.415435 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3ebf4d5_102a_4552_b30b_cbacb3a779fa.slice/crio-0542a3a753b9094a234321631acd4025173eabd56a7290f7bcb6024411fe3fcb WatchSource:0}: Error finding container 0542a3a753b9094a234321631acd4025173eabd56a7290f7bcb6024411fe3fcb: Status 404 returned error can't find the container with id 0542a3a753b9094a234321631acd4025173eabd56a7290f7bcb6024411fe3fcb Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.464633 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 08:19:19 crc kubenswrapper[5004]: E1201 08:19:19.464774 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 08:19:19.964754219 +0000 UTC m=+137.529746201 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.464814 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e796fca9-e620-4e16-bda0-0e722b91b53c-config\") pod \"controller-manager-879f6c89f-zj88j\" (UID: \"e796fca9-e620-4e16-bda0-0e722b91b53c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zj88j" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.464835 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/ecf238f4-a4b7-45ab-8d1f-ff20327f375c-csi-data-dir\") pod \"csi-hostpathplugin-pt82b\" (UID: \"ecf238f4-a4b7-45ab-8d1f-ff20327f375c\") " pod="hostpath-provisioner/csi-hostpathplugin-pt82b" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.464851 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlrf7\" (UniqueName: \"kubernetes.io/projected/45170d05-984d-4bae-8f74-d7d7c60fffca-kube-api-access-rlrf7\") pod \"migrator-59844c95c7-cnkdl\" (UID: \"45170d05-984d-4bae-8f74-d7d7c60fffca\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-cnkdl" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.466699 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e796fca9-e620-4e16-bda0-0e722b91b53c-serving-cert\") pod \"controller-manager-879f6c89f-zj88j\" (UID: \"e796fca9-e620-4e16-bda0-0e722b91b53c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zj88j" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.466726 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/6047f6c2-4e66-4dde-b262-383c622eef04-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-829wj\" (UID: \"6047f6c2-4e66-4dde-b262-383c622eef04\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-829wj" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.466743 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e0f00c28-2b6e-4127-a70e-43761e7cdb9e-metrics-tls\") pod \"ingress-operator-5b745b69d9-v5wjg\" (UID: \"e0f00c28-2b6e-4127-a70e-43761e7cdb9e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v5wjg" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.466758 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2d0ddb3f-b8bc-420e-90ba-d45a29705615-images\") pod \"machine-config-operator-74547568cd-ft4b5\" (UID: \"2d0ddb3f-b8bc-420e-90ba-d45a29705615\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ft4b5" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.466774 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5ed7ffac-66ee-4b90-a582-ba697f8d87b9-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-njrts\" (UID: \"5ed7ffac-66ee-4b90-a582-ba697f8d87b9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-njrts" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.466789 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f0d517f-566f-404b-be4d-08adaea5926b-serving-cert\") pod \"service-ca-operator-777779d784-pbxqh\" (UID: \"3f0d517f-566f-404b-be4d-08adaea5926b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pbxqh" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.466807 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55711190-8e14-4951-9ac3-dc3675c3a86e-serving-cert\") pod \"openshift-config-operator-7777fb866f-p9c2d\" (UID: \"55711190-8e14-4951-9ac3-dc3675c3a86e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9c2d" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.466823 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2d4r\" (UniqueName: \"kubernetes.io/projected/c888afb0-ad29-42e2-ba4a-594f27ebbe4e-kube-api-access-g2d4r\") pod \"downloads-7954f5f757-qmztb\" (UID: \"c888afb0-ad29-42e2-ba4a-594f27ebbe4e\") " pod="openshift-console/downloads-7954f5f757-qmztb" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.466839 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xp57r\" (UniqueName: \"kubernetes.io/projected/1456ef72-76a5-4a0b-812b-7a0431444f47-kube-api-access-xp57r\") pod \"olm-operator-6b444d44fb-hr5n5\" (UID: \"1456ef72-76a5-4a0b-812b-7a0431444f47\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hr5n5" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.466854 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/beba9fed-710a-49a6-96ce-951ecb0a4a74-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-mrj44\" (UID: \"beba9fed-710a-49a6-96ce-951ecb0a4a74\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mrj44" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.466872 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8n8g\" (UniqueName: \"kubernetes.io/projected/e0f00c28-2b6e-4127-a70e-43761e7cdb9e-kube-api-access-r8n8g\") pod \"ingress-operator-5b745b69d9-v5wjg\" (UID: \"e0f00c28-2b6e-4127-a70e-43761e7cdb9e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v5wjg" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.466887 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ce579b07-073d-450d-b056-1be2c7bed20f-oauth-serving-cert\") pod \"console-f9d7485db-th28b\" (UID: \"ce579b07-073d-450d-b056-1be2c7bed20f\") " pod="openshift-console/console-f9d7485db-th28b" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.466973 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lsxw\" (UniqueName: \"kubernetes.io/projected/ecf238f4-a4b7-45ab-8d1f-ff20327f375c-kube-api-access-4lsxw\") pod \"csi-hostpathplugin-pt82b\" (UID: \"ecf238f4-a4b7-45ab-8d1f-ff20327f375c\") " pod="hostpath-provisioner/csi-hostpathplugin-pt82b" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.466992 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59b7fdd8-0d91-4442-a2a8-41c92d027266-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-4f588\" (UID: \"59b7fdd8-0d91-4442-a2a8-41c92d027266\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4f588" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.467010 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1456ef72-76a5-4a0b-812b-7a0431444f47-srv-cert\") pod \"olm-operator-6b444d44fb-hr5n5\" (UID: \"1456ef72-76a5-4a0b-812b-7a0431444f47\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hr5n5" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.467025 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9bba24e8-8799-4012-8d3a-7813ef29344e-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-rsfff\" (UID: \"9bba24e8-8799-4012-8d3a-7813ef29344e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rsfff" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.467084 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7r65\" (UniqueName: \"kubernetes.io/projected/a2404628-0f25-4889-8a15-73576dd41470-kube-api-access-w7r65\") pod \"dns-operator-744455d44c-w4btr\" (UID: \"a2404628-0f25-4889-8a15-73576dd41470\") " pod="openshift-dns-operator/dns-operator-744455d44c-w4btr" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.467103 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af840de7-db59-4020-a5c3-2d888069db1e-config\") pod \"kube-apiserver-operator-766d6c64bb-vrl25\" (UID: \"af840de7-db59-4020-a5c3-2d888069db1e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vrl25" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.467121 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvxmf\" (UniqueName: \"kubernetes.io/projected/6453284d-a0de-451c-9132-d30f6fddc220-kube-api-access-bvxmf\") pod \"ingress-canary-dpnk8\" (UID: \"6453284d-a0de-451c-9132-d30f6fddc220\") " pod="openshift-ingress-canary/ingress-canary-dpnk8" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.467192 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8jr7l\" (UID: \"9c645213-a3fd-4f35-9edd-60905873a559\") " pod="openshift-image-registry/image-registry-697d97f7c8-8jr7l" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.467212 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5ed7ffac-66ee-4b90-a582-ba697f8d87b9-proxy-tls\") pod \"machine-config-controller-84d6567774-njrts\" (UID: \"5ed7ffac-66ee-4b90-a582-ba697f8d87b9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-njrts" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.467411 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ce579b07-073d-450d-b056-1be2c7bed20f-console-oauth-config\") pod \"console-f9d7485db-th28b\" (UID: \"ce579b07-073d-450d-b056-1be2c7bed20f\") " pod="openshift-console/console-f9d7485db-th28b" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.467501 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7bhz\" (UniqueName: \"kubernetes.io/projected/4f397145-18ab-4b43-b133-cc42f45bc852-kube-api-access-l7bhz\") pod \"collect-profiles-29409615-5d85l\" (UID: \"4f397145-18ab-4b43-b133-cc42f45bc852\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409615-5d85l" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.467533 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9c645213-a3fd-4f35-9edd-60905873a559-installation-pull-secrets\") pod \"image-registry-697d97f7c8-8jr7l\" (UID: \"9c645213-a3fd-4f35-9edd-60905873a559\") " pod="openshift-image-registry/image-registry-697d97f7c8-8jr7l" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.467605 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1456ef72-76a5-4a0b-812b-7a0431444f47-profile-collector-cert\") pod \"olm-operator-6b444d44fb-hr5n5\" (UID: \"1456ef72-76a5-4a0b-812b-7a0431444f47\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hr5n5" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.467637 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e0f00c28-2b6e-4127-a70e-43761e7cdb9e-trusted-ca\") pod \"ingress-operator-5b745b69d9-v5wjg\" (UID: \"e0f00c28-2b6e-4127-a70e-43761e7cdb9e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v5wjg" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.467660 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/59a2dd51-8b1b-4437-a2f7-591bbf1890e8-node-bootstrap-token\") pod \"machine-config-server-c6929\" (UID: \"59a2dd51-8b1b-4437-a2f7-591bbf1890e8\") " pod="openshift-machine-config-operator/machine-config-server-c6929" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.467679 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/beba9fed-710a-49a6-96ce-951ecb0a4a74-etcd-client\") pod \"apiserver-7bbb656c7d-mrj44\" (UID: \"beba9fed-710a-49a6-96ce-951ecb0a4a74\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mrj44" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.467703 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0bd0713c-01be-44a0-ab66-51056ba04719-config-volume\") pod \"dns-default-qwgn4\" (UID: \"0bd0713c-01be-44a0-ab66-51056ba04719\") " pod="openshift-dns/dns-default-qwgn4" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.467730 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9c645213-a3fd-4f35-9edd-60905873a559-ca-trust-extracted\") pod \"image-registry-697d97f7c8-8jr7l\" (UID: \"9c645213-a3fd-4f35-9edd-60905873a559\") " pod="openshift-image-registry/image-registry-697d97f7c8-8jr7l" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.467755 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krzrl\" (UniqueName: \"kubernetes.io/projected/7a169e83-de91-4038-95b3-aa57f9b50861-kube-api-access-krzrl\") pod \"control-plane-machine-set-operator-78cbb6b69f-nsn58\" (UID: \"7a169e83-de91-4038-95b3-aa57f9b50861\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nsn58" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.467780 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c7f8a0cb-a369-4f34-b131-2023a72f1abb-apiservice-cert\") pod \"packageserver-d55dfcdfc-f2m5b\" (UID: \"c7f8a0cb-a369-4f34-b131-2023a72f1abb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-f2m5b" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.467804 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/59a2dd51-8b1b-4437-a2f7-591bbf1890e8-certs\") pod \"machine-config-server-c6929\" (UID: \"59a2dd51-8b1b-4437-a2f7-591bbf1890e8\") " pod="openshift-machine-config-operator/machine-config-server-c6929" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.467829 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/cdd6ce32-26c3-4202-860b-8b37ead2941c-etcd-client\") pod \"etcd-operator-b45778765-glpkv\" (UID: \"cdd6ce32-26c3-4202-860b-8b37ead2941c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-glpkv" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.467860 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-p77t7\" (UID: \"15dc5e3f-02c6-474d-bd7b-d51ce42340b3\") " pod="openshift-authentication/oauth-openshift-558db77b4-p77t7" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.467887 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cdd6ce32-26c3-4202-860b-8b37ead2941c-config\") pod \"etcd-operator-b45778765-glpkv\" (UID: \"cdd6ce32-26c3-4202-860b-8b37ead2941c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-glpkv" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.467911 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f0d517f-566f-404b-be4d-08adaea5926b-config\") pod \"service-ca-operator-777779d784-pbxqh\" (UID: \"3f0d517f-566f-404b-be4d-08adaea5926b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pbxqh" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.467938 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ecf238f4-a4b7-45ab-8d1f-ff20327f375c-socket-dir\") pod \"csi-hostpathplugin-pt82b\" (UID: \"ecf238f4-a4b7-45ab-8d1f-ff20327f375c\") " pod="hostpath-provisioner/csi-hostpathplugin-pt82b" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.467964 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9c645213-a3fd-4f35-9edd-60905873a559-bound-sa-token\") pod \"image-registry-697d97f7c8-8jr7l\" (UID: \"9c645213-a3fd-4f35-9edd-60905873a559\") " pod="openshift-image-registry/image-registry-697d97f7c8-8jr7l" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.467987 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ecf238f4-a4b7-45ab-8d1f-ff20327f375c-registration-dir\") pod \"csi-hostpathplugin-pt82b\" (UID: \"ecf238f4-a4b7-45ab-8d1f-ff20327f375c\") " pod="hostpath-provisioner/csi-hostpathplugin-pt82b" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.468015 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-audit-policies\") pod \"oauth-openshift-558db77b4-p77t7\" (UID: \"15dc5e3f-02c6-474d-bd7b-d51ce42340b3\") " pod="openshift-authentication/oauth-openshift-558db77b4-p77t7" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.468040 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4f397145-18ab-4b43-b133-cc42f45bc852-secret-volume\") pod \"collect-profiles-29409615-5d85l\" (UID: \"4f397145-18ab-4b43-b133-cc42f45bc852\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409615-5d85l" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.468065 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjphf\" (UniqueName: \"kubernetes.io/projected/5ed7ffac-66ee-4b90-a582-ba697f8d87b9-kube-api-access-tjphf\") pod \"machine-config-controller-84d6567774-njrts\" (UID: \"5ed7ffac-66ee-4b90-a582-ba697f8d87b9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-njrts" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.468090 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/39b4186d-328d-4fc5-a106-50e351a34f90-srv-cert\") pod \"catalog-operator-68c6474976-jdc97\" (UID: \"39b4186d-328d-4fc5-a106-50e351a34f90\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jdc97" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.468116 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hh7c\" (UniqueName: \"kubernetes.io/projected/6047f6c2-4e66-4dde-b262-383c622eef04-kube-api-access-5hh7c\") pod \"cluster-image-registry-operator-dc59b4c8b-829wj\" (UID: \"6047f6c2-4e66-4dde-b262-383c622eef04\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-829wj" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.468141 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-p77t7\" (UID: \"15dc5e3f-02c6-474d-bd7b-d51ce42340b3\") " pod="openshift-authentication/oauth-openshift-558db77b4-p77t7" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.468166 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa8bb474-21e8-42b3-a2c6-81ef7d267d9d-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-h6qmw\" (UID: \"aa8bb474-21e8-42b3-a2c6-81ef7d267d9d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-h6qmw" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.468188 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/caebe48f-0ac0-436e-983b-6c5858472cf7-config\") pod \"console-operator-58897d9998-jshhn\" (UID: \"caebe48f-0ac0-436e-983b-6c5858472cf7\") " pod="openshift-console-operator/console-operator-58897d9998-jshhn" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.468226 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ce579b07-073d-450d-b056-1be2c7bed20f-oauth-serving-cert\") pod \"console-f9d7485db-th28b\" (UID: \"ce579b07-073d-450d-b056-1be2c7bed20f\") " pod="openshift-console/console-f9d7485db-th28b" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.468212 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b3ef8088-fc82-4ce3-9c1c-662e380e0587-service-ca-bundle\") pod \"router-default-5444994796-6x76r\" (UID: \"b3ef8088-fc82-4ce3-9c1c-662e380e0587\") " pod="openshift-ingress/router-default-5444994796-6x76r" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.468379 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce579b07-073d-450d-b056-1be2c7bed20f-trusted-ca-bundle\") pod \"console-f9d7485db-th28b\" (UID: \"ce579b07-073d-450d-b056-1be2c7bed20f\") " pod="openshift-console/console-f9d7485db-th28b" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.468406 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ca76c4ad-59c4-4861-9279-4f8107524e44-signing-key\") pod \"service-ca-9c57cc56f-rlzws\" (UID: \"ca76c4ad-59c4-4861-9279-4f8107524e44\") " pod="openshift-service-ca/service-ca-9c57cc56f-rlzws" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.468429 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/beba9fed-710a-49a6-96ce-951ecb0a4a74-serving-cert\") pod \"apiserver-7bbb656c7d-mrj44\" (UID: \"beba9fed-710a-49a6-96ce-951ecb0a4a74\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mrj44" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.468454 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e0f00c28-2b6e-4127-a70e-43761e7cdb9e-bound-sa-token\") pod \"ingress-operator-5b745b69d9-v5wjg\" (UID: \"e0f00c28-2b6e-4127-a70e-43761e7cdb9e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v5wjg" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.468478 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-p77t7\" (UID: \"15dc5e3f-02c6-474d-bd7b-d51ce42340b3\") " pod="openshift-authentication/oauth-openshift-558db77b4-p77t7" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.468500 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b3ef8088-fc82-4ce3-9c1c-662e380e0587-stats-auth\") pod \"router-default-5444994796-6x76r\" (UID: \"b3ef8088-fc82-4ce3-9c1c-662e380e0587\") " pod="openshift-ingress/router-default-5444994796-6x76r" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.468869 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/beba9fed-710a-49a6-96ce-951ecb0a4a74-encryption-config\") pod \"apiserver-7bbb656c7d-mrj44\" (UID: \"beba9fed-710a-49a6-96ce-951ecb0a4a74\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mrj44" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.468895 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6047f6c2-4e66-4dde-b262-383c622eef04-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-829wj\" (UID: \"6047f6c2-4e66-4dde-b262-383c622eef04\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-829wj" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.468914 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ce579b07-073d-450d-b056-1be2c7bed20f-console-serving-cert\") pod \"console-f9d7485db-th28b\" (UID: \"ce579b07-073d-450d-b056-1be2c7bed20f\") " pod="openshift-console/console-f9d7485db-th28b" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.468934 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a2404628-0f25-4889-8a15-73576dd41470-metrics-tls\") pod \"dns-operator-744455d44c-w4btr\" (UID: \"a2404628-0f25-4889-8a15-73576dd41470\") " pod="openshift-dns-operator/dns-operator-744455d44c-w4btr" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.468953 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b3ef8088-fc82-4ce3-9c1c-662e380e0587-metrics-certs\") pod \"router-default-5444994796-6x76r\" (UID: \"b3ef8088-fc82-4ce3-9c1c-662e380e0587\") " pod="openshift-ingress/router-default-5444994796-6x76r" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.468975 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfvvw\" (UniqueName: \"kubernetes.io/projected/9bba24e8-8799-4012-8d3a-7813ef29344e-kube-api-access-dfvvw\") pod \"package-server-manager-789f6589d5-rsfff\" (UID: \"9bba24e8-8799-4012-8d3a-7813ef29344e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rsfff" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.469000 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-p77t7\" (UID: \"15dc5e3f-02c6-474d-bd7b-d51ce42340b3\") " pod="openshift-authentication/oauth-openshift-558db77b4-p77t7" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.469021 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9c645213-a3fd-4f35-9edd-60905873a559-trusted-ca\") pod \"image-registry-697d97f7c8-8jr7l\" (UID: \"9c645213-a3fd-4f35-9edd-60905873a559\") " pod="openshift-image-registry/image-registry-697d97f7c8-8jr7l" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.469041 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l95tm\" (UniqueName: \"kubernetes.io/projected/3f0d517f-566f-404b-be4d-08adaea5926b-kube-api-access-l95tm\") pod \"service-ca-operator-777779d784-pbxqh\" (UID: \"3f0d517f-566f-404b-be4d-08adaea5926b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pbxqh" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.469059 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b3ef8088-fc82-4ce3-9c1c-662e380e0587-default-certificate\") pod \"router-default-5444994796-6x76r\" (UID: \"b3ef8088-fc82-4ce3-9c1c-662e380e0587\") " pod="openshift-ingress/router-default-5444994796-6x76r" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.469077 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8a9f98dc-e84b-4fb8-9d4d-69c766486ebb-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8g8bf\" (UID: \"8a9f98dc-e84b-4fb8-9d4d-69c766486ebb\") " pod="openshift-marketplace/marketplace-operator-79b997595-8g8bf" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.469111 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa8bb474-21e8-42b3-a2c6-81ef7d267d9d-config\") pod \"kube-controller-manager-operator-78b949d7b-h6qmw\" (UID: \"aa8bb474-21e8-42b3-a2c6-81ef7d267d9d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-h6qmw" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.469131 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-p77t7\" (UID: \"15dc5e3f-02c6-474d-bd7b-d51ce42340b3\") " pod="openshift-authentication/oauth-openshift-558db77b4-p77t7" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.469150 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/c7f8a0cb-a369-4f34-b131-2023a72f1abb-tmpfs\") pod \"packageserver-d55dfcdfc-f2m5b\" (UID: \"c7f8a0cb-a369-4f34-b131-2023a72f1abb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-f2m5b" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.469170 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lv9xz\" (UniqueName: \"kubernetes.io/projected/ce579b07-073d-450d-b056-1be2c7bed20f-kube-api-access-lv9xz\") pod \"console-f9d7485db-th28b\" (UID: \"ce579b07-073d-450d-b056-1be2c7bed20f\") " pod="openshift-console/console-f9d7485db-th28b" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.469189 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-p77t7\" (UID: \"15dc5e3f-02c6-474d-bd7b-d51ce42340b3\") " pod="openshift-authentication/oauth-openshift-558db77b4-p77t7" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.469207 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6453284d-a0de-451c-9132-d30f6fddc220-cert\") pod \"ingress-canary-dpnk8\" (UID: \"6453284d-a0de-451c-9132-d30f6fddc220\") " pod="openshift-ingress-canary/ingress-canary-dpnk8" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.469233 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-p77t7\" (UID: \"15dc5e3f-02c6-474d-bd7b-d51ce42340b3\") " pod="openshift-authentication/oauth-openshift-558db77b4-p77t7" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.469256 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/af840de7-db59-4020-a5c3-2d888069db1e-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-vrl25\" (UID: \"af840de7-db59-4020-a5c3-2d888069db1e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vrl25" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.469276 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ca76c4ad-59c4-4861-9279-4f8107524e44-signing-cabundle\") pod \"service-ca-9c57cc56f-rlzws\" (UID: \"ca76c4ad-59c4-4861-9279-4f8107524e44\") " pod="openshift-service-ca/service-ca-9c57cc56f-rlzws" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.469298 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bv8b7\" (UniqueName: \"kubernetes.io/projected/9c645213-a3fd-4f35-9edd-60905873a559-kube-api-access-bv8b7\") pod \"image-registry-697d97f7c8-8jr7l\" (UID: \"9c645213-a3fd-4f35-9edd-60905873a559\") " pod="openshift-image-registry/image-registry-697d97f7c8-8jr7l" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.469317 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ce579b07-073d-450d-b056-1be2c7bed20f-service-ca\") pod \"console-f9d7485db-th28b\" (UID: \"ce579b07-073d-450d-b056-1be2c7bed20f\") " pod="openshift-console/console-f9d7485db-th28b" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.469338 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-p77t7\" (UID: \"15dc5e3f-02c6-474d-bd7b-d51ce42340b3\") " pod="openshift-authentication/oauth-openshift-558db77b4-p77t7" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.469362 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9c645213-a3fd-4f35-9edd-60905873a559-registry-certificates\") pod \"image-registry-697d97f7c8-8jr7l\" (UID: \"9c645213-a3fd-4f35-9edd-60905873a559\") " pod="openshift-image-registry/image-registry-697d97f7c8-8jr7l" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.469388 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-565jz\" (UniqueName: \"kubernetes.io/projected/caebe48f-0ac0-436e-983b-6c5858472cf7-kube-api-access-565jz\") pod \"console-operator-58897d9998-jshhn\" (UID: \"caebe48f-0ac0-436e-983b-6c5858472cf7\") " pod="openshift-console-operator/console-operator-58897d9998-jshhn" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.469412 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qg92\" (UniqueName: \"kubernetes.io/projected/cdd6ce32-26c3-4202-860b-8b37ead2941c-kube-api-access-4qg92\") pod \"etcd-operator-b45778765-glpkv\" (UID: \"cdd6ce32-26c3-4202-860b-8b37ead2941c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-glpkv" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.469434 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4f397145-18ab-4b43-b133-cc42f45bc852-config-volume\") pod \"collect-profiles-29409615-5d85l\" (UID: \"4f397145-18ab-4b43-b133-cc42f45bc852\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409615-5d85l" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.469457 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/55711190-8e14-4951-9ac3-dc3675c3a86e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-p9c2d\" (UID: \"55711190-8e14-4951-9ac3-dc3675c3a86e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9c2d" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.469478 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbs6r\" (UniqueName: \"kubernetes.io/projected/39b4186d-328d-4fc5-a106-50e351a34f90-kube-api-access-kbs6r\") pod \"catalog-operator-68c6474976-jdc97\" (UID: \"39b4186d-328d-4fc5-a106-50e351a34f90\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jdc97" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.469497 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ff6n5\" (UniqueName: \"kubernetes.io/projected/beba9fed-710a-49a6-96ce-951ecb0a4a74-kube-api-access-ff6n5\") pod \"apiserver-7bbb656c7d-mrj44\" (UID: \"beba9fed-710a-49a6-96ce-951ecb0a4a74\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mrj44" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.469516 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8wh9\" (UniqueName: \"kubernetes.io/projected/8a9f98dc-e84b-4fb8-9d4d-69c766486ebb-kube-api-access-w8wh9\") pod \"marketplace-operator-79b997595-8g8bf\" (UID: \"8a9f98dc-e84b-4fb8-9d4d-69c766486ebb\") " pod="openshift-marketplace/marketplace-operator-79b997595-8g8bf" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.469542 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8a9f98dc-e84b-4fb8-9d4d-69c766486ebb-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8g8bf\" (UID: \"8a9f98dc-e84b-4fb8-9d4d-69c766486ebb\") " pod="openshift-marketplace/marketplace-operator-79b997595-8g8bf" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.469580 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0bd0713c-01be-44a0-ab66-51056ba04719-metrics-tls\") pod \"dns-default-qwgn4\" (UID: \"0bd0713c-01be-44a0-ab66-51056ba04719\") " pod="openshift-dns/dns-default-qwgn4" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.469607 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/caebe48f-0ac0-436e-983b-6c5858472cf7-serving-cert\") pod \"console-operator-58897d9998-jshhn\" (UID: \"caebe48f-0ac0-436e-983b-6c5858472cf7\") " pod="openshift-console-operator/console-operator-58897d9998-jshhn" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.469644 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59b7fdd8-0d91-4442-a2a8-41c92d027266-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-4f588\" (UID: \"59b7fdd8-0d91-4442-a2a8-41c92d027266\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4f588" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.469669 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/caebe48f-0ac0-436e-983b-6c5858472cf7-trusted-ca\") pod \"console-operator-58897d9998-jshhn\" (UID: \"caebe48f-0ac0-436e-983b-6c5858472cf7\") " pod="openshift-console-operator/console-operator-58897d9998-jshhn" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.469694 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rv442\" (UniqueName: \"kubernetes.io/projected/e796fca9-e620-4e16-bda0-0e722b91b53c-kube-api-access-rv442\") pod \"controller-manager-879f6c89f-zj88j\" (UID: \"e796fca9-e620-4e16-bda0-0e722b91b53c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zj88j" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.469718 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxd4p\" (UniqueName: \"kubernetes.io/projected/c7f8a0cb-a369-4f34-b131-2023a72f1abb-kube-api-access-dxd4p\") pod \"packageserver-d55dfcdfc-f2m5b\" (UID: \"c7f8a0cb-a369-4f34-b131-2023a72f1abb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-f2m5b" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.469740 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/39b4186d-328d-4fc5-a106-50e351a34f90-profile-collector-cert\") pod \"catalog-operator-68c6474976-jdc97\" (UID: \"39b4186d-328d-4fc5-a106-50e351a34f90\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jdc97" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.469760 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/beba9fed-710a-49a6-96ce-951ecb0a4a74-audit-policies\") pod \"apiserver-7bbb656c7d-mrj44\" (UID: \"beba9fed-710a-49a6-96ce-951ecb0a4a74\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mrj44" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.469782 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9c645213-a3fd-4f35-9edd-60905873a559-registry-tls\") pod \"image-registry-697d97f7c8-8jr7l\" (UID: \"9c645213-a3fd-4f35-9edd-60905873a559\") " pod="openshift-image-registry/image-registry-697d97f7c8-8jr7l" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.469803 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/cdd6ce32-26c3-4202-860b-8b37ead2941c-etcd-ca\") pod \"etcd-operator-b45778765-glpkv\" (UID: \"cdd6ce32-26c3-4202-860b-8b37ead2941c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-glpkv" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.469824 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6j5h\" (UniqueName: \"kubernetes.io/projected/2d0ddb3f-b8bc-420e-90ba-d45a29705615-kube-api-access-p6j5h\") pod \"machine-config-operator-74547568cd-ft4b5\" (UID: \"2d0ddb3f-b8bc-420e-90ba-d45a29705615\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ft4b5" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.469845 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6aedcff-a4a2-4265-9c4c-9a3aee8b9377-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-44tqp\" (UID: \"a6aedcff-a4a2-4265-9c4c-9a3aee8b9377\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-44tqp" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.469865 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/29f791b6-e03f-4159-85ee-783401ccf7e1-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-8z2fx\" (UID: \"29f791b6-e03f-4159-85ee-783401ccf7e1\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8z2fx" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.469931 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c7f8a0cb-a369-4f34-b131-2023a72f1abb-webhook-cert\") pod \"packageserver-d55dfcdfc-f2m5b\" (UID: \"c7f8a0cb-a369-4f34-b131-2023a72f1abb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-f2m5b" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.469983 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-audit-dir\") pod \"oauth-openshift-558db77b4-p77t7\" (UID: \"15dc5e3f-02c6-474d-bd7b-d51ce42340b3\") " pod="openshift-authentication/oauth-openshift-558db77b4-p77t7" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.470006 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqlv7\" (UniqueName: \"kubernetes.io/projected/ca76c4ad-59c4-4861-9279-4f8107524e44-kube-api-access-jqlv7\") pod \"service-ca-9c57cc56f-rlzws\" (UID: \"ca76c4ad-59c4-4861-9279-4f8107524e44\") " pod="openshift-service-ca/service-ca-9c57cc56f-rlzws" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.470029 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-p77t7\" (UID: \"15dc5e3f-02c6-474d-bd7b-d51ce42340b3\") " pod="openshift-authentication/oauth-openshift-558db77b4-p77t7" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.470050 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hc6xg\" (UniqueName: \"kubernetes.io/projected/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-kube-api-access-hc6xg\") pod \"oauth-openshift-558db77b4-p77t7\" (UID: \"15dc5e3f-02c6-474d-bd7b-d51ce42340b3\") " pod="openshift-authentication/oauth-openshift-558db77b4-p77t7" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.470069 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzqqc\" (UniqueName: \"kubernetes.io/projected/59a2dd51-8b1b-4437-a2f7-591bbf1890e8-kube-api-access-pzqqc\") pod \"machine-config-server-c6929\" (UID: \"59a2dd51-8b1b-4437-a2f7-591bbf1890e8\") " pod="openshift-machine-config-operator/machine-config-server-c6929" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.470088 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dgcd\" (UniqueName: \"kubernetes.io/projected/a6aedcff-a4a2-4265-9c4c-9a3aee8b9377-kube-api-access-4dgcd\") pod \"kube-storage-version-migrator-operator-b67b599dd-44tqp\" (UID: \"a6aedcff-a4a2-4265-9c4c-9a3aee8b9377\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-44tqp" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.470110 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-p77t7\" (UID: \"15dc5e3f-02c6-474d-bd7b-d51ce42340b3\") " pod="openshift-authentication/oauth-openshift-558db77b4-p77t7" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.470141 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/7a169e83-de91-4038-95b3-aa57f9b50861-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-nsn58\" (UID: \"7a169e83-de91-4038-95b3-aa57f9b50861\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nsn58" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.470165 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7vhw\" (UniqueName: \"kubernetes.io/projected/b3ef8088-fc82-4ce3-9c1c-662e380e0587-kube-api-access-c7vhw\") pod \"router-default-5444994796-6x76r\" (UID: \"b3ef8088-fc82-4ce3-9c1c-662e380e0587\") " pod="openshift-ingress/router-default-5444994796-6x76r" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.470194 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cdd6ce32-26c3-4202-860b-8b37ead2941c-serving-cert\") pod \"etcd-operator-b45778765-glpkv\" (UID: \"cdd6ce32-26c3-4202-860b-8b37ead2941c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-glpkv" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.470213 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e796fca9-e620-4e16-bda0-0e722b91b53c-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-zj88j\" (UID: \"e796fca9-e620-4e16-bda0-0e722b91b53c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zj88j" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.470233 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/ecf238f4-a4b7-45ab-8d1f-ff20327f375c-plugins-dir\") pod \"csi-hostpathplugin-pt82b\" (UID: \"ecf238f4-a4b7-45ab-8d1f-ff20327f375c\") " pod="hostpath-provisioner/csi-hostpathplugin-pt82b" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.470252 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ck2qv\" (UniqueName: \"kubernetes.io/projected/0bd0713c-01be-44a0-ab66-51056ba04719-kube-api-access-ck2qv\") pod \"dns-default-qwgn4\" (UID: \"0bd0713c-01be-44a0-ab66-51056ba04719\") " pod="openshift-dns/dns-default-qwgn4" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.470275 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/cdd6ce32-26c3-4202-860b-8b37ead2941c-etcd-service-ca\") pod \"etcd-operator-b45778765-glpkv\" (UID: \"cdd6ce32-26c3-4202-860b-8b37ead2941c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-glpkv" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.470296 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mh5lv\" (UniqueName: \"kubernetes.io/projected/59b7fdd8-0d91-4442-a2a8-41c92d027266-kube-api-access-mh5lv\") pod \"openshift-controller-manager-operator-756b6f6bc6-4f588\" (UID: \"59b7fdd8-0d91-4442-a2a8-41c92d027266\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4f588" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.470319 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af840de7-db59-4020-a5c3-2d888069db1e-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-vrl25\" (UID: \"af840de7-db59-4020-a5c3-2d888069db1e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vrl25" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.470343 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2d0ddb3f-b8bc-420e-90ba-d45a29705615-auth-proxy-config\") pod \"machine-config-operator-74547568cd-ft4b5\" (UID: \"2d0ddb3f-b8bc-420e-90ba-d45a29705615\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ft4b5" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.470367 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtz65\" (UniqueName: \"kubernetes.io/projected/29f791b6-e03f-4159-85ee-783401ccf7e1-kube-api-access-mtz65\") pod \"multus-admission-controller-857f4d67dd-8z2fx\" (UID: \"29f791b6-e03f-4159-85ee-783401ccf7e1\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8z2fx" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.470391 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ce579b07-073d-450d-b056-1be2c7bed20f-console-config\") pod \"console-f9d7485db-th28b\" (UID: \"ce579b07-073d-450d-b056-1be2c7bed20f\") " pod="openshift-console/console-f9d7485db-th28b" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.470413 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e796fca9-e620-4e16-bda0-0e722b91b53c-client-ca\") pod \"controller-manager-879f6c89f-zj88j\" (UID: \"e796fca9-e620-4e16-bda0-0e722b91b53c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zj88j" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.470474 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6047f6c2-4e66-4dde-b262-383c622eef04-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-829wj\" (UID: \"6047f6c2-4e66-4dde-b262-383c622eef04\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-829wj" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.470497 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/beba9fed-710a-49a6-96ce-951ecb0a4a74-audit-dir\") pod \"apiserver-7bbb656c7d-mrj44\" (UID: \"beba9fed-710a-49a6-96ce-951ecb0a4a74\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mrj44" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.470521 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2d0ddb3f-b8bc-420e-90ba-d45a29705615-proxy-tls\") pod \"machine-config-operator-74547568cd-ft4b5\" (UID: \"2d0ddb3f-b8bc-420e-90ba-d45a29705615\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ft4b5" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.470544 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/ecf238f4-a4b7-45ab-8d1f-ff20327f375c-mountpoint-dir\") pod \"csi-hostpathplugin-pt82b\" (UID: \"ecf238f4-a4b7-45ab-8d1f-ff20327f375c\") " pod="hostpath-provisioner/csi-hostpathplugin-pt82b" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.470583 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a6aedcff-a4a2-4265-9c4c-9a3aee8b9377-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-44tqp\" (UID: \"a6aedcff-a4a2-4265-9c4c-9a3aee8b9377\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-44tqp" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.470609 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/beba9fed-710a-49a6-96ce-951ecb0a4a74-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-mrj44\" (UID: \"beba9fed-710a-49a6-96ce-951ecb0a4a74\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mrj44" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.470641 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/aa8bb474-21e8-42b3-a2c6-81ef7d267d9d-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-h6qmw\" (UID: \"aa8bb474-21e8-42b3-a2c6-81ef7d267d9d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-h6qmw" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.470675 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plrp2\" (UniqueName: \"kubernetes.io/projected/55711190-8e14-4951-9ac3-dc3675c3a86e-kube-api-access-plrp2\") pod \"openshift-config-operator-7777fb866f-p9c2d\" (UID: \"55711190-8e14-4951-9ac3-dc3675c3a86e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9c2d" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.470705 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-p77t7\" (UID: \"15dc5e3f-02c6-474d-bd7b-d51ce42340b3\") " pod="openshift-authentication/oauth-openshift-558db77b4-p77t7" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.471429 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9c645213-a3fd-4f35-9edd-60905873a559-ca-trust-extracted\") pod \"image-registry-697d97f7c8-8jr7l\" (UID: \"9c645213-a3fd-4f35-9edd-60905873a559\") " pod="openshift-image-registry/image-registry-697d97f7c8-8jr7l" Dec 01 08:19:19 crc kubenswrapper[5004]: E1201 08:19:19.474708 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 08:19:19.974690249 +0000 UTC m=+137.539682231 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8jr7l" (UID: "9c645213-a3fd-4f35-9edd-60905873a559") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.475013 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af840de7-db59-4020-a5c3-2d888069db1e-config\") pod \"kube-apiserver-operator-766d6c64bb-vrl25\" (UID: \"af840de7-db59-4020-a5c3-2d888069db1e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vrl25" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.476183 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59b7fdd8-0d91-4442-a2a8-41c92d027266-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-4f588\" (UID: \"59b7fdd8-0d91-4442-a2a8-41c92d027266\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4f588" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.476226 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-audit-dir\") pod \"oauth-openshift-558db77b4-p77t7\" (UID: \"15dc5e3f-02c6-474d-bd7b-d51ce42340b3\") " pod="openshift-authentication/oauth-openshift-558db77b4-p77t7" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.476359 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e0f00c28-2b6e-4127-a70e-43761e7cdb9e-trusted-ca\") pod \"ingress-operator-5b745b69d9-v5wjg\" (UID: \"e0f00c28-2b6e-4127-a70e-43761e7cdb9e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v5wjg" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.478910 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa8bb474-21e8-42b3-a2c6-81ef7d267d9d-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-h6qmw\" (UID: \"aa8bb474-21e8-42b3-a2c6-81ef7d267d9d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-h6qmw" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.479547 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/cdd6ce32-26c3-4202-860b-8b37ead2941c-etcd-service-ca\") pod \"etcd-operator-b45778765-glpkv\" (UID: \"cdd6ce32-26c3-4202-860b-8b37ead2941c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-glpkv" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.479726 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/caebe48f-0ac0-436e-983b-6c5858472cf7-config\") pod \"console-operator-58897d9998-jshhn\" (UID: \"caebe48f-0ac0-436e-983b-6c5858472cf7\") " pod="openshift-console-operator/console-operator-58897d9998-jshhn" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.480167 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-p77t7\" (UID: \"15dc5e3f-02c6-474d-bd7b-d51ce42340b3\") " pod="openshift-authentication/oauth-openshift-558db77b4-p77t7" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.481837 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-audit-policies\") pod \"oauth-openshift-558db77b4-p77t7\" (UID: \"15dc5e3f-02c6-474d-bd7b-d51ce42340b3\") " pod="openshift-authentication/oauth-openshift-558db77b4-p77t7" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.482258 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cdd6ce32-26c3-4202-860b-8b37ead2941c-config\") pod \"etcd-operator-b45778765-glpkv\" (UID: \"cdd6ce32-26c3-4202-860b-8b37ead2941c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-glpkv" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.482612 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-p77t7\" (UID: \"15dc5e3f-02c6-474d-bd7b-d51ce42340b3\") " pod="openshift-authentication/oauth-openshift-558db77b4-p77t7" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.483106 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-p77t7\" (UID: \"15dc5e3f-02c6-474d-bd7b-d51ce42340b3\") " pod="openshift-authentication/oauth-openshift-558db77b4-p77t7" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.483461 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce579b07-073d-450d-b056-1be2c7bed20f-trusted-ca-bundle\") pod \"console-f9d7485db-th28b\" (UID: \"ce579b07-073d-450d-b056-1be2c7bed20f\") " pod="openshift-console/console-f9d7485db-th28b" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.484401 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/caebe48f-0ac0-436e-983b-6c5858472cf7-trusted-ca\") pod \"console-operator-58897d9998-jshhn\" (UID: \"caebe48f-0ac0-436e-983b-6c5858472cf7\") " pod="openshift-console-operator/console-operator-58897d9998-jshhn" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.484628 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a2404628-0f25-4889-8a15-73576dd41470-metrics-tls\") pod \"dns-operator-744455d44c-w4btr\" (UID: \"a2404628-0f25-4889-8a15-73576dd41470\") " pod="openshift-dns-operator/dns-operator-744455d44c-w4btr" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.484705 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/55711190-8e14-4951-9ac3-dc3675c3a86e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-p9c2d\" (UID: \"55711190-8e14-4951-9ac3-dc3675c3a86e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9c2d" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.485761 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/cdd6ce32-26c3-4202-860b-8b37ead2941c-etcd-ca\") pod \"etcd-operator-b45778765-glpkv\" (UID: \"cdd6ce32-26c3-4202-860b-8b37ead2941c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-glpkv" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.486948 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ce579b07-073d-450d-b056-1be2c7bed20f-console-config\") pod \"console-f9d7485db-th28b\" (UID: \"ce579b07-073d-450d-b056-1be2c7bed20f\") " pod="openshift-console/console-f9d7485db-th28b" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.488325 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-p77t7\" (UID: \"15dc5e3f-02c6-474d-bd7b-d51ce42340b3\") " pod="openshift-authentication/oauth-openshift-558db77b4-p77t7" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.488392 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ce579b07-073d-450d-b056-1be2c7bed20f-service-ca\") pod \"console-f9d7485db-th28b\" (UID: \"ce579b07-073d-450d-b056-1be2c7bed20f\") " pod="openshift-console/console-f9d7485db-th28b" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.488532 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55711190-8e14-4951-9ac3-dc3675c3a86e-serving-cert\") pod \"openshift-config-operator-7777fb866f-p9c2d\" (UID: \"55711190-8e14-4951-9ac3-dc3675c3a86e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9c2d" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.489135 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-p77t7\" (UID: \"15dc5e3f-02c6-474d-bd7b-d51ce42340b3\") " pod="openshift-authentication/oauth-openshift-558db77b4-p77t7" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.489182 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-p77t7\" (UID: \"15dc5e3f-02c6-474d-bd7b-d51ce42340b3\") " pod="openshift-authentication/oauth-openshift-558db77b4-p77t7" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.489614 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af840de7-db59-4020-a5c3-2d888069db1e-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-vrl25\" (UID: \"af840de7-db59-4020-a5c3-2d888069db1e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vrl25" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.490026 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6047f6c2-4e66-4dde-b262-383c622eef04-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-829wj\" (UID: \"6047f6c2-4e66-4dde-b262-383c622eef04\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-829wj" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.490181 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-p77t7\" (UID: \"15dc5e3f-02c6-474d-bd7b-d51ce42340b3\") " pod="openshift-authentication/oauth-openshift-558db77b4-p77t7" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.490379 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa8bb474-21e8-42b3-a2c6-81ef7d267d9d-config\") pod \"kube-controller-manager-operator-78b949d7b-h6qmw\" (UID: \"aa8bb474-21e8-42b3-a2c6-81ef7d267d9d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-h6qmw" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.490605 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-p77t7\" (UID: \"15dc5e3f-02c6-474d-bd7b-d51ce42340b3\") " pod="openshift-authentication/oauth-openshift-558db77b4-p77t7" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.491202 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9c645213-a3fd-4f35-9edd-60905873a559-registry-certificates\") pod \"image-registry-697d97f7c8-8jr7l\" (UID: \"9c645213-a3fd-4f35-9edd-60905873a559\") " pod="openshift-image-registry/image-registry-697d97f7c8-8jr7l" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.493695 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-p77t7\" (UID: \"15dc5e3f-02c6-474d-bd7b-d51ce42340b3\") " pod="openshift-authentication/oauth-openshift-558db77b4-p77t7" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.494181 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59b7fdd8-0d91-4442-a2a8-41c92d027266-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-4f588\" (UID: \"59b7fdd8-0d91-4442-a2a8-41c92d027266\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4f588" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.494427 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/cdd6ce32-26c3-4202-860b-8b37ead2941c-etcd-client\") pod \"etcd-operator-b45778765-glpkv\" (UID: \"cdd6ce32-26c3-4202-860b-8b37ead2941c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-glpkv" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.494705 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e0f00c28-2b6e-4127-a70e-43761e7cdb9e-metrics-tls\") pod \"ingress-operator-5b745b69d9-v5wjg\" (UID: \"e0f00c28-2b6e-4127-a70e-43761e7cdb9e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v5wjg" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.494754 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/caebe48f-0ac0-436e-983b-6c5858472cf7-serving-cert\") pod \"console-operator-58897d9998-jshhn\" (UID: \"caebe48f-0ac0-436e-983b-6c5858472cf7\") " pod="openshift-console-operator/console-operator-58897d9998-jshhn" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.495162 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cdd6ce32-26c3-4202-860b-8b37ead2941c-serving-cert\") pod \"etcd-operator-b45778765-glpkv\" (UID: \"cdd6ce32-26c3-4202-860b-8b37ead2941c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-glpkv" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.495911 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ce579b07-073d-450d-b056-1be2c7bed20f-console-oauth-config\") pod \"console-f9d7485db-th28b\" (UID: \"ce579b07-073d-450d-b056-1be2c7bed20f\") " pod="openshift-console/console-f9d7485db-th28b" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.496160 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ce579b07-073d-450d-b056-1be2c7bed20f-console-serving-cert\") pod \"console-f9d7485db-th28b\" (UID: \"ce579b07-073d-450d-b056-1be2c7bed20f\") " pod="openshift-console/console-f9d7485db-th28b" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.496496 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9c645213-a3fd-4f35-9edd-60905873a559-trusted-ca\") pod \"image-registry-697d97f7c8-8jr7l\" (UID: \"9c645213-a3fd-4f35-9edd-60905873a559\") " pod="openshift-image-registry/image-registry-697d97f7c8-8jr7l" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.499530 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-p77t7\" (UID: \"15dc5e3f-02c6-474d-bd7b-d51ce42340b3\") " pod="openshift-authentication/oauth-openshift-558db77b4-p77t7" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.500090 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9c645213-a3fd-4f35-9edd-60905873a559-registry-tls\") pod \"image-registry-697d97f7c8-8jr7l\" (UID: \"9c645213-a3fd-4f35-9edd-60905873a559\") " pod="openshift-image-registry/image-registry-697d97f7c8-8jr7l" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.500593 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/6047f6c2-4e66-4dde-b262-383c622eef04-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-829wj\" (UID: \"6047f6c2-4e66-4dde-b262-383c622eef04\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-829wj" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.507354 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-p77t7\" (UID: \"15dc5e3f-02c6-474d-bd7b-d51ce42340b3\") " pod="openshift-authentication/oauth-openshift-558db77b4-p77t7" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.509195 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9c645213-a3fd-4f35-9edd-60905873a559-installation-pull-secrets\") pod \"image-registry-697d97f7c8-8jr7l\" (UID: \"9c645213-a3fd-4f35-9edd-60905873a559\") " pod="openshift-image-registry/image-registry-697d97f7c8-8jr7l" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.511106 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8n8g\" (UniqueName: \"kubernetes.io/projected/e0f00c28-2b6e-4127-a70e-43761e7cdb9e-kube-api-access-r8n8g\") pod \"ingress-operator-5b745b69d9-v5wjg\" (UID: \"e0f00c28-2b6e-4127-a70e-43761e7cdb9e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v5wjg" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.529793 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2d4r\" (UniqueName: \"kubernetes.io/projected/c888afb0-ad29-42e2-ba4a-594f27ebbe4e-kube-api-access-g2d4r\") pod \"downloads-7954f5f757-qmztb\" (UID: \"c888afb0-ad29-42e2-ba4a-594f27ebbe4e\") " pod="openshift-console/downloads-7954f5f757-qmztb" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.550986 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qg92\" (UniqueName: \"kubernetes.io/projected/cdd6ce32-26c3-4202-860b-8b37ead2941c-kube-api-access-4qg92\") pod \"etcd-operator-b45778765-glpkv\" (UID: \"cdd6ce32-26c3-4202-860b-8b37ead2941c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-glpkv" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.564178 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-qjz9p"] Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.567096 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6047f6c2-4e66-4dde-b262-383c622eef04-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-829wj\" (UID: \"6047f6c2-4e66-4dde-b262-383c622eef04\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-829wj" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.571341 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.571532 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2d0ddb3f-b8bc-420e-90ba-d45a29705615-images\") pod \"machine-config-operator-74547568cd-ft4b5\" (UID: \"2d0ddb3f-b8bc-420e-90ba-d45a29705615\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ft4b5" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.571686 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5ed7ffac-66ee-4b90-a582-ba697f8d87b9-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-njrts\" (UID: \"5ed7ffac-66ee-4b90-a582-ba697f8d87b9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-njrts" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.571722 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f0d517f-566f-404b-be4d-08adaea5926b-serving-cert\") pod \"service-ca-operator-777779d784-pbxqh\" (UID: \"3f0d517f-566f-404b-be4d-08adaea5926b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pbxqh" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.571752 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xp57r\" (UniqueName: \"kubernetes.io/projected/1456ef72-76a5-4a0b-812b-7a0431444f47-kube-api-access-xp57r\") pod \"olm-operator-6b444d44fb-hr5n5\" (UID: \"1456ef72-76a5-4a0b-812b-7a0431444f47\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hr5n5" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.571776 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/beba9fed-710a-49a6-96ce-951ecb0a4a74-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-mrj44\" (UID: \"beba9fed-710a-49a6-96ce-951ecb0a4a74\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mrj44" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.571800 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lsxw\" (UniqueName: \"kubernetes.io/projected/ecf238f4-a4b7-45ab-8d1f-ff20327f375c-kube-api-access-4lsxw\") pod \"csi-hostpathplugin-pt82b\" (UID: \"ecf238f4-a4b7-45ab-8d1f-ff20327f375c\") " pod="hostpath-provisioner/csi-hostpathplugin-pt82b" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.571827 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1456ef72-76a5-4a0b-812b-7a0431444f47-srv-cert\") pod \"olm-operator-6b444d44fb-hr5n5\" (UID: \"1456ef72-76a5-4a0b-812b-7a0431444f47\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hr5n5" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.571851 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9bba24e8-8799-4012-8d3a-7813ef29344e-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-rsfff\" (UID: \"9bba24e8-8799-4012-8d3a-7813ef29344e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rsfff" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.571884 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvxmf\" (UniqueName: \"kubernetes.io/projected/6453284d-a0de-451c-9132-d30f6fddc220-kube-api-access-bvxmf\") pod \"ingress-canary-dpnk8\" (UID: \"6453284d-a0de-451c-9132-d30f6fddc220\") " pod="openshift-ingress-canary/ingress-canary-dpnk8" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.571916 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5ed7ffac-66ee-4b90-a582-ba697f8d87b9-proxy-tls\") pod \"machine-config-controller-84d6567774-njrts\" (UID: \"5ed7ffac-66ee-4b90-a582-ba697f8d87b9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-njrts" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.572171 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7bhz\" (UniqueName: \"kubernetes.io/projected/4f397145-18ab-4b43-b133-cc42f45bc852-kube-api-access-l7bhz\") pod \"collect-profiles-29409615-5d85l\" (UID: \"4f397145-18ab-4b43-b133-cc42f45bc852\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409615-5d85l" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.572211 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1456ef72-76a5-4a0b-812b-7a0431444f47-profile-collector-cert\") pod \"olm-operator-6b444d44fb-hr5n5\" (UID: \"1456ef72-76a5-4a0b-812b-7a0431444f47\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hr5n5" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.572233 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/59a2dd51-8b1b-4437-a2f7-591bbf1890e8-node-bootstrap-token\") pod \"machine-config-server-c6929\" (UID: \"59a2dd51-8b1b-4437-a2f7-591bbf1890e8\") " pod="openshift-machine-config-operator/machine-config-server-c6929" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.572280 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/beba9fed-710a-49a6-96ce-951ecb0a4a74-etcd-client\") pod \"apiserver-7bbb656c7d-mrj44\" (UID: \"beba9fed-710a-49a6-96ce-951ecb0a4a74\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mrj44" Dec 01 08:19:19 crc kubenswrapper[5004]: E1201 08:19:19.572410 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 08:19:20.072351168 +0000 UTC m=+137.637343150 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.572432 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0bd0713c-01be-44a0-ab66-51056ba04719-config-volume\") pod \"dns-default-qwgn4\" (UID: \"0bd0713c-01be-44a0-ab66-51056ba04719\") " pod="openshift-dns/dns-default-qwgn4" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.572460 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krzrl\" (UniqueName: \"kubernetes.io/projected/7a169e83-de91-4038-95b3-aa57f9b50861-kube-api-access-krzrl\") pod \"control-plane-machine-set-operator-78cbb6b69f-nsn58\" (UID: \"7a169e83-de91-4038-95b3-aa57f9b50861\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nsn58" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.572480 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c7f8a0cb-a369-4f34-b131-2023a72f1abb-apiservice-cert\") pod \"packageserver-d55dfcdfc-f2m5b\" (UID: \"c7f8a0cb-a369-4f34-b131-2023a72f1abb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-f2m5b" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.572506 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/59a2dd51-8b1b-4437-a2f7-591bbf1890e8-certs\") pod \"machine-config-server-c6929\" (UID: \"59a2dd51-8b1b-4437-a2f7-591bbf1890e8\") " pod="openshift-machine-config-operator/machine-config-server-c6929" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.572532 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f0d517f-566f-404b-be4d-08adaea5926b-config\") pod \"service-ca-operator-777779d784-pbxqh\" (UID: \"3f0d517f-566f-404b-be4d-08adaea5926b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pbxqh" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.572576 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ecf238f4-a4b7-45ab-8d1f-ff20327f375c-socket-dir\") pod \"csi-hostpathplugin-pt82b\" (UID: \"ecf238f4-a4b7-45ab-8d1f-ff20327f375c\") " pod="hostpath-provisioner/csi-hostpathplugin-pt82b" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.572598 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ecf238f4-a4b7-45ab-8d1f-ff20327f375c-registration-dir\") pod \"csi-hostpathplugin-pt82b\" (UID: \"ecf238f4-a4b7-45ab-8d1f-ff20327f375c\") " pod="hostpath-provisioner/csi-hostpathplugin-pt82b" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.572633 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4f397145-18ab-4b43-b133-cc42f45bc852-secret-volume\") pod \"collect-profiles-29409615-5d85l\" (UID: \"4f397145-18ab-4b43-b133-cc42f45bc852\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409615-5d85l" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.572666 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjphf\" (UniqueName: \"kubernetes.io/projected/5ed7ffac-66ee-4b90-a582-ba697f8d87b9-kube-api-access-tjphf\") pod \"machine-config-controller-84d6567774-njrts\" (UID: \"5ed7ffac-66ee-4b90-a582-ba697f8d87b9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-njrts" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.572690 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/39b4186d-328d-4fc5-a106-50e351a34f90-srv-cert\") pod \"catalog-operator-68c6474976-jdc97\" (UID: \"39b4186d-328d-4fc5-a106-50e351a34f90\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jdc97" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.572723 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b3ef8088-fc82-4ce3-9c1c-662e380e0587-service-ca-bundle\") pod \"router-default-5444994796-6x76r\" (UID: \"b3ef8088-fc82-4ce3-9c1c-662e380e0587\") " pod="openshift-ingress/router-default-5444994796-6x76r" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.572780 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ca76c4ad-59c4-4861-9279-4f8107524e44-signing-key\") pod \"service-ca-9c57cc56f-rlzws\" (UID: \"ca76c4ad-59c4-4861-9279-4f8107524e44\") " pod="openshift-service-ca/service-ca-9c57cc56f-rlzws" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.572805 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/beba9fed-710a-49a6-96ce-951ecb0a4a74-serving-cert\") pod \"apiserver-7bbb656c7d-mrj44\" (UID: \"beba9fed-710a-49a6-96ce-951ecb0a4a74\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mrj44" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.572836 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b3ef8088-fc82-4ce3-9c1c-662e380e0587-stats-auth\") pod \"router-default-5444994796-6x76r\" (UID: \"b3ef8088-fc82-4ce3-9c1c-662e380e0587\") " pod="openshift-ingress/router-default-5444994796-6x76r" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.572858 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/beba9fed-710a-49a6-96ce-951ecb0a4a74-encryption-config\") pod \"apiserver-7bbb656c7d-mrj44\" (UID: \"beba9fed-710a-49a6-96ce-951ecb0a4a74\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mrj44" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.572883 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b3ef8088-fc82-4ce3-9c1c-662e380e0587-metrics-certs\") pod \"router-default-5444994796-6x76r\" (UID: \"b3ef8088-fc82-4ce3-9c1c-662e380e0587\") " pod="openshift-ingress/router-default-5444994796-6x76r" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.572908 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfvvw\" (UniqueName: \"kubernetes.io/projected/9bba24e8-8799-4012-8d3a-7813ef29344e-kube-api-access-dfvvw\") pod \"package-server-manager-789f6589d5-rsfff\" (UID: \"9bba24e8-8799-4012-8d3a-7813ef29344e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rsfff" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.572935 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l95tm\" (UniqueName: \"kubernetes.io/projected/3f0d517f-566f-404b-be4d-08adaea5926b-kube-api-access-l95tm\") pod \"service-ca-operator-777779d784-pbxqh\" (UID: \"3f0d517f-566f-404b-be4d-08adaea5926b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pbxqh" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.572958 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b3ef8088-fc82-4ce3-9c1c-662e380e0587-default-certificate\") pod \"router-default-5444994796-6x76r\" (UID: \"b3ef8088-fc82-4ce3-9c1c-662e380e0587\") " pod="openshift-ingress/router-default-5444994796-6x76r" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.572979 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8a9f98dc-e84b-4fb8-9d4d-69c766486ebb-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8g8bf\" (UID: \"8a9f98dc-e84b-4fb8-9d4d-69c766486ebb\") " pod="openshift-marketplace/marketplace-operator-79b997595-8g8bf" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.573014 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/c7f8a0cb-a369-4f34-b131-2023a72f1abb-tmpfs\") pod \"packageserver-d55dfcdfc-f2m5b\" (UID: \"c7f8a0cb-a369-4f34-b131-2023a72f1abb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-f2m5b" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.573044 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6453284d-a0de-451c-9132-d30f6fddc220-cert\") pod \"ingress-canary-dpnk8\" (UID: \"6453284d-a0de-451c-9132-d30f6fddc220\") " pod="openshift-ingress-canary/ingress-canary-dpnk8" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.573076 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ca76c4ad-59c4-4861-9279-4f8107524e44-signing-cabundle\") pod \"service-ca-9c57cc56f-rlzws\" (UID: \"ca76c4ad-59c4-4861-9279-4f8107524e44\") " pod="openshift-service-ca/service-ca-9c57cc56f-rlzws" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.573119 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4f397145-18ab-4b43-b133-cc42f45bc852-config-volume\") pod \"collect-profiles-29409615-5d85l\" (UID: \"4f397145-18ab-4b43-b133-cc42f45bc852\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409615-5d85l" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.573144 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbs6r\" (UniqueName: \"kubernetes.io/projected/39b4186d-328d-4fc5-a106-50e351a34f90-kube-api-access-kbs6r\") pod \"catalog-operator-68c6474976-jdc97\" (UID: \"39b4186d-328d-4fc5-a106-50e351a34f90\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jdc97" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.573168 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ff6n5\" (UniqueName: \"kubernetes.io/projected/beba9fed-710a-49a6-96ce-951ecb0a4a74-kube-api-access-ff6n5\") pod \"apiserver-7bbb656c7d-mrj44\" (UID: \"beba9fed-710a-49a6-96ce-951ecb0a4a74\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mrj44" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.573192 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8wh9\" (UniqueName: \"kubernetes.io/projected/8a9f98dc-e84b-4fb8-9d4d-69c766486ebb-kube-api-access-w8wh9\") pod \"marketplace-operator-79b997595-8g8bf\" (UID: \"8a9f98dc-e84b-4fb8-9d4d-69c766486ebb\") " pod="openshift-marketplace/marketplace-operator-79b997595-8g8bf" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.573215 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8a9f98dc-e84b-4fb8-9d4d-69c766486ebb-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8g8bf\" (UID: \"8a9f98dc-e84b-4fb8-9d4d-69c766486ebb\") " pod="openshift-marketplace/marketplace-operator-79b997595-8g8bf" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.573238 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0bd0713c-01be-44a0-ab66-51056ba04719-metrics-tls\") pod \"dns-default-qwgn4\" (UID: \"0bd0713c-01be-44a0-ab66-51056ba04719\") " pod="openshift-dns/dns-default-qwgn4" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.573274 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rv442\" (UniqueName: \"kubernetes.io/projected/e796fca9-e620-4e16-bda0-0e722b91b53c-kube-api-access-rv442\") pod \"controller-manager-879f6c89f-zj88j\" (UID: \"e796fca9-e620-4e16-bda0-0e722b91b53c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zj88j" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.573300 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxd4p\" (UniqueName: \"kubernetes.io/projected/c7f8a0cb-a369-4f34-b131-2023a72f1abb-kube-api-access-dxd4p\") pod \"packageserver-d55dfcdfc-f2m5b\" (UID: \"c7f8a0cb-a369-4f34-b131-2023a72f1abb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-f2m5b" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.573323 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/39b4186d-328d-4fc5-a106-50e351a34f90-profile-collector-cert\") pod \"catalog-operator-68c6474976-jdc97\" (UID: \"39b4186d-328d-4fc5-a106-50e351a34f90\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jdc97" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.573346 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/beba9fed-710a-49a6-96ce-951ecb0a4a74-audit-policies\") pod \"apiserver-7bbb656c7d-mrj44\" (UID: \"beba9fed-710a-49a6-96ce-951ecb0a4a74\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mrj44" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.573366 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2d0ddb3f-b8bc-420e-90ba-d45a29705615-images\") pod \"machine-config-operator-74547568cd-ft4b5\" (UID: \"2d0ddb3f-b8bc-420e-90ba-d45a29705615\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ft4b5" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.573373 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6j5h\" (UniqueName: \"kubernetes.io/projected/2d0ddb3f-b8bc-420e-90ba-d45a29705615-kube-api-access-p6j5h\") pod \"machine-config-operator-74547568cd-ft4b5\" (UID: \"2d0ddb3f-b8bc-420e-90ba-d45a29705615\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ft4b5" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.573397 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6aedcff-a4a2-4265-9c4c-9a3aee8b9377-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-44tqp\" (UID: \"a6aedcff-a4a2-4265-9c4c-9a3aee8b9377\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-44tqp" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.573420 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/29f791b6-e03f-4159-85ee-783401ccf7e1-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-8z2fx\" (UID: \"29f791b6-e03f-4159-85ee-783401ccf7e1\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8z2fx" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.573443 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c7f8a0cb-a369-4f34-b131-2023a72f1abb-webhook-cert\") pod \"packageserver-d55dfcdfc-f2m5b\" (UID: \"c7f8a0cb-a369-4f34-b131-2023a72f1abb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-f2m5b" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.573468 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqlv7\" (UniqueName: \"kubernetes.io/projected/ca76c4ad-59c4-4861-9279-4f8107524e44-kube-api-access-jqlv7\") pod \"service-ca-9c57cc56f-rlzws\" (UID: \"ca76c4ad-59c4-4861-9279-4f8107524e44\") " pod="openshift-service-ca/service-ca-9c57cc56f-rlzws" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.573497 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzqqc\" (UniqueName: \"kubernetes.io/projected/59a2dd51-8b1b-4437-a2f7-591bbf1890e8-kube-api-access-pzqqc\") pod \"machine-config-server-c6929\" (UID: \"59a2dd51-8b1b-4437-a2f7-591bbf1890e8\") " pod="openshift-machine-config-operator/machine-config-server-c6929" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.573522 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dgcd\" (UniqueName: \"kubernetes.io/projected/a6aedcff-a4a2-4265-9c4c-9a3aee8b9377-kube-api-access-4dgcd\") pod \"kube-storage-version-migrator-operator-b67b599dd-44tqp\" (UID: \"a6aedcff-a4a2-4265-9c4c-9a3aee8b9377\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-44tqp" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.573550 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/7a169e83-de91-4038-95b3-aa57f9b50861-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-nsn58\" (UID: \"7a169e83-de91-4038-95b3-aa57f9b50861\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nsn58" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.573593 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7vhw\" (UniqueName: \"kubernetes.io/projected/b3ef8088-fc82-4ce3-9c1c-662e380e0587-kube-api-access-c7vhw\") pod \"router-default-5444994796-6x76r\" (UID: \"b3ef8088-fc82-4ce3-9c1c-662e380e0587\") " pod="openshift-ingress/router-default-5444994796-6x76r" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.573620 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e796fca9-e620-4e16-bda0-0e722b91b53c-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-zj88j\" (UID: \"e796fca9-e620-4e16-bda0-0e722b91b53c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zj88j" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.573639 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/ecf238f4-a4b7-45ab-8d1f-ff20327f375c-plugins-dir\") pod \"csi-hostpathplugin-pt82b\" (UID: \"ecf238f4-a4b7-45ab-8d1f-ff20327f375c\") " pod="hostpath-provisioner/csi-hostpathplugin-pt82b" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.573658 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ck2qv\" (UniqueName: \"kubernetes.io/projected/0bd0713c-01be-44a0-ab66-51056ba04719-kube-api-access-ck2qv\") pod \"dns-default-qwgn4\" (UID: \"0bd0713c-01be-44a0-ab66-51056ba04719\") " pod="openshift-dns/dns-default-qwgn4" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.573689 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2d0ddb3f-b8bc-420e-90ba-d45a29705615-auth-proxy-config\") pod \"machine-config-operator-74547568cd-ft4b5\" (UID: \"2d0ddb3f-b8bc-420e-90ba-d45a29705615\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ft4b5" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.573708 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtz65\" (UniqueName: \"kubernetes.io/projected/29f791b6-e03f-4159-85ee-783401ccf7e1-kube-api-access-mtz65\") pod \"multus-admission-controller-857f4d67dd-8z2fx\" (UID: \"29f791b6-e03f-4159-85ee-783401ccf7e1\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8z2fx" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.573730 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e796fca9-e620-4e16-bda0-0e722b91b53c-client-ca\") pod \"controller-manager-879f6c89f-zj88j\" (UID: \"e796fca9-e620-4e16-bda0-0e722b91b53c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zj88j" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.573765 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/beba9fed-710a-49a6-96ce-951ecb0a4a74-audit-dir\") pod \"apiserver-7bbb656c7d-mrj44\" (UID: \"beba9fed-710a-49a6-96ce-951ecb0a4a74\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mrj44" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.573778 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5ed7ffac-66ee-4b90-a582-ba697f8d87b9-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-njrts\" (UID: \"5ed7ffac-66ee-4b90-a582-ba697f8d87b9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-njrts" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.573787 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2d0ddb3f-b8bc-420e-90ba-d45a29705615-proxy-tls\") pod \"machine-config-operator-74547568cd-ft4b5\" (UID: \"2d0ddb3f-b8bc-420e-90ba-d45a29705615\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ft4b5" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.573810 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/ecf238f4-a4b7-45ab-8d1f-ff20327f375c-mountpoint-dir\") pod \"csi-hostpathplugin-pt82b\" (UID: \"ecf238f4-a4b7-45ab-8d1f-ff20327f375c\") " pod="hostpath-provisioner/csi-hostpathplugin-pt82b" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.573833 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a6aedcff-a4a2-4265-9c4c-9a3aee8b9377-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-44tqp\" (UID: \"a6aedcff-a4a2-4265-9c4c-9a3aee8b9377\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-44tqp" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.573856 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/beba9fed-710a-49a6-96ce-951ecb0a4a74-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-mrj44\" (UID: \"beba9fed-710a-49a6-96ce-951ecb0a4a74\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mrj44" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.573903 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e796fca9-e620-4e16-bda0-0e722b91b53c-config\") pod \"controller-manager-879f6c89f-zj88j\" (UID: \"e796fca9-e620-4e16-bda0-0e722b91b53c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zj88j" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.573926 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/ecf238f4-a4b7-45ab-8d1f-ff20327f375c-csi-data-dir\") pod \"csi-hostpathplugin-pt82b\" (UID: \"ecf238f4-a4b7-45ab-8d1f-ff20327f375c\") " pod="hostpath-provisioner/csi-hostpathplugin-pt82b" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.573949 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlrf7\" (UniqueName: \"kubernetes.io/projected/45170d05-984d-4bae-8f74-d7d7c60fffca-kube-api-access-rlrf7\") pod \"migrator-59844c95c7-cnkdl\" (UID: \"45170d05-984d-4bae-8f74-d7d7c60fffca\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-cnkdl" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.573976 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e796fca9-e620-4e16-bda0-0e722b91b53c-serving-cert\") pod \"controller-manager-879f6c89f-zj88j\" (UID: \"e796fca9-e620-4e16-bda0-0e722b91b53c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zj88j" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.575084 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/c7f8a0cb-a369-4f34-b131-2023a72f1abb-tmpfs\") pod \"packageserver-d55dfcdfc-f2m5b\" (UID: \"c7f8a0cb-a369-4f34-b131-2023a72f1abb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-f2m5b" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.576671 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ecf238f4-a4b7-45ab-8d1f-ff20327f375c-socket-dir\") pod \"csi-hostpathplugin-pt82b\" (UID: \"ecf238f4-a4b7-45ab-8d1f-ff20327f375c\") " pod="hostpath-provisioner/csi-hostpathplugin-pt82b" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.576703 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ca76c4ad-59c4-4861-9279-4f8107524e44-signing-cabundle\") pod \"service-ca-9c57cc56f-rlzws\" (UID: \"ca76c4ad-59c4-4861-9279-4f8107524e44\") " pod="openshift-service-ca/service-ca-9c57cc56f-rlzws" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.576721 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ecf238f4-a4b7-45ab-8d1f-ff20327f375c-registration-dir\") pod \"csi-hostpathplugin-pt82b\" (UID: \"ecf238f4-a4b7-45ab-8d1f-ff20327f375c\") " pod="hostpath-provisioner/csi-hostpathplugin-pt82b" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.577299 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4f397145-18ab-4b43-b133-cc42f45bc852-config-volume\") pod \"collect-profiles-29409615-5d85l\" (UID: \"4f397145-18ab-4b43-b133-cc42f45bc852\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409615-5d85l" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.577646 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f0d517f-566f-404b-be4d-08adaea5926b-config\") pod \"service-ca-operator-777779d784-pbxqh\" (UID: \"3f0d517f-566f-404b-be4d-08adaea5926b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pbxqh" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.577996 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/beba9fed-710a-49a6-96ce-951ecb0a4a74-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-mrj44\" (UID: \"beba9fed-710a-49a6-96ce-951ecb0a4a74\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mrj44" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.578616 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8a9f98dc-e84b-4fb8-9d4d-69c766486ebb-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8g8bf\" (UID: \"8a9f98dc-e84b-4fb8-9d4d-69c766486ebb\") " pod="openshift-marketplace/marketplace-operator-79b997595-8g8bf" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.578797 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/beba9fed-710a-49a6-96ce-951ecb0a4a74-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-mrj44\" (UID: \"beba9fed-710a-49a6-96ce-951ecb0a4a74\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mrj44" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.578869 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8a9f98dc-e84b-4fb8-9d4d-69c766486ebb-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8g8bf\" (UID: \"8a9f98dc-e84b-4fb8-9d4d-69c766486ebb\") " pod="openshift-marketplace/marketplace-operator-79b997595-8g8bf" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.578969 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/beba9fed-710a-49a6-96ce-951ecb0a4a74-audit-dir\") pod \"apiserver-7bbb656c7d-mrj44\" (UID: \"beba9fed-710a-49a6-96ce-951ecb0a4a74\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mrj44" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.579157 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e796fca9-e620-4e16-bda0-0e722b91b53c-serving-cert\") pod \"controller-manager-879f6c89f-zj88j\" (UID: \"e796fca9-e620-4e16-bda0-0e722b91b53c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zj88j" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.579369 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e796fca9-e620-4e16-bda0-0e722b91b53c-config\") pod \"controller-manager-879f6c89f-zj88j\" (UID: \"e796fca9-e620-4e16-bda0-0e722b91b53c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zj88j" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.579449 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/ecf238f4-a4b7-45ab-8d1f-ff20327f375c-csi-data-dir\") pod \"csi-hostpathplugin-pt82b\" (UID: \"ecf238f4-a4b7-45ab-8d1f-ff20327f375c\") " pod="hostpath-provisioner/csi-hostpathplugin-pt82b" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.579855 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/ecf238f4-a4b7-45ab-8d1f-ff20327f375c-mountpoint-dir\") pod \"csi-hostpathplugin-pt82b\" (UID: \"ecf238f4-a4b7-45ab-8d1f-ff20327f375c\") " pod="hostpath-provisioner/csi-hostpathplugin-pt82b" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.581286 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/7a169e83-de91-4038-95b3-aa57f9b50861-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-nsn58\" (UID: \"7a169e83-de91-4038-95b3-aa57f9b50861\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nsn58" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.581376 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4f397145-18ab-4b43-b133-cc42f45bc852-secret-volume\") pod \"collect-profiles-29409615-5d85l\" (UID: \"4f397145-18ab-4b43-b133-cc42f45bc852\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409615-5d85l" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.581449 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6453284d-a0de-451c-9132-d30f6fddc220-cert\") pod \"ingress-canary-dpnk8\" (UID: \"6453284d-a0de-451c-9132-d30f6fddc220\") " pod="openshift-ingress-canary/ingress-canary-dpnk8" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.585317 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/beba9fed-710a-49a6-96ce-951ecb0a4a74-serving-cert\") pod \"apiserver-7bbb656c7d-mrj44\" (UID: \"beba9fed-710a-49a6-96ce-951ecb0a4a74\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mrj44" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.586356 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/beba9fed-710a-49a6-96ce-951ecb0a4a74-encryption-config\") pod \"apiserver-7bbb656c7d-mrj44\" (UID: \"beba9fed-710a-49a6-96ce-951ecb0a4a74\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mrj44" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.586477 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e796fca9-e620-4e16-bda0-0e722b91b53c-client-ca\") pod \"controller-manager-879f6c89f-zj88j\" (UID: \"e796fca9-e620-4e16-bda0-0e722b91b53c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zj88j" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.586618 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/ecf238f4-a4b7-45ab-8d1f-ff20327f375c-plugins-dir\") pod \"csi-hostpathplugin-pt82b\" (UID: \"ecf238f4-a4b7-45ab-8d1f-ff20327f375c\") " pod="hostpath-provisioner/csi-hostpathplugin-pt82b" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.587828 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f0d517f-566f-404b-be4d-08adaea5926b-serving-cert\") pod \"service-ca-operator-777779d784-pbxqh\" (UID: \"3f0d517f-566f-404b-be4d-08adaea5926b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pbxqh" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.587925 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e796fca9-e620-4e16-bda0-0e722b91b53c-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-zj88j\" (UID: \"e796fca9-e620-4e16-bda0-0e722b91b53c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zj88j" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.588128 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6aedcff-a4a2-4265-9c4c-9a3aee8b9377-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-44tqp\" (UID: \"a6aedcff-a4a2-4265-9c4c-9a3aee8b9377\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-44tqp" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.588574 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2d0ddb3f-b8bc-420e-90ba-d45a29705615-auth-proxy-config\") pod \"machine-config-operator-74547568cd-ft4b5\" (UID: \"2d0ddb3f-b8bc-420e-90ba-d45a29705615\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ft4b5" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.589037 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2d0ddb3f-b8bc-420e-90ba-d45a29705615-proxy-tls\") pod \"machine-config-operator-74547568cd-ft4b5\" (UID: \"2d0ddb3f-b8bc-420e-90ba-d45a29705615\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ft4b5" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.589160 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1456ef72-76a5-4a0b-812b-7a0431444f47-srv-cert\") pod \"olm-operator-6b444d44fb-hr5n5\" (UID: \"1456ef72-76a5-4a0b-812b-7a0431444f47\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hr5n5" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.590371 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0bd0713c-01be-44a0-ab66-51056ba04719-config-volume\") pod \"dns-default-qwgn4\" (UID: \"0bd0713c-01be-44a0-ab66-51056ba04719\") " pod="openshift-dns/dns-default-qwgn4" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.590961 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/beba9fed-710a-49a6-96ce-951ecb0a4a74-audit-policies\") pod \"apiserver-7bbb656c7d-mrj44\" (UID: \"beba9fed-710a-49a6-96ce-951ecb0a4a74\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mrj44" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.591764 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b3ef8088-fc82-4ce3-9c1c-662e380e0587-service-ca-bundle\") pod \"router-default-5444994796-6x76r\" (UID: \"b3ef8088-fc82-4ce3-9c1c-662e380e0587\") " pod="openshift-ingress/router-default-5444994796-6x76r" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.592351 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/29f791b6-e03f-4159-85ee-783401ccf7e1-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-8z2fx\" (UID: \"29f791b6-e03f-4159-85ee-783401ccf7e1\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8z2fx" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.593851 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0bd0713c-01be-44a0-ab66-51056ba04719-metrics-tls\") pod \"dns-default-qwgn4\" (UID: \"0bd0713c-01be-44a0-ab66-51056ba04719\") " pod="openshift-dns/dns-default-qwgn4" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.595035 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a6aedcff-a4a2-4265-9c4c-9a3aee8b9377-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-44tqp\" (UID: \"a6aedcff-a4a2-4265-9c4c-9a3aee8b9377\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-44tqp" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.595791 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/59a2dd51-8b1b-4437-a2f7-591bbf1890e8-certs\") pod \"machine-config-server-c6929\" (UID: \"59a2dd51-8b1b-4437-a2f7-591bbf1890e8\") " pod="openshift-machine-config-operator/machine-config-server-c6929" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.606765 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-7khqv"] Dec 01 08:19:19 crc kubenswrapper[5004]: W1201 08:19:19.606772 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b170acc_4880_42d4_ae54_0946ba0029b5.slice/crio-fb3af3a85092b75e644523ff15bf0c18069507c22346ca47d05c736b47d4e6f3 WatchSource:0}: Error finding container fb3af3a85092b75e644523ff15bf0c18069507c22346ca47d05c736b47d4e6f3: Status 404 returned error can't find the container with id fb3af3a85092b75e644523ff15bf0c18069507c22346ca47d05c736b47d4e6f3 Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.608507 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wvmxq"] Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.609155 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lv9xz\" (UniqueName: \"kubernetes.io/projected/ce579b07-073d-450d-b056-1be2c7bed20f-kube-api-access-lv9xz\") pod \"console-f9d7485db-th28b\" (UID: \"ce579b07-073d-450d-b056-1be2c7bed20f\") " pod="openshift-console/console-f9d7485db-th28b" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.610383 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/af840de7-db59-4020-a5c3-2d888069db1e-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-vrl25\" (UID: \"af840de7-db59-4020-a5c3-2d888069db1e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vrl25" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.611054 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9bba24e8-8799-4012-8d3a-7813ef29344e-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-rsfff\" (UID: \"9bba24e8-8799-4012-8d3a-7813ef29344e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rsfff" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.611115 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ca76c4ad-59c4-4861-9279-4f8107524e44-signing-key\") pod \"service-ca-9c57cc56f-rlzws\" (UID: \"ca76c4ad-59c4-4861-9279-4f8107524e44\") " pod="openshift-service-ca/service-ca-9c57cc56f-rlzws" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.611353 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5ed7ffac-66ee-4b90-a582-ba697f8d87b9-proxy-tls\") pod \"machine-config-controller-84d6567774-njrts\" (UID: \"5ed7ffac-66ee-4b90-a582-ba697f8d87b9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-njrts" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.611491 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b3ef8088-fc82-4ce3-9c1c-662e380e0587-default-certificate\") pod \"router-default-5444994796-6x76r\" (UID: \"b3ef8088-fc82-4ce3-9c1c-662e380e0587\") " pod="openshift-ingress/router-default-5444994796-6x76r" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.611502 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1456ef72-76a5-4a0b-812b-7a0431444f47-profile-collector-cert\") pod \"olm-operator-6b444d44fb-hr5n5\" (UID: \"1456ef72-76a5-4a0b-812b-7a0431444f47\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hr5n5" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.611661 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/39b4186d-328d-4fc5-a106-50e351a34f90-profile-collector-cert\") pod \"catalog-operator-68c6474976-jdc97\" (UID: \"39b4186d-328d-4fc5-a106-50e351a34f90\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jdc97" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.611917 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/beba9fed-710a-49a6-96ce-951ecb0a4a74-etcd-client\") pod \"apiserver-7bbb656c7d-mrj44\" (UID: \"beba9fed-710a-49a6-96ce-951ecb0a4a74\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mrj44" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.611930 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/39b4186d-328d-4fc5-a106-50e351a34f90-srv-cert\") pod \"catalog-operator-68c6474976-jdc97\" (UID: \"39b4186d-328d-4fc5-a106-50e351a34f90\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jdc97" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.612060 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b3ef8088-fc82-4ce3-9c1c-662e380e0587-metrics-certs\") pod \"router-default-5444994796-6x76r\" (UID: \"b3ef8088-fc82-4ce3-9c1c-662e380e0587\") " pod="openshift-ingress/router-default-5444994796-6x76r" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.612238 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b3ef8088-fc82-4ce3-9c1c-662e380e0587-stats-auth\") pod \"router-default-5444994796-6x76r\" (UID: \"b3ef8088-fc82-4ce3-9c1c-662e380e0587\") " pod="openshift-ingress/router-default-5444994796-6x76r" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.613048 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/59a2dd51-8b1b-4437-a2f7-591bbf1890e8-node-bootstrap-token\") pod \"machine-config-server-c6929\" (UID: \"59a2dd51-8b1b-4437-a2f7-591bbf1890e8\") " pod="openshift-machine-config-operator/machine-config-server-c6929" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.613408 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c7f8a0cb-a369-4f34-b131-2023a72f1abb-apiservice-cert\") pod \"packageserver-d55dfcdfc-f2m5b\" (UID: \"c7f8a0cb-a369-4f34-b131-2023a72f1abb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-f2m5b" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.615633 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c7f8a0cb-a369-4f34-b131-2023a72f1abb-webhook-cert\") pod \"packageserver-d55dfcdfc-f2m5b\" (UID: \"c7f8a0cb-a369-4f34-b131-2023a72f1abb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-f2m5b" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.626022 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9c645213-a3fd-4f35-9edd-60905873a559-bound-sa-token\") pod \"image-registry-697d97f7c8-8jr7l\" (UID: \"9c645213-a3fd-4f35-9edd-60905873a559\") " pod="openshift-image-registry/image-registry-697d97f7c8-8jr7l" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.650209 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hh7c\" (UniqueName: \"kubernetes.io/projected/6047f6c2-4e66-4dde-b262-383c622eef04-kube-api-access-5hh7c\") pod \"cluster-image-registry-operator-dc59b4c8b-829wj\" (UID: \"6047f6c2-4e66-4dde-b262-383c622eef04\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-829wj" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.674903 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8jr7l\" (UID: \"9c645213-a3fd-4f35-9edd-60905873a559\") " pod="openshift-image-registry/image-registry-697d97f7c8-8jr7l" Dec 01 08:19:19 crc kubenswrapper[5004]: E1201 08:19:19.675191 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 08:19:20.175177312 +0000 UTC m=+137.740169294 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8jr7l" (UID: "9c645213-a3fd-4f35-9edd-60905873a559") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.678214 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-qmztb" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.685903 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e0f00c28-2b6e-4127-a70e-43761e7cdb9e-bound-sa-token\") pod \"ingress-operator-5b745b69d9-v5wjg\" (UID: \"e0f00c28-2b6e-4127-a70e-43761e7cdb9e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v5wjg" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.701220 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-th28b" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.710247 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7r65\" (UniqueName: \"kubernetes.io/projected/a2404628-0f25-4889-8a15-73576dd41470-kube-api-access-w7r65\") pod \"dns-operator-744455d44c-w4btr\" (UID: \"a2404628-0f25-4889-8a15-73576dd41470\") " pod="openshift-dns-operator/dns-operator-744455d44c-w4btr" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.724137 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-829wj" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.728152 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mh5lv\" (UniqueName: \"kubernetes.io/projected/59b7fdd8-0d91-4442-a2a8-41c92d027266-kube-api-access-mh5lv\") pod \"openshift-controller-manager-operator-756b6f6bc6-4f588\" (UID: \"59b7fdd8-0d91-4442-a2a8-41c92d027266\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4f588" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.732651 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vrl25" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.745913 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hc6xg\" (UniqueName: \"kubernetes.io/projected/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-kube-api-access-hc6xg\") pod \"oauth-openshift-558db77b4-p77t7\" (UID: \"15dc5e3f-02c6-474d-bd7b-d51ce42340b3\") " pod="openshift-authentication/oauth-openshift-558db77b4-p77t7" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.755529 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-glpkv" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.762870 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v5wjg" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.774157 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/aa8bb474-21e8-42b3-a2c6-81ef7d267d9d-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-h6qmw\" (UID: \"aa8bb474-21e8-42b3-a2c6-81ef7d267d9d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-h6qmw" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.775573 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 08:19:19 crc kubenswrapper[5004]: E1201 08:19:19.776059 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 08:19:20.276045065 +0000 UTC m=+137.841037047 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.805651 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-kplt9" event={"ID":"2daa468b-e9f8-41a3-ba94-a1e33093fc97","Type":"ContainerStarted","Data":"e33e2e2cb0e94e7c62495fa68e5ace42ae44fd5453d5df47c395d4ef447bae47"} Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.805963 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-kplt9" event={"ID":"2daa468b-e9f8-41a3-ba94-a1e33093fc97","Type":"ContainerStarted","Data":"27d4fc4cebc65d31bc005cc2f42ae3f5789d81475e9168e6cb8e90743aa42365"} Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.813496 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bv8b7\" (UniqueName: \"kubernetes.io/projected/9c645213-a3fd-4f35-9edd-60905873a559-kube-api-access-bv8b7\") pod \"image-registry-697d97f7c8-8jr7l\" (UID: \"9c645213-a3fd-4f35-9edd-60905873a559\") " pod="openshift-image-registry/image-registry-697d97f7c8-8jr7l" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.814852 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-j9k4x" event={"ID":"17c7a11e-bffa-4ecf-abe0-c467a33538a8","Type":"ContainerStarted","Data":"ff0a92f56831e573559bdbec231cf56b1b26c79efc50a3657c6e03df9f7805f2"} Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.814896 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-j9k4x" event={"ID":"17c7a11e-bffa-4ecf-abe0-c467a33538a8","Type":"ContainerStarted","Data":"5cf41589bc241bb2e5363c35553f5f6844e9e8a146e666e530c6fa48746ce5be"} Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.817984 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-565jz\" (UniqueName: \"kubernetes.io/projected/caebe48f-0ac0-436e-983b-6c5858472cf7-kube-api-access-565jz\") pod \"console-operator-58897d9998-jshhn\" (UID: \"caebe48f-0ac0-436e-983b-6c5858472cf7\") " pod="openshift-console-operator/console-operator-58897d9998-jshhn" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.819841 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7khqv" event={"ID":"d2bbf8d8-0338-4af4-8d6a-402033f87676","Type":"ContainerStarted","Data":"eaeaa64e5cf8ec97c4cb29e5099fed5888c6853e6cc142278e293caefc0b2c6d"} Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.821256 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-b9vjf" event={"ID":"c3ebf4d5-102a-4552-b30b-cbacb3a779fa","Type":"ContainerStarted","Data":"d00abd42ebbca4a70c595893ccf6a323a7b5a0ffaded73030bb11e07020cb959"} Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.821288 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-b9vjf" event={"ID":"c3ebf4d5-102a-4552-b30b-cbacb3a779fa","Type":"ContainerStarted","Data":"b7123e8600ba552fec5582d3e9bb796e54b6ae1e91e8c0145f64444515fb8056"} Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.821299 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-b9vjf" event={"ID":"c3ebf4d5-102a-4552-b30b-cbacb3a779fa","Type":"ContainerStarted","Data":"0542a3a753b9094a234321631acd4025173eabd56a7290f7bcb6024411fe3fcb"} Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.822804 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wvmxq" event={"ID":"16dd04af-b1db-4a72-8f1f-8d53ffd52b41","Type":"ContainerStarted","Data":"c1119e7049a5d10a60772e363f6a1b7f6712919c04dc41facec14f5d2f12d696"} Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.826046 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pkg9x" event={"ID":"29bfa426-07b0-4acb-a886-9f9316644d71","Type":"ContainerStarted","Data":"1bd1c73fcf5c7852b899150383acc07e6fa99ec260928f804f8dd2ac8ca852ef"} Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.826087 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pkg9x" event={"ID":"29bfa426-07b0-4acb-a886-9f9316644d71","Type":"ContainerStarted","Data":"f9c0f9f1f7161878d988dd81e64f3316f99e068588740fc919ff4c5c597059dc"} Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.829624 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dsrp6" event={"ID":"1f694a09-4564-4103-b1b0-ea419e62082e","Type":"ContainerStarted","Data":"08d800347d2aa8ac47530d31275635b58a11e9959fdb9b720beb1b9978c11ab0"} Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.829669 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dsrp6" event={"ID":"1f694a09-4564-4103-b1b0-ea419e62082e","Type":"ContainerStarted","Data":"66f85f9072b52534e5d7b1b401d9ef7a3ad5356f5f6b3ceffd7ee37012df8fe8"} Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.829681 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dsrp6" event={"ID":"1f694a09-4564-4103-b1b0-ea419e62082e","Type":"ContainerStarted","Data":"0ab40725ddd1344d8de61b05f290d0e86dc37a722aaf8d89bc59a73feab74c28"} Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.845871 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-qjz9p" event={"ID":"7b170acc-4880-42d4-ae54-0946ba0029b5","Type":"ContainerStarted","Data":"78871168adb3a36ad3c56ad0e5b69c8e26fa2a5926fd2645216ea2a29805dcae"} Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.845909 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-qjz9p" event={"ID":"7b170acc-4880-42d4-ae54-0946ba0029b5","Type":"ContainerStarted","Data":"fb3af3a85092b75e644523ff15bf0c18069507c22346ca47d05c736b47d4e6f3"} Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.854195 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plrp2\" (UniqueName: \"kubernetes.io/projected/55711190-8e14-4951-9ac3-dc3675c3a86e-kube-api-access-plrp2\") pod \"openshift-config-operator-7777fb866f-p9c2d\" (UID: \"55711190-8e14-4951-9ac3-dc3675c3a86e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9c2d" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.866719 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xp57r\" (UniqueName: \"kubernetes.io/projected/1456ef72-76a5-4a0b-812b-7a0431444f47-kube-api-access-xp57r\") pod \"olm-operator-6b444d44fb-hr5n5\" (UID: \"1456ef72-76a5-4a0b-812b-7a0431444f47\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hr5n5" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.869450 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtz65\" (UniqueName: \"kubernetes.io/projected/29f791b6-e03f-4159-85ee-783401ccf7e1-kube-api-access-mtz65\") pod \"multus-admission-controller-857f4d67dd-8z2fx\" (UID: \"29f791b6-e03f-4159-85ee-783401ccf7e1\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8z2fx" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.876851 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8jr7l\" (UID: \"9c645213-a3fd-4f35-9edd-60905873a559\") " pod="openshift-image-registry/image-registry-697d97f7c8-8jr7l" Dec 01 08:19:19 crc kubenswrapper[5004]: E1201 08:19:19.880130 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 08:19:20.380118891 +0000 UTC m=+137.945110873 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8jr7l" (UID: "9c645213-a3fd-4f35-9edd-60905873a559") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.880651 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hr5n5" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.887607 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-8z2fx" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.891947 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqlv7\" (UniqueName: \"kubernetes.io/projected/ca76c4ad-59c4-4861-9279-4f8107524e44-kube-api-access-jqlv7\") pod \"service-ca-9c57cc56f-rlzws\" (UID: \"ca76c4ad-59c4-4861-9279-4f8107524e44\") " pod="openshift-service-ca/service-ca-9c57cc56f-rlzws" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.910317 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzqqc\" (UniqueName: \"kubernetes.io/projected/59a2dd51-8b1b-4437-a2f7-591bbf1890e8-kube-api-access-pzqqc\") pod \"machine-config-server-c6929\" (UID: \"59a2dd51-8b1b-4437-a2f7-591bbf1890e8\") " pod="openshift-machine-config-operator/machine-config-server-c6929" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.910555 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-c6929" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.931550 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4f588" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.947209 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dgcd\" (UniqueName: \"kubernetes.io/projected/a6aedcff-a4a2-4265-9c4c-9a3aee8b9377-kube-api-access-4dgcd\") pod \"kube-storage-version-migrator-operator-b67b599dd-44tqp\" (UID: \"a6aedcff-a4a2-4265-9c4c-9a3aee8b9377\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-44tqp" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.961769 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rv442\" (UniqueName: \"kubernetes.io/projected/e796fca9-e620-4e16-bda0-0e722b91b53c-kube-api-access-rv442\") pod \"controller-manager-879f6c89f-zj88j\" (UID: \"e796fca9-e620-4e16-bda0-0e722b91b53c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zj88j" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.970005 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbs6r\" (UniqueName: \"kubernetes.io/projected/39b4186d-328d-4fc5-a106-50e351a34f90-kube-api-access-kbs6r\") pod \"catalog-operator-68c6474976-jdc97\" (UID: \"39b4186d-328d-4fc5-a106-50e351a34f90\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jdc97" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.978975 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 08:19:19 crc kubenswrapper[5004]: E1201 08:19:19.979128 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 08:19:20.479102305 +0000 UTC m=+138.044094297 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.979211 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8jr7l\" (UID: \"9c645213-a3fd-4f35-9edd-60905873a559\") " pod="openshift-image-registry/image-registry-697d97f7c8-8jr7l" Dec 01 08:19:19 crc kubenswrapper[5004]: E1201 08:19:19.981499 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 08:19:20.481488708 +0000 UTC m=+138.046480690 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8jr7l" (UID: "9c645213-a3fd-4f35-9edd-60905873a559") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.987056 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-p77t7" Dec 01 08:19:19 crc kubenswrapper[5004]: I1201 08:19:19.994515 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-jshhn" Dec 01 08:19:20 crc kubenswrapper[5004]: I1201 08:19:20.008235 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-w4btr" Dec 01 08:19:20 crc kubenswrapper[5004]: I1201 08:19:20.012315 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ff6n5\" (UniqueName: \"kubernetes.io/projected/beba9fed-710a-49a6-96ce-951ecb0a4a74-kube-api-access-ff6n5\") pod \"apiserver-7bbb656c7d-mrj44\" (UID: \"beba9fed-710a-49a6-96ce-951ecb0a4a74\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mrj44" Dec 01 08:19:20 crc kubenswrapper[5004]: I1201 08:19:20.016081 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8wh9\" (UniqueName: \"kubernetes.io/projected/8a9f98dc-e84b-4fb8-9d4d-69c766486ebb-kube-api-access-w8wh9\") pod \"marketplace-operator-79b997595-8g8bf\" (UID: \"8a9f98dc-e84b-4fb8-9d4d-69c766486ebb\") " pod="openshift-marketplace/marketplace-operator-79b997595-8g8bf" Dec 01 08:19:20 crc kubenswrapper[5004]: I1201 08:19:20.038170 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-h6qmw" Dec 01 08:19:20 crc kubenswrapper[5004]: I1201 08:19:20.041267 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlrf7\" (UniqueName: \"kubernetes.io/projected/45170d05-984d-4bae-8f74-d7d7c60fffca-kube-api-access-rlrf7\") pod \"migrator-59844c95c7-cnkdl\" (UID: \"45170d05-984d-4bae-8f74-d7d7c60fffca\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-cnkdl" Dec 01 08:19:20 crc kubenswrapper[5004]: I1201 08:19:20.048159 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9c2d" Dec 01 08:19:20 crc kubenswrapper[5004]: I1201 08:19:20.070271 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mrj44" Dec 01 08:19:20 crc kubenswrapper[5004]: I1201 08:19:20.079096 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6j5h\" (UniqueName: \"kubernetes.io/projected/2d0ddb3f-b8bc-420e-90ba-d45a29705615-kube-api-access-p6j5h\") pod \"machine-config-operator-74547568cd-ft4b5\" (UID: \"2d0ddb3f-b8bc-420e-90ba-d45a29705615\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ft4b5" Dec 01 08:19:20 crc kubenswrapper[5004]: I1201 08:19:20.080285 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 08:19:20 crc kubenswrapper[5004]: E1201 08:19:20.080740 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 08:19:20.580725138 +0000 UTC m=+138.145717110 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:19:20 crc kubenswrapper[5004]: I1201 08:19:20.084007 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ft4b5" Dec 01 08:19:20 crc kubenswrapper[5004]: I1201 08:19:20.085416 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfvvw\" (UniqueName: \"kubernetes.io/projected/9bba24e8-8799-4012-8d3a-7813ef29344e-kube-api-access-dfvvw\") pod \"package-server-manager-789f6589d5-rsfff\" (UID: \"9bba24e8-8799-4012-8d3a-7813ef29344e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rsfff" Dec 01 08:19:20 crc kubenswrapper[5004]: I1201 08:19:20.090800 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-zj88j" Dec 01 08:19:20 crc kubenswrapper[5004]: I1201 08:19:20.091417 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxd4p\" (UniqueName: \"kubernetes.io/projected/c7f8a0cb-a369-4f34-b131-2023a72f1abb-kube-api-access-dxd4p\") pod \"packageserver-d55dfcdfc-f2m5b\" (UID: \"c7f8a0cb-a369-4f34-b131-2023a72f1abb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-f2m5b" Dec 01 08:19:20 crc kubenswrapper[5004]: I1201 08:19:20.121552 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-cnkdl" Dec 01 08:19:20 crc kubenswrapper[5004]: I1201 08:19:20.131862 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-44tqp" Dec 01 08:19:20 crc kubenswrapper[5004]: I1201 08:19:20.148777 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-rlzws" Dec 01 08:19:20 crc kubenswrapper[5004]: I1201 08:19:20.149490 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-f2m5b" Dec 01 08:19:20 crc kubenswrapper[5004]: I1201 08:19:20.154758 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8g8bf" Dec 01 08:19:20 crc kubenswrapper[5004]: I1201 08:19:20.182386 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8jr7l\" (UID: \"9c645213-a3fd-4f35-9edd-60905873a559\") " pod="openshift-image-registry/image-registry-697d97f7c8-8jr7l" Dec 01 08:19:20 crc kubenswrapper[5004]: E1201 08:19:20.182916 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 08:19:20.682901525 +0000 UTC m=+138.247893517 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8jr7l" (UID: "9c645213-a3fd-4f35-9edd-60905873a559") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:19:20 crc kubenswrapper[5004]: I1201 08:19:20.194906 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jdc97" Dec 01 08:19:20 crc kubenswrapper[5004]: I1201 08:19:20.203108 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rsfff" Dec 01 08:19:20 crc kubenswrapper[5004]: I1201 08:19:20.220899 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ck2qv\" (UniqueName: \"kubernetes.io/projected/0bd0713c-01be-44a0-ab66-51056ba04719-kube-api-access-ck2qv\") pod \"dns-default-qwgn4\" (UID: \"0bd0713c-01be-44a0-ab66-51056ba04719\") " pod="openshift-dns/dns-default-qwgn4" Dec 01 08:19:20 crc kubenswrapper[5004]: I1201 08:19:20.220950 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l95tm\" (UniqueName: \"kubernetes.io/projected/3f0d517f-566f-404b-be4d-08adaea5926b-kube-api-access-l95tm\") pod \"service-ca-operator-777779d784-pbxqh\" (UID: \"3f0d517f-566f-404b-be4d-08adaea5926b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pbxqh" Dec 01 08:19:20 crc kubenswrapper[5004]: I1201 08:19:20.221780 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7vhw\" (UniqueName: \"kubernetes.io/projected/b3ef8088-fc82-4ce3-9c1c-662e380e0587-kube-api-access-c7vhw\") pod \"router-default-5444994796-6x76r\" (UID: \"b3ef8088-fc82-4ce3-9c1c-662e380e0587\") " pod="openshift-ingress/router-default-5444994796-6x76r" Dec 01 08:19:20 crc kubenswrapper[5004]: I1201 08:19:20.222226 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-th28b"] Dec 01 08:19:20 crc kubenswrapper[5004]: I1201 08:19:20.223381 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-qmztb"] Dec 01 08:19:20 crc kubenswrapper[5004]: I1201 08:19:20.223995 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lsxw\" (UniqueName: \"kubernetes.io/projected/ecf238f4-a4b7-45ab-8d1f-ff20327f375c-kube-api-access-4lsxw\") pod \"csi-hostpathplugin-pt82b\" (UID: \"ecf238f4-a4b7-45ab-8d1f-ff20327f375c\") " pod="hostpath-provisioner/csi-hostpathplugin-pt82b" Dec 01 08:19:20 crc kubenswrapper[5004]: I1201 08:19:20.229483 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7bhz\" (UniqueName: \"kubernetes.io/projected/4f397145-18ab-4b43-b133-cc42f45bc852-kube-api-access-l7bhz\") pod \"collect-profiles-29409615-5d85l\" (UID: \"4f397145-18ab-4b43-b133-cc42f45bc852\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409615-5d85l" Dec 01 08:19:20 crc kubenswrapper[5004]: I1201 08:19:20.229747 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-qwgn4" Dec 01 08:19:20 crc kubenswrapper[5004]: I1201 08:19:20.232922 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjphf\" (UniqueName: \"kubernetes.io/projected/5ed7ffac-66ee-4b90-a582-ba697f8d87b9-kube-api-access-tjphf\") pod \"machine-config-controller-84d6567774-njrts\" (UID: \"5ed7ffac-66ee-4b90-a582-ba697f8d87b9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-njrts" Dec 01 08:19:20 crc kubenswrapper[5004]: I1201 08:19:20.258286 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-pt82b" Dec 01 08:19:20 crc kubenswrapper[5004]: I1201 08:19:20.261024 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krzrl\" (UniqueName: \"kubernetes.io/projected/7a169e83-de91-4038-95b3-aa57f9b50861-kube-api-access-krzrl\") pod \"control-plane-machine-set-operator-78cbb6b69f-nsn58\" (UID: \"7a169e83-de91-4038-95b3-aa57f9b50861\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nsn58" Dec 01 08:19:20 crc kubenswrapper[5004]: I1201 08:19:20.268300 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvxmf\" (UniqueName: \"kubernetes.io/projected/6453284d-a0de-451c-9132-d30f6fddc220-kube-api-access-bvxmf\") pod \"ingress-canary-dpnk8\" (UID: \"6453284d-a0de-451c-9132-d30f6fddc220\") " pod="openshift-ingress-canary/ingress-canary-dpnk8" Dec 01 08:19:20 crc kubenswrapper[5004]: I1201 08:19:20.286056 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 08:19:20 crc kubenswrapper[5004]: E1201 08:19:20.286453 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 08:19:20.786436478 +0000 UTC m=+138.351428460 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:19:20 crc kubenswrapper[5004]: W1201 08:19:20.311225 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce579b07_073d_450d_b056_1be2c7bed20f.slice/crio-0db55e0a2ffa2d216635500ddb7467e400bd220b238f92073dbde27e60df51cb WatchSource:0}: Error finding container 0db55e0a2ffa2d216635500ddb7467e400bd220b238f92073dbde27e60df51cb: Status 404 returned error can't find the container with id 0db55e0a2ffa2d216635500ddb7467e400bd220b238f92073dbde27e60df51cb Dec 01 08:19:20 crc kubenswrapper[5004]: I1201 08:19:20.375390 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-6x76r" Dec 01 08:19:20 crc kubenswrapper[5004]: I1201 08:19:20.392687 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8jr7l\" (UID: \"9c645213-a3fd-4f35-9edd-60905873a559\") " pod="openshift-image-registry/image-registry-697d97f7c8-8jr7l" Dec 01 08:19:20 crc kubenswrapper[5004]: E1201 08:19:20.393001 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 08:19:20.892985329 +0000 UTC m=+138.457977311 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8jr7l" (UID: "9c645213-a3fd-4f35-9edd-60905873a559") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:19:20 crc kubenswrapper[5004]: I1201 08:19:20.399811 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-njrts" Dec 01 08:19:20 crc kubenswrapper[5004]: I1201 08:19:20.415789 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nsn58" Dec 01 08:19:20 crc kubenswrapper[5004]: I1201 08:19:20.461896 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-pbxqh" Dec 01 08:19:20 crc kubenswrapper[5004]: I1201 08:19:20.470794 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409615-5d85l" Dec 01 08:19:20 crc kubenswrapper[5004]: I1201 08:19:20.493465 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 08:19:20 crc kubenswrapper[5004]: E1201 08:19:20.493874 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 08:19:20.993857322 +0000 UTC m=+138.558849304 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:19:20 crc kubenswrapper[5004]: I1201 08:19:20.523011 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dpnk8" Dec 01 08:19:20 crc kubenswrapper[5004]: I1201 08:19:20.560130 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vrl25"] Dec 01 08:19:20 crc kubenswrapper[5004]: I1201 08:19:20.595092 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8jr7l\" (UID: \"9c645213-a3fd-4f35-9edd-60905873a559\") " pod="openshift-image-registry/image-registry-697d97f7c8-8jr7l" Dec 01 08:19:20 crc kubenswrapper[5004]: E1201 08:19:20.595428 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 08:19:21.095416133 +0000 UTC m=+138.660408115 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8jr7l" (UID: "9c645213-a3fd-4f35-9edd-60905873a559") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:19:20 crc kubenswrapper[5004]: I1201 08:19:20.643063 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-w4btr"] Dec 01 08:19:20 crc kubenswrapper[5004]: I1201 08:19:20.657113 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-829wj"] Dec 01 08:19:20 crc kubenswrapper[5004]: I1201 08:19:20.672600 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-v5wjg"] Dec 01 08:19:20 crc kubenswrapper[5004]: I1201 08:19:20.674738 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hr5n5"] Dec 01 08:19:20 crc kubenswrapper[5004]: I1201 08:19:20.702668 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-glpkv"] Dec 01 08:19:20 crc kubenswrapper[5004]: I1201 08:19:20.703129 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 08:19:20 crc kubenswrapper[5004]: E1201 08:19:20.703723 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 08:19:21.20370274 +0000 UTC m=+138.768694732 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:19:20 crc kubenswrapper[5004]: W1201 08:19:20.780398 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6047f6c2_4e66_4dde_b262_383c622eef04.slice/crio-7281406906c447651a06af68906b7eb14b00119629c5905b760afd0244d98308 WatchSource:0}: Error finding container 7281406906c447651a06af68906b7eb14b00119629c5905b760afd0244d98308: Status 404 returned error can't find the container with id 7281406906c447651a06af68906b7eb14b00119629c5905b760afd0244d98308 Dec 01 08:19:20 crc kubenswrapper[5004]: I1201 08:19:20.782902 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-h6qmw"] Dec 01 08:19:20 crc kubenswrapper[5004]: I1201 08:19:20.784899 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-8z2fx"] Dec 01 08:19:20 crc kubenswrapper[5004]: I1201 08:19:20.796630 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-b9vjf" podStartSLOduration=117.796614435 podStartE2EDuration="1m57.796614435s" podCreationTimestamp="2025-12-01 08:17:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:19:20.796029 +0000 UTC m=+138.361020992" watchObservedRunningTime="2025-12-01 08:19:20.796614435 +0000 UTC m=+138.361606417" Dec 01 08:19:20 crc kubenswrapper[5004]: I1201 08:19:20.805004 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8jr7l\" (UID: \"9c645213-a3fd-4f35-9edd-60905873a559\") " pod="openshift-image-registry/image-registry-697d97f7c8-8jr7l" Dec 01 08:19:20 crc kubenswrapper[5004]: E1201 08:19:20.805369 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 08:19:21.305356953 +0000 UTC m=+138.870348935 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8jr7l" (UID: "9c645213-a3fd-4f35-9edd-60905873a559") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:19:20 crc kubenswrapper[5004]: I1201 08:19:20.851289 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-qmztb" event={"ID":"c888afb0-ad29-42e2-ba4a-594f27ebbe4e","Type":"ContainerStarted","Data":"86f1afb63c17dc7a1ef7e5eb7c5c1b325bfa5172c99b129d6d62ffb0a8cbf3d6"} Dec 01 08:19:20 crc kubenswrapper[5004]: I1201 08:19:20.853080 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-glpkv" event={"ID":"cdd6ce32-26c3-4202-860b-8b37ead2941c","Type":"ContainerStarted","Data":"06067ecece31a8440327b7ef1c9e2c6eda1482e2cb1f91f6bb3a2eeb899080f2"} Dec 01 08:19:20 crc kubenswrapper[5004]: I1201 08:19:20.862775 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-th28b" event={"ID":"ce579b07-073d-450d-b056-1be2c7bed20f","Type":"ContainerStarted","Data":"0db55e0a2ffa2d216635500ddb7467e400bd220b238f92073dbde27e60df51cb"} Dec 01 08:19:20 crc kubenswrapper[5004]: I1201 08:19:20.863430 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vrl25" event={"ID":"af840de7-db59-4020-a5c3-2d888069db1e","Type":"ContainerStarted","Data":"eca7ebc5b14ea0b966c34fa4da4146a410fd8ee60b2acf968d3c4aead229d381"} Dec 01 08:19:20 crc kubenswrapper[5004]: I1201 08:19:20.865929 5004 generic.go:334] "Generic (PLEG): container finished" podID="2daa468b-e9f8-41a3-ba94-a1e33093fc97" containerID="e33e2e2cb0e94e7c62495fa68e5ace42ae44fd5453d5df47c395d4ef447bae47" exitCode=0 Dec 01 08:19:20 crc kubenswrapper[5004]: I1201 08:19:20.865984 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-kplt9" event={"ID":"2daa468b-e9f8-41a3-ba94-a1e33093fc97","Type":"ContainerDied","Data":"e33e2e2cb0e94e7c62495fa68e5ace42ae44fd5453d5df47c395d4ef447bae47"} Dec 01 08:19:20 crc kubenswrapper[5004]: I1201 08:19:20.883168 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wvmxq" event={"ID":"16dd04af-b1db-4a72-8f1f-8d53ffd52b41","Type":"ContainerStarted","Data":"c61505d1c8060ca8dd8611463e3fe6cbd20a6917d11260b80462b2b399aeb588"} Dec 01 08:19:20 crc kubenswrapper[5004]: I1201 08:19:20.885680 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-c6929" event={"ID":"59a2dd51-8b1b-4437-a2f7-591bbf1890e8","Type":"ContainerStarted","Data":"79f86df8ac3fd9e47bc1b505d2cdf85ff06c8a37259567dc353b30ca0aebf67d"} Dec 01 08:19:20 crc kubenswrapper[5004]: I1201 08:19:20.887260 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v5wjg" event={"ID":"e0f00c28-2b6e-4127-a70e-43761e7cdb9e","Type":"ContainerStarted","Data":"702f569dd827b64b4ed0934e14149d5e066ddf9c28ea32ce3ec0e5b3718f9b7f"} Dec 01 08:19:20 crc kubenswrapper[5004]: I1201 08:19:20.906631 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 08:19:20 crc kubenswrapper[5004]: E1201 08:19:20.908317 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 08:19:21.40829716 +0000 UTC m=+138.973289142 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:19:20 crc kubenswrapper[5004]: I1201 08:19:20.929730 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-w4btr" event={"ID":"a2404628-0f25-4889-8a15-73576dd41470","Type":"ContainerStarted","Data":"06f07115628a9ca88dc20b09676023b8866cf001aae566977cc34ab8041572a8"} Dec 01 08:19:20 crc kubenswrapper[5004]: W1201 08:19:20.932946 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3ef8088_fc82_4ce3_9c1c_662e380e0587.slice/crio-b1d4058fcf3d38a8388c1e8224c83cae3b1acfcd1ed56c5fc4ccc3e908295946 WatchSource:0}: Error finding container b1d4058fcf3d38a8388c1e8224c83cae3b1acfcd1ed56c5fc4ccc3e908295946: Status 404 returned error can't find the container with id b1d4058fcf3d38a8388c1e8224c83cae3b1acfcd1ed56c5fc4ccc3e908295946 Dec 01 08:19:20 crc kubenswrapper[5004]: I1201 08:19:20.934729 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hr5n5" event={"ID":"1456ef72-76a5-4a0b-812b-7a0431444f47","Type":"ContainerStarted","Data":"21ddbf89ce5c1490755c0b4eb2f2ca3c3b40e76bbb1bd193a4517a8702da3aab"} Dec 01 08:19:20 crc kubenswrapper[5004]: I1201 08:19:20.968483 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7khqv" event={"ID":"d2bbf8d8-0338-4af4-8d6a-402033f87676","Type":"ContainerStarted","Data":"03edc9ff781b4fd46a8e1e10ed0b366003a6d2f0ffda4f693fc27b63d2c61230"} Dec 01 08:19:20 crc kubenswrapper[5004]: I1201 08:19:20.969086 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7khqv" Dec 01 08:19:20 crc kubenswrapper[5004]: I1201 08:19:20.970286 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-829wj" event={"ID":"6047f6c2-4e66-4dde-b262-383c622eef04","Type":"ContainerStarted","Data":"7281406906c447651a06af68906b7eb14b00119629c5905b760afd0244d98308"} Dec 01 08:19:20 crc kubenswrapper[5004]: I1201 08:19:20.970478 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-qwgn4"] Dec 01 08:19:21 crc kubenswrapper[5004]: I1201 08:19:21.008448 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8jr7l\" (UID: \"9c645213-a3fd-4f35-9edd-60905873a559\") " pod="openshift-image-registry/image-registry-697d97f7c8-8jr7l" Dec 01 08:19:21 crc kubenswrapper[5004]: E1201 08:19:21.010524 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 08:19:21.510509908 +0000 UTC m=+139.075501890 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8jr7l" (UID: "9c645213-a3fd-4f35-9edd-60905873a559") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:19:21 crc kubenswrapper[5004]: I1201 08:19:21.074198 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7khqv" Dec 01 08:19:21 crc kubenswrapper[5004]: W1201 08:19:21.101759 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0bd0713c_01be_44a0_ab66_51056ba04719.slice/crio-2cfa3c687c2e53a40510c518b8022d4f0747aab0bd86c51032fab346445463ca WatchSource:0}: Error finding container 2cfa3c687c2e53a40510c518b8022d4f0747aab0bd86c51032fab346445463ca: Status 404 returned error can't find the container with id 2cfa3c687c2e53a40510c518b8022d4f0747aab0bd86c51032fab346445463ca Dec 01 08:19:21 crc kubenswrapper[5004]: I1201 08:19:21.110283 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 08:19:21 crc kubenswrapper[5004]: E1201 08:19:21.110789 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 08:19:21.610770225 +0000 UTC m=+139.175762207 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:19:21 crc kubenswrapper[5004]: I1201 08:19:21.212216 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8jr7l\" (UID: \"9c645213-a3fd-4f35-9edd-60905873a559\") " pod="openshift-image-registry/image-registry-697d97f7c8-8jr7l" Dec 01 08:19:21 crc kubenswrapper[5004]: E1201 08:19:21.212661 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 08:19:21.712643644 +0000 UTC m=+139.277635636 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8jr7l" (UID: "9c645213-a3fd-4f35-9edd-60905873a559") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:19:21 crc kubenswrapper[5004]: I1201 08:19:21.259195 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-j9k4x" podStartSLOduration=119.259179219 podStartE2EDuration="1m59.259179219s" podCreationTimestamp="2025-12-01 08:17:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:19:21.229107154 +0000 UTC m=+138.794099136" watchObservedRunningTime="2025-12-01 08:19:21.259179219 +0000 UTC m=+138.824171201" Dec 01 08:19:21 crc kubenswrapper[5004]: I1201 08:19:21.265371 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-cnkdl"] Dec 01 08:19:21 crc kubenswrapper[5004]: I1201 08:19:21.265712 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-p9c2d"] Dec 01 08:19:21 crc kubenswrapper[5004]: I1201 08:19:21.268216 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4f588"] Dec 01 08:19:21 crc kubenswrapper[5004]: I1201 08:19:21.291615 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-pt82b"] Dec 01 08:19:21 crc kubenswrapper[5004]: I1201 08:19:21.305643 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8g8bf"] Dec 01 08:19:21 crc kubenswrapper[5004]: I1201 08:19:21.332828 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 08:19:21 crc kubenswrapper[5004]: E1201 08:19:21.335196 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 08:19:21.835178613 +0000 UTC m=+139.400170595 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:19:21 crc kubenswrapper[5004]: I1201 08:19:21.335904 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8jr7l\" (UID: \"9c645213-a3fd-4f35-9edd-60905873a559\") " pod="openshift-image-registry/image-registry-697d97f7c8-8jr7l" Dec 01 08:19:21 crc kubenswrapper[5004]: E1201 08:19:21.343147 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 08:19:21.84310429 +0000 UTC m=+139.408096272 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8jr7l" (UID: "9c645213-a3fd-4f35-9edd-60905873a559") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:19:21 crc kubenswrapper[5004]: I1201 08:19:21.355290 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-44tqp"] Dec 01 08:19:21 crc kubenswrapper[5004]: I1201 08:19:21.396801 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-f2m5b"] Dec 01 08:19:21 crc kubenswrapper[5004]: I1201 08:19:21.401706 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-jshhn"] Dec 01 08:19:21 crc kubenswrapper[5004]: W1201 08:19:21.419780 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a9f98dc_e84b_4fb8_9d4d_69c766486ebb.slice/crio-941e8a5c4fc20058261dcf84f0b89e290d51d4d0982ee5fe35dacd50c2652e10 WatchSource:0}: Error finding container 941e8a5c4fc20058261dcf84f0b89e290d51d4d0982ee5fe35dacd50c2652e10: Status 404 returned error can't find the container with id 941e8a5c4fc20058261dcf84f0b89e290d51d4d0982ee5fe35dacd50c2652e10 Dec 01 08:19:21 crc kubenswrapper[5004]: I1201 08:19:21.426654 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jdc97"] Dec 01 08:19:21 crc kubenswrapper[5004]: I1201 08:19:21.426707 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-p77t7"] Dec 01 08:19:21 crc kubenswrapper[5004]: I1201 08:19:21.430691 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zj88j"] Dec 01 08:19:21 crc kubenswrapper[5004]: W1201 08:19:21.438709 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podecf238f4_a4b7_45ab_8d1f_ff20327f375c.slice/crio-5a6720e726ab67afe6101de9dd1de5de0eacb80eaea2057283dad414afca1b83 WatchSource:0}: Error finding container 5a6720e726ab67afe6101de9dd1de5de0eacb80eaea2057283dad414afca1b83: Status 404 returned error can't find the container with id 5a6720e726ab67afe6101de9dd1de5de0eacb80eaea2057283dad414afca1b83 Dec 01 08:19:21 crc kubenswrapper[5004]: I1201 08:19:21.439278 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 08:19:21 crc kubenswrapper[5004]: E1201 08:19:21.439976 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 08:19:21.939941247 +0000 UTC m=+139.504933229 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:19:21 crc kubenswrapper[5004]: I1201 08:19:21.445683 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rsfff"] Dec 01 08:19:21 crc kubenswrapper[5004]: I1201 08:19:21.449703 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-dpnk8"] Dec 01 08:19:21 crc kubenswrapper[5004]: W1201 08:19:21.458954 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode796fca9_e620_4e16_bda0_0e722b91b53c.slice/crio-c3540b12fb85126fc3cb459c2bb712627922475242ff4f7e81cbcb4153a32976 WatchSource:0}: Error finding container c3540b12fb85126fc3cb459c2bb712627922475242ff4f7e81cbcb4153a32976: Status 404 returned error can't find the container with id c3540b12fb85126fc3cb459c2bb712627922475242ff4f7e81cbcb4153a32976 Dec 01 08:19:21 crc kubenswrapper[5004]: I1201 08:19:21.459164 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pkg9x" podStartSLOduration=119.459143309 podStartE2EDuration="1m59.459143309s" podCreationTimestamp="2025-12-01 08:17:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:19:21.433184222 +0000 UTC m=+138.998176224" watchObservedRunningTime="2025-12-01 08:19:21.459143309 +0000 UTC m=+139.024135291" Dec 01 08:19:21 crc kubenswrapper[5004]: W1201 08:19:21.477170 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6aedcff_a4a2_4265_9c4c_9a3aee8b9377.slice/crio-94241532b60fefe4779ac66fb53745c5dda858256c7e6982dbd2e722d15629b3 WatchSource:0}: Error finding container 94241532b60fefe4779ac66fb53745c5dda858256c7e6982dbd2e722d15629b3: Status 404 returned error can't find the container with id 94241532b60fefe4779ac66fb53745c5dda858256c7e6982dbd2e722d15629b3 Dec 01 08:19:21 crc kubenswrapper[5004]: I1201 08:19:21.504948 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dsrp6" podStartSLOduration=119.504928884 podStartE2EDuration="1m59.504928884s" podCreationTimestamp="2025-12-01 08:17:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:19:21.49980802 +0000 UTC m=+139.064800002" watchObservedRunningTime="2025-12-01 08:19:21.504928884 +0000 UTC m=+139.069920866" Dec 01 08:19:21 crc kubenswrapper[5004]: W1201 08:19:21.519498 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6453284d_a0de_451c_9132_d30f6fddc220.slice/crio-4f694ee0accb63e46196c526625269af7d7496e95385a1bed4901d766ceb9d5e WatchSource:0}: Error finding container 4f694ee0accb63e46196c526625269af7d7496e95385a1bed4901d766ceb9d5e: Status 404 returned error can't find the container with id 4f694ee0accb63e46196c526625269af7d7496e95385a1bed4901d766ceb9d5e Dec 01 08:19:21 crc kubenswrapper[5004]: I1201 08:19:21.546510 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8jr7l\" (UID: \"9c645213-a3fd-4f35-9edd-60905873a559\") " pod="openshift-image-registry/image-registry-697d97f7c8-8jr7l" Dec 01 08:19:21 crc kubenswrapper[5004]: E1201 08:19:21.546918 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 08:19:22.04690504 +0000 UTC m=+139.611897022 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8jr7l" (UID: "9c645213-a3fd-4f35-9edd-60905873a559") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:19:21 crc kubenswrapper[5004]: I1201 08:19:21.570851 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-ft4b5"] Dec 01 08:19:21 crc kubenswrapper[5004]: I1201 08:19:21.575397 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-pbxqh"] Dec 01 08:19:21 crc kubenswrapper[5004]: I1201 08:19:21.581864 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-rlzws"] Dec 01 08:19:21 crc kubenswrapper[5004]: I1201 08:19:21.583302 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-mrj44"] Dec 01 08:19:21 crc kubenswrapper[5004]: I1201 08:19:21.629547 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-njrts"] Dec 01 08:19:21 crc kubenswrapper[5004]: I1201 08:19:21.632674 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nsn58"] Dec 01 08:19:21 crc kubenswrapper[5004]: I1201 08:19:21.640502 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409615-5d85l"] Dec 01 08:19:21 crc kubenswrapper[5004]: I1201 08:19:21.648111 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 08:19:21 crc kubenswrapper[5004]: E1201 08:19:21.648481 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 08:19:22.148466891 +0000 UTC m=+139.713458873 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:19:21 crc kubenswrapper[5004]: I1201 08:19:21.750509 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8jr7l\" (UID: \"9c645213-a3fd-4f35-9edd-60905873a559\") " pod="openshift-image-registry/image-registry-697d97f7c8-8jr7l" Dec 01 08:19:21 crc kubenswrapper[5004]: E1201 08:19:21.750934 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 08:19:22.250918126 +0000 UTC m=+139.815910108 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8jr7l" (UID: "9c645213-a3fd-4f35-9edd-60905873a559") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:19:21 crc kubenswrapper[5004]: I1201 08:19:21.854084 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 08:19:21 crc kubenswrapper[5004]: E1201 08:19:21.854663 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 08:19:22.354647413 +0000 UTC m=+139.919639395 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:19:21 crc kubenswrapper[5004]: I1201 08:19:21.867822 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-qjz9p" podStartSLOduration=119.867804916 podStartE2EDuration="1m59.867804916s" podCreationTimestamp="2025-12-01 08:17:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:19:21.838305916 +0000 UTC m=+139.403297898" watchObservedRunningTime="2025-12-01 08:19:21.867804916 +0000 UTC m=+139.432796898" Dec 01 08:19:21 crc kubenswrapper[5004]: I1201 08:19:21.962940 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8jr7l\" (UID: \"9c645213-a3fd-4f35-9edd-60905873a559\") " pod="openshift-image-registry/image-registry-697d97f7c8-8jr7l" Dec 01 08:19:21 crc kubenswrapper[5004]: E1201 08:19:21.963221 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 08:19:22.463210227 +0000 UTC m=+140.028202209 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8jr7l" (UID: "9c645213-a3fd-4f35-9edd-60905873a559") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:19:21 crc kubenswrapper[5004]: I1201 08:19:21.995494 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7khqv" podStartSLOduration=118.995474989 podStartE2EDuration="1m58.995474989s" podCreationTimestamp="2025-12-01 08:17:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:19:21.96831385 +0000 UTC m=+139.533305842" watchObservedRunningTime="2025-12-01 08:19:21.995474989 +0000 UTC m=+139.560466971" Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.021857 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-dpnk8" event={"ID":"6453284d-a0de-451c-9132-d30f6fddc220","Type":"ContainerStarted","Data":"4f694ee0accb63e46196c526625269af7d7496e95385a1bed4901d766ceb9d5e"} Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.026458 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nsn58" event={"ID":"7a169e83-de91-4038-95b3-aa57f9b50861","Type":"ContainerStarted","Data":"fd5d17fb527e01927385778621d0515d3e80ad27c13f6b560000579914790cbd"} Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.038929 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-44tqp" event={"ID":"a6aedcff-a4a2-4265-9c4c-9a3aee8b9377","Type":"ContainerStarted","Data":"94241532b60fefe4779ac66fb53745c5dda858256c7e6982dbd2e722d15629b3"} Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.056164 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-njrts" event={"ID":"5ed7ffac-66ee-4b90-a582-ba697f8d87b9","Type":"ContainerStarted","Data":"38133690cd19efee9a704bf1140335d59953657ac0a7333113ef882fcdb3575b"} Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.064061 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-rlzws" event={"ID":"ca76c4ad-59c4-4861-9279-4f8107524e44","Type":"ContainerStarted","Data":"6caa4ace197fab4af368e6d1fcda36d5acbe5c954323cbafab5000691d059cb6"} Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.064240 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 08:19:22 crc kubenswrapper[5004]: E1201 08:19:22.064653 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 08:19:22.564635325 +0000 UTC m=+140.129627307 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.067753 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-th28b" event={"ID":"ce579b07-073d-450d-b056-1be2c7bed20f","Type":"ContainerStarted","Data":"75bc034389fa490d4b05cb7cd396fac0f7ac00cd2e0e51a4f7d03dc71bc13202"} Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.081179 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-f2m5b" event={"ID":"c7f8a0cb-a369-4f34-b131-2023a72f1abb","Type":"ContainerStarted","Data":"6174bbc0f5a564cf49c85adc33587c107b04005820a52211fece19641fc13b1a"} Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.082301 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-f2m5b" Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.083262 5004 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-f2m5b container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.33:5443/healthz\": dial tcp 10.217.0.33:5443: connect: connection refused" start-of-body= Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.083289 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-f2m5b" podUID="c7f8a0cb-a369-4f34-b131-2023a72f1abb" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.33:5443/healthz\": dial tcp 10.217.0.33:5443: connect: connection refused" Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.086003 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jdc97" event={"ID":"39b4186d-328d-4fc5-a106-50e351a34f90","Type":"ContainerStarted","Data":"178424f14b55558d9cacd76ffc95fc54dd3d472792d8b91014524eb05e78ba0f"} Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.119743 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-w4btr" event={"ID":"a2404628-0f25-4889-8a15-73576dd41470","Type":"ContainerStarted","Data":"d811508314247cfef0229dc74e9a8c115cc28a04bcce11ae737f79bcdd8e6774"} Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.126517 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-kplt9" event={"ID":"2daa468b-e9f8-41a3-ba94-a1e33093fc97","Type":"ContainerStarted","Data":"a0e1f24ac6a3a0b1fc8caf71b550ea47548efbc0106e4cab54863c4eca3aee3d"} Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.131515 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-f2m5b" podStartSLOduration=119.131504889 podStartE2EDuration="1m59.131504889s" podCreationTimestamp="2025-12-01 08:17:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:19:22.129924408 +0000 UTC m=+139.694916390" watchObservedRunningTime="2025-12-01 08:19:22.131504889 +0000 UTC m=+139.696496871" Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.132883 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-th28b" podStartSLOduration=120.132876625 podStartE2EDuration="2m0.132876625s" podCreationTimestamp="2025-12-01 08:17:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:19:22.104889435 +0000 UTC m=+139.669881437" watchObservedRunningTime="2025-12-01 08:19:22.132876625 +0000 UTC m=+139.697868597" Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.147008 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-pbxqh" event={"ID":"3f0d517f-566f-404b-be4d-08adaea5926b","Type":"ContainerStarted","Data":"0ce002ef0a49706b961428390fc74709993b2b1cc7397c2e4ec5e96f255b205e"} Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.165384 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8jr7l\" (UID: \"9c645213-a3fd-4f35-9edd-60905873a559\") " pod="openshift-image-registry/image-registry-697d97f7c8-8jr7l" Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.166318 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ft4b5" event={"ID":"2d0ddb3f-b8bc-420e-90ba-d45a29705615","Type":"ContainerStarted","Data":"4e1cf685791f6a46b1bc25edb75080d2647cc356be8e41dc4d11968484948738"} Dec 01 08:19:22 crc kubenswrapper[5004]: E1201 08:19:22.167631 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 08:19:22.667619742 +0000 UTC m=+140.232611724 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8jr7l" (UID: "9c645213-a3fd-4f35-9edd-60905873a559") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.206729 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vrl25" event={"ID":"af840de7-db59-4020-a5c3-2d888069db1e","Type":"ContainerStarted","Data":"d0368ffa4d8d42712b0857793082e96db996bd542cbfcbf610d6aff58d74f175"} Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.223387 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hr5n5" event={"ID":"1456ef72-76a5-4a0b-812b-7a0431444f47","Type":"ContainerStarted","Data":"3f905c74bf59e506f9273e2481c97223018943d83e8a5f6b248e5091e92f5952"} Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.223450 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hr5n5" Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.226246 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vrl25" podStartSLOduration=120.226231482 podStartE2EDuration="2m0.226231482s" podCreationTimestamp="2025-12-01 08:17:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:19:22.225630407 +0000 UTC m=+139.790622399" watchObservedRunningTime="2025-12-01 08:19:22.226231482 +0000 UTC m=+139.791223464" Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.229230 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409615-5d85l" event={"ID":"4f397145-18ab-4b43-b133-cc42f45bc852","Type":"ContainerStarted","Data":"939dca2e20736a45d95fbcc9cb9b71970bef62e36c1d69c80200c9cdaf940f2a"} Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.232179 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-qmztb" event={"ID":"c888afb0-ad29-42e2-ba4a-594f27ebbe4e","Type":"ContainerStarted","Data":"33a9b49ff57931041c3e4317ce4a970e3aa9803c7917a0c67003db7a2c63bf63"} Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.232529 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-qmztb" Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.233378 5004 patch_prober.go:28] interesting pod/downloads-7954f5f757-qmztb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.233421 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qmztb" podUID="c888afb0-ad29-42e2-ba4a-594f27ebbe4e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.245311 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hr5n5" Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.249997 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hr5n5" podStartSLOduration=119.249980832 podStartE2EDuration="1m59.249980832s" podCreationTimestamp="2025-12-01 08:17:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:19:22.249850209 +0000 UTC m=+139.814842191" watchObservedRunningTime="2025-12-01 08:19:22.249980832 +0000 UTC m=+139.814972814" Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.261705 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-c6929" event={"ID":"59a2dd51-8b1b-4437-a2f7-591bbf1890e8","Type":"ContainerStarted","Data":"9d7baa5d83a0f37494f270ddcf2f29cf7caf532d7bbc349e316cd672648b4fcb"} Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.266677 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 08:19:22 crc kubenswrapper[5004]: E1201 08:19:22.267654 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 08:19:22.767639604 +0000 UTC m=+140.332631586 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.282439 5004 generic.go:334] "Generic (PLEG): container finished" podID="55711190-8e14-4951-9ac3-dc3675c3a86e" containerID="8f8011da57711014c88d6eb3c3ce57636dee418b621fad0e954fb45c085b28db" exitCode=0 Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.283541 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9c2d" event={"ID":"55711190-8e14-4951-9ac3-dc3675c3a86e","Type":"ContainerDied","Data":"8f8011da57711014c88d6eb3c3ce57636dee418b621fad0e954fb45c085b28db"} Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.283585 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9c2d" event={"ID":"55711190-8e14-4951-9ac3-dc3675c3a86e","Type":"ContainerStarted","Data":"65893c43bcac62b5df29c629dda302332ff953b6efe63f0476b292e2e16a9f81"} Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.296056 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-pt82b" event={"ID":"ecf238f4-a4b7-45ab-8d1f-ff20327f375c","Type":"ContainerStarted","Data":"5a6720e726ab67afe6101de9dd1de5de0eacb80eaea2057283dad414afca1b83"} Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.301846 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-6x76r" event={"ID":"b3ef8088-fc82-4ce3-9c1c-662e380e0587","Type":"ContainerStarted","Data":"12f374ba0fb9b4c3e3abe6167a42273c5a0442107ac239a147ef6f4fbbffa7ea"} Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.301892 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-6x76r" event={"ID":"b3ef8088-fc82-4ce3-9c1c-662e380e0587","Type":"ContainerStarted","Data":"b1d4058fcf3d38a8388c1e8224c83cae3b1acfcd1ed56c5fc4ccc3e908295946"} Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.305487 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-p77t7" event={"ID":"15dc5e3f-02c6-474d-bd7b-d51ce42340b3","Type":"ContainerStarted","Data":"cd868af285e19f0a76de95983008553906cc29bbd468ada721dac00ec9b687a0"} Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.316390 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8g8bf" event={"ID":"8a9f98dc-e84b-4fb8-9d4d-69c766486ebb","Type":"ContainerStarted","Data":"941e8a5c4fc20058261dcf84f0b89e290d51d4d0982ee5fe35dacd50c2652e10"} Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.316761 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-qmztb" podStartSLOduration=120.316738395 podStartE2EDuration="2m0.316738395s" podCreationTimestamp="2025-12-01 08:17:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:19:22.314449955 +0000 UTC m=+139.879441927" watchObservedRunningTime="2025-12-01 08:19:22.316738395 +0000 UTC m=+139.881730377" Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.320814 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-cnkdl" event={"ID":"45170d05-984d-4bae-8f74-d7d7c60fffca","Type":"ContainerStarted","Data":"a9ef0177e752d65220b5d1296c0ddd12913dadc6c1565d409cfcb970095cf77b"} Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.320846 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-cnkdl" event={"ID":"45170d05-984d-4bae-8f74-d7d7c60fffca","Type":"ContainerStarted","Data":"af66d1d6802c22212925823e0f1263979ba1a34ab54948696c3f558f65e9be53"} Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.327279 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rsfff" event={"ID":"9bba24e8-8799-4012-8d3a-7813ef29344e","Type":"ContainerStarted","Data":"dd8264d80554f17c374f4402f5499ab7af2adf3bf44fa35e4263aa32d1314b45"} Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.328458 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4f588" event={"ID":"59b7fdd8-0d91-4442-a2a8-41c92d027266","Type":"ContainerStarted","Data":"4f1d1bc884021e4ea2c164797acfa7586a354cb82b1503c91991a08c721bf51d"} Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.356414 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-6x76r" podStartSLOduration=120.35639995 podStartE2EDuration="2m0.35639995s" podCreationTimestamp="2025-12-01 08:17:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:19:22.340984388 +0000 UTC m=+139.905976370" watchObservedRunningTime="2025-12-01 08:19:22.35639995 +0000 UTC m=+139.921391932" Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.356755 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-c6929" podStartSLOduration=5.356751979 podStartE2EDuration="5.356751979s" podCreationTimestamp="2025-12-01 08:19:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:19:22.356084682 +0000 UTC m=+139.921076664" watchObservedRunningTime="2025-12-01 08:19:22.356751979 +0000 UTC m=+139.921743961" Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.359613 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-glpkv" event={"ID":"cdd6ce32-26c3-4202-860b-8b37ead2941c","Type":"ContainerStarted","Data":"90843b8cf9f8373f9dee0371a3c9a9cd2e5766a47d406224c895abf3dfbf8bd2"} Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.368448 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8jr7l\" (UID: \"9c645213-a3fd-4f35-9edd-60905873a559\") " pod="openshift-image-registry/image-registry-697d97f7c8-8jr7l" Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.371525 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-jshhn" event={"ID":"caebe48f-0ac0-436e-983b-6c5858472cf7","Type":"ContainerStarted","Data":"0e47554ebfab58677b4bb85ba0118e5ac0d81916b64a0fa74fef8055d8451ce9"} Dec 01 08:19:22 crc kubenswrapper[5004]: E1201 08:19:22.372302 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 08:19:22.872287075 +0000 UTC m=+140.437279057 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8jr7l" (UID: "9c645213-a3fd-4f35-9edd-60905873a559") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.375642 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v5wjg" event={"ID":"e0f00c28-2b6e-4127-a70e-43761e7cdb9e","Type":"ContainerStarted","Data":"ffcc4fb6f8faa84dc488e2a52e93198bf9b15376f2219f57957ae27a6a10ee13"} Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.376425 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-6x76r" Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.382418 5004 patch_prober.go:28] interesting pod/router-default-5444994796-6x76r container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 08:19:22 crc kubenswrapper[5004]: [-]has-synced failed: reason withheld Dec 01 08:19:22 crc kubenswrapper[5004]: [+]process-running ok Dec 01 08:19:22 crc kubenswrapper[5004]: healthz check failed Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.382465 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6x76r" podUID="b3ef8088-fc82-4ce3-9c1c-662e380e0587" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.390788 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-8z2fx" event={"ID":"29f791b6-e03f-4159-85ee-783401ccf7e1","Type":"ContainerStarted","Data":"d50a012c13e2eace5f65a723d8eedb120141fc7abdd78ef11dec4ca82bd5f168"} Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.390830 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-8z2fx" event={"ID":"29f791b6-e03f-4159-85ee-783401ccf7e1","Type":"ContainerStarted","Data":"e304d45745bb45e7eadfb96ab1e76df937b2c54574673874994150bebbe9cdb1"} Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.429428 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wvmxq" event={"ID":"16dd04af-b1db-4a72-8f1f-8d53ffd52b41","Type":"ContainerStarted","Data":"7866b249b9358c5f5cbfb7a2d50a49d3987408dccd62b5a90a7f030155f6cf72"} Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.430195 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-glpkv" podStartSLOduration=120.430177966 podStartE2EDuration="2m0.430177966s" podCreationTimestamp="2025-12-01 08:17:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:19:22.429049817 +0000 UTC m=+139.994041799" watchObservedRunningTime="2025-12-01 08:19:22.430177966 +0000 UTC m=+139.995169968" Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.431584 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v5wjg" podStartSLOduration=120.431552052 podStartE2EDuration="2m0.431552052s" podCreationTimestamp="2025-12-01 08:17:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:19:22.413400318 +0000 UTC m=+139.978392300" watchObservedRunningTime="2025-12-01 08:19:22.431552052 +0000 UTC m=+139.996544024" Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.443642 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-829wj" event={"ID":"6047f6c2-4e66-4dde-b262-383c622eef04","Type":"ContainerStarted","Data":"ce93c4f27c41752f37c435a672fa7b36ce569fb066820c11a66ec7c48fb6c0a4"} Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.445193 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mrj44" event={"ID":"beba9fed-710a-49a6-96ce-951ecb0a4a74","Type":"ContainerStarted","Data":"98b67b66ddef38605c4f98d691c19387ce81a74fce5949f77116ca69678c7e4b"} Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.458634 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-h6qmw" event={"ID":"aa8bb474-21e8-42b3-a2c6-81ef7d267d9d","Type":"ContainerStarted","Data":"1cdabe9a9938564fd5569da4fb25e012d3184edd356ec8418334dbf6da9f70e2"} Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.458920 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-h6qmw" event={"ID":"aa8bb474-21e8-42b3-a2c6-81ef7d267d9d","Type":"ContainerStarted","Data":"870debcc6c4d1ae6d440c78589e5e310a9da5c9f9352f3d8947e0c8826d0d6bc"} Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.469973 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 08:19:22 crc kubenswrapper[5004]: E1201 08:19:22.471474 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 08:19:22.971458663 +0000 UTC m=+140.536450645 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.477348 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wvmxq" podStartSLOduration=120.477328836 podStartE2EDuration="2m0.477328836s" podCreationTimestamp="2025-12-01 08:17:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:19:22.473108006 +0000 UTC m=+140.038099988" watchObservedRunningTime="2025-12-01 08:19:22.477328836 +0000 UTC m=+140.042320818" Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.485268 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-zj88j" event={"ID":"e796fca9-e620-4e16-bda0-0e722b91b53c","Type":"ContainerStarted","Data":"c3540b12fb85126fc3cb459c2bb712627922475242ff4f7e81cbcb4153a32976"} Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.486115 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-zj88j" Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.492744 5004 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-zj88j container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.17:8443/healthz\": dial tcp 10.217.0.17:8443: connect: connection refused" start-of-body= Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.492797 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-zj88j" podUID="e796fca9-e620-4e16-bda0-0e722b91b53c" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.17:8443/healthz\": dial tcp 10.217.0.17:8443: connect: connection refused" Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.495621 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qwgn4" event={"ID":"0bd0713c-01be-44a0-ab66-51056ba04719","Type":"ContainerStarted","Data":"2cfa3c687c2e53a40510c518b8022d4f0747aab0bd86c51032fab346445463ca"} Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.541308 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-h6qmw" podStartSLOduration=120.541284106 podStartE2EDuration="2m0.541284106s" podCreationTimestamp="2025-12-01 08:17:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:19:22.531335677 +0000 UTC m=+140.096327659" watchObservedRunningTime="2025-12-01 08:19:22.541284106 +0000 UTC m=+140.106276088" Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.561719 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-829wj" podStartSLOduration=120.56169598 podStartE2EDuration="2m0.56169598s" podCreationTimestamp="2025-12-01 08:17:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:19:22.552429177 +0000 UTC m=+140.117421159" watchObservedRunningTime="2025-12-01 08:19:22.56169598 +0000 UTC m=+140.126687982" Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.594973 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8jr7l\" (UID: \"9c645213-a3fd-4f35-9edd-60905873a559\") " pod="openshift-image-registry/image-registry-697d97f7c8-8jr7l" Dec 01 08:19:22 crc kubenswrapper[5004]: E1201 08:19:22.597131 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 08:19:23.097118914 +0000 UTC m=+140.662110896 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8jr7l" (UID: "9c645213-a3fd-4f35-9edd-60905873a559") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.625127 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-zj88j" podStartSLOduration=120.625113595 podStartE2EDuration="2m0.625113595s" podCreationTimestamp="2025-12-01 08:17:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:19:22.623988705 +0000 UTC m=+140.188980687" watchObservedRunningTime="2025-12-01 08:19:22.625113595 +0000 UTC m=+140.190105577" Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.698242 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 08:19:22 crc kubenswrapper[5004]: E1201 08:19:22.698829 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 08:19:23.198815479 +0000 UTC m=+140.763807451 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.805267 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8jr7l\" (UID: \"9c645213-a3fd-4f35-9edd-60905873a559\") " pod="openshift-image-registry/image-registry-697d97f7c8-8jr7l" Dec 01 08:19:22 crc kubenswrapper[5004]: E1201 08:19:22.805549 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 08:19:23.305538984 +0000 UTC m=+140.870530966 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8jr7l" (UID: "9c645213-a3fd-4f35-9edd-60905873a559") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:19:22 crc kubenswrapper[5004]: I1201 08:19:22.912726 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 08:19:22 crc kubenswrapper[5004]: E1201 08:19:22.913081 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 08:19:23.413067821 +0000 UTC m=+140.978059803 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:19:23 crc kubenswrapper[5004]: I1201 08:19:23.015819 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8jr7l\" (UID: \"9c645213-a3fd-4f35-9edd-60905873a559\") " pod="openshift-image-registry/image-registry-697d97f7c8-8jr7l" Dec 01 08:19:23 crc kubenswrapper[5004]: E1201 08:19:23.016350 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 08:19:23.516337646 +0000 UTC m=+141.081329628 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8jr7l" (UID: "9c645213-a3fd-4f35-9edd-60905873a559") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:19:23 crc kubenswrapper[5004]: I1201 08:19:23.116520 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 08:19:23 crc kubenswrapper[5004]: E1201 08:19:23.116856 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 08:19:23.61684001 +0000 UTC m=+141.181831992 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:19:23 crc kubenswrapper[5004]: I1201 08:19:23.218403 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8jr7l\" (UID: \"9c645213-a3fd-4f35-9edd-60905873a559\") " pod="openshift-image-registry/image-registry-697d97f7c8-8jr7l" Dec 01 08:19:23 crc kubenswrapper[5004]: E1201 08:19:23.218846 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 08:19:23.718830872 +0000 UTC m=+141.283822854 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8jr7l" (UID: "9c645213-a3fd-4f35-9edd-60905873a559") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:19:23 crc kubenswrapper[5004]: I1201 08:19:23.319385 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 08:19:23 crc kubenswrapper[5004]: E1201 08:19:23.319760 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 08:19:23.819745637 +0000 UTC m=+141.384737619 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:19:23 crc kubenswrapper[5004]: I1201 08:19:23.387502 5004 patch_prober.go:28] interesting pod/router-default-5444994796-6x76r container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 08:19:23 crc kubenswrapper[5004]: [-]has-synced failed: reason withheld Dec 01 08:19:23 crc kubenswrapper[5004]: [+]process-running ok Dec 01 08:19:23 crc kubenswrapper[5004]: healthz check failed Dec 01 08:19:23 crc kubenswrapper[5004]: I1201 08:19:23.387553 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6x76r" podUID="b3ef8088-fc82-4ce3-9c1c-662e380e0587" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 08:19:23 crc kubenswrapper[5004]: I1201 08:19:23.423135 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8jr7l\" (UID: \"9c645213-a3fd-4f35-9edd-60905873a559\") " pod="openshift-image-registry/image-registry-697d97f7c8-8jr7l" Dec 01 08:19:23 crc kubenswrapper[5004]: E1201 08:19:23.423694 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 08:19:23.923682839 +0000 UTC m=+141.488674821 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8jr7l" (UID: "9c645213-a3fd-4f35-9edd-60905873a559") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:19:23 crc kubenswrapper[5004]: I1201 08:19:23.515429 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-p77t7" event={"ID":"15dc5e3f-02c6-474d-bd7b-d51ce42340b3","Type":"ContainerStarted","Data":"74772694e1c95b51d67b201d5ad882a5668763ec908b994b999c98e143fd7dd6"} Dec 01 08:19:23 crc kubenswrapper[5004]: I1201 08:19:23.515743 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-p77t7" Dec 01 08:19:23 crc kubenswrapper[5004]: I1201 08:19:23.516881 5004 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-p77t7 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" start-of-body= Dec 01 08:19:23 crc kubenswrapper[5004]: I1201 08:19:23.516950 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-p77t7" podUID="15dc5e3f-02c6-474d-bd7b-d51ce42340b3" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" Dec 01 08:19:23 crc kubenswrapper[5004]: I1201 08:19:23.518605 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-44tqp" event={"ID":"a6aedcff-a4a2-4265-9c4c-9a3aee8b9377","Type":"ContainerStarted","Data":"8db7383cf45b0c5fd35aceea3bf0e92666b46d5b9f3db8124cca80a1a085563d"} Dec 01 08:19:23 crc kubenswrapper[5004]: I1201 08:19:23.524019 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 08:19:23 crc kubenswrapper[5004]: E1201 08:19:23.524205 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 08:19:24.024169343 +0000 UTC m=+141.589161325 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:19:23 crc kubenswrapper[5004]: I1201 08:19:23.524386 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8jr7l\" (UID: \"9c645213-a3fd-4f35-9edd-60905873a559\") " pod="openshift-image-registry/image-registry-697d97f7c8-8jr7l" Dec 01 08:19:23 crc kubenswrapper[5004]: E1201 08:19:23.524723 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 08:19:24.024711086 +0000 UTC m=+141.589703068 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8jr7l" (UID: "9c645213-a3fd-4f35-9edd-60905873a559") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:19:23 crc kubenswrapper[5004]: I1201 08:19:23.526830 5004 generic.go:334] "Generic (PLEG): container finished" podID="beba9fed-710a-49a6-96ce-951ecb0a4a74" containerID="a81bb40a7ff19bdf7aa1368a9550183b853c6016398487b8cca9acec803d27a8" exitCode=0 Dec 01 08:19:23 crc kubenswrapper[5004]: I1201 08:19:23.527042 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mrj44" event={"ID":"beba9fed-710a-49a6-96ce-951ecb0a4a74","Type":"ContainerDied","Data":"a81bb40a7ff19bdf7aa1368a9550183b853c6016398487b8cca9acec803d27a8"} Dec 01 08:19:23 crc kubenswrapper[5004]: I1201 08:19:23.535772 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-pt82b" event={"ID":"ecf238f4-a4b7-45ab-8d1f-ff20327f375c","Type":"ContainerStarted","Data":"4f36b3b03fe1a23a487550312d4eb8f448f84a847c4a80c729d8bf4a43ab9a31"} Dec 01 08:19:23 crc kubenswrapper[5004]: I1201 08:19:23.551846 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nsn58" event={"ID":"7a169e83-de91-4038-95b3-aa57f9b50861","Type":"ContainerStarted","Data":"b95c6291df5197ff16dedacb9597687f1fffb146c19e36f1d9c1725c177551c6"} Dec 01 08:19:23 crc kubenswrapper[5004]: I1201 08:19:23.571642 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jdc97" event={"ID":"39b4186d-328d-4fc5-a106-50e351a34f90","Type":"ContainerStarted","Data":"688423462a3c839c61d164ca25ed9d5dd747873318705585f24aab9785a87c71"} Dec 01 08:19:23 crc kubenswrapper[5004]: I1201 08:19:23.571829 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jdc97" Dec 01 08:19:23 crc kubenswrapper[5004]: I1201 08:19:23.577073 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jdc97" Dec 01 08:19:23 crc kubenswrapper[5004]: I1201 08:19:23.577207 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-8z2fx" event={"ID":"29f791b6-e03f-4159-85ee-783401ccf7e1","Type":"ContainerStarted","Data":"d884615ab63972bff9e3f89d80ca7ad3a547312b2680ea99f7de55f217d094c3"} Dec 01 08:19:23 crc kubenswrapper[5004]: I1201 08:19:23.579592 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-kplt9" event={"ID":"2daa468b-e9f8-41a3-ba94-a1e33093fc97","Type":"ContainerStarted","Data":"1bf385b693ac36a9d1e5df07caadfbca8c5f407c1935ebe1b9370ac96e3ab841"} Dec 01 08:19:23 crc kubenswrapper[5004]: I1201 08:19:23.586714 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v5wjg" event={"ID":"e0f00c28-2b6e-4127-a70e-43761e7cdb9e","Type":"ContainerStarted","Data":"0b05990b53dbbbe2b57d226345b75f494fe4e0062345cc974ce617b8a2ab49f5"} Dec 01 08:19:23 crc kubenswrapper[5004]: I1201 08:19:23.598946 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8g8bf" event={"ID":"8a9f98dc-e84b-4fb8-9d4d-69c766486ebb","Type":"ContainerStarted","Data":"886c8baa541d32465cd9ae76af7323fb43bd8bcafa4a5a26793c1480ef9bf2cc"} Dec 01 08:19:23 crc kubenswrapper[5004]: I1201 08:19:23.599258 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-8g8bf" Dec 01 08:19:23 crc kubenswrapper[5004]: I1201 08:19:23.600255 5004 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-8g8bf container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Dec 01 08:19:23 crc kubenswrapper[5004]: I1201 08:19:23.600293 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-8g8bf" podUID="8a9f98dc-e84b-4fb8-9d4d-69c766486ebb" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" Dec 01 08:19:23 crc kubenswrapper[5004]: I1201 08:19:23.603308 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ft4b5" event={"ID":"2d0ddb3f-b8bc-420e-90ba-d45a29705615","Type":"ContainerStarted","Data":"a3985f325795593c597d167726997ed982394bab98e3dec241fa16c198f52615"} Dec 01 08:19:23 crc kubenswrapper[5004]: I1201 08:19:23.603336 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ft4b5" event={"ID":"2d0ddb3f-b8bc-420e-90ba-d45a29705615","Type":"ContainerStarted","Data":"49fc0c7a2743f467324661a9bb2f2cd894aeec1bd55432f1caa90ec573a2a10d"} Dec 01 08:19:23 crc kubenswrapper[5004]: I1201 08:19:23.618930 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-p77t7" podStartSLOduration=121.618915096 podStartE2EDuration="2m1.618915096s" podCreationTimestamp="2025-12-01 08:17:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:19:23.598587605 +0000 UTC m=+141.163579577" watchObservedRunningTime="2025-12-01 08:19:23.618915096 +0000 UTC m=+141.183907068" Dec 01 08:19:23 crc kubenswrapper[5004]: I1201 08:19:23.619384 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-rlzws" event={"ID":"ca76c4ad-59c4-4861-9279-4f8107524e44","Type":"ContainerStarted","Data":"c23bfd29b9007d31419b77bb63da0c33d6e2480c12d5cd9df6fde562806fadec"} Dec 01 08:19:23 crc kubenswrapper[5004]: I1201 08:19:23.625546 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 08:19:23 crc kubenswrapper[5004]: E1201 08:19:23.626334 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 08:19:24.126320809 +0000 UTC m=+141.691312791 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:19:23 crc kubenswrapper[5004]: I1201 08:19:23.654787 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-jshhn" event={"ID":"caebe48f-0ac0-436e-983b-6c5858472cf7","Type":"ContainerStarted","Data":"e8b63c804eb36736c64465e0340e2bee558fb93e236a92d82780c9fc773e28d1"} Dec 01 08:19:23 crc kubenswrapper[5004]: I1201 08:19:23.655296 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-jshhn" Dec 01 08:19:23 crc kubenswrapper[5004]: I1201 08:19:23.671994 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-44tqp" podStartSLOduration=120.671979481 podStartE2EDuration="2m0.671979481s" podCreationTimestamp="2025-12-01 08:17:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:19:23.620025224 +0000 UTC m=+141.185017206" watchObservedRunningTime="2025-12-01 08:19:23.671979481 +0000 UTC m=+141.236971463" Dec 01 08:19:23 crc kubenswrapper[5004]: I1201 08:19:23.673215 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-8z2fx" podStartSLOduration=120.673210003 podStartE2EDuration="2m0.673210003s" podCreationTimestamp="2025-12-01 08:17:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:19:23.671597521 +0000 UTC m=+141.236589503" watchObservedRunningTime="2025-12-01 08:19:23.673210003 +0000 UTC m=+141.238201985" Dec 01 08:19:23 crc kubenswrapper[5004]: I1201 08:19:23.676403 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-f2m5b" event={"ID":"c7f8a0cb-a369-4f34-b131-2023a72f1abb","Type":"ContainerStarted","Data":"9b85796a1ac3beb7a930958dd3f99b510e39bba9b53e96acff63060b97b1b39d"} Dec 01 08:19:23 crc kubenswrapper[5004]: I1201 08:19:23.708779 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-f2m5b" Dec 01 08:19:23 crc kubenswrapper[5004]: I1201 08:19:23.718102 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409615-5d85l" event={"ID":"4f397145-18ab-4b43-b133-cc42f45bc852","Type":"ContainerStarted","Data":"88637e7089d85bcdeb4f8e3236cd4a87fe28fff0b21507a073034b85e41e17b7"} Dec 01 08:19:23 crc kubenswrapper[5004]: I1201 08:19:23.728276 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8jr7l\" (UID: \"9c645213-a3fd-4f35-9edd-60905873a559\") " pod="openshift-image-registry/image-registry-697d97f7c8-8jr7l" Dec 01 08:19:23 crc kubenswrapper[5004]: E1201 08:19:23.728520 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 08:19:24.228510437 +0000 UTC m=+141.793502419 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8jr7l" (UID: "9c645213-a3fd-4f35-9edd-60905873a559") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:19:23 crc kubenswrapper[5004]: I1201 08:19:23.749052 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-pbxqh" event={"ID":"3f0d517f-566f-404b-be4d-08adaea5926b","Type":"ContainerStarted","Data":"1396735b266519b588c0d1cca05f8a4824a1c90c03adedd89dac60bcf4c3e647"} Dec 01 08:19:23 crc kubenswrapper[5004]: I1201 08:19:23.760672 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nsn58" podStartSLOduration=120.760659566 podStartE2EDuration="2m0.760659566s" podCreationTimestamp="2025-12-01 08:17:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:19:23.759058404 +0000 UTC m=+141.324050386" watchObservedRunningTime="2025-12-01 08:19:23.760659566 +0000 UTC m=+141.325651548" Dec 01 08:19:23 crc kubenswrapper[5004]: I1201 08:19:23.761775 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jdc97" podStartSLOduration=120.761770045 podStartE2EDuration="2m0.761770045s" podCreationTimestamp="2025-12-01 08:17:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:19:23.727315445 +0000 UTC m=+141.292307427" watchObservedRunningTime="2025-12-01 08:19:23.761770045 +0000 UTC m=+141.326762027" Dec 01 08:19:23 crc kubenswrapper[5004]: I1201 08:19:23.779145 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4f588" event={"ID":"59b7fdd8-0d91-4442-a2a8-41c92d027266","Type":"ContainerStarted","Data":"53538950f9754c37e87cbf1f8b475faf9d3360a3e346624dd339e7e7fcba6298"} Dec 01 08:19:23 crc kubenswrapper[5004]: I1201 08:19:23.786076 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-dpnk8" event={"ID":"6453284d-a0de-451c-9132-d30f6fddc220","Type":"ContainerStarted","Data":"13f02ef79fc691ffaf8a86a6bd6c5403ac3a50df142ddd7c324c9008b13781e2"} Dec 01 08:19:23 crc kubenswrapper[5004]: I1201 08:19:23.828763 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 08:19:23 crc kubenswrapper[5004]: E1201 08:19:23.829430 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 08:19:24.329416461 +0000 UTC m=+141.894408433 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:19:23 crc kubenswrapper[5004]: I1201 08:19:23.832467 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-zj88j" event={"ID":"e796fca9-e620-4e16-bda0-0e722b91b53c","Type":"ContainerStarted","Data":"8c1f2a555dc364d531d5a43583c945284728abe28b766059988547fb43857a5b"} Dec 01 08:19:23 crc kubenswrapper[5004]: I1201 08:19:23.848095 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-zj88j" Dec 01 08:19:23 crc kubenswrapper[5004]: I1201 08:19:23.849014 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9c2d" event={"ID":"55711190-8e14-4951-9ac3-dc3675c3a86e","Type":"ContainerStarted","Data":"57d38701771ea046b768a2a073eb1ce86a269b77041b5564acfbcc25b46b0800"} Dec 01 08:19:23 crc kubenswrapper[5004]: I1201 08:19:23.849750 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9c2d" Dec 01 08:19:23 crc kubenswrapper[5004]: I1201 08:19:23.859170 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-kplt9" podStartSLOduration=121.859156167 podStartE2EDuration="2m1.859156167s" podCreationTimestamp="2025-12-01 08:17:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:19:23.858224773 +0000 UTC m=+141.423216755" watchObservedRunningTime="2025-12-01 08:19:23.859156167 +0000 UTC m=+141.424148149" Dec 01 08:19:23 crc kubenswrapper[5004]: I1201 08:19:23.932381 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8jr7l\" (UID: \"9c645213-a3fd-4f35-9edd-60905873a559\") " pod="openshift-image-registry/image-registry-697d97f7c8-8jr7l" Dec 01 08:19:23 crc kubenswrapper[5004]: I1201 08:19:23.933612 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-njrts" event={"ID":"5ed7ffac-66ee-4b90-a582-ba697f8d87b9","Type":"ContainerStarted","Data":"e4f108bcfb92a4fb5f8d49a680ef8d0019ecbbdcebb5422d80f3df9bbef81d78"} Dec 01 08:19:23 crc kubenswrapper[5004]: I1201 08:19:23.933656 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-njrts" event={"ID":"5ed7ffac-66ee-4b90-a582-ba697f8d87b9","Type":"ContainerStarted","Data":"8aee14e2c69463dd61ef76bd92debd756956dc4e9c80d38db89b5f8dda9ee917"} Dec 01 08:19:23 crc kubenswrapper[5004]: I1201 08:19:23.948940 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9c2d" podStartSLOduration=121.94892499 podStartE2EDuration="2m1.94892499s" podCreationTimestamp="2025-12-01 08:17:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:19:23.904602564 +0000 UTC m=+141.469594546" watchObservedRunningTime="2025-12-01 08:19:23.94892499 +0000 UTC m=+141.513916972" Dec 01 08:19:23 crc kubenswrapper[5004]: I1201 08:19:23.950113 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-dpnk8" podStartSLOduration=6.950105741 podStartE2EDuration="6.950105741s" podCreationTimestamp="2025-12-01 08:19:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:19:23.946380344 +0000 UTC m=+141.511372336" watchObservedRunningTime="2025-12-01 08:19:23.950105741 +0000 UTC m=+141.515097723" Dec 01 08:19:23 crc kubenswrapper[5004]: E1201 08:19:23.951440 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 08:19:24.451423455 +0000 UTC m=+142.016415437 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8jr7l" (UID: "9c645213-a3fd-4f35-9edd-60905873a559") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:19:23 crc kubenswrapper[5004]: I1201 08:19:23.962932 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qwgn4" event={"ID":"0bd0713c-01be-44a0-ab66-51056ba04719","Type":"ContainerStarted","Data":"ed7991ad1ba9a0b2399a925ab1b846579ea8067fe0371913087bf43cec22e59e"} Dec 01 08:19:23 crc kubenswrapper[5004]: I1201 08:19:23.962979 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qwgn4" event={"ID":"0bd0713c-01be-44a0-ab66-51056ba04719","Type":"ContainerStarted","Data":"844f22d4d6badb269dcab8b1913e14c75960244f9e6d64d5b84ea6e02aa864dd"} Dec 01 08:19:23 crc kubenswrapper[5004]: I1201 08:19:23.963595 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-qwgn4" Dec 01 08:19:23 crc kubenswrapper[5004]: I1201 08:19:23.974531 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-w4btr" event={"ID":"a2404628-0f25-4889-8a15-73576dd41470","Type":"ContainerStarted","Data":"174da6bfbec65a139caecd5f6128388f4fc6f7e067c036b230cb40e5166e8dc1"} Dec 01 08:19:23 crc kubenswrapper[5004]: I1201 08:19:23.995916 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-cnkdl" event={"ID":"45170d05-984d-4bae-8f74-d7d7c60fffca","Type":"ContainerStarted","Data":"b96c2985d0577db33f5820698cd66e60ddc0b829f7317438d3bd9657090fc73a"} Dec 01 08:19:24 crc kubenswrapper[5004]: I1201 08:19:24.013517 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ft4b5" podStartSLOduration=121.013500506 podStartE2EDuration="2m1.013500506s" podCreationTimestamp="2025-12-01 08:17:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:19:23.983137553 +0000 UTC m=+141.548129545" watchObservedRunningTime="2025-12-01 08:19:24.013500506 +0000 UTC m=+141.578492488" Dec 01 08:19:24 crc kubenswrapper[5004]: I1201 08:19:24.014975 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-jshhn" podStartSLOduration=122.014968515 podStartE2EDuration="2m2.014968515s" podCreationTimestamp="2025-12-01 08:17:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:19:24.012943561 +0000 UTC m=+141.577935543" watchObservedRunningTime="2025-12-01 08:19:24.014968515 +0000 UTC m=+141.579960497" Dec 01 08:19:24 crc kubenswrapper[5004]: I1201 08:19:24.036100 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 08:19:24 crc kubenswrapper[5004]: I1201 08:19:24.036832 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rsfff" event={"ID":"9bba24e8-8799-4012-8d3a-7813ef29344e","Type":"ContainerStarted","Data":"c462b4e2fb0992175fb17438f9d3a4d60f7457e531f654da36703d32323f90df"} Dec 01 08:19:24 crc kubenswrapper[5004]: I1201 08:19:24.036877 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rsfff" event={"ID":"9bba24e8-8799-4012-8d3a-7813ef29344e","Type":"ContainerStarted","Data":"8c839df98cf40d6265fdcbab18b9d618b8c1c234a7742eacc836d44deea42017"} Dec 01 08:19:24 crc kubenswrapper[5004]: E1201 08:19:24.037189 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 08:19:24.537173403 +0000 UTC m=+142.102165385 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:19:24 crc kubenswrapper[5004]: I1201 08:19:24.038006 5004 patch_prober.go:28] interesting pod/downloads-7954f5f757-qmztb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Dec 01 08:19:24 crc kubenswrapper[5004]: I1201 08:19:24.038125 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qmztb" podUID="c888afb0-ad29-42e2-ba4a-594f27ebbe4e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" Dec 01 08:19:24 crc kubenswrapper[5004]: I1201 08:19:24.098682 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29409615-5d85l" podStartSLOduration=122.098665199 podStartE2EDuration="2m2.098665199s" podCreationTimestamp="2025-12-01 08:17:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:19:24.095069265 +0000 UTC m=+141.660061247" watchObservedRunningTime="2025-12-01 08:19:24.098665199 +0000 UTC m=+141.663657181" Dec 01 08:19:24 crc kubenswrapper[5004]: I1201 08:19:24.137593 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8jr7l\" (UID: \"9c645213-a3fd-4f35-9edd-60905873a559\") " pod="openshift-image-registry/image-registry-697d97f7c8-8jr7l" Dec 01 08:19:24 crc kubenswrapper[5004]: E1201 08:19:24.139174 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 08:19:24.639157686 +0000 UTC m=+142.204149658 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8jr7l" (UID: "9c645213-a3fd-4f35-9edd-60905873a559") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:19:24 crc kubenswrapper[5004]: I1201 08:19:24.140757 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-pbxqh" podStartSLOduration=121.140744157 podStartE2EDuration="2m1.140744157s" podCreationTimestamp="2025-12-01 08:17:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:19:24.138694814 +0000 UTC m=+141.703686796" watchObservedRunningTime="2025-12-01 08:19:24.140744157 +0000 UTC m=+141.705736139" Dec 01 08:19:24 crc kubenswrapper[5004]: I1201 08:19:24.185451 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4f588" podStartSLOduration=122.185432054 podStartE2EDuration="2m2.185432054s" podCreationTimestamp="2025-12-01 08:17:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:19:24.182277711 +0000 UTC m=+141.747269693" watchObservedRunningTime="2025-12-01 08:19:24.185432054 +0000 UTC m=+141.750424036" Dec 01 08:19:24 crc kubenswrapper[5004]: I1201 08:19:24.200633 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-kplt9" Dec 01 08:19:24 crc kubenswrapper[5004]: I1201 08:19:24.200695 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-kplt9" Dec 01 08:19:24 crc kubenswrapper[5004]: I1201 08:19:24.239096 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 08:19:24 crc kubenswrapper[5004]: E1201 08:19:24.239374 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 08:19:24.739360302 +0000 UTC m=+142.304352284 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:19:24 crc kubenswrapper[5004]: I1201 08:19:24.252176 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-rlzws" podStartSLOduration=121.252158666 podStartE2EDuration="2m1.252158666s" podCreationTimestamp="2025-12-01 08:17:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:19:24.25078973 +0000 UTC m=+141.815781712" watchObservedRunningTime="2025-12-01 08:19:24.252158666 +0000 UTC m=+141.817150648" Dec 01 08:19:24 crc kubenswrapper[5004]: I1201 08:19:24.282660 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-8g8bf" podStartSLOduration=121.282641112 podStartE2EDuration="2m1.282641112s" podCreationTimestamp="2025-12-01 08:17:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:19:24.281504972 +0000 UTC m=+141.846496954" watchObservedRunningTime="2025-12-01 08:19:24.282641112 +0000 UTC m=+141.847633094" Dec 01 08:19:24 crc kubenswrapper[5004]: I1201 08:19:24.306383 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-cnkdl" podStartSLOduration=121.306366341 podStartE2EDuration="2m1.306366341s" podCreationTimestamp="2025-12-01 08:17:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:19:24.305731164 +0000 UTC m=+141.870723146" watchObservedRunningTime="2025-12-01 08:19:24.306366341 +0000 UTC m=+141.871358323" Dec 01 08:19:24 crc kubenswrapper[5004]: I1201 08:19:24.340382 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8jr7l\" (UID: \"9c645213-a3fd-4f35-9edd-60905873a559\") " pod="openshift-image-registry/image-registry-697d97f7c8-8jr7l" Dec 01 08:19:24 crc kubenswrapper[5004]: E1201 08:19:24.340754 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 08:19:24.840737168 +0000 UTC m=+142.405729140 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8jr7l" (UID: "9c645213-a3fd-4f35-9edd-60905873a559") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:19:24 crc kubenswrapper[5004]: I1201 08:19:24.350424 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-qwgn4" podStartSLOduration=7.3504048 podStartE2EDuration="7.3504048s" podCreationTimestamp="2025-12-01 08:19:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:19:24.349801495 +0000 UTC m=+141.914793477" watchObservedRunningTime="2025-12-01 08:19:24.3504048 +0000 UTC m=+141.915396782" Dec 01 08:19:24 crc kubenswrapper[5004]: I1201 08:19:24.373906 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rsfff" podStartSLOduration=121.373873263 podStartE2EDuration="2m1.373873263s" podCreationTimestamp="2025-12-01 08:17:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:19:24.373195825 +0000 UTC m=+141.938187807" watchObservedRunningTime="2025-12-01 08:19:24.373873263 +0000 UTC m=+141.938865235" Dec 01 08:19:24 crc kubenswrapper[5004]: I1201 08:19:24.382768 5004 patch_prober.go:28] interesting pod/router-default-5444994796-6x76r container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 08:19:24 crc kubenswrapper[5004]: [-]has-synced failed: reason withheld Dec 01 08:19:24 crc kubenswrapper[5004]: [+]process-running ok Dec 01 08:19:24 crc kubenswrapper[5004]: healthz check failed Dec 01 08:19:24 crc kubenswrapper[5004]: I1201 08:19:24.383096 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6x76r" podUID="b3ef8088-fc82-4ce3-9c1c-662e380e0587" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 08:19:24 crc kubenswrapper[5004]: I1201 08:19:24.404017 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-njrts" podStartSLOduration=121.403999769 podStartE2EDuration="2m1.403999769s" podCreationTimestamp="2025-12-01 08:17:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:19:24.399726157 +0000 UTC m=+141.964718129" watchObservedRunningTime="2025-12-01 08:19:24.403999769 +0000 UTC m=+141.968991751" Dec 01 08:19:24 crc kubenswrapper[5004]: I1201 08:19:24.411335 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-jshhn" Dec 01 08:19:24 crc kubenswrapper[5004]: I1201 08:19:24.441105 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 08:19:24 crc kubenswrapper[5004]: E1201 08:19:24.441288 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 08:19:24.941263402 +0000 UTC m=+142.506255384 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:19:24 crc kubenswrapper[5004]: I1201 08:19:24.441575 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8jr7l\" (UID: \"9c645213-a3fd-4f35-9edd-60905873a559\") " pod="openshift-image-registry/image-registry-697d97f7c8-8jr7l" Dec 01 08:19:24 crc kubenswrapper[5004]: E1201 08:19:24.441890 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 08:19:24.941883058 +0000 UTC m=+142.506875040 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8jr7l" (UID: "9c645213-a3fd-4f35-9edd-60905873a559") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:19:24 crc kubenswrapper[5004]: I1201 08:19:24.501274 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-w4btr" podStartSLOduration=122.501254848 podStartE2EDuration="2m2.501254848s" podCreationTimestamp="2025-12-01 08:17:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:19:24.469417007 +0000 UTC m=+142.034408999" watchObservedRunningTime="2025-12-01 08:19:24.501254848 +0000 UTC m=+142.066246820" Dec 01 08:19:24 crc kubenswrapper[5004]: I1201 08:19:24.503357 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jd8z9"] Dec 01 08:19:24 crc kubenswrapper[5004]: I1201 08:19:24.504406 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jd8z9" Dec 01 08:19:24 crc kubenswrapper[5004]: I1201 08:19:24.510923 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 01 08:19:24 crc kubenswrapper[5004]: I1201 08:19:24.526530 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jd8z9"] Dec 01 08:19:24 crc kubenswrapper[5004]: I1201 08:19:24.542960 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 08:19:24 crc kubenswrapper[5004]: E1201 08:19:24.543149 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 08:19:25.043125121 +0000 UTC m=+142.608117103 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:19:24 crc kubenswrapper[5004]: I1201 08:19:24.543234 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8jr7l\" (UID: \"9c645213-a3fd-4f35-9edd-60905873a559\") " pod="openshift-image-registry/image-registry-697d97f7c8-8jr7l" Dec 01 08:19:24 crc kubenswrapper[5004]: E1201 08:19:24.543605 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 08:19:25.043596223 +0000 UTC m=+142.608588195 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8jr7l" (UID: "9c645213-a3fd-4f35-9edd-60905873a559") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:19:24 crc kubenswrapper[5004]: I1201 08:19:24.644264 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 08:19:24 crc kubenswrapper[5004]: I1201 08:19:24.644464 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56a0aeb3-484c-4e7e-9ff7-b3bc75f253f4-catalog-content\") pod \"certified-operators-jd8z9\" (UID: \"56a0aeb3-484c-4e7e-9ff7-b3bc75f253f4\") " pod="openshift-marketplace/certified-operators-jd8z9" Dec 01 08:19:24 crc kubenswrapper[5004]: I1201 08:19:24.644491 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66j5l\" (UniqueName: \"kubernetes.io/projected/56a0aeb3-484c-4e7e-9ff7-b3bc75f253f4-kube-api-access-66j5l\") pod \"certified-operators-jd8z9\" (UID: \"56a0aeb3-484c-4e7e-9ff7-b3bc75f253f4\") " pod="openshift-marketplace/certified-operators-jd8z9" Dec 01 08:19:24 crc kubenswrapper[5004]: I1201 08:19:24.644579 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56a0aeb3-484c-4e7e-9ff7-b3bc75f253f4-utilities\") pod \"certified-operators-jd8z9\" (UID: \"56a0aeb3-484c-4e7e-9ff7-b3bc75f253f4\") " pod="openshift-marketplace/certified-operators-jd8z9" Dec 01 08:19:24 crc kubenswrapper[5004]: E1201 08:19:24.644736 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 08:19:25.144711302 +0000 UTC m=+142.709703284 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:19:24 crc kubenswrapper[5004]: I1201 08:19:24.657618 5004 patch_prober.go:28] interesting pod/apiserver-76f77b778f-kplt9 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 01 08:19:24 crc kubenswrapper[5004]: [+]log ok Dec 01 08:19:24 crc kubenswrapper[5004]: [+]etcd ok Dec 01 08:19:24 crc kubenswrapper[5004]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 01 08:19:24 crc kubenswrapper[5004]: [+]poststarthook/generic-apiserver-start-informers ok Dec 01 08:19:24 crc kubenswrapper[5004]: [+]poststarthook/max-in-flight-filter ok Dec 01 08:19:24 crc kubenswrapper[5004]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 01 08:19:24 crc kubenswrapper[5004]: [+]poststarthook/image.openshift.io-apiserver-caches ok Dec 01 08:19:24 crc kubenswrapper[5004]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Dec 01 08:19:24 crc kubenswrapper[5004]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Dec 01 08:19:24 crc kubenswrapper[5004]: [+]poststarthook/project.openshift.io-projectcache ok Dec 01 08:19:24 crc kubenswrapper[5004]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Dec 01 08:19:24 crc kubenswrapper[5004]: [+]poststarthook/openshift.io-startinformers ok Dec 01 08:19:24 crc kubenswrapper[5004]: [+]poststarthook/openshift.io-restmapperupdater ok Dec 01 08:19:24 crc kubenswrapper[5004]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 01 08:19:24 crc kubenswrapper[5004]: livez check failed Dec 01 08:19:24 crc kubenswrapper[5004]: I1201 08:19:24.658223 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-kplt9" podUID="2daa468b-e9f8-41a3-ba94-a1e33093fc97" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 08:19:24 crc kubenswrapper[5004]: I1201 08:19:24.699572 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-dg9cf"] Dec 01 08:19:24 crc kubenswrapper[5004]: I1201 08:19:24.700467 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dg9cf" Dec 01 08:19:24 crc kubenswrapper[5004]: I1201 08:19:24.725410 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 01 08:19:24 crc kubenswrapper[5004]: I1201 08:19:24.734287 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dg9cf"] Dec 01 08:19:24 crc kubenswrapper[5004]: I1201 08:19:24.746971 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66j5l\" (UniqueName: \"kubernetes.io/projected/56a0aeb3-484c-4e7e-9ff7-b3bc75f253f4-kube-api-access-66j5l\") pod \"certified-operators-jd8z9\" (UID: \"56a0aeb3-484c-4e7e-9ff7-b3bc75f253f4\") " pod="openshift-marketplace/certified-operators-jd8z9" Dec 01 08:19:24 crc kubenswrapper[5004]: I1201 08:19:24.747052 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56a0aeb3-484c-4e7e-9ff7-b3bc75f253f4-utilities\") pod \"certified-operators-jd8z9\" (UID: \"56a0aeb3-484c-4e7e-9ff7-b3bc75f253f4\") " pod="openshift-marketplace/certified-operators-jd8z9" Dec 01 08:19:24 crc kubenswrapper[5004]: I1201 08:19:24.747104 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8jr7l\" (UID: \"9c645213-a3fd-4f35-9edd-60905873a559\") " pod="openshift-image-registry/image-registry-697d97f7c8-8jr7l" Dec 01 08:19:24 crc kubenswrapper[5004]: I1201 08:19:24.747127 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56a0aeb3-484c-4e7e-9ff7-b3bc75f253f4-catalog-content\") pod \"certified-operators-jd8z9\" (UID: \"56a0aeb3-484c-4e7e-9ff7-b3bc75f253f4\") " pod="openshift-marketplace/certified-operators-jd8z9" Dec 01 08:19:24 crc kubenswrapper[5004]: I1201 08:19:24.747485 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56a0aeb3-484c-4e7e-9ff7-b3bc75f253f4-catalog-content\") pod \"certified-operators-jd8z9\" (UID: \"56a0aeb3-484c-4e7e-9ff7-b3bc75f253f4\") " pod="openshift-marketplace/certified-operators-jd8z9" Dec 01 08:19:24 crc kubenswrapper[5004]: I1201 08:19:24.747819 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56a0aeb3-484c-4e7e-9ff7-b3bc75f253f4-utilities\") pod \"certified-operators-jd8z9\" (UID: \"56a0aeb3-484c-4e7e-9ff7-b3bc75f253f4\") " pod="openshift-marketplace/certified-operators-jd8z9" Dec 01 08:19:24 crc kubenswrapper[5004]: E1201 08:19:24.747927 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 08:19:25.247912496 +0000 UTC m=+142.812904478 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8jr7l" (UID: "9c645213-a3fd-4f35-9edd-60905873a559") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:19:24 crc kubenswrapper[5004]: I1201 08:19:24.821818 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66j5l\" (UniqueName: \"kubernetes.io/projected/56a0aeb3-484c-4e7e-9ff7-b3bc75f253f4-kube-api-access-66j5l\") pod \"certified-operators-jd8z9\" (UID: \"56a0aeb3-484c-4e7e-9ff7-b3bc75f253f4\") " pod="openshift-marketplace/certified-operators-jd8z9" Dec 01 08:19:24 crc kubenswrapper[5004]: I1201 08:19:24.849109 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 08:19:24 crc kubenswrapper[5004]: I1201 08:19:24.849278 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsshm\" (UniqueName: \"kubernetes.io/projected/19da9663-9e98-41f0-a737-0c2683293496-kube-api-access-lsshm\") pod \"community-operators-dg9cf\" (UID: \"19da9663-9e98-41f0-a737-0c2683293496\") " pod="openshift-marketplace/community-operators-dg9cf" Dec 01 08:19:24 crc kubenswrapper[5004]: I1201 08:19:24.849301 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19da9663-9e98-41f0-a737-0c2683293496-catalog-content\") pod \"community-operators-dg9cf\" (UID: \"19da9663-9e98-41f0-a737-0c2683293496\") " pod="openshift-marketplace/community-operators-dg9cf" Dec 01 08:19:24 crc kubenswrapper[5004]: I1201 08:19:24.849417 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19da9663-9e98-41f0-a737-0c2683293496-utilities\") pod \"community-operators-dg9cf\" (UID: \"19da9663-9e98-41f0-a737-0c2683293496\") " pod="openshift-marketplace/community-operators-dg9cf" Dec 01 08:19:24 crc kubenswrapper[5004]: E1201 08:19:24.849553 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 08:19:25.34953744 +0000 UTC m=+142.914529422 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:19:24 crc kubenswrapper[5004]: I1201 08:19:24.937745 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dqzk6"] Dec 01 08:19:24 crc kubenswrapper[5004]: I1201 08:19:24.945864 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dqzk6" Dec 01 08:19:24 crc kubenswrapper[5004]: I1201 08:19:24.952199 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19da9663-9e98-41f0-a737-0c2683293496-utilities\") pod \"community-operators-dg9cf\" (UID: \"19da9663-9e98-41f0-a737-0c2683293496\") " pod="openshift-marketplace/community-operators-dg9cf" Dec 01 08:19:24 crc kubenswrapper[5004]: I1201 08:19:24.952417 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8jr7l\" (UID: \"9c645213-a3fd-4f35-9edd-60905873a559\") " pod="openshift-image-registry/image-registry-697d97f7c8-8jr7l" Dec 01 08:19:24 crc kubenswrapper[5004]: I1201 08:19:24.952509 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsshm\" (UniqueName: \"kubernetes.io/projected/19da9663-9e98-41f0-a737-0c2683293496-kube-api-access-lsshm\") pod \"community-operators-dg9cf\" (UID: \"19da9663-9e98-41f0-a737-0c2683293496\") " pod="openshift-marketplace/community-operators-dg9cf" Dec 01 08:19:24 crc kubenswrapper[5004]: I1201 08:19:24.952616 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19da9663-9e98-41f0-a737-0c2683293496-catalog-content\") pod \"community-operators-dg9cf\" (UID: \"19da9663-9e98-41f0-a737-0c2683293496\") " pod="openshift-marketplace/community-operators-dg9cf" Dec 01 08:19:24 crc kubenswrapper[5004]: I1201 08:19:24.953082 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19da9663-9e98-41f0-a737-0c2683293496-catalog-content\") pod \"community-operators-dg9cf\" (UID: \"19da9663-9e98-41f0-a737-0c2683293496\") " pod="openshift-marketplace/community-operators-dg9cf" Dec 01 08:19:24 crc kubenswrapper[5004]: I1201 08:19:24.953532 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19da9663-9e98-41f0-a737-0c2683293496-utilities\") pod \"community-operators-dg9cf\" (UID: \"19da9663-9e98-41f0-a737-0c2683293496\") " pod="openshift-marketplace/community-operators-dg9cf" Dec 01 08:19:24 crc kubenswrapper[5004]: E1201 08:19:24.953863 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 08:19:25.453852002 +0000 UTC m=+143.018843984 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8jr7l" (UID: "9c645213-a3fd-4f35-9edd-60905873a559") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:19:24 crc kubenswrapper[5004]: I1201 08:19:24.974814 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dqzk6"] Dec 01 08:19:25 crc kubenswrapper[5004]: I1201 08:19:25.007041 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsshm\" (UniqueName: \"kubernetes.io/projected/19da9663-9e98-41f0-a737-0c2683293496-kube-api-access-lsshm\") pod \"community-operators-dg9cf\" (UID: \"19da9663-9e98-41f0-a737-0c2683293496\") " pod="openshift-marketplace/community-operators-dg9cf" Dec 01 08:19:25 crc kubenswrapper[5004]: I1201 08:19:25.040109 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dg9cf" Dec 01 08:19:25 crc kubenswrapper[5004]: I1201 08:19:25.047853 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-pt82b" event={"ID":"ecf238f4-a4b7-45ab-8d1f-ff20327f375c","Type":"ContainerStarted","Data":"ea22ed88bde8abc1f138fcb8eae2a3b7f9c4977a3a21bfbce2362876204bbb1d"} Dec 01 08:19:25 crc kubenswrapper[5004]: I1201 08:19:25.062089 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 08:19:25 crc kubenswrapper[5004]: I1201 08:19:25.062233 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mvf9\" (UniqueName: \"kubernetes.io/projected/494e6c31-3cc3-45a4-b45c-8f5ffb4251fa-kube-api-access-9mvf9\") pod \"certified-operators-dqzk6\" (UID: \"494e6c31-3cc3-45a4-b45c-8f5ffb4251fa\") " pod="openshift-marketplace/certified-operators-dqzk6" Dec 01 08:19:25 crc kubenswrapper[5004]: I1201 08:19:25.062279 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/494e6c31-3cc3-45a4-b45c-8f5ffb4251fa-catalog-content\") pod \"certified-operators-dqzk6\" (UID: \"494e6c31-3cc3-45a4-b45c-8f5ffb4251fa\") " pod="openshift-marketplace/certified-operators-dqzk6" Dec 01 08:19:25 crc kubenswrapper[5004]: I1201 08:19:25.062307 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/494e6c31-3cc3-45a4-b45c-8f5ffb4251fa-utilities\") pod \"certified-operators-dqzk6\" (UID: \"494e6c31-3cc3-45a4-b45c-8f5ffb4251fa\") " pod="openshift-marketplace/certified-operators-dqzk6" Dec 01 08:19:25 crc kubenswrapper[5004]: E1201 08:19:25.062525 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 08:19:25.562511359 +0000 UTC m=+143.127503341 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:19:25 crc kubenswrapper[5004]: I1201 08:19:25.065395 5004 generic.go:334] "Generic (PLEG): container finished" podID="4f397145-18ab-4b43-b133-cc42f45bc852" containerID="88637e7089d85bcdeb4f8e3236cd4a87fe28fff0b21507a073034b85e41e17b7" exitCode=0 Dec 01 08:19:25 crc kubenswrapper[5004]: I1201 08:19:25.065536 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409615-5d85l" event={"ID":"4f397145-18ab-4b43-b133-cc42f45bc852","Type":"ContainerDied","Data":"88637e7089d85bcdeb4f8e3236cd4a87fe28fff0b21507a073034b85e41e17b7"} Dec 01 08:19:25 crc kubenswrapper[5004]: I1201 08:19:25.080365 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mrj44" event={"ID":"beba9fed-710a-49a6-96ce-951ecb0a4a74","Type":"ContainerStarted","Data":"889ea19584139f3a5da34fc40b4e8541c3cd4591b0bc3a9039f41b02ac084022"} Dec 01 08:19:25 crc kubenswrapper[5004]: I1201 08:19:25.083005 5004 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-8g8bf container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Dec 01 08:19:25 crc kubenswrapper[5004]: I1201 08:19:25.083056 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-8g8bf" podUID="8a9f98dc-e84b-4fb8-9d4d-69c766486ebb" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" Dec 01 08:19:25 crc kubenswrapper[5004]: I1201 08:19:25.084055 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rsfff" Dec 01 08:19:25 crc kubenswrapper[5004]: I1201 08:19:25.111630 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-p77t7" Dec 01 08:19:25 crc kubenswrapper[5004]: I1201 08:19:25.112436 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mgs69"] Dec 01 08:19:25 crc kubenswrapper[5004]: I1201 08:19:25.113332 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mgs69" Dec 01 08:19:25 crc kubenswrapper[5004]: I1201 08:19:25.122806 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jd8z9" Dec 01 08:19:25 crc kubenswrapper[5004]: I1201 08:19:25.163843 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mvf9\" (UniqueName: \"kubernetes.io/projected/494e6c31-3cc3-45a4-b45c-8f5ffb4251fa-kube-api-access-9mvf9\") pod \"certified-operators-dqzk6\" (UID: \"494e6c31-3cc3-45a4-b45c-8f5ffb4251fa\") " pod="openshift-marketplace/certified-operators-dqzk6" Dec 01 08:19:25 crc kubenswrapper[5004]: I1201 08:19:25.164120 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/494e6c31-3cc3-45a4-b45c-8f5ffb4251fa-catalog-content\") pod \"certified-operators-dqzk6\" (UID: \"494e6c31-3cc3-45a4-b45c-8f5ffb4251fa\") " pod="openshift-marketplace/certified-operators-dqzk6" Dec 01 08:19:25 crc kubenswrapper[5004]: I1201 08:19:25.164231 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/494e6c31-3cc3-45a4-b45c-8f5ffb4251fa-utilities\") pod \"certified-operators-dqzk6\" (UID: \"494e6c31-3cc3-45a4-b45c-8f5ffb4251fa\") " pod="openshift-marketplace/certified-operators-dqzk6" Dec 01 08:19:25 crc kubenswrapper[5004]: I1201 08:19:25.164386 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8jr7l\" (UID: \"9c645213-a3fd-4f35-9edd-60905873a559\") " pod="openshift-image-registry/image-registry-697d97f7c8-8jr7l" Dec 01 08:19:25 crc kubenswrapper[5004]: I1201 08:19:25.167285 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/494e6c31-3cc3-45a4-b45c-8f5ffb4251fa-utilities\") pod \"certified-operators-dqzk6\" (UID: \"494e6c31-3cc3-45a4-b45c-8f5ffb4251fa\") " pod="openshift-marketplace/certified-operators-dqzk6" Dec 01 08:19:25 crc kubenswrapper[5004]: I1201 08:19:25.170066 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/494e6c31-3cc3-45a4-b45c-8f5ffb4251fa-catalog-content\") pod \"certified-operators-dqzk6\" (UID: \"494e6c31-3cc3-45a4-b45c-8f5ffb4251fa\") " pod="openshift-marketplace/certified-operators-dqzk6" Dec 01 08:19:25 crc kubenswrapper[5004]: E1201 08:19:25.171955 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 08:19:25.671939375 +0000 UTC m=+143.236931357 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8jr7l" (UID: "9c645213-a3fd-4f35-9edd-60905873a559") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:19:25 crc kubenswrapper[5004]: I1201 08:19:25.190798 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mrj44" podStartSLOduration=122.190779696 podStartE2EDuration="2m2.190779696s" podCreationTimestamp="2025-12-01 08:17:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:19:25.174487342 +0000 UTC m=+142.739479324" watchObservedRunningTime="2025-12-01 08:19:25.190779696 +0000 UTC m=+142.755771668" Dec 01 08:19:25 crc kubenswrapper[5004]: I1201 08:19:25.197054 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mvf9\" (UniqueName: \"kubernetes.io/projected/494e6c31-3cc3-45a4-b45c-8f5ffb4251fa-kube-api-access-9mvf9\") pod \"certified-operators-dqzk6\" (UID: \"494e6c31-3cc3-45a4-b45c-8f5ffb4251fa\") " pod="openshift-marketplace/certified-operators-dqzk6" Dec 01 08:19:25 crc kubenswrapper[5004]: I1201 08:19:25.252974 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mgs69"] Dec 01 08:19:25 crc kubenswrapper[5004]: I1201 08:19:25.269294 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 08:19:25 crc kubenswrapper[5004]: I1201 08:19:25.269493 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rn7rw\" (UniqueName: \"kubernetes.io/projected/9ac4710f-ad3b-471c-861d-622e995871cd-kube-api-access-rn7rw\") pod \"community-operators-mgs69\" (UID: \"9ac4710f-ad3b-471c-861d-622e995871cd\") " pod="openshift-marketplace/community-operators-mgs69" Dec 01 08:19:25 crc kubenswrapper[5004]: I1201 08:19:25.269530 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ac4710f-ad3b-471c-861d-622e995871cd-catalog-content\") pod \"community-operators-mgs69\" (UID: \"9ac4710f-ad3b-471c-861d-622e995871cd\") " pod="openshift-marketplace/community-operators-mgs69" Dec 01 08:19:25 crc kubenswrapper[5004]: I1201 08:19:25.269615 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ac4710f-ad3b-471c-861d-622e995871cd-utilities\") pod \"community-operators-mgs69\" (UID: \"9ac4710f-ad3b-471c-861d-622e995871cd\") " pod="openshift-marketplace/community-operators-mgs69" Dec 01 08:19:25 crc kubenswrapper[5004]: E1201 08:19:25.269746 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 08:19:25.769723168 +0000 UTC m=+143.334715150 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:19:25 crc kubenswrapper[5004]: I1201 08:19:25.318866 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dqzk6" Dec 01 08:19:25 crc kubenswrapper[5004]: I1201 08:19:25.372788 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ac4710f-ad3b-471c-861d-622e995871cd-utilities\") pod \"community-operators-mgs69\" (UID: \"9ac4710f-ad3b-471c-861d-622e995871cd\") " pod="openshift-marketplace/community-operators-mgs69" Dec 01 08:19:25 crc kubenswrapper[5004]: I1201 08:19:25.372858 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rn7rw\" (UniqueName: \"kubernetes.io/projected/9ac4710f-ad3b-471c-861d-622e995871cd-kube-api-access-rn7rw\") pod \"community-operators-mgs69\" (UID: \"9ac4710f-ad3b-471c-861d-622e995871cd\") " pod="openshift-marketplace/community-operators-mgs69" Dec 01 08:19:25 crc kubenswrapper[5004]: I1201 08:19:25.372890 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ac4710f-ad3b-471c-861d-622e995871cd-catalog-content\") pod \"community-operators-mgs69\" (UID: \"9ac4710f-ad3b-471c-861d-622e995871cd\") " pod="openshift-marketplace/community-operators-mgs69" Dec 01 08:19:25 crc kubenswrapper[5004]: I1201 08:19:25.372915 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8jr7l\" (UID: \"9c645213-a3fd-4f35-9edd-60905873a559\") " pod="openshift-image-registry/image-registry-697d97f7c8-8jr7l" Dec 01 08:19:25 crc kubenswrapper[5004]: E1201 08:19:25.373176 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 08:19:25.873164978 +0000 UTC m=+143.438156950 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8jr7l" (UID: "9c645213-a3fd-4f35-9edd-60905873a559") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:19:25 crc kubenswrapper[5004]: I1201 08:19:25.373641 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ac4710f-ad3b-471c-861d-622e995871cd-utilities\") pod \"community-operators-mgs69\" (UID: \"9ac4710f-ad3b-471c-861d-622e995871cd\") " pod="openshift-marketplace/community-operators-mgs69" Dec 01 08:19:25 crc kubenswrapper[5004]: I1201 08:19:25.374040 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ac4710f-ad3b-471c-861d-622e995871cd-catalog-content\") pod \"community-operators-mgs69\" (UID: \"9ac4710f-ad3b-471c-861d-622e995871cd\") " pod="openshift-marketplace/community-operators-mgs69" Dec 01 08:19:25 crc kubenswrapper[5004]: I1201 08:19:25.385008 5004 patch_prober.go:28] interesting pod/router-default-5444994796-6x76r container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 08:19:25 crc kubenswrapper[5004]: [-]has-synced failed: reason withheld Dec 01 08:19:25 crc kubenswrapper[5004]: [+]process-running ok Dec 01 08:19:25 crc kubenswrapper[5004]: healthz check failed Dec 01 08:19:25 crc kubenswrapper[5004]: I1201 08:19:25.385268 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6x76r" podUID="b3ef8088-fc82-4ce3-9c1c-662e380e0587" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 08:19:25 crc kubenswrapper[5004]: I1201 08:19:25.418401 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rn7rw\" (UniqueName: \"kubernetes.io/projected/9ac4710f-ad3b-471c-861d-622e995871cd-kube-api-access-rn7rw\") pod \"community-operators-mgs69\" (UID: \"9ac4710f-ad3b-471c-861d-622e995871cd\") " pod="openshift-marketplace/community-operators-mgs69" Dec 01 08:19:25 crc kubenswrapper[5004]: I1201 08:19:25.446816 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mgs69" Dec 01 08:19:25 crc kubenswrapper[5004]: I1201 08:19:25.473521 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 08:19:25 crc kubenswrapper[5004]: E1201 08:19:25.474723 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 08:19:25.974707558 +0000 UTC m=+143.539699530 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:19:25 crc kubenswrapper[5004]: I1201 08:19:25.501505 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dg9cf"] Dec 01 08:19:25 crc kubenswrapper[5004]: I1201 08:19:25.611050 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8jr7l\" (UID: \"9c645213-a3fd-4f35-9edd-60905873a559\") " pod="openshift-image-registry/image-registry-697d97f7c8-8jr7l" Dec 01 08:19:25 crc kubenswrapper[5004]: E1201 08:19:25.656546 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 08:19:26.156528244 +0000 UTC m=+143.721520226 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8jr7l" (UID: "9c645213-a3fd-4f35-9edd-60905873a559") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:19:25 crc kubenswrapper[5004]: I1201 08:19:25.675327 5004 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Dec 01 08:19:25 crc kubenswrapper[5004]: I1201 08:19:25.716201 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 08:19:25 crc kubenswrapper[5004]: E1201 08:19:25.716619 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 08:19:26.216547231 +0000 UTC m=+143.781539213 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:19:25 crc kubenswrapper[5004]: I1201 08:19:25.780596 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jd8z9"] Dec 01 08:19:25 crc kubenswrapper[5004]: I1201 08:19:25.817271 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8jr7l\" (UID: \"9c645213-a3fd-4f35-9edd-60905873a559\") " pod="openshift-image-registry/image-registry-697d97f7c8-8jr7l" Dec 01 08:19:25 crc kubenswrapper[5004]: E1201 08:19:25.817719 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 08:19:26.317705851 +0000 UTC m=+143.882697833 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8jr7l" (UID: "9c645213-a3fd-4f35-9edd-60905873a559") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:19:25 crc kubenswrapper[5004]: I1201 08:19:25.924162 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 08:19:25 crc kubenswrapper[5004]: E1201 08:19:25.924534 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 08:19:26.424520269 +0000 UTC m=+143.989512251 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 08:19:25 crc kubenswrapper[5004]: I1201 08:19:25.960675 5004 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-12-01T08:19:25.675347805Z","Handler":null,"Name":""} Dec 01 08:19:26 crc kubenswrapper[5004]: I1201 08:19:26.019398 5004 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Dec 01 08:19:26 crc kubenswrapper[5004]: I1201 08:19:26.019452 5004 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Dec 01 08:19:26 crc kubenswrapper[5004]: I1201 08:19:26.031526 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8jr7l\" (UID: \"9c645213-a3fd-4f35-9edd-60905873a559\") " pod="openshift-image-registry/image-registry-697d97f7c8-8jr7l" Dec 01 08:19:26 crc kubenswrapper[5004]: I1201 08:19:26.052698 5004 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 01 08:19:26 crc kubenswrapper[5004]: I1201 08:19:26.052929 5004 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8jr7l\" (UID: \"9c645213-a3fd-4f35-9edd-60905873a559\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-8jr7l" Dec 01 08:19:26 crc kubenswrapper[5004]: I1201 08:19:26.066389 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9c2d" Dec 01 08:19:26 crc kubenswrapper[5004]: I1201 08:19:26.115547 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mgs69"] Dec 01 08:19:26 crc kubenswrapper[5004]: I1201 08:19:26.122263 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-pt82b" event={"ID":"ecf238f4-a4b7-45ab-8d1f-ff20327f375c","Type":"ContainerStarted","Data":"d1ca9865ef18585d7182776cfbfde4c7f2376edecf23c6a8d247bb23f8527644"} Dec 01 08:19:26 crc kubenswrapper[5004]: I1201 08:19:26.122316 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-pt82b" event={"ID":"ecf238f4-a4b7-45ab-8d1f-ff20327f375c","Type":"ContainerStarted","Data":"4feebff4f9148d38d2170bdae3bb1688c8fa7d1e054400f19fe57134d60b1c52"} Dec 01 08:19:26 crc kubenswrapper[5004]: I1201 08:19:26.124728 5004 generic.go:334] "Generic (PLEG): container finished" podID="19da9663-9e98-41f0-a737-0c2683293496" containerID="738c7e32c65dc8a55331b1cefb9f74180dbacc98c66732e1b36cccf0c10debf9" exitCode=0 Dec 01 08:19:26 crc kubenswrapper[5004]: I1201 08:19:26.125373 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dg9cf" event={"ID":"19da9663-9e98-41f0-a737-0c2683293496","Type":"ContainerDied","Data":"738c7e32c65dc8a55331b1cefb9f74180dbacc98c66732e1b36cccf0c10debf9"} Dec 01 08:19:26 crc kubenswrapper[5004]: I1201 08:19:26.125399 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dg9cf" event={"ID":"19da9663-9e98-41f0-a737-0c2683293496","Type":"ContainerStarted","Data":"41d9e939e0d909f1a131ac2fea6303ecc0a0a33f3f5527a76a276e35446ba44a"} Dec 01 08:19:26 crc kubenswrapper[5004]: I1201 08:19:26.129973 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dqzk6"] Dec 01 08:19:26 crc kubenswrapper[5004]: I1201 08:19:26.131661 5004 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 08:19:26 crc kubenswrapper[5004]: I1201 08:19:26.135427 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jd8z9" event={"ID":"56a0aeb3-484c-4e7e-9ff7-b3bc75f253f4","Type":"ContainerStarted","Data":"ef3d20796b6bb7f0a3f5208286e3b40defefb1b479f815014fd30621a7e76f81"} Dec 01 08:19:26 crc kubenswrapper[5004]: I1201 08:19:26.135490 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jd8z9" event={"ID":"56a0aeb3-484c-4e7e-9ff7-b3bc75f253f4","Type":"ContainerStarted","Data":"1eba94626f192621bf1d375428f6d4084dd29d1d1b4693fd4f7efbbb5ebf7e26"} Dec 01 08:19:26 crc kubenswrapper[5004]: I1201 08:19:26.175671 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-pt82b" podStartSLOduration=9.175653265 podStartE2EDuration="9.175653265s" podCreationTimestamp="2025-12-01 08:19:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:19:26.149645706 +0000 UTC m=+143.714637688" watchObservedRunningTime="2025-12-01 08:19:26.175653265 +0000 UTC m=+143.740645247" Dec 01 08:19:26 crc kubenswrapper[5004]: I1201 08:19:26.242227 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8jr7l\" (UID: \"9c645213-a3fd-4f35-9edd-60905873a559\") " pod="openshift-image-registry/image-registry-697d97f7c8-8jr7l" Dec 01 08:19:26 crc kubenswrapper[5004]: I1201 08:19:26.316396 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-8jr7l" Dec 01 08:19:26 crc kubenswrapper[5004]: I1201 08:19:26.340071 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 08:19:26 crc kubenswrapper[5004]: I1201 08:19:26.365971 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 01 08:19:26 crc kubenswrapper[5004]: I1201 08:19:26.394748 5004 patch_prober.go:28] interesting pod/router-default-5444994796-6x76r container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 08:19:26 crc kubenswrapper[5004]: [-]has-synced failed: reason withheld Dec 01 08:19:26 crc kubenswrapper[5004]: [+]process-running ok Dec 01 08:19:26 crc kubenswrapper[5004]: healthz check failed Dec 01 08:19:26 crc kubenswrapper[5004]: I1201 08:19:26.394801 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6x76r" podUID="b3ef8088-fc82-4ce3-9c1c-662e380e0587" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 08:19:26 crc kubenswrapper[5004]: I1201 08:19:26.503014 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409615-5d85l" Dec 01 08:19:26 crc kubenswrapper[5004]: I1201 08:19:26.646695 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4f397145-18ab-4b43-b133-cc42f45bc852-secret-volume\") pod \"4f397145-18ab-4b43-b133-cc42f45bc852\" (UID: \"4f397145-18ab-4b43-b133-cc42f45bc852\") " Dec 01 08:19:26 crc kubenswrapper[5004]: I1201 08:19:26.646789 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4f397145-18ab-4b43-b133-cc42f45bc852-config-volume\") pod \"4f397145-18ab-4b43-b133-cc42f45bc852\" (UID: \"4f397145-18ab-4b43-b133-cc42f45bc852\") " Dec 01 08:19:26 crc kubenswrapper[5004]: I1201 08:19:26.646826 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7bhz\" (UniqueName: \"kubernetes.io/projected/4f397145-18ab-4b43-b133-cc42f45bc852-kube-api-access-l7bhz\") pod \"4f397145-18ab-4b43-b133-cc42f45bc852\" (UID: \"4f397145-18ab-4b43-b133-cc42f45bc852\") " Dec 01 08:19:26 crc kubenswrapper[5004]: I1201 08:19:26.647233 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f397145-18ab-4b43-b133-cc42f45bc852-config-volume" (OuterVolumeSpecName: "config-volume") pod "4f397145-18ab-4b43-b133-cc42f45bc852" (UID: "4f397145-18ab-4b43-b133-cc42f45bc852"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:19:26 crc kubenswrapper[5004]: I1201 08:19:26.652375 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f397145-18ab-4b43-b133-cc42f45bc852-kube-api-access-l7bhz" (OuterVolumeSpecName: "kube-api-access-l7bhz") pod "4f397145-18ab-4b43-b133-cc42f45bc852" (UID: "4f397145-18ab-4b43-b133-cc42f45bc852"). InnerVolumeSpecName "kube-api-access-l7bhz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:19:26 crc kubenswrapper[5004]: I1201 08:19:26.652880 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f397145-18ab-4b43-b133-cc42f45bc852-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4f397145-18ab-4b43-b133-cc42f45bc852" (UID: "4f397145-18ab-4b43-b133-cc42f45bc852"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:19:26 crc kubenswrapper[5004]: I1201 08:19:26.672800 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-8jr7l"] Dec 01 08:19:26 crc kubenswrapper[5004]: I1201 08:19:26.690370 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pl87w"] Dec 01 08:19:26 crc kubenswrapper[5004]: E1201 08:19:26.690568 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f397145-18ab-4b43-b133-cc42f45bc852" containerName="collect-profiles" Dec 01 08:19:26 crc kubenswrapper[5004]: I1201 08:19:26.690579 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f397145-18ab-4b43-b133-cc42f45bc852" containerName="collect-profiles" Dec 01 08:19:26 crc kubenswrapper[5004]: I1201 08:19:26.690676 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f397145-18ab-4b43-b133-cc42f45bc852" containerName="collect-profiles" Dec 01 08:19:26 crc kubenswrapper[5004]: I1201 08:19:26.691301 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pl87w" Dec 01 08:19:26 crc kubenswrapper[5004]: I1201 08:19:26.693395 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 01 08:19:26 crc kubenswrapper[5004]: I1201 08:19:26.703851 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pl87w"] Dec 01 08:19:26 crc kubenswrapper[5004]: I1201 08:19:26.749374 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0286c29-4a56-4e46-8820-21dfbc658c86-utilities\") pod \"redhat-marketplace-pl87w\" (UID: \"b0286c29-4a56-4e46-8820-21dfbc658c86\") " pod="openshift-marketplace/redhat-marketplace-pl87w" Dec 01 08:19:26 crc kubenswrapper[5004]: I1201 08:19:26.749436 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fng7c\" (UniqueName: \"kubernetes.io/projected/b0286c29-4a56-4e46-8820-21dfbc658c86-kube-api-access-fng7c\") pod \"redhat-marketplace-pl87w\" (UID: \"b0286c29-4a56-4e46-8820-21dfbc658c86\") " pod="openshift-marketplace/redhat-marketplace-pl87w" Dec 01 08:19:26 crc kubenswrapper[5004]: I1201 08:19:26.749553 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0286c29-4a56-4e46-8820-21dfbc658c86-catalog-content\") pod \"redhat-marketplace-pl87w\" (UID: \"b0286c29-4a56-4e46-8820-21dfbc658c86\") " pod="openshift-marketplace/redhat-marketplace-pl87w" Dec 01 08:19:26 crc kubenswrapper[5004]: I1201 08:19:26.749676 5004 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4f397145-18ab-4b43-b133-cc42f45bc852-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 08:19:26 crc kubenswrapper[5004]: I1201 08:19:26.749696 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7bhz\" (UniqueName: \"kubernetes.io/projected/4f397145-18ab-4b43-b133-cc42f45bc852-kube-api-access-l7bhz\") on node \"crc\" DevicePath \"\"" Dec 01 08:19:26 crc kubenswrapper[5004]: I1201 08:19:26.749707 5004 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4f397145-18ab-4b43-b133-cc42f45bc852-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 08:19:26 crc kubenswrapper[5004]: I1201 08:19:26.764381 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Dec 01 08:19:26 crc kubenswrapper[5004]: I1201 08:19:26.807097 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:19:26 crc kubenswrapper[5004]: I1201 08:19:26.851936 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fng7c\" (UniqueName: \"kubernetes.io/projected/b0286c29-4a56-4e46-8820-21dfbc658c86-kube-api-access-fng7c\") pod \"redhat-marketplace-pl87w\" (UID: \"b0286c29-4a56-4e46-8820-21dfbc658c86\") " pod="openshift-marketplace/redhat-marketplace-pl87w" Dec 01 08:19:26 crc kubenswrapper[5004]: I1201 08:19:26.852189 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0286c29-4a56-4e46-8820-21dfbc658c86-catalog-content\") pod \"redhat-marketplace-pl87w\" (UID: \"b0286c29-4a56-4e46-8820-21dfbc658c86\") " pod="openshift-marketplace/redhat-marketplace-pl87w" Dec 01 08:19:26 crc kubenswrapper[5004]: I1201 08:19:26.852255 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0286c29-4a56-4e46-8820-21dfbc658c86-utilities\") pod \"redhat-marketplace-pl87w\" (UID: \"b0286c29-4a56-4e46-8820-21dfbc658c86\") " pod="openshift-marketplace/redhat-marketplace-pl87w" Dec 01 08:19:26 crc kubenswrapper[5004]: I1201 08:19:26.852740 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0286c29-4a56-4e46-8820-21dfbc658c86-catalog-content\") pod \"redhat-marketplace-pl87w\" (UID: \"b0286c29-4a56-4e46-8820-21dfbc658c86\") " pod="openshift-marketplace/redhat-marketplace-pl87w" Dec 01 08:19:26 crc kubenswrapper[5004]: I1201 08:19:26.852780 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0286c29-4a56-4e46-8820-21dfbc658c86-utilities\") pod \"redhat-marketplace-pl87w\" (UID: \"b0286c29-4a56-4e46-8820-21dfbc658c86\") " pod="openshift-marketplace/redhat-marketplace-pl87w" Dec 01 08:19:26 crc kubenswrapper[5004]: I1201 08:19:26.867667 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fng7c\" (UniqueName: \"kubernetes.io/projected/b0286c29-4a56-4e46-8820-21dfbc658c86-kube-api-access-fng7c\") pod \"redhat-marketplace-pl87w\" (UID: \"b0286c29-4a56-4e46-8820-21dfbc658c86\") " pod="openshift-marketplace/redhat-marketplace-pl87w" Dec 01 08:19:27 crc kubenswrapper[5004]: I1201 08:19:27.022286 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pl87w" Dec 01 08:19:27 crc kubenswrapper[5004]: I1201 08:19:27.093420 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6fl5f"] Dec 01 08:19:27 crc kubenswrapper[5004]: I1201 08:19:27.094630 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6fl5f" Dec 01 08:19:27 crc kubenswrapper[5004]: I1201 08:19:27.099634 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6fl5f"] Dec 01 08:19:27 crc kubenswrapper[5004]: I1201 08:19:27.150135 5004 generic.go:334] "Generic (PLEG): container finished" podID="9ac4710f-ad3b-471c-861d-622e995871cd" containerID="0c041dd9cc98ed40a4edf455b208a49e72d4b8ca2d88ebb1aec12f9560fa8640" exitCode=0 Dec 01 08:19:27 crc kubenswrapper[5004]: I1201 08:19:27.150352 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mgs69" event={"ID":"9ac4710f-ad3b-471c-861d-622e995871cd","Type":"ContainerDied","Data":"0c041dd9cc98ed40a4edf455b208a49e72d4b8ca2d88ebb1aec12f9560fa8640"} Dec 01 08:19:27 crc kubenswrapper[5004]: I1201 08:19:27.150392 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mgs69" event={"ID":"9ac4710f-ad3b-471c-861d-622e995871cd","Type":"ContainerStarted","Data":"8fd7fd4f0691c759ff143a52dc6f3242fa255b64d63bcf8200d30713d3b2a58b"} Dec 01 08:19:27 crc kubenswrapper[5004]: I1201 08:19:27.155075 5004 generic.go:334] "Generic (PLEG): container finished" podID="56a0aeb3-484c-4e7e-9ff7-b3bc75f253f4" containerID="ef3d20796b6bb7f0a3f5208286e3b40defefb1b479f815014fd30621a7e76f81" exitCode=0 Dec 01 08:19:27 crc kubenswrapper[5004]: I1201 08:19:27.155148 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jd8z9" event={"ID":"56a0aeb3-484c-4e7e-9ff7-b3bc75f253f4","Type":"ContainerDied","Data":"ef3d20796b6bb7f0a3f5208286e3b40defefb1b479f815014fd30621a7e76f81"} Dec 01 08:19:27 crc kubenswrapper[5004]: I1201 08:19:27.157142 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5dss\" (UniqueName: \"kubernetes.io/projected/4d7ad504-8dea-4c9c-9a5a-682d56793c9b-kube-api-access-x5dss\") pod \"redhat-marketplace-6fl5f\" (UID: \"4d7ad504-8dea-4c9c-9a5a-682d56793c9b\") " pod="openshift-marketplace/redhat-marketplace-6fl5f" Dec 01 08:19:27 crc kubenswrapper[5004]: I1201 08:19:27.157212 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d7ad504-8dea-4c9c-9a5a-682d56793c9b-catalog-content\") pod \"redhat-marketplace-6fl5f\" (UID: \"4d7ad504-8dea-4c9c-9a5a-682d56793c9b\") " pod="openshift-marketplace/redhat-marketplace-6fl5f" Dec 01 08:19:27 crc kubenswrapper[5004]: I1201 08:19:27.157252 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d7ad504-8dea-4c9c-9a5a-682d56793c9b-utilities\") pod \"redhat-marketplace-6fl5f\" (UID: \"4d7ad504-8dea-4c9c-9a5a-682d56793c9b\") " pod="openshift-marketplace/redhat-marketplace-6fl5f" Dec 01 08:19:27 crc kubenswrapper[5004]: I1201 08:19:27.158389 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409615-5d85l" event={"ID":"4f397145-18ab-4b43-b133-cc42f45bc852","Type":"ContainerDied","Data":"939dca2e20736a45d95fbcc9cb9b71970bef62e36c1d69c80200c9cdaf940f2a"} Dec 01 08:19:27 crc kubenswrapper[5004]: I1201 08:19:27.158434 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="939dca2e20736a45d95fbcc9cb9b71970bef62e36c1d69c80200c9cdaf940f2a" Dec 01 08:19:27 crc kubenswrapper[5004]: I1201 08:19:27.158533 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409615-5d85l" Dec 01 08:19:27 crc kubenswrapper[5004]: I1201 08:19:27.169076 5004 generic.go:334] "Generic (PLEG): container finished" podID="494e6c31-3cc3-45a4-b45c-8f5ffb4251fa" containerID="3e4cb69daf5619b550dcd20e5c8f317a0838f7c101dc4aa258321fdbf713ceb0" exitCode=0 Dec 01 08:19:27 crc kubenswrapper[5004]: I1201 08:19:27.169151 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dqzk6" event={"ID":"494e6c31-3cc3-45a4-b45c-8f5ffb4251fa","Type":"ContainerDied","Data":"3e4cb69daf5619b550dcd20e5c8f317a0838f7c101dc4aa258321fdbf713ceb0"} Dec 01 08:19:27 crc kubenswrapper[5004]: I1201 08:19:27.169181 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dqzk6" event={"ID":"494e6c31-3cc3-45a4-b45c-8f5ffb4251fa","Type":"ContainerStarted","Data":"080fbd3ee83e7487abed4046e69ff255161759096254669ee15564cfb7cd6101"} Dec 01 08:19:27 crc kubenswrapper[5004]: I1201 08:19:27.174960 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-8jr7l" event={"ID":"9c645213-a3fd-4f35-9edd-60905873a559","Type":"ContainerStarted","Data":"e54f7026de6305decc27f119174eb4e406e62b4b6ac16f898f07825ff5ef24ce"} Dec 01 08:19:27 crc kubenswrapper[5004]: I1201 08:19:27.174992 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-8jr7l" Dec 01 08:19:27 crc kubenswrapper[5004]: I1201 08:19:27.175003 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-8jr7l" event={"ID":"9c645213-a3fd-4f35-9edd-60905873a559","Type":"ContainerStarted","Data":"122870c1f8f7094cb7442e50daf493b4d8e00b41dbe2d955d8ad6493dce19c96"} Dec 01 08:19:27 crc kubenswrapper[5004]: I1201 08:19:27.241104 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-8jr7l" podStartSLOduration=125.241083266 podStartE2EDuration="2m5.241083266s" podCreationTimestamp="2025-12-01 08:17:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:19:27.213480856 +0000 UTC m=+144.778472838" watchObservedRunningTime="2025-12-01 08:19:27.241083266 +0000 UTC m=+144.806075248" Dec 01 08:19:27 crc kubenswrapper[5004]: I1201 08:19:27.244267 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pl87w"] Dec 01 08:19:27 crc kubenswrapper[5004]: I1201 08:19:27.257882 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5dss\" (UniqueName: \"kubernetes.io/projected/4d7ad504-8dea-4c9c-9a5a-682d56793c9b-kube-api-access-x5dss\") pod \"redhat-marketplace-6fl5f\" (UID: \"4d7ad504-8dea-4c9c-9a5a-682d56793c9b\") " pod="openshift-marketplace/redhat-marketplace-6fl5f" Dec 01 08:19:27 crc kubenswrapper[5004]: I1201 08:19:27.259305 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d7ad504-8dea-4c9c-9a5a-682d56793c9b-catalog-content\") pod \"redhat-marketplace-6fl5f\" (UID: \"4d7ad504-8dea-4c9c-9a5a-682d56793c9b\") " pod="openshift-marketplace/redhat-marketplace-6fl5f" Dec 01 08:19:27 crc kubenswrapper[5004]: I1201 08:19:27.259372 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d7ad504-8dea-4c9c-9a5a-682d56793c9b-utilities\") pod \"redhat-marketplace-6fl5f\" (UID: \"4d7ad504-8dea-4c9c-9a5a-682d56793c9b\") " pod="openshift-marketplace/redhat-marketplace-6fl5f" Dec 01 08:19:27 crc kubenswrapper[5004]: I1201 08:19:27.261761 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d7ad504-8dea-4c9c-9a5a-682d56793c9b-utilities\") pod \"redhat-marketplace-6fl5f\" (UID: \"4d7ad504-8dea-4c9c-9a5a-682d56793c9b\") " pod="openshift-marketplace/redhat-marketplace-6fl5f" Dec 01 08:19:27 crc kubenswrapper[5004]: I1201 08:19:27.262174 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d7ad504-8dea-4c9c-9a5a-682d56793c9b-catalog-content\") pod \"redhat-marketplace-6fl5f\" (UID: \"4d7ad504-8dea-4c9c-9a5a-682d56793c9b\") " pod="openshift-marketplace/redhat-marketplace-6fl5f" Dec 01 08:19:27 crc kubenswrapper[5004]: I1201 08:19:27.283378 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5dss\" (UniqueName: \"kubernetes.io/projected/4d7ad504-8dea-4c9c-9a5a-682d56793c9b-kube-api-access-x5dss\") pod \"redhat-marketplace-6fl5f\" (UID: \"4d7ad504-8dea-4c9c-9a5a-682d56793c9b\") " pod="openshift-marketplace/redhat-marketplace-6fl5f" Dec 01 08:19:27 crc kubenswrapper[5004]: I1201 08:19:27.379431 5004 patch_prober.go:28] interesting pod/router-default-5444994796-6x76r container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 08:19:27 crc kubenswrapper[5004]: [-]has-synced failed: reason withheld Dec 01 08:19:27 crc kubenswrapper[5004]: [+]process-running ok Dec 01 08:19:27 crc kubenswrapper[5004]: healthz check failed Dec 01 08:19:27 crc kubenswrapper[5004]: I1201 08:19:27.379506 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6x76r" podUID="b3ef8088-fc82-4ce3-9c1c-662e380e0587" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 08:19:27 crc kubenswrapper[5004]: I1201 08:19:27.444583 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6fl5f" Dec 01 08:19:27 crc kubenswrapper[5004]: I1201 08:19:27.635065 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 01 08:19:27 crc kubenswrapper[5004]: I1201 08:19:27.635816 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 08:19:27 crc kubenswrapper[5004]: I1201 08:19:27.640601 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 01 08:19:27 crc kubenswrapper[5004]: I1201 08:19:27.640833 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Dec 01 08:19:27 crc kubenswrapper[5004]: I1201 08:19:27.647974 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 01 08:19:27 crc kubenswrapper[5004]: I1201 08:19:27.651326 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6fl5f"] Dec 01 08:19:27 crc kubenswrapper[5004]: I1201 08:19:27.708755 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wvfds"] Dec 01 08:19:27 crc kubenswrapper[5004]: I1201 08:19:27.712976 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wvfds" Dec 01 08:19:27 crc kubenswrapper[5004]: I1201 08:19:27.715358 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 01 08:19:27 crc kubenswrapper[5004]: I1201 08:19:27.716607 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wvfds"] Dec 01 08:19:27 crc kubenswrapper[5004]: I1201 08:19:27.766184 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8ece7887-299b-4c39-8669-f23a01e922eb-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"8ece7887-299b-4c39-8669-f23a01e922eb\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 08:19:27 crc kubenswrapper[5004]: I1201 08:19:27.766223 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8ece7887-299b-4c39-8669-f23a01e922eb-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8ece7887-299b-4c39-8669-f23a01e922eb\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 08:19:27 crc kubenswrapper[5004]: I1201 08:19:27.766266 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f079082-1b6f-4919-8d2c-c640f30de417-utilities\") pod \"redhat-operators-wvfds\" (UID: \"8f079082-1b6f-4919-8d2c-c640f30de417\") " pod="openshift-marketplace/redhat-operators-wvfds" Dec 01 08:19:27 crc kubenswrapper[5004]: I1201 08:19:27.766396 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f079082-1b6f-4919-8d2c-c640f30de417-catalog-content\") pod \"redhat-operators-wvfds\" (UID: \"8f079082-1b6f-4919-8d2c-c640f30de417\") " pod="openshift-marketplace/redhat-operators-wvfds" Dec 01 08:19:27 crc kubenswrapper[5004]: I1201 08:19:27.766511 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cghx8\" (UniqueName: \"kubernetes.io/projected/8f079082-1b6f-4919-8d2c-c640f30de417-kube-api-access-cghx8\") pod \"redhat-operators-wvfds\" (UID: \"8f079082-1b6f-4919-8d2c-c640f30de417\") " pod="openshift-marketplace/redhat-operators-wvfds" Dec 01 08:19:27 crc kubenswrapper[5004]: I1201 08:19:27.867747 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8ece7887-299b-4c39-8669-f23a01e922eb-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"8ece7887-299b-4c39-8669-f23a01e922eb\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 08:19:27 crc kubenswrapper[5004]: I1201 08:19:27.867797 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8ece7887-299b-4c39-8669-f23a01e922eb-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8ece7887-299b-4c39-8669-f23a01e922eb\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 08:19:27 crc kubenswrapper[5004]: I1201 08:19:27.867850 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f079082-1b6f-4919-8d2c-c640f30de417-utilities\") pod \"redhat-operators-wvfds\" (UID: \"8f079082-1b6f-4919-8d2c-c640f30de417\") " pod="openshift-marketplace/redhat-operators-wvfds" Dec 01 08:19:27 crc kubenswrapper[5004]: I1201 08:19:27.867855 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8ece7887-299b-4c39-8669-f23a01e922eb-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"8ece7887-299b-4c39-8669-f23a01e922eb\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 08:19:27 crc kubenswrapper[5004]: I1201 08:19:27.867875 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f079082-1b6f-4919-8d2c-c640f30de417-catalog-content\") pod \"redhat-operators-wvfds\" (UID: \"8f079082-1b6f-4919-8d2c-c640f30de417\") " pod="openshift-marketplace/redhat-operators-wvfds" Dec 01 08:19:27 crc kubenswrapper[5004]: I1201 08:19:27.867915 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cghx8\" (UniqueName: \"kubernetes.io/projected/8f079082-1b6f-4919-8d2c-c640f30de417-kube-api-access-cghx8\") pod \"redhat-operators-wvfds\" (UID: \"8f079082-1b6f-4919-8d2c-c640f30de417\") " pod="openshift-marketplace/redhat-operators-wvfds" Dec 01 08:19:27 crc kubenswrapper[5004]: I1201 08:19:27.868336 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f079082-1b6f-4919-8d2c-c640f30de417-utilities\") pod \"redhat-operators-wvfds\" (UID: \"8f079082-1b6f-4919-8d2c-c640f30de417\") " pod="openshift-marketplace/redhat-operators-wvfds" Dec 01 08:19:27 crc kubenswrapper[5004]: I1201 08:19:27.868475 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f079082-1b6f-4919-8d2c-c640f30de417-catalog-content\") pod \"redhat-operators-wvfds\" (UID: \"8f079082-1b6f-4919-8d2c-c640f30de417\") " pod="openshift-marketplace/redhat-operators-wvfds" Dec 01 08:19:27 crc kubenswrapper[5004]: I1201 08:19:27.891160 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-djhzh"] Dec 01 08:19:27 crc kubenswrapper[5004]: I1201 08:19:27.896207 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-djhzh" Dec 01 08:19:27 crc kubenswrapper[5004]: I1201 08:19:27.898246 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8ece7887-299b-4c39-8669-f23a01e922eb-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8ece7887-299b-4c39-8669-f23a01e922eb\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 08:19:27 crc kubenswrapper[5004]: I1201 08:19:27.899597 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cghx8\" (UniqueName: \"kubernetes.io/projected/8f079082-1b6f-4919-8d2c-c640f30de417-kube-api-access-cghx8\") pod \"redhat-operators-wvfds\" (UID: \"8f079082-1b6f-4919-8d2c-c640f30de417\") " pod="openshift-marketplace/redhat-operators-wvfds" Dec 01 08:19:27 crc kubenswrapper[5004]: I1201 08:19:27.910121 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-djhzh"] Dec 01 08:19:27 crc kubenswrapper[5004]: I1201 08:19:27.956792 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 08:19:27 crc kubenswrapper[5004]: I1201 08:19:27.969458 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqcrk\" (UniqueName: \"kubernetes.io/projected/50e55125-2439-4ff5-87f4-e15db8c26dae-kube-api-access-qqcrk\") pod \"redhat-operators-djhzh\" (UID: \"50e55125-2439-4ff5-87f4-e15db8c26dae\") " pod="openshift-marketplace/redhat-operators-djhzh" Dec 01 08:19:27 crc kubenswrapper[5004]: I1201 08:19:27.969915 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50e55125-2439-4ff5-87f4-e15db8c26dae-catalog-content\") pod \"redhat-operators-djhzh\" (UID: \"50e55125-2439-4ff5-87f4-e15db8c26dae\") " pod="openshift-marketplace/redhat-operators-djhzh" Dec 01 08:19:27 crc kubenswrapper[5004]: I1201 08:19:27.969971 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50e55125-2439-4ff5-87f4-e15db8c26dae-utilities\") pod \"redhat-operators-djhzh\" (UID: \"50e55125-2439-4ff5-87f4-e15db8c26dae\") " pod="openshift-marketplace/redhat-operators-djhzh" Dec 01 08:19:28 crc kubenswrapper[5004]: I1201 08:19:28.051986 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wvfds" Dec 01 08:19:28 crc kubenswrapper[5004]: I1201 08:19:28.071596 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqcrk\" (UniqueName: \"kubernetes.io/projected/50e55125-2439-4ff5-87f4-e15db8c26dae-kube-api-access-qqcrk\") pod \"redhat-operators-djhzh\" (UID: \"50e55125-2439-4ff5-87f4-e15db8c26dae\") " pod="openshift-marketplace/redhat-operators-djhzh" Dec 01 08:19:28 crc kubenswrapper[5004]: I1201 08:19:28.071697 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50e55125-2439-4ff5-87f4-e15db8c26dae-catalog-content\") pod \"redhat-operators-djhzh\" (UID: \"50e55125-2439-4ff5-87f4-e15db8c26dae\") " pod="openshift-marketplace/redhat-operators-djhzh" Dec 01 08:19:28 crc kubenswrapper[5004]: I1201 08:19:28.071749 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50e55125-2439-4ff5-87f4-e15db8c26dae-utilities\") pod \"redhat-operators-djhzh\" (UID: \"50e55125-2439-4ff5-87f4-e15db8c26dae\") " pod="openshift-marketplace/redhat-operators-djhzh" Dec 01 08:19:28 crc kubenswrapper[5004]: I1201 08:19:28.072162 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50e55125-2439-4ff5-87f4-e15db8c26dae-utilities\") pod \"redhat-operators-djhzh\" (UID: \"50e55125-2439-4ff5-87f4-e15db8c26dae\") " pod="openshift-marketplace/redhat-operators-djhzh" Dec 01 08:19:28 crc kubenswrapper[5004]: I1201 08:19:28.072928 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50e55125-2439-4ff5-87f4-e15db8c26dae-catalog-content\") pod \"redhat-operators-djhzh\" (UID: \"50e55125-2439-4ff5-87f4-e15db8c26dae\") " pod="openshift-marketplace/redhat-operators-djhzh" Dec 01 08:19:28 crc kubenswrapper[5004]: I1201 08:19:28.112411 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqcrk\" (UniqueName: \"kubernetes.io/projected/50e55125-2439-4ff5-87f4-e15db8c26dae-kube-api-access-qqcrk\") pod \"redhat-operators-djhzh\" (UID: \"50e55125-2439-4ff5-87f4-e15db8c26dae\") " pod="openshift-marketplace/redhat-operators-djhzh" Dec 01 08:19:28 crc kubenswrapper[5004]: I1201 08:19:28.181077 5004 generic.go:334] "Generic (PLEG): container finished" podID="b0286c29-4a56-4e46-8820-21dfbc658c86" containerID="96e1f7b40fe70c152339afe6573337284eb05591732581b555d6a7a8b1e564a0" exitCode=0 Dec 01 08:19:28 crc kubenswrapper[5004]: I1201 08:19:28.181146 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pl87w" event={"ID":"b0286c29-4a56-4e46-8820-21dfbc658c86","Type":"ContainerDied","Data":"96e1f7b40fe70c152339afe6573337284eb05591732581b555d6a7a8b1e564a0"} Dec 01 08:19:28 crc kubenswrapper[5004]: I1201 08:19:28.181171 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pl87w" event={"ID":"b0286c29-4a56-4e46-8820-21dfbc658c86","Type":"ContainerStarted","Data":"722a3f9b52cf235625e44e5718f1f3371bd16330168a7bb4c05370dd23f51940"} Dec 01 08:19:28 crc kubenswrapper[5004]: I1201 08:19:28.186815 5004 generic.go:334] "Generic (PLEG): container finished" podID="4d7ad504-8dea-4c9c-9a5a-682d56793c9b" containerID="e59bdbc20757ab353ba06d898d2fa03876048272352084f2ac28982c71b8b19b" exitCode=0 Dec 01 08:19:28 crc kubenswrapper[5004]: I1201 08:19:28.186908 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6fl5f" event={"ID":"4d7ad504-8dea-4c9c-9a5a-682d56793c9b","Type":"ContainerDied","Data":"e59bdbc20757ab353ba06d898d2fa03876048272352084f2ac28982c71b8b19b"} Dec 01 08:19:28 crc kubenswrapper[5004]: I1201 08:19:28.186959 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6fl5f" event={"ID":"4d7ad504-8dea-4c9c-9a5a-682d56793c9b","Type":"ContainerStarted","Data":"03009d147f96e714e3ae70449468f3a71750dd69c196de3267463cb845b938e2"} Dec 01 08:19:28 crc kubenswrapper[5004]: I1201 08:19:28.190930 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 01 08:19:28 crc kubenswrapper[5004]: W1201 08:19:28.211325 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod8ece7887_299b_4c39_8669_f23a01e922eb.slice/crio-4a4c6803af8eeba307c6c19e8125b1d38eff46b50bdd1a1a54311cdaa15c131c WatchSource:0}: Error finding container 4a4c6803af8eeba307c6c19e8125b1d38eff46b50bdd1a1a54311cdaa15c131c: Status 404 returned error can't find the container with id 4a4c6803af8eeba307c6c19e8125b1d38eff46b50bdd1a1a54311cdaa15c131c Dec 01 08:19:28 crc kubenswrapper[5004]: I1201 08:19:28.224745 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-djhzh" Dec 01 08:19:28 crc kubenswrapper[5004]: I1201 08:19:28.290740 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wvfds"] Dec 01 08:19:28 crc kubenswrapper[5004]: W1201 08:19:28.321630 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f079082_1b6f_4919_8d2c_c640f30de417.slice/crio-e3d302ab764558a23ad99272aae3abf968e4b0dce735241d75edd3fea79c1c00 WatchSource:0}: Error finding container e3d302ab764558a23ad99272aae3abf968e4b0dce735241d75edd3fea79c1c00: Status 404 returned error can't find the container with id e3d302ab764558a23ad99272aae3abf968e4b0dce735241d75edd3fea79c1c00 Dec 01 08:19:28 crc kubenswrapper[5004]: I1201 08:19:28.379212 5004 patch_prober.go:28] interesting pod/router-default-5444994796-6x76r container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 08:19:28 crc kubenswrapper[5004]: [-]has-synced failed: reason withheld Dec 01 08:19:28 crc kubenswrapper[5004]: [+]process-running ok Dec 01 08:19:28 crc kubenswrapper[5004]: healthz check failed Dec 01 08:19:28 crc kubenswrapper[5004]: I1201 08:19:28.379283 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6x76r" podUID="b3ef8088-fc82-4ce3-9c1c-662e380e0587" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 08:19:28 crc kubenswrapper[5004]: I1201 08:19:28.703064 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-djhzh"] Dec 01 08:19:29 crc kubenswrapper[5004]: I1201 08:19:29.196630 5004 generic.go:334] "Generic (PLEG): container finished" podID="8f079082-1b6f-4919-8d2c-c640f30de417" containerID="0020faff5bf1262e9cb333a1f3852acfcaf730ef0f690befd288586307f1caca" exitCode=0 Dec 01 08:19:29 crc kubenswrapper[5004]: I1201 08:19:29.196709 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvfds" event={"ID":"8f079082-1b6f-4919-8d2c-c640f30de417","Type":"ContainerDied","Data":"0020faff5bf1262e9cb333a1f3852acfcaf730ef0f690befd288586307f1caca"} Dec 01 08:19:29 crc kubenswrapper[5004]: I1201 08:19:29.196760 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvfds" event={"ID":"8f079082-1b6f-4919-8d2c-c640f30de417","Type":"ContainerStarted","Data":"e3d302ab764558a23ad99272aae3abf968e4b0dce735241d75edd3fea79c1c00"} Dec 01 08:19:29 crc kubenswrapper[5004]: I1201 08:19:29.201993 5004 generic.go:334] "Generic (PLEG): container finished" podID="8ece7887-299b-4c39-8669-f23a01e922eb" containerID="eebb359cefde72472187a047eb09551fbc1ac81c1349fd488117e0d53fde4570" exitCode=0 Dec 01 08:19:29 crc kubenswrapper[5004]: I1201 08:19:29.202048 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"8ece7887-299b-4c39-8669-f23a01e922eb","Type":"ContainerDied","Data":"eebb359cefde72472187a047eb09551fbc1ac81c1349fd488117e0d53fde4570"} Dec 01 08:19:29 crc kubenswrapper[5004]: I1201 08:19:29.202069 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"8ece7887-299b-4c39-8669-f23a01e922eb","Type":"ContainerStarted","Data":"4a4c6803af8eeba307c6c19e8125b1d38eff46b50bdd1a1a54311cdaa15c131c"} Dec 01 08:19:29 crc kubenswrapper[5004]: I1201 08:19:29.204384 5004 generic.go:334] "Generic (PLEG): container finished" podID="50e55125-2439-4ff5-87f4-e15db8c26dae" containerID="bf6ce909c7e5e2a43b2308b70506a049ee95d531e9282fbf54f2d0cc0c78ed35" exitCode=0 Dec 01 08:19:29 crc kubenswrapper[5004]: I1201 08:19:29.204410 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-djhzh" event={"ID":"50e55125-2439-4ff5-87f4-e15db8c26dae","Type":"ContainerDied","Data":"bf6ce909c7e5e2a43b2308b70506a049ee95d531e9282fbf54f2d0cc0c78ed35"} Dec 01 08:19:29 crc kubenswrapper[5004]: I1201 08:19:29.204425 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-djhzh" event={"ID":"50e55125-2439-4ff5-87f4-e15db8c26dae","Type":"ContainerStarted","Data":"d271da7abd74019a5c1ca0e401792e264670963646b26b095fb75c0d480570a7"} Dec 01 08:19:29 crc kubenswrapper[5004]: I1201 08:19:29.205348 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-kplt9" Dec 01 08:19:29 crc kubenswrapper[5004]: I1201 08:19:29.211302 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-kplt9" Dec 01 08:19:29 crc kubenswrapper[5004]: I1201 08:19:29.379392 5004 patch_prober.go:28] interesting pod/router-default-5444994796-6x76r container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 08:19:29 crc kubenswrapper[5004]: [-]has-synced failed: reason withheld Dec 01 08:19:29 crc kubenswrapper[5004]: [+]process-running ok Dec 01 08:19:29 crc kubenswrapper[5004]: healthz check failed Dec 01 08:19:29 crc kubenswrapper[5004]: I1201 08:19:29.379486 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6x76r" podUID="b3ef8088-fc82-4ce3-9c1c-662e380e0587" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 08:19:29 crc kubenswrapper[5004]: I1201 08:19:29.684634 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-qmztb" Dec 01 08:19:29 crc kubenswrapper[5004]: I1201 08:19:29.702397 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-th28b" Dec 01 08:19:29 crc kubenswrapper[5004]: I1201 08:19:29.702439 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-th28b" Dec 01 08:19:29 crc kubenswrapper[5004]: I1201 08:19:29.703346 5004 patch_prober.go:28] interesting pod/console-f9d7485db-th28b container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Dec 01 08:19:29 crc kubenswrapper[5004]: I1201 08:19:29.703395 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-th28b" podUID="ce579b07-073d-450d-b056-1be2c7bed20f" containerName="console" probeResult="failure" output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" Dec 01 08:19:29 crc kubenswrapper[5004]: I1201 08:19:29.718181 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:19:29 crc kubenswrapper[5004]: I1201 08:19:29.718434 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 08:19:29 crc kubenswrapper[5004]: I1201 08:19:29.718463 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:19:29 crc kubenswrapper[5004]: I1201 08:19:29.718494 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 08:19:29 crc kubenswrapper[5004]: I1201 08:19:29.728522 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:19:29 crc kubenswrapper[5004]: I1201 08:19:29.733034 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 08:19:29 crc kubenswrapper[5004]: I1201 08:19:29.738187 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 08:19:29 crc kubenswrapper[5004]: I1201 08:19:29.905005 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:19:29 crc kubenswrapper[5004]: I1201 08:19:29.975133 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 08:19:29 crc kubenswrapper[5004]: I1201 08:19:29.983707 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 08:19:29 crc kubenswrapper[5004]: I1201 08:19:29.989399 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 08:19:30 crc kubenswrapper[5004]: I1201 08:19:30.072134 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mrj44" Dec 01 08:19:30 crc kubenswrapper[5004]: I1201 08:19:30.072383 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mrj44" Dec 01 08:19:30 crc kubenswrapper[5004]: I1201 08:19:30.077551 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mrj44" Dec 01 08:19:30 crc kubenswrapper[5004]: I1201 08:19:30.159297 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-8g8bf" Dec 01 08:19:30 crc kubenswrapper[5004]: I1201 08:19:30.229717 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mrj44" Dec 01 08:19:30 crc kubenswrapper[5004]: W1201 08:19:30.311845 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-90f14c8b9c715c570c531ebe1a640fb50c118a9ab208945b924b321d4d333146 WatchSource:0}: Error finding container 90f14c8b9c715c570c531ebe1a640fb50c118a9ab208945b924b321d4d333146: Status 404 returned error can't find the container with id 90f14c8b9c715c570c531ebe1a640fb50c118a9ab208945b924b321d4d333146 Dec 01 08:19:30 crc kubenswrapper[5004]: I1201 08:19:30.376204 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-6x76r" Dec 01 08:19:30 crc kubenswrapper[5004]: I1201 08:19:30.382303 5004 patch_prober.go:28] interesting pod/router-default-5444994796-6x76r container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 08:19:30 crc kubenswrapper[5004]: [-]has-synced failed: reason withheld Dec 01 08:19:30 crc kubenswrapper[5004]: [+]process-running ok Dec 01 08:19:30 crc kubenswrapper[5004]: healthz check failed Dec 01 08:19:30 crc kubenswrapper[5004]: I1201 08:19:30.382348 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6x76r" podUID="b3ef8088-fc82-4ce3-9c1c-662e380e0587" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 08:19:30 crc kubenswrapper[5004]: I1201 08:19:30.499103 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 01 08:19:30 crc kubenswrapper[5004]: I1201 08:19:30.499735 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 08:19:30 crc kubenswrapper[5004]: I1201 08:19:30.501907 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 01 08:19:30 crc kubenswrapper[5004]: I1201 08:19:30.502042 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 01 08:19:30 crc kubenswrapper[5004]: W1201 08:19:30.503017 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-a7cf3e0a4edf9b8dcd76ec93842c09a53101e2d338d6280b211c41c8b3511a19 WatchSource:0}: Error finding container a7cf3e0a4edf9b8dcd76ec93842c09a53101e2d338d6280b211c41c8b3511a19: Status 404 returned error can't find the container with id a7cf3e0a4edf9b8dcd76ec93842c09a53101e2d338d6280b211c41c8b3511a19 Dec 01 08:19:30 crc kubenswrapper[5004]: I1201 08:19:30.514980 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 01 08:19:30 crc kubenswrapper[5004]: I1201 08:19:30.566585 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 08:19:30 crc kubenswrapper[5004]: I1201 08:19:30.630817 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8ece7887-299b-4c39-8669-f23a01e922eb-kubelet-dir\") pod \"8ece7887-299b-4c39-8669-f23a01e922eb\" (UID: \"8ece7887-299b-4c39-8669-f23a01e922eb\") " Dec 01 08:19:30 crc kubenswrapper[5004]: I1201 08:19:30.631133 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8ece7887-299b-4c39-8669-f23a01e922eb-kube-api-access\") pod \"8ece7887-299b-4c39-8669-f23a01e922eb\" (UID: \"8ece7887-299b-4c39-8669-f23a01e922eb\") " Dec 01 08:19:30 crc kubenswrapper[5004]: I1201 08:19:30.631327 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e452fcc5-664c-48a3-878a-3e92334c81a8-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"e452fcc5-664c-48a3-878a-3e92334c81a8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 08:19:30 crc kubenswrapper[5004]: I1201 08:19:30.631403 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e452fcc5-664c-48a3-878a-3e92334c81a8-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"e452fcc5-664c-48a3-878a-3e92334c81a8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 08:19:30 crc kubenswrapper[5004]: I1201 08:19:30.630934 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8ece7887-299b-4c39-8669-f23a01e922eb-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "8ece7887-299b-4c39-8669-f23a01e922eb" (UID: "8ece7887-299b-4c39-8669-f23a01e922eb"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 08:19:30 crc kubenswrapper[5004]: I1201 08:19:30.635740 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ece7887-299b-4c39-8669-f23a01e922eb-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "8ece7887-299b-4c39-8669-f23a01e922eb" (UID: "8ece7887-299b-4c39-8669-f23a01e922eb"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:19:30 crc kubenswrapper[5004]: I1201 08:19:30.733077 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e452fcc5-664c-48a3-878a-3e92334c81a8-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"e452fcc5-664c-48a3-878a-3e92334c81a8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 08:19:30 crc kubenswrapper[5004]: I1201 08:19:30.733200 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e452fcc5-664c-48a3-878a-3e92334c81a8-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"e452fcc5-664c-48a3-878a-3e92334c81a8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 08:19:30 crc kubenswrapper[5004]: I1201 08:19:30.733283 5004 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8ece7887-299b-4c39-8669-f23a01e922eb-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 01 08:19:30 crc kubenswrapper[5004]: I1201 08:19:30.733299 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8ece7887-299b-4c39-8669-f23a01e922eb-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 08:19:30 crc kubenswrapper[5004]: I1201 08:19:30.733316 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e452fcc5-664c-48a3-878a-3e92334c81a8-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"e452fcc5-664c-48a3-878a-3e92334c81a8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 08:19:30 crc kubenswrapper[5004]: I1201 08:19:30.750189 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e452fcc5-664c-48a3-878a-3e92334c81a8-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"e452fcc5-664c-48a3-878a-3e92334c81a8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 08:19:30 crc kubenswrapper[5004]: I1201 08:19:30.820266 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 08:19:31 crc kubenswrapper[5004]: I1201 08:19:31.233471 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"1f078b8a66182d7fa0f1686428819af513c5282635ef8c81a11bda8318fd3e49"} Dec 01 08:19:31 crc kubenswrapper[5004]: I1201 08:19:31.233520 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"90f14c8b9c715c570c531ebe1a640fb50c118a9ab208945b924b321d4d333146"} Dec 01 08:19:31 crc kubenswrapper[5004]: I1201 08:19:31.252319 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"4cfdf23c034b9bbc6a0fbe9cd26625eb5836b0c90b23c4f14473634504089f7f"} Dec 01 08:19:31 crc kubenswrapper[5004]: I1201 08:19:31.252389 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"0fa01c39c760cd50158a571b92680916869d4c6594195f1975a08c4fc10891af"} Dec 01 08:19:31 crc kubenswrapper[5004]: I1201 08:19:31.252598 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 08:19:31 crc kubenswrapper[5004]: I1201 08:19:31.259403 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"c2ba18491aa78fa725523bf76ca5b74d66b6dfc35f49622f440a06d9a5ed1766"} Dec 01 08:19:31 crc kubenswrapper[5004]: I1201 08:19:31.259437 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"a7cf3e0a4edf9b8dcd76ec93842c09a53101e2d338d6280b211c41c8b3511a19"} Dec 01 08:19:31 crc kubenswrapper[5004]: I1201 08:19:31.262775 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"8ece7887-299b-4c39-8669-f23a01e922eb","Type":"ContainerDied","Data":"4a4c6803af8eeba307c6c19e8125b1d38eff46b50bdd1a1a54311cdaa15c131c"} Dec 01 08:19:31 crc kubenswrapper[5004]: I1201 08:19:31.262806 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a4c6803af8eeba307c6c19e8125b1d38eff46b50bdd1a1a54311cdaa15c131c" Dec 01 08:19:31 crc kubenswrapper[5004]: I1201 08:19:31.262837 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 08:19:31 crc kubenswrapper[5004]: I1201 08:19:31.381150 5004 patch_prober.go:28] interesting pod/router-default-5444994796-6x76r container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 08:19:31 crc kubenswrapper[5004]: [-]has-synced failed: reason withheld Dec 01 08:19:31 crc kubenswrapper[5004]: [+]process-running ok Dec 01 08:19:31 crc kubenswrapper[5004]: healthz check failed Dec 01 08:19:31 crc kubenswrapper[5004]: I1201 08:19:31.381210 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6x76r" podUID="b3ef8088-fc82-4ce3-9c1c-662e380e0587" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 08:19:32 crc kubenswrapper[5004]: I1201 08:19:32.233458 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-qwgn4" Dec 01 08:19:32 crc kubenswrapper[5004]: I1201 08:19:32.379283 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-6x76r" Dec 01 08:19:32 crc kubenswrapper[5004]: I1201 08:19:32.380867 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-6x76r" Dec 01 08:19:38 crc kubenswrapper[5004]: I1201 08:19:38.729300 5004 patch_prober.go:28] interesting pod/machine-config-daemon-fvdgt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 08:19:38 crc kubenswrapper[5004]: I1201 08:19:38.729901 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 08:19:39 crc kubenswrapper[5004]: I1201 08:19:39.712812 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-th28b" Dec 01 08:19:39 crc kubenswrapper[5004]: I1201 08:19:39.719222 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-th28b" Dec 01 08:19:45 crc kubenswrapper[5004]: I1201 08:19:45.156913 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b488f4f3-d385-4d40-bdee-96d8fe2d42a1-metrics-certs\") pod \"network-metrics-daemon-7cl5l\" (UID: \"b488f4f3-d385-4d40-bdee-96d8fe2d42a1\") " pod="openshift-multus/network-metrics-daemon-7cl5l" Dec 01 08:19:45 crc kubenswrapper[5004]: I1201 08:19:45.166908 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b488f4f3-d385-4d40-bdee-96d8fe2d42a1-metrics-certs\") pod \"network-metrics-daemon-7cl5l\" (UID: \"b488f4f3-d385-4d40-bdee-96d8fe2d42a1\") " pod="openshift-multus/network-metrics-daemon-7cl5l" Dec 01 08:19:45 crc kubenswrapper[5004]: I1201 08:19:45.381982 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cl5l" Dec 01 08:19:46 crc kubenswrapper[5004]: I1201 08:19:46.327514 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-8jr7l" Dec 01 08:19:57 crc kubenswrapper[5004]: E1201 08:19:57.213274 5004 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 01 08:19:57 crc kubenswrapper[5004]: E1201 08:19:57.213989 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qqcrk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-djhzh_openshift-marketplace(50e55125-2439-4ff5-87f4-e15db8c26dae): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 08:19:57 crc kubenswrapper[5004]: E1201 08:19:57.215811 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-djhzh" podUID="50e55125-2439-4ff5-87f4-e15db8c26dae" Dec 01 08:19:58 crc kubenswrapper[5004]: E1201 08:19:58.382068 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-djhzh" podUID="50e55125-2439-4ff5-87f4-e15db8c26dae" Dec 01 08:19:58 crc kubenswrapper[5004]: E1201 08:19:58.435602 5004 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 01 08:19:58 crc kubenswrapper[5004]: E1201 08:19:58.435741 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rn7rw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-mgs69_openshift-marketplace(9ac4710f-ad3b-471c-861d-622e995871cd): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 08:19:58 crc kubenswrapper[5004]: E1201 08:19:58.438132 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-mgs69" podUID="9ac4710f-ad3b-471c-861d-622e995871cd" Dec 01 08:19:59 crc kubenswrapper[5004]: E1201 08:19:59.449602 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-mgs69" podUID="9ac4710f-ad3b-471c-861d-622e995871cd" Dec 01 08:19:59 crc kubenswrapper[5004]: E1201 08:19:59.461815 5004 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 01 08:19:59 crc kubenswrapper[5004]: E1201 08:19:59.462182 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fng7c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-pl87w_openshift-marketplace(b0286c29-4a56-4e46-8820-21dfbc658c86): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 08:19:59 crc kubenswrapper[5004]: E1201 08:19:59.465384 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-pl87w" podUID="b0286c29-4a56-4e46-8820-21dfbc658c86" Dec 01 08:19:59 crc kubenswrapper[5004]: E1201 08:19:59.510516 5004 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 01 08:19:59 crc kubenswrapper[5004]: E1201 08:19:59.510643 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-66j5l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-jd8z9_openshift-marketplace(56a0aeb3-484c-4e7e-9ff7-b3bc75f253f4): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 08:19:59 crc kubenswrapper[5004]: E1201 08:19:59.510840 5004 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 01 08:19:59 crc kubenswrapper[5004]: E1201 08:19:59.510888 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x5dss,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-6fl5f_openshift-marketplace(4d7ad504-8dea-4c9c-9a5a-682d56793c9b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 08:19:59 crc kubenswrapper[5004]: E1201 08:19:59.512004 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-6fl5f" podUID="4d7ad504-8dea-4c9c-9a5a-682d56793c9b" Dec 01 08:19:59 crc kubenswrapper[5004]: E1201 08:19:59.512046 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-jd8z9" podUID="56a0aeb3-484c-4e7e-9ff7-b3bc75f253f4" Dec 01 08:19:59 crc kubenswrapper[5004]: E1201 08:19:59.553576 5004 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 01 08:19:59 crc kubenswrapper[5004]: E1201 08:19:59.553711 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lsshm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-dg9cf_openshift-marketplace(19da9663-9e98-41f0-a737-0c2683293496): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 08:19:59 crc kubenswrapper[5004]: E1201 08:19:59.555020 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-dg9cf" podUID="19da9663-9e98-41f0-a737-0c2683293496" Dec 01 08:19:59 crc kubenswrapper[5004]: I1201 08:19:59.603260 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 01 08:19:59 crc kubenswrapper[5004]: I1201 08:19:59.863676 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7cl5l"] Dec 01 08:19:59 crc kubenswrapper[5004]: W1201 08:19:59.872214 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb488f4f3_d385_4d40_bdee_96d8fe2d42a1.slice/crio-02f3061c36ad44643d09e52beb83e705a5355201c5853266e5e7d18db0b4314e WatchSource:0}: Error finding container 02f3061c36ad44643d09e52beb83e705a5355201c5853266e5e7d18db0b4314e: Status 404 returned error can't find the container with id 02f3061c36ad44643d09e52beb83e705a5355201c5853266e5e7d18db0b4314e Dec 01 08:20:00 crc kubenswrapper[5004]: I1201 08:20:00.211543 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rsfff" Dec 01 08:20:00 crc kubenswrapper[5004]: I1201 08:20:00.456046 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"e452fcc5-664c-48a3-878a-3e92334c81a8","Type":"ContainerStarted","Data":"c4af0c5d1df16403105b211e3dca43cd9a07db23fd77965f200d6fa2d3cca620"} Dec 01 08:20:00 crc kubenswrapper[5004]: I1201 08:20:00.456424 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"e452fcc5-664c-48a3-878a-3e92334c81a8","Type":"ContainerStarted","Data":"4fdbbecaa65bf6aac8bf0a4d256741d427ebd6384df1879b36ea84523a42d986"} Dec 01 08:20:00 crc kubenswrapper[5004]: I1201 08:20:00.460454 5004 generic.go:334] "Generic (PLEG): container finished" podID="494e6c31-3cc3-45a4-b45c-8f5ffb4251fa" containerID="82328ea9d94e5415b9e2a6792045138c763aad3324f611d140ea9eca1e044c31" exitCode=0 Dec 01 08:20:00 crc kubenswrapper[5004]: I1201 08:20:00.460510 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dqzk6" event={"ID":"494e6c31-3cc3-45a4-b45c-8f5ffb4251fa","Type":"ContainerDied","Data":"82328ea9d94e5415b9e2a6792045138c763aad3324f611d140ea9eca1e044c31"} Dec 01 08:20:00 crc kubenswrapper[5004]: I1201 08:20:00.463058 5004 generic.go:334] "Generic (PLEG): container finished" podID="8f079082-1b6f-4919-8d2c-c640f30de417" containerID="4ac44ca1597f9ba8f4e6822986319e334129b0cbf52bbdfc96aeb5c30e74d163" exitCode=0 Dec 01 08:20:00 crc kubenswrapper[5004]: I1201 08:20:00.463145 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvfds" event={"ID":"8f079082-1b6f-4919-8d2c-c640f30de417","Type":"ContainerDied","Data":"4ac44ca1597f9ba8f4e6822986319e334129b0cbf52bbdfc96aeb5c30e74d163"} Dec 01 08:20:00 crc kubenswrapper[5004]: I1201 08:20:00.468149 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7cl5l" event={"ID":"b488f4f3-d385-4d40-bdee-96d8fe2d42a1","Type":"ContainerStarted","Data":"236252d20508bf78957490baedb16e65697251ede0f14c2850c6e3f27c2e6c5c"} Dec 01 08:20:00 crc kubenswrapper[5004]: I1201 08:20:00.468210 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7cl5l" event={"ID":"b488f4f3-d385-4d40-bdee-96d8fe2d42a1","Type":"ContainerStarted","Data":"d808cf6eb82079d2662d7dea713b3470ad02f12af6b0b211a3433b6924ddb38f"} Dec 01 08:20:00 crc kubenswrapper[5004]: I1201 08:20:00.468231 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7cl5l" event={"ID":"b488f4f3-d385-4d40-bdee-96d8fe2d42a1","Type":"ContainerStarted","Data":"02f3061c36ad44643d09e52beb83e705a5355201c5853266e5e7d18db0b4314e"} Dec 01 08:20:00 crc kubenswrapper[5004]: E1201 08:20:00.471195 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-pl87w" podUID="b0286c29-4a56-4e46-8820-21dfbc658c86" Dec 01 08:20:00 crc kubenswrapper[5004]: E1201 08:20:00.471505 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-6fl5f" podUID="4d7ad504-8dea-4c9c-9a5a-682d56793c9b" Dec 01 08:20:00 crc kubenswrapper[5004]: E1201 08:20:00.471664 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-dg9cf" podUID="19da9663-9e98-41f0-a737-0c2683293496" Dec 01 08:20:00 crc kubenswrapper[5004]: I1201 08:20:00.484144 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=30.484117025 podStartE2EDuration="30.484117025s" podCreationTimestamp="2025-12-01 08:19:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:20:00.474406822 +0000 UTC m=+178.039398844" watchObservedRunningTime="2025-12-01 08:20:00.484117025 +0000 UTC m=+178.049109047" Dec 01 08:20:00 crc kubenswrapper[5004]: E1201 08:20:00.489203 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-jd8z9" podUID="56a0aeb3-484c-4e7e-9ff7-b3bc75f253f4" Dec 01 08:20:00 crc kubenswrapper[5004]: I1201 08:20:00.636167 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-7cl5l" podStartSLOduration=158.636137784 podStartE2EDuration="2m38.636137784s" podCreationTimestamp="2025-12-01 08:17:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:20:00.635668532 +0000 UTC m=+178.200660564" watchObservedRunningTime="2025-12-01 08:20:00.636137784 +0000 UTC m=+178.201129806" Dec 01 08:20:01 crc kubenswrapper[5004]: I1201 08:20:01.477524 5004 generic.go:334] "Generic (PLEG): container finished" podID="e452fcc5-664c-48a3-878a-3e92334c81a8" containerID="c4af0c5d1df16403105b211e3dca43cd9a07db23fd77965f200d6fa2d3cca620" exitCode=0 Dec 01 08:20:01 crc kubenswrapper[5004]: I1201 08:20:01.477695 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"e452fcc5-664c-48a3-878a-3e92334c81a8","Type":"ContainerDied","Data":"c4af0c5d1df16403105b211e3dca43cd9a07db23fd77965f200d6fa2d3cca620"} Dec 01 08:20:02 crc kubenswrapper[5004]: I1201 08:20:02.496162 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dqzk6" event={"ID":"494e6c31-3cc3-45a4-b45c-8f5ffb4251fa","Type":"ContainerStarted","Data":"f3ad39fdb9dce7b3e51d34f6a3fb2800e1ec2a34b1bd336ee5ba8a5221b916d5"} Dec 01 08:20:02 crc kubenswrapper[5004]: I1201 08:20:02.498349 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvfds" event={"ID":"8f079082-1b6f-4919-8d2c-c640f30de417","Type":"ContainerStarted","Data":"98d8c746f45777a77a60247a36da7f37e3cba4aab2752e59ee09374f7e491cbd"} Dec 01 08:20:02 crc kubenswrapper[5004]: I1201 08:20:02.514757 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dqzk6" podStartSLOduration=3.672149099 podStartE2EDuration="38.514742472s" podCreationTimestamp="2025-12-01 08:19:24 +0000 UTC" firstStartedPulling="2025-12-01 08:19:27.170733079 +0000 UTC m=+144.735725061" lastFinishedPulling="2025-12-01 08:20:02.013326452 +0000 UTC m=+179.578318434" observedRunningTime="2025-12-01 08:20:02.514205997 +0000 UTC m=+180.079197979" watchObservedRunningTime="2025-12-01 08:20:02.514742472 +0000 UTC m=+180.079734454" Dec 01 08:20:02 crc kubenswrapper[5004]: I1201 08:20:02.538992 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wvfds" podStartSLOduration=2.95389337 podStartE2EDuration="35.538976614s" podCreationTimestamp="2025-12-01 08:19:27 +0000 UTC" firstStartedPulling="2025-12-01 08:19:29.198550973 +0000 UTC m=+146.763542955" lastFinishedPulling="2025-12-01 08:20:01.783634217 +0000 UTC m=+179.348626199" observedRunningTime="2025-12-01 08:20:02.535031741 +0000 UTC m=+180.100023743" watchObservedRunningTime="2025-12-01 08:20:02.538976614 +0000 UTC m=+180.103968596" Dec 01 08:20:02 crc kubenswrapper[5004]: I1201 08:20:02.741743 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 08:20:02 crc kubenswrapper[5004]: I1201 08:20:02.829786 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e452fcc5-664c-48a3-878a-3e92334c81a8-kube-api-access\") pod \"e452fcc5-664c-48a3-878a-3e92334c81a8\" (UID: \"e452fcc5-664c-48a3-878a-3e92334c81a8\") " Dec 01 08:20:02 crc kubenswrapper[5004]: I1201 08:20:02.829974 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e452fcc5-664c-48a3-878a-3e92334c81a8-kubelet-dir\") pod \"e452fcc5-664c-48a3-878a-3e92334c81a8\" (UID: \"e452fcc5-664c-48a3-878a-3e92334c81a8\") " Dec 01 08:20:02 crc kubenswrapper[5004]: I1201 08:20:02.830261 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e452fcc5-664c-48a3-878a-3e92334c81a8-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e452fcc5-664c-48a3-878a-3e92334c81a8" (UID: "e452fcc5-664c-48a3-878a-3e92334c81a8"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 08:20:02 crc kubenswrapper[5004]: I1201 08:20:02.838607 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e452fcc5-664c-48a3-878a-3e92334c81a8-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e452fcc5-664c-48a3-878a-3e92334c81a8" (UID: "e452fcc5-664c-48a3-878a-3e92334c81a8"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:20:02 crc kubenswrapper[5004]: I1201 08:20:02.930603 5004 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e452fcc5-664c-48a3-878a-3e92334c81a8-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 01 08:20:02 crc kubenswrapper[5004]: I1201 08:20:02.930908 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e452fcc5-664c-48a3-878a-3e92334c81a8-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 08:20:03 crc kubenswrapper[5004]: I1201 08:20:03.504802 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"e452fcc5-664c-48a3-878a-3e92334c81a8","Type":"ContainerDied","Data":"4fdbbecaa65bf6aac8bf0a4d256741d427ebd6384df1879b36ea84523a42d986"} Dec 01 08:20:03 crc kubenswrapper[5004]: I1201 08:20:03.504850 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4fdbbecaa65bf6aac8bf0a4d256741d427ebd6384df1879b36ea84523a42d986" Dec 01 08:20:03 crc kubenswrapper[5004]: I1201 08:20:03.504915 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 08:20:05 crc kubenswrapper[5004]: I1201 08:20:05.320531 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dqzk6" Dec 01 08:20:05 crc kubenswrapper[5004]: I1201 08:20:05.320882 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dqzk6" Dec 01 08:20:05 crc kubenswrapper[5004]: I1201 08:20:05.382606 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dqzk6" Dec 01 08:20:07 crc kubenswrapper[5004]: I1201 08:20:07.502358 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 01 08:20:07 crc kubenswrapper[5004]: E1201 08:20:07.502609 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ece7887-299b-4c39-8669-f23a01e922eb" containerName="pruner" Dec 01 08:20:07 crc kubenswrapper[5004]: I1201 08:20:07.502623 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ece7887-299b-4c39-8669-f23a01e922eb" containerName="pruner" Dec 01 08:20:07 crc kubenswrapper[5004]: E1201 08:20:07.502647 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e452fcc5-664c-48a3-878a-3e92334c81a8" containerName="pruner" Dec 01 08:20:07 crc kubenswrapper[5004]: I1201 08:20:07.502654 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="e452fcc5-664c-48a3-878a-3e92334c81a8" containerName="pruner" Dec 01 08:20:07 crc kubenswrapper[5004]: I1201 08:20:07.502772 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ece7887-299b-4c39-8669-f23a01e922eb" containerName="pruner" Dec 01 08:20:07 crc kubenswrapper[5004]: I1201 08:20:07.502792 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="e452fcc5-664c-48a3-878a-3e92334c81a8" containerName="pruner" Dec 01 08:20:07 crc kubenswrapper[5004]: I1201 08:20:07.503181 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 08:20:07 crc kubenswrapper[5004]: I1201 08:20:07.505434 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 01 08:20:07 crc kubenswrapper[5004]: I1201 08:20:07.505875 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 01 08:20:07 crc kubenswrapper[5004]: I1201 08:20:07.526944 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 01 08:20:07 crc kubenswrapper[5004]: I1201 08:20:07.592640 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7566183a-c465-40a1-962f-8035ce711b49-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"7566183a-c465-40a1-962f-8035ce711b49\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 08:20:07 crc kubenswrapper[5004]: I1201 08:20:07.592744 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7566183a-c465-40a1-962f-8035ce711b49-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"7566183a-c465-40a1-962f-8035ce711b49\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 08:20:07 crc kubenswrapper[5004]: I1201 08:20:07.694206 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7566183a-c465-40a1-962f-8035ce711b49-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"7566183a-c465-40a1-962f-8035ce711b49\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 08:20:07 crc kubenswrapper[5004]: I1201 08:20:07.694785 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7566183a-c465-40a1-962f-8035ce711b49-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"7566183a-c465-40a1-962f-8035ce711b49\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 08:20:07 crc kubenswrapper[5004]: I1201 08:20:07.694864 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7566183a-c465-40a1-962f-8035ce711b49-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"7566183a-c465-40a1-962f-8035ce711b49\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 08:20:07 crc kubenswrapper[5004]: I1201 08:20:07.731097 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7566183a-c465-40a1-962f-8035ce711b49-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"7566183a-c465-40a1-962f-8035ce711b49\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 08:20:07 crc kubenswrapper[5004]: I1201 08:20:07.872859 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 08:20:08 crc kubenswrapper[5004]: I1201 08:20:08.052930 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wvfds" Dec 01 08:20:08 crc kubenswrapper[5004]: I1201 08:20:08.053301 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wvfds" Dec 01 08:20:08 crc kubenswrapper[5004]: I1201 08:20:08.067669 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 01 08:20:08 crc kubenswrapper[5004]: W1201 08:20:08.074507 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod7566183a_c465_40a1_962f_8035ce711b49.slice/crio-2a15012a15a688ede7f2d045bd322f024618f7f411cc699a3eeba1756b13dd1a WatchSource:0}: Error finding container 2a15012a15a688ede7f2d045bd322f024618f7f411cc699a3eeba1756b13dd1a: Status 404 returned error can't find the container with id 2a15012a15a688ede7f2d045bd322f024618f7f411cc699a3eeba1756b13dd1a Dec 01 08:20:08 crc kubenswrapper[5004]: I1201 08:20:08.098804 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wvfds" Dec 01 08:20:08 crc kubenswrapper[5004]: I1201 08:20:08.532497 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"7566183a-c465-40a1-962f-8035ce711b49","Type":"ContainerStarted","Data":"2a15012a15a688ede7f2d045bd322f024618f7f411cc699a3eeba1756b13dd1a"} Dec 01 08:20:08 crc kubenswrapper[5004]: I1201 08:20:08.590247 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wvfds" Dec 01 08:20:08 crc kubenswrapper[5004]: I1201 08:20:08.728933 5004 patch_prober.go:28] interesting pod/machine-config-daemon-fvdgt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 08:20:08 crc kubenswrapper[5004]: I1201 08:20:08.728994 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 08:20:09 crc kubenswrapper[5004]: I1201 08:20:09.539198 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"7566183a-c465-40a1-962f-8035ce711b49","Type":"ContainerStarted","Data":"af535d99d8d338ebf8fff3c77113f3edd71a374af08c944b9b3a00431ab2f362"} Dec 01 08:20:09 crc kubenswrapper[5004]: I1201 08:20:09.551302 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=2.551285868 podStartE2EDuration="2.551285868s" podCreationTimestamp="2025-12-01 08:20:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:20:09.550849616 +0000 UTC m=+187.115841598" watchObservedRunningTime="2025-12-01 08:20:09.551285868 +0000 UTC m=+187.116277850" Dec 01 08:20:09 crc kubenswrapper[5004]: I1201 08:20:09.993516 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 08:20:10 crc kubenswrapper[5004]: I1201 08:20:10.544591 5004 generic.go:334] "Generic (PLEG): container finished" podID="7566183a-c465-40a1-962f-8035ce711b49" containerID="af535d99d8d338ebf8fff3c77113f3edd71a374af08c944b9b3a00431ab2f362" exitCode=0 Dec 01 08:20:10 crc kubenswrapper[5004]: I1201 08:20:10.544639 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"7566183a-c465-40a1-962f-8035ce711b49","Type":"ContainerDied","Data":"af535d99d8d338ebf8fff3c77113f3edd71a374af08c944b9b3a00431ab2f362"} Dec 01 08:20:11 crc kubenswrapper[5004]: I1201 08:20:11.813310 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 08:20:11 crc kubenswrapper[5004]: I1201 08:20:11.953631 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7566183a-c465-40a1-962f-8035ce711b49-kubelet-dir\") pod \"7566183a-c465-40a1-962f-8035ce711b49\" (UID: \"7566183a-c465-40a1-962f-8035ce711b49\") " Dec 01 08:20:11 crc kubenswrapper[5004]: I1201 08:20:11.953722 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7566183a-c465-40a1-962f-8035ce711b49-kube-api-access\") pod \"7566183a-c465-40a1-962f-8035ce711b49\" (UID: \"7566183a-c465-40a1-962f-8035ce711b49\") " Dec 01 08:20:11 crc kubenswrapper[5004]: I1201 08:20:11.953765 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7566183a-c465-40a1-962f-8035ce711b49-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "7566183a-c465-40a1-962f-8035ce711b49" (UID: "7566183a-c465-40a1-962f-8035ce711b49"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 08:20:11 crc kubenswrapper[5004]: I1201 08:20:11.954199 5004 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7566183a-c465-40a1-962f-8035ce711b49-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 01 08:20:11 crc kubenswrapper[5004]: I1201 08:20:11.960996 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7566183a-c465-40a1-962f-8035ce711b49-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "7566183a-c465-40a1-962f-8035ce711b49" (UID: "7566183a-c465-40a1-962f-8035ce711b49"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:20:12 crc kubenswrapper[5004]: I1201 08:20:12.055367 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7566183a-c465-40a1-962f-8035ce711b49-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 08:20:12 crc kubenswrapper[5004]: I1201 08:20:12.557059 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"7566183a-c465-40a1-962f-8035ce711b49","Type":"ContainerDied","Data":"2a15012a15a688ede7f2d045bd322f024618f7f411cc699a3eeba1756b13dd1a"} Dec 01 08:20:12 crc kubenswrapper[5004]: I1201 08:20:12.557672 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a15012a15a688ede7f2d045bd322f024618f7f411cc699a3eeba1756b13dd1a" Dec 01 08:20:12 crc kubenswrapper[5004]: I1201 08:20:12.557799 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 08:20:13 crc kubenswrapper[5004]: I1201 08:20:13.496457 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 01 08:20:13 crc kubenswrapper[5004]: E1201 08:20:13.496720 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7566183a-c465-40a1-962f-8035ce711b49" containerName="pruner" Dec 01 08:20:13 crc kubenswrapper[5004]: I1201 08:20:13.496737 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="7566183a-c465-40a1-962f-8035ce711b49" containerName="pruner" Dec 01 08:20:13 crc kubenswrapper[5004]: I1201 08:20:13.496854 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="7566183a-c465-40a1-962f-8035ce711b49" containerName="pruner" Dec 01 08:20:13 crc kubenswrapper[5004]: I1201 08:20:13.497220 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 01 08:20:13 crc kubenswrapper[5004]: I1201 08:20:13.498839 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 01 08:20:13 crc kubenswrapper[5004]: I1201 08:20:13.499602 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 01 08:20:13 crc kubenswrapper[5004]: I1201 08:20:13.508193 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 01 08:20:13 crc kubenswrapper[5004]: I1201 08:20:13.673775 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/af8fc39f-8aa1-4e0d-8fe1-33aa79852f78-kube-api-access\") pod \"installer-9-crc\" (UID: \"af8fc39f-8aa1-4e0d-8fe1-33aa79852f78\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 08:20:13 crc kubenswrapper[5004]: I1201 08:20:13.674079 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/af8fc39f-8aa1-4e0d-8fe1-33aa79852f78-var-lock\") pod \"installer-9-crc\" (UID: \"af8fc39f-8aa1-4e0d-8fe1-33aa79852f78\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 08:20:13 crc kubenswrapper[5004]: I1201 08:20:13.674173 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/af8fc39f-8aa1-4e0d-8fe1-33aa79852f78-kubelet-dir\") pod \"installer-9-crc\" (UID: \"af8fc39f-8aa1-4e0d-8fe1-33aa79852f78\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 08:20:13 crc kubenswrapper[5004]: I1201 08:20:13.775250 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/af8fc39f-8aa1-4e0d-8fe1-33aa79852f78-kube-api-access\") pod \"installer-9-crc\" (UID: \"af8fc39f-8aa1-4e0d-8fe1-33aa79852f78\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 08:20:13 crc kubenswrapper[5004]: I1201 08:20:13.775321 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/af8fc39f-8aa1-4e0d-8fe1-33aa79852f78-var-lock\") pod \"installer-9-crc\" (UID: \"af8fc39f-8aa1-4e0d-8fe1-33aa79852f78\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 08:20:13 crc kubenswrapper[5004]: I1201 08:20:13.775348 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/af8fc39f-8aa1-4e0d-8fe1-33aa79852f78-kubelet-dir\") pod \"installer-9-crc\" (UID: \"af8fc39f-8aa1-4e0d-8fe1-33aa79852f78\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 08:20:13 crc kubenswrapper[5004]: I1201 08:20:13.775421 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/af8fc39f-8aa1-4e0d-8fe1-33aa79852f78-var-lock\") pod \"installer-9-crc\" (UID: \"af8fc39f-8aa1-4e0d-8fe1-33aa79852f78\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 08:20:13 crc kubenswrapper[5004]: I1201 08:20:13.775464 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/af8fc39f-8aa1-4e0d-8fe1-33aa79852f78-kubelet-dir\") pod \"installer-9-crc\" (UID: \"af8fc39f-8aa1-4e0d-8fe1-33aa79852f78\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 08:20:13 crc kubenswrapper[5004]: I1201 08:20:13.795461 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/af8fc39f-8aa1-4e0d-8fe1-33aa79852f78-kube-api-access\") pod \"installer-9-crc\" (UID: \"af8fc39f-8aa1-4e0d-8fe1-33aa79852f78\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 08:20:13 crc kubenswrapper[5004]: I1201 08:20:13.820463 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 01 08:20:14 crc kubenswrapper[5004]: I1201 08:20:14.016813 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 01 08:20:14 crc kubenswrapper[5004]: W1201 08:20:14.026191 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podaf8fc39f_8aa1_4e0d_8fe1_33aa79852f78.slice/crio-de753a5332f8cf7d717c3afe9c6e3657d53435793a1d2776f9228762a754c26b WatchSource:0}: Error finding container de753a5332f8cf7d717c3afe9c6e3657d53435793a1d2776f9228762a754c26b: Status 404 returned error can't find the container with id de753a5332f8cf7d717c3afe9c6e3657d53435793a1d2776f9228762a754c26b Dec 01 08:20:14 crc kubenswrapper[5004]: I1201 08:20:14.572665 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"af8fc39f-8aa1-4e0d-8fe1-33aa79852f78","Type":"ContainerStarted","Data":"3cac78010ceea6d134852b8f0e220fd6f5eec37a19b29bb2279cef8ebe2cee13"} Dec 01 08:20:14 crc kubenswrapper[5004]: I1201 08:20:14.573289 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"af8fc39f-8aa1-4e0d-8fe1-33aa79852f78","Type":"ContainerStarted","Data":"de753a5332f8cf7d717c3afe9c6e3657d53435793a1d2776f9228762a754c26b"} Dec 01 08:20:14 crc kubenswrapper[5004]: I1201 08:20:14.602441 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=1.602401168 podStartE2EDuration="1.602401168s" podCreationTimestamp="2025-12-01 08:20:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:20:14.588232889 +0000 UTC m=+192.153224881" watchObservedRunningTime="2025-12-01 08:20:14.602401168 +0000 UTC m=+192.167393150" Dec 01 08:20:15 crc kubenswrapper[5004]: I1201 08:20:15.380855 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dqzk6" Dec 01 08:20:15 crc kubenswrapper[5004]: I1201 08:20:15.426886 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dqzk6"] Dec 01 08:20:15 crc kubenswrapper[5004]: I1201 08:20:15.578131 5004 generic.go:334] "Generic (PLEG): container finished" podID="19da9663-9e98-41f0-a737-0c2683293496" containerID="82271e5359a06cdbb1f6c56401f792c5b2fb6bafb7fed178312d8a044b0e6d60" exitCode=0 Dec 01 08:20:15 crc kubenswrapper[5004]: I1201 08:20:15.578190 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dg9cf" event={"ID":"19da9663-9e98-41f0-a737-0c2683293496","Type":"ContainerDied","Data":"82271e5359a06cdbb1f6c56401f792c5b2fb6bafb7fed178312d8a044b0e6d60"} Dec 01 08:20:15 crc kubenswrapper[5004]: I1201 08:20:15.582753 5004 generic.go:334] "Generic (PLEG): container finished" podID="50e55125-2439-4ff5-87f4-e15db8c26dae" containerID="5a25de6c21b3c95c5c331bde6b74062ba094b575da200747f1b11db8fe63d69b" exitCode=0 Dec 01 08:20:15 crc kubenswrapper[5004]: I1201 08:20:15.582796 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-djhzh" event={"ID":"50e55125-2439-4ff5-87f4-e15db8c26dae","Type":"ContainerDied","Data":"5a25de6c21b3c95c5c331bde6b74062ba094b575da200747f1b11db8fe63d69b"} Dec 01 08:20:15 crc kubenswrapper[5004]: I1201 08:20:15.585444 5004 generic.go:334] "Generic (PLEG): container finished" podID="9ac4710f-ad3b-471c-861d-622e995871cd" containerID="67ef06c26d9aff37fd4b87284932273cc996f6df75479b6a8cb00b9c6cdb8f99" exitCode=0 Dec 01 08:20:15 crc kubenswrapper[5004]: I1201 08:20:15.585495 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mgs69" event={"ID":"9ac4710f-ad3b-471c-861d-622e995871cd","Type":"ContainerDied","Data":"67ef06c26d9aff37fd4b87284932273cc996f6df75479b6a8cb00b9c6cdb8f99"} Dec 01 08:20:15 crc kubenswrapper[5004]: I1201 08:20:15.587885 5004 generic.go:334] "Generic (PLEG): container finished" podID="56a0aeb3-484c-4e7e-9ff7-b3bc75f253f4" containerID="782aa8929e5fd32c5ea22d3c039cad33119a0899cd482e7f975aeb590a3461ab" exitCode=0 Dec 01 08:20:15 crc kubenswrapper[5004]: I1201 08:20:15.587988 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jd8z9" event={"ID":"56a0aeb3-484c-4e7e-9ff7-b3bc75f253f4","Type":"ContainerDied","Data":"782aa8929e5fd32c5ea22d3c039cad33119a0899cd482e7f975aeb590a3461ab"} Dec 01 08:20:15 crc kubenswrapper[5004]: I1201 08:20:15.588080 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dqzk6" podUID="494e6c31-3cc3-45a4-b45c-8f5ffb4251fa" containerName="registry-server" containerID="cri-o://f3ad39fdb9dce7b3e51d34f6a3fb2800e1ec2a34b1bd336ee5ba8a5221b916d5" gracePeriod=2 Dec 01 08:20:16 crc kubenswrapper[5004]: I1201 08:20:16.038864 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dqzk6" Dec 01 08:20:16 crc kubenswrapper[5004]: I1201 08:20:16.207509 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mvf9\" (UniqueName: \"kubernetes.io/projected/494e6c31-3cc3-45a4-b45c-8f5ffb4251fa-kube-api-access-9mvf9\") pod \"494e6c31-3cc3-45a4-b45c-8f5ffb4251fa\" (UID: \"494e6c31-3cc3-45a4-b45c-8f5ffb4251fa\") " Dec 01 08:20:16 crc kubenswrapper[5004]: I1201 08:20:16.207549 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/494e6c31-3cc3-45a4-b45c-8f5ffb4251fa-catalog-content\") pod \"494e6c31-3cc3-45a4-b45c-8f5ffb4251fa\" (UID: \"494e6c31-3cc3-45a4-b45c-8f5ffb4251fa\") " Dec 01 08:20:16 crc kubenswrapper[5004]: I1201 08:20:16.207645 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/494e6c31-3cc3-45a4-b45c-8f5ffb4251fa-utilities\") pod \"494e6c31-3cc3-45a4-b45c-8f5ffb4251fa\" (UID: \"494e6c31-3cc3-45a4-b45c-8f5ffb4251fa\") " Dec 01 08:20:16 crc kubenswrapper[5004]: I1201 08:20:16.208424 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/494e6c31-3cc3-45a4-b45c-8f5ffb4251fa-utilities" (OuterVolumeSpecName: "utilities") pod "494e6c31-3cc3-45a4-b45c-8f5ffb4251fa" (UID: "494e6c31-3cc3-45a4-b45c-8f5ffb4251fa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:20:16 crc kubenswrapper[5004]: I1201 08:20:16.213047 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/494e6c31-3cc3-45a4-b45c-8f5ffb4251fa-kube-api-access-9mvf9" (OuterVolumeSpecName: "kube-api-access-9mvf9") pod "494e6c31-3cc3-45a4-b45c-8f5ffb4251fa" (UID: "494e6c31-3cc3-45a4-b45c-8f5ffb4251fa"). InnerVolumeSpecName "kube-api-access-9mvf9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:20:16 crc kubenswrapper[5004]: I1201 08:20:16.273477 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/494e6c31-3cc3-45a4-b45c-8f5ffb4251fa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "494e6c31-3cc3-45a4-b45c-8f5ffb4251fa" (UID: "494e6c31-3cc3-45a4-b45c-8f5ffb4251fa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:20:16 crc kubenswrapper[5004]: I1201 08:20:16.309121 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/494e6c31-3cc3-45a4-b45c-8f5ffb4251fa-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 08:20:16 crc kubenswrapper[5004]: I1201 08:20:16.309158 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mvf9\" (UniqueName: \"kubernetes.io/projected/494e6c31-3cc3-45a4-b45c-8f5ffb4251fa-kube-api-access-9mvf9\") on node \"crc\" DevicePath \"\"" Dec 01 08:20:16 crc kubenswrapper[5004]: I1201 08:20:16.309168 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/494e6c31-3cc3-45a4-b45c-8f5ffb4251fa-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 08:20:16 crc kubenswrapper[5004]: I1201 08:20:16.610474 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dg9cf" event={"ID":"19da9663-9e98-41f0-a737-0c2683293496","Type":"ContainerStarted","Data":"ec801c83d20861c9e218221b9eb3a2728ff98a78239de7472bae8eb8d0d9af11"} Dec 01 08:20:16 crc kubenswrapper[5004]: I1201 08:20:16.614496 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-djhzh" event={"ID":"50e55125-2439-4ff5-87f4-e15db8c26dae","Type":"ContainerStarted","Data":"bfcd1bbf6870e76913b273ad7c527c24adf446ec364494906e515d729bc349ab"} Dec 01 08:20:16 crc kubenswrapper[5004]: I1201 08:20:16.619742 5004 generic.go:334] "Generic (PLEG): container finished" podID="b0286c29-4a56-4e46-8820-21dfbc658c86" containerID="d40961f147a0a57e437432250ec1c726b9307da206e459eba42bc49fca28ef6e" exitCode=0 Dec 01 08:20:16 crc kubenswrapper[5004]: I1201 08:20:16.619819 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pl87w" event={"ID":"b0286c29-4a56-4e46-8820-21dfbc658c86","Type":"ContainerDied","Data":"d40961f147a0a57e437432250ec1c726b9307da206e459eba42bc49fca28ef6e"} Dec 01 08:20:16 crc kubenswrapper[5004]: I1201 08:20:16.625838 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jd8z9" event={"ID":"56a0aeb3-484c-4e7e-9ff7-b3bc75f253f4","Type":"ContainerStarted","Data":"c63f5e37abd10023683d84c37956975712137d4a6df7a00047f5d62ada1bc723"} Dec 01 08:20:16 crc kubenswrapper[5004]: I1201 08:20:16.627170 5004 generic.go:334] "Generic (PLEG): container finished" podID="494e6c31-3cc3-45a4-b45c-8f5ffb4251fa" containerID="f3ad39fdb9dce7b3e51d34f6a3fb2800e1ec2a34b1bd336ee5ba8a5221b916d5" exitCode=0 Dec 01 08:20:16 crc kubenswrapper[5004]: I1201 08:20:16.627196 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dqzk6" event={"ID":"494e6c31-3cc3-45a4-b45c-8f5ffb4251fa","Type":"ContainerDied","Data":"f3ad39fdb9dce7b3e51d34f6a3fb2800e1ec2a34b1bd336ee5ba8a5221b916d5"} Dec 01 08:20:16 crc kubenswrapper[5004]: I1201 08:20:16.627212 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dqzk6" event={"ID":"494e6c31-3cc3-45a4-b45c-8f5ffb4251fa","Type":"ContainerDied","Data":"080fbd3ee83e7487abed4046e69ff255161759096254669ee15564cfb7cd6101"} Dec 01 08:20:16 crc kubenswrapper[5004]: I1201 08:20:16.627228 5004 scope.go:117] "RemoveContainer" containerID="f3ad39fdb9dce7b3e51d34f6a3fb2800e1ec2a34b1bd336ee5ba8a5221b916d5" Dec 01 08:20:16 crc kubenswrapper[5004]: I1201 08:20:16.627322 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dqzk6" Dec 01 08:20:16 crc kubenswrapper[5004]: I1201 08:20:16.643197 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-dg9cf" podStartSLOduration=2.661780115 podStartE2EDuration="52.643175149s" podCreationTimestamp="2025-12-01 08:19:24 +0000 UTC" firstStartedPulling="2025-12-01 08:19:26.13140789 +0000 UTC m=+143.696399872" lastFinishedPulling="2025-12-01 08:20:16.112802924 +0000 UTC m=+193.677794906" observedRunningTime="2025-12-01 08:20:16.639338359 +0000 UTC m=+194.204330381" watchObservedRunningTime="2025-12-01 08:20:16.643175149 +0000 UTC m=+194.208167161" Dec 01 08:20:16 crc kubenswrapper[5004]: I1201 08:20:16.666274 5004 scope.go:117] "RemoveContainer" containerID="82328ea9d94e5415b9e2a6792045138c763aad3324f611d140ea9eca1e044c31" Dec 01 08:20:16 crc kubenswrapper[5004]: I1201 08:20:16.686603 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jd8z9" podStartSLOduration=2.762608737 podStartE2EDuration="52.6865244s" podCreationTimestamp="2025-12-01 08:19:24 +0000 UTC" firstStartedPulling="2025-12-01 08:19:26.137738046 +0000 UTC m=+143.702730028" lastFinishedPulling="2025-12-01 08:20:16.061653709 +0000 UTC m=+193.626645691" observedRunningTime="2025-12-01 08:20:16.682571737 +0000 UTC m=+194.247563729" watchObservedRunningTime="2025-12-01 08:20:16.6865244 +0000 UTC m=+194.251516372" Dec 01 08:20:16 crc kubenswrapper[5004]: I1201 08:20:16.688961 5004 scope.go:117] "RemoveContainer" containerID="3e4cb69daf5619b550dcd20e5c8f317a0838f7c101dc4aa258321fdbf713ceb0" Dec 01 08:20:16 crc kubenswrapper[5004]: I1201 08:20:16.704632 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-djhzh" podStartSLOduration=2.571962618 podStartE2EDuration="49.704612712s" podCreationTimestamp="2025-12-01 08:19:27 +0000 UTC" firstStartedPulling="2025-12-01 08:19:29.206099669 +0000 UTC m=+146.771091651" lastFinishedPulling="2025-12-01 08:20:16.338749753 +0000 UTC m=+193.903741745" observedRunningTime="2025-12-01 08:20:16.701945863 +0000 UTC m=+194.266937855" watchObservedRunningTime="2025-12-01 08:20:16.704612712 +0000 UTC m=+194.269604694" Dec 01 08:20:16 crc kubenswrapper[5004]: I1201 08:20:16.718391 5004 scope.go:117] "RemoveContainer" containerID="f3ad39fdb9dce7b3e51d34f6a3fb2800e1ec2a34b1bd336ee5ba8a5221b916d5" Dec 01 08:20:16 crc kubenswrapper[5004]: I1201 08:20:16.718531 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dqzk6"] Dec 01 08:20:16 crc kubenswrapper[5004]: E1201 08:20:16.718860 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3ad39fdb9dce7b3e51d34f6a3fb2800e1ec2a34b1bd336ee5ba8a5221b916d5\": container with ID starting with f3ad39fdb9dce7b3e51d34f6a3fb2800e1ec2a34b1bd336ee5ba8a5221b916d5 not found: ID does not exist" containerID="f3ad39fdb9dce7b3e51d34f6a3fb2800e1ec2a34b1bd336ee5ba8a5221b916d5" Dec 01 08:20:16 crc kubenswrapper[5004]: I1201 08:20:16.718910 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3ad39fdb9dce7b3e51d34f6a3fb2800e1ec2a34b1bd336ee5ba8a5221b916d5"} err="failed to get container status \"f3ad39fdb9dce7b3e51d34f6a3fb2800e1ec2a34b1bd336ee5ba8a5221b916d5\": rpc error: code = NotFound desc = could not find container \"f3ad39fdb9dce7b3e51d34f6a3fb2800e1ec2a34b1bd336ee5ba8a5221b916d5\": container with ID starting with f3ad39fdb9dce7b3e51d34f6a3fb2800e1ec2a34b1bd336ee5ba8a5221b916d5 not found: ID does not exist" Dec 01 08:20:16 crc kubenswrapper[5004]: I1201 08:20:16.718960 5004 scope.go:117] "RemoveContainer" containerID="82328ea9d94e5415b9e2a6792045138c763aad3324f611d140ea9eca1e044c31" Dec 01 08:20:16 crc kubenswrapper[5004]: E1201 08:20:16.719219 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82328ea9d94e5415b9e2a6792045138c763aad3324f611d140ea9eca1e044c31\": container with ID starting with 82328ea9d94e5415b9e2a6792045138c763aad3324f611d140ea9eca1e044c31 not found: ID does not exist" containerID="82328ea9d94e5415b9e2a6792045138c763aad3324f611d140ea9eca1e044c31" Dec 01 08:20:16 crc kubenswrapper[5004]: I1201 08:20:16.719239 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82328ea9d94e5415b9e2a6792045138c763aad3324f611d140ea9eca1e044c31"} err="failed to get container status \"82328ea9d94e5415b9e2a6792045138c763aad3324f611d140ea9eca1e044c31\": rpc error: code = NotFound desc = could not find container \"82328ea9d94e5415b9e2a6792045138c763aad3324f611d140ea9eca1e044c31\": container with ID starting with 82328ea9d94e5415b9e2a6792045138c763aad3324f611d140ea9eca1e044c31 not found: ID does not exist" Dec 01 08:20:16 crc kubenswrapper[5004]: I1201 08:20:16.719254 5004 scope.go:117] "RemoveContainer" containerID="3e4cb69daf5619b550dcd20e5c8f317a0838f7c101dc4aa258321fdbf713ceb0" Dec 01 08:20:16 crc kubenswrapper[5004]: E1201 08:20:16.719489 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e4cb69daf5619b550dcd20e5c8f317a0838f7c101dc4aa258321fdbf713ceb0\": container with ID starting with 3e4cb69daf5619b550dcd20e5c8f317a0838f7c101dc4aa258321fdbf713ceb0 not found: ID does not exist" containerID="3e4cb69daf5619b550dcd20e5c8f317a0838f7c101dc4aa258321fdbf713ceb0" Dec 01 08:20:16 crc kubenswrapper[5004]: I1201 08:20:16.719535 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e4cb69daf5619b550dcd20e5c8f317a0838f7c101dc4aa258321fdbf713ceb0"} err="failed to get container status \"3e4cb69daf5619b550dcd20e5c8f317a0838f7c101dc4aa258321fdbf713ceb0\": rpc error: code = NotFound desc = could not find container \"3e4cb69daf5619b550dcd20e5c8f317a0838f7c101dc4aa258321fdbf713ceb0\": container with ID starting with 3e4cb69daf5619b550dcd20e5c8f317a0838f7c101dc4aa258321fdbf713ceb0 not found: ID does not exist" Dec 01 08:20:16 crc kubenswrapper[5004]: I1201 08:20:16.720582 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dqzk6"] Dec 01 08:20:16 crc kubenswrapper[5004]: I1201 08:20:16.766341 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="494e6c31-3cc3-45a4-b45c-8f5ffb4251fa" path="/var/lib/kubelet/pods/494e6c31-3cc3-45a4-b45c-8f5ffb4251fa/volumes" Dec 01 08:20:17 crc kubenswrapper[5004]: E1201 08:20:17.098443 5004 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = parsing image configuration: Get \"https://cdn01.quay.io/quayio-production-s3/sha256/b8/b856e4d37af238240aaa3504ebf72881a05d3e5875365377d4fbd3a313fe7d06?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIATAAF2YHTGR23ZTE6%2F20251201%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20251201T082016Z&X-Amz-Expires=600&X-Amz-SignedHeaders=host&X-Amz-Signature=71c15c92c306d4df074059ec0ef62939f82f2a7e5f32dcf4ac56bf38400d65b2®ion=us-east-1&namespace=openshift-release-dev&username=openshift-release-dev+ocm_access_1b89217552bc42d1be3fb06a1aed001a&repo_name=ocp-v4.0-art-dev&akamai_signature=exp=1764578116~hmac=521ccaa01dec84ce8f2dcedc57246cdcc3ef32f9bf41a1ab12522065c6d15e3e\": remote error: tls: internal error" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad" Dec 01 08:20:17 crc kubenswrapper[5004]: E1201 08:20:17.098699 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:registry-server,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad,Command:[/bin/opm],Args:[serve /extracted-catalog/catalog --cache-dir=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:grpc,HostPort:0,ContainerPort:50051,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:GOMEMLIMIT,Value:120MiB,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{125829120 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rn7rw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-mgs69_openshift-marketplace(9ac4710f-ad3b-471c-861d-622e995871cd): ErrImagePull: parsing image configuration: Get \"https://cdn01.quay.io/quayio-production-s3/sha256/b8/b856e4d37af238240aaa3504ebf72881a05d3e5875365377d4fbd3a313fe7d06?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIATAAF2YHTGR23ZTE6%2F20251201%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20251201T082016Z&X-Amz-Expires=600&X-Amz-SignedHeaders=host&X-Amz-Signature=71c15c92c306d4df074059ec0ef62939f82f2a7e5f32dcf4ac56bf38400d65b2®ion=us-east-1&namespace=openshift-release-dev&username=openshift-release-dev+ocm_access_1b89217552bc42d1be3fb06a1aed001a&repo_name=ocp-v4.0-art-dev&akamai_signature=exp=1764578116~hmac=521ccaa01dec84ce8f2dcedc57246cdcc3ef32f9bf41a1ab12522065c6d15e3e\": remote error: tls: internal error" logger="UnhandledError" Dec 01 08:20:17 crc kubenswrapper[5004]: E1201 08:20:17.099928 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ErrImagePull: \"parsing image configuration: Get \\\"https://cdn01.quay.io/quayio-production-s3/sha256/b8/b856e4d37af238240aaa3504ebf72881a05d3e5875365377d4fbd3a313fe7d06?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIATAAF2YHTGR23ZTE6%2F20251201%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20251201T082016Z&X-Amz-Expires=600&X-Amz-SignedHeaders=host&X-Amz-Signature=71c15c92c306d4df074059ec0ef62939f82f2a7e5f32dcf4ac56bf38400d65b2®ion=us-east-1&namespace=openshift-release-dev&username=openshift-release-dev+ocm_access_1b89217552bc42d1be3fb06a1aed001a&repo_name=ocp-v4.0-art-dev&akamai_signature=exp=1764578116~hmac=521ccaa01dec84ce8f2dcedc57246cdcc3ef32f9bf41a1ab12522065c6d15e3e\\\": remote error: tls: internal error\"" pod="openshift-marketplace/community-operators-mgs69" podUID="9ac4710f-ad3b-471c-861d-622e995871cd" Dec 01 08:20:17 crc kubenswrapper[5004]: I1201 08:20:17.635289 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pl87w" event={"ID":"b0286c29-4a56-4e46-8820-21dfbc658c86","Type":"ContainerStarted","Data":"edff7af1925601359de2784bf43e9f064567d2588c070ac07415d42a8213ae54"} Dec 01 08:20:17 crc kubenswrapper[5004]: I1201 08:20:17.637488 5004 generic.go:334] "Generic (PLEG): container finished" podID="4d7ad504-8dea-4c9c-9a5a-682d56793c9b" containerID="456763b524e30db540665a3625feccc6549aa9851d2467b7e90f85bd46921cc7" exitCode=0 Dec 01 08:20:17 crc kubenswrapper[5004]: I1201 08:20:17.637569 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6fl5f" event={"ID":"4d7ad504-8dea-4c9c-9a5a-682d56793c9b","Type":"ContainerDied","Data":"456763b524e30db540665a3625feccc6549aa9851d2467b7e90f85bd46921cc7"} Dec 01 08:20:17 crc kubenswrapper[5004]: I1201 08:20:17.658263 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pl87w" podStartSLOduration=2.714863996 podStartE2EDuration="51.658242415s" podCreationTimestamp="2025-12-01 08:19:26 +0000 UTC" firstStartedPulling="2025-12-01 08:19:28.182748756 +0000 UTC m=+145.747740738" lastFinishedPulling="2025-12-01 08:20:17.126127175 +0000 UTC m=+194.691119157" observedRunningTime="2025-12-01 08:20:17.655844733 +0000 UTC m=+195.220836735" watchObservedRunningTime="2025-12-01 08:20:17.658242415 +0000 UTC m=+195.223234397" Dec 01 08:20:18 crc kubenswrapper[5004]: I1201 08:20:18.225188 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-djhzh" Dec 01 08:20:18 crc kubenswrapper[5004]: I1201 08:20:18.225528 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-djhzh" Dec 01 08:20:19 crc kubenswrapper[5004]: I1201 08:20:19.257746 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-djhzh" podUID="50e55125-2439-4ff5-87f4-e15db8c26dae" containerName="registry-server" probeResult="failure" output=< Dec 01 08:20:19 crc kubenswrapper[5004]: timeout: failed to connect service ":50051" within 1s Dec 01 08:20:19 crc kubenswrapper[5004]: > Dec 01 08:20:21 crc kubenswrapper[5004]: I1201 08:20:21.660714 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6fl5f" event={"ID":"4d7ad504-8dea-4c9c-9a5a-682d56793c9b","Type":"ContainerStarted","Data":"a2d6feb3df2ed8f67a38a21d62615bbae31b65216304049543c6a45a8af2a7ce"} Dec 01 08:20:21 crc kubenswrapper[5004]: I1201 08:20:21.680416 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6fl5f" podStartSLOduration=2.867931158 podStartE2EDuration="54.680400294s" podCreationTimestamp="2025-12-01 08:19:27 +0000 UTC" firstStartedPulling="2025-12-01 08:19:28.188147868 +0000 UTC m=+145.753139850" lastFinishedPulling="2025-12-01 08:20:20.000616994 +0000 UTC m=+197.565608986" observedRunningTime="2025-12-01 08:20:21.678005779 +0000 UTC m=+199.242997761" watchObservedRunningTime="2025-12-01 08:20:21.680400294 +0000 UTC m=+199.245392276" Dec 01 08:20:25 crc kubenswrapper[5004]: I1201 08:20:25.040737 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-dg9cf" Dec 01 08:20:25 crc kubenswrapper[5004]: I1201 08:20:25.041108 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-dg9cf" Dec 01 08:20:25 crc kubenswrapper[5004]: I1201 08:20:25.090996 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-dg9cf" Dec 01 08:20:25 crc kubenswrapper[5004]: I1201 08:20:25.123863 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jd8z9" Dec 01 08:20:25 crc kubenswrapper[5004]: I1201 08:20:25.124047 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jd8z9" Dec 01 08:20:25 crc kubenswrapper[5004]: I1201 08:20:25.184255 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jd8z9" Dec 01 08:20:25 crc kubenswrapper[5004]: I1201 08:20:25.744185 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jd8z9" Dec 01 08:20:25 crc kubenswrapper[5004]: I1201 08:20:25.745354 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-dg9cf" Dec 01 08:20:27 crc kubenswrapper[5004]: I1201 08:20:27.025593 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pl87w" Dec 01 08:20:27 crc kubenswrapper[5004]: I1201 08:20:27.025647 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pl87w" Dec 01 08:20:27 crc kubenswrapper[5004]: I1201 08:20:27.098735 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pl87w" Dec 01 08:20:27 crc kubenswrapper[5004]: I1201 08:20:27.449960 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6fl5f" Dec 01 08:20:27 crc kubenswrapper[5004]: I1201 08:20:27.450028 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6fl5f" Dec 01 08:20:27 crc kubenswrapper[5004]: I1201 08:20:27.520413 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6fl5f" Dec 01 08:20:27 crc kubenswrapper[5004]: I1201 08:20:27.754798 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pl87w" Dec 01 08:20:27 crc kubenswrapper[5004]: I1201 08:20:27.767985 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6fl5f" Dec 01 08:20:28 crc kubenswrapper[5004]: I1201 08:20:28.285673 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-djhzh" Dec 01 08:20:28 crc kubenswrapper[5004]: I1201 08:20:28.352912 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-djhzh" Dec 01 08:20:29 crc kubenswrapper[5004]: I1201 08:20:29.696419 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6fl5f"] Dec 01 08:20:29 crc kubenswrapper[5004]: I1201 08:20:29.727328 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6fl5f" podUID="4d7ad504-8dea-4c9c-9a5a-682d56793c9b" containerName="registry-server" containerID="cri-o://a2d6feb3df2ed8f67a38a21d62615bbae31b65216304049543c6a45a8af2a7ce" gracePeriod=2 Dec 01 08:20:30 crc kubenswrapper[5004]: I1201 08:20:30.179230 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6fl5f" Dec 01 08:20:30 crc kubenswrapper[5004]: I1201 08:20:30.348583 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5dss\" (UniqueName: \"kubernetes.io/projected/4d7ad504-8dea-4c9c-9a5a-682d56793c9b-kube-api-access-x5dss\") pod \"4d7ad504-8dea-4c9c-9a5a-682d56793c9b\" (UID: \"4d7ad504-8dea-4c9c-9a5a-682d56793c9b\") " Dec 01 08:20:30 crc kubenswrapper[5004]: I1201 08:20:30.348751 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d7ad504-8dea-4c9c-9a5a-682d56793c9b-utilities\") pod \"4d7ad504-8dea-4c9c-9a5a-682d56793c9b\" (UID: \"4d7ad504-8dea-4c9c-9a5a-682d56793c9b\") " Dec 01 08:20:30 crc kubenswrapper[5004]: I1201 08:20:30.348830 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d7ad504-8dea-4c9c-9a5a-682d56793c9b-catalog-content\") pod \"4d7ad504-8dea-4c9c-9a5a-682d56793c9b\" (UID: \"4d7ad504-8dea-4c9c-9a5a-682d56793c9b\") " Dec 01 08:20:30 crc kubenswrapper[5004]: I1201 08:20:30.349866 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d7ad504-8dea-4c9c-9a5a-682d56793c9b-utilities" (OuterVolumeSpecName: "utilities") pod "4d7ad504-8dea-4c9c-9a5a-682d56793c9b" (UID: "4d7ad504-8dea-4c9c-9a5a-682d56793c9b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:20:30 crc kubenswrapper[5004]: I1201 08:20:30.359696 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d7ad504-8dea-4c9c-9a5a-682d56793c9b-kube-api-access-x5dss" (OuterVolumeSpecName: "kube-api-access-x5dss") pod "4d7ad504-8dea-4c9c-9a5a-682d56793c9b" (UID: "4d7ad504-8dea-4c9c-9a5a-682d56793c9b"). InnerVolumeSpecName "kube-api-access-x5dss". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:20:30 crc kubenswrapper[5004]: I1201 08:20:30.372147 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d7ad504-8dea-4c9c-9a5a-682d56793c9b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4d7ad504-8dea-4c9c-9a5a-682d56793c9b" (UID: "4d7ad504-8dea-4c9c-9a5a-682d56793c9b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:20:30 crc kubenswrapper[5004]: I1201 08:20:30.450404 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x5dss\" (UniqueName: \"kubernetes.io/projected/4d7ad504-8dea-4c9c-9a5a-682d56793c9b-kube-api-access-x5dss\") on node \"crc\" DevicePath \"\"" Dec 01 08:20:30 crc kubenswrapper[5004]: I1201 08:20:30.450456 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d7ad504-8dea-4c9c-9a5a-682d56793c9b-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 08:20:30 crc kubenswrapper[5004]: I1201 08:20:30.450479 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d7ad504-8dea-4c9c-9a5a-682d56793c9b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 08:20:30 crc kubenswrapper[5004]: I1201 08:20:30.736728 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mgs69" event={"ID":"9ac4710f-ad3b-471c-861d-622e995871cd","Type":"ContainerStarted","Data":"1488ee8f8915ae5a8d85e3c4e8568265969c9bfa45ae9de2fe5bb52a36cdf944"} Dec 01 08:20:30 crc kubenswrapper[5004]: I1201 08:20:30.743307 5004 generic.go:334] "Generic (PLEG): container finished" podID="4d7ad504-8dea-4c9c-9a5a-682d56793c9b" containerID="a2d6feb3df2ed8f67a38a21d62615bbae31b65216304049543c6a45a8af2a7ce" exitCode=0 Dec 01 08:20:30 crc kubenswrapper[5004]: I1201 08:20:30.743382 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6fl5f" event={"ID":"4d7ad504-8dea-4c9c-9a5a-682d56793c9b","Type":"ContainerDied","Data":"a2d6feb3df2ed8f67a38a21d62615bbae31b65216304049543c6a45a8af2a7ce"} Dec 01 08:20:30 crc kubenswrapper[5004]: I1201 08:20:30.743414 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6fl5f" event={"ID":"4d7ad504-8dea-4c9c-9a5a-682d56793c9b","Type":"ContainerDied","Data":"03009d147f96e714e3ae70449468f3a71750dd69c196de3267463cb845b938e2"} Dec 01 08:20:30 crc kubenswrapper[5004]: I1201 08:20:30.743432 5004 scope.go:117] "RemoveContainer" containerID="a2d6feb3df2ed8f67a38a21d62615bbae31b65216304049543c6a45a8af2a7ce" Dec 01 08:20:30 crc kubenswrapper[5004]: I1201 08:20:30.743748 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6fl5f" Dec 01 08:20:30 crc kubenswrapper[5004]: I1201 08:20:30.772199 5004 scope.go:117] "RemoveContainer" containerID="456763b524e30db540665a3625feccc6549aa9851d2467b7e90f85bd46921cc7" Dec 01 08:20:30 crc kubenswrapper[5004]: I1201 08:20:30.787362 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mgs69" podStartSLOduration=3.249485625 podStartE2EDuration="1m5.787347741s" podCreationTimestamp="2025-12-01 08:19:25 +0000 UTC" firstStartedPulling="2025-12-01 08:19:27.153395387 +0000 UTC m=+144.718387369" lastFinishedPulling="2025-12-01 08:20:29.691257483 +0000 UTC m=+207.256249485" observedRunningTime="2025-12-01 08:20:30.784133795 +0000 UTC m=+208.349125787" watchObservedRunningTime="2025-12-01 08:20:30.787347741 +0000 UTC m=+208.352339713" Dec 01 08:20:30 crc kubenswrapper[5004]: I1201 08:20:30.806618 5004 scope.go:117] "RemoveContainer" containerID="e59bdbc20757ab353ba06d898d2fa03876048272352084f2ac28982c71b8b19b" Dec 01 08:20:30 crc kubenswrapper[5004]: I1201 08:20:30.810418 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6fl5f"] Dec 01 08:20:30 crc kubenswrapper[5004]: I1201 08:20:30.813940 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6fl5f"] Dec 01 08:20:30 crc kubenswrapper[5004]: I1201 08:20:30.819204 5004 scope.go:117] "RemoveContainer" containerID="a2d6feb3df2ed8f67a38a21d62615bbae31b65216304049543c6a45a8af2a7ce" Dec 01 08:20:30 crc kubenswrapper[5004]: E1201 08:20:30.819663 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2d6feb3df2ed8f67a38a21d62615bbae31b65216304049543c6a45a8af2a7ce\": container with ID starting with a2d6feb3df2ed8f67a38a21d62615bbae31b65216304049543c6a45a8af2a7ce not found: ID does not exist" containerID="a2d6feb3df2ed8f67a38a21d62615bbae31b65216304049543c6a45a8af2a7ce" Dec 01 08:20:30 crc kubenswrapper[5004]: I1201 08:20:30.819705 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2d6feb3df2ed8f67a38a21d62615bbae31b65216304049543c6a45a8af2a7ce"} err="failed to get container status \"a2d6feb3df2ed8f67a38a21d62615bbae31b65216304049543c6a45a8af2a7ce\": rpc error: code = NotFound desc = could not find container \"a2d6feb3df2ed8f67a38a21d62615bbae31b65216304049543c6a45a8af2a7ce\": container with ID starting with a2d6feb3df2ed8f67a38a21d62615bbae31b65216304049543c6a45a8af2a7ce not found: ID does not exist" Dec 01 08:20:30 crc kubenswrapper[5004]: I1201 08:20:30.819733 5004 scope.go:117] "RemoveContainer" containerID="456763b524e30db540665a3625feccc6549aa9851d2467b7e90f85bd46921cc7" Dec 01 08:20:30 crc kubenswrapper[5004]: E1201 08:20:30.820046 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"456763b524e30db540665a3625feccc6549aa9851d2467b7e90f85bd46921cc7\": container with ID starting with 456763b524e30db540665a3625feccc6549aa9851d2467b7e90f85bd46921cc7 not found: ID does not exist" containerID="456763b524e30db540665a3625feccc6549aa9851d2467b7e90f85bd46921cc7" Dec 01 08:20:30 crc kubenswrapper[5004]: I1201 08:20:30.820158 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"456763b524e30db540665a3625feccc6549aa9851d2467b7e90f85bd46921cc7"} err="failed to get container status \"456763b524e30db540665a3625feccc6549aa9851d2467b7e90f85bd46921cc7\": rpc error: code = NotFound desc = could not find container \"456763b524e30db540665a3625feccc6549aa9851d2467b7e90f85bd46921cc7\": container with ID starting with 456763b524e30db540665a3625feccc6549aa9851d2467b7e90f85bd46921cc7 not found: ID does not exist" Dec 01 08:20:30 crc kubenswrapper[5004]: I1201 08:20:30.820272 5004 scope.go:117] "RemoveContainer" containerID="e59bdbc20757ab353ba06d898d2fa03876048272352084f2ac28982c71b8b19b" Dec 01 08:20:30 crc kubenswrapper[5004]: E1201 08:20:30.820939 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e59bdbc20757ab353ba06d898d2fa03876048272352084f2ac28982c71b8b19b\": container with ID starting with e59bdbc20757ab353ba06d898d2fa03876048272352084f2ac28982c71b8b19b not found: ID does not exist" containerID="e59bdbc20757ab353ba06d898d2fa03876048272352084f2ac28982c71b8b19b" Dec 01 08:20:30 crc kubenswrapper[5004]: I1201 08:20:30.820988 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e59bdbc20757ab353ba06d898d2fa03876048272352084f2ac28982c71b8b19b"} err="failed to get container status \"e59bdbc20757ab353ba06d898d2fa03876048272352084f2ac28982c71b8b19b\": rpc error: code = NotFound desc = could not find container \"e59bdbc20757ab353ba06d898d2fa03876048272352084f2ac28982c71b8b19b\": container with ID starting with e59bdbc20757ab353ba06d898d2fa03876048272352084f2ac28982c71b8b19b not found: ID does not exist" Dec 01 08:20:32 crc kubenswrapper[5004]: I1201 08:20:32.100337 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-djhzh"] Dec 01 08:20:32 crc kubenswrapper[5004]: I1201 08:20:32.102385 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-djhzh" podUID="50e55125-2439-4ff5-87f4-e15db8c26dae" containerName="registry-server" containerID="cri-o://bfcd1bbf6870e76913b273ad7c527c24adf446ec364494906e515d729bc349ab" gracePeriod=2 Dec 01 08:20:32 crc kubenswrapper[5004]: I1201 08:20:32.574999 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-djhzh" Dec 01 08:20:32 crc kubenswrapper[5004]: I1201 08:20:32.679811 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50e55125-2439-4ff5-87f4-e15db8c26dae-utilities\") pod \"50e55125-2439-4ff5-87f4-e15db8c26dae\" (UID: \"50e55125-2439-4ff5-87f4-e15db8c26dae\") " Dec 01 08:20:32 crc kubenswrapper[5004]: I1201 08:20:32.679912 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50e55125-2439-4ff5-87f4-e15db8c26dae-catalog-content\") pod \"50e55125-2439-4ff5-87f4-e15db8c26dae\" (UID: \"50e55125-2439-4ff5-87f4-e15db8c26dae\") " Dec 01 08:20:32 crc kubenswrapper[5004]: I1201 08:20:32.679982 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqcrk\" (UniqueName: \"kubernetes.io/projected/50e55125-2439-4ff5-87f4-e15db8c26dae-kube-api-access-qqcrk\") pod \"50e55125-2439-4ff5-87f4-e15db8c26dae\" (UID: \"50e55125-2439-4ff5-87f4-e15db8c26dae\") " Dec 01 08:20:32 crc kubenswrapper[5004]: I1201 08:20:32.680952 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50e55125-2439-4ff5-87f4-e15db8c26dae-utilities" (OuterVolumeSpecName: "utilities") pod "50e55125-2439-4ff5-87f4-e15db8c26dae" (UID: "50e55125-2439-4ff5-87f4-e15db8c26dae"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:20:32 crc kubenswrapper[5004]: I1201 08:20:32.686815 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50e55125-2439-4ff5-87f4-e15db8c26dae-kube-api-access-qqcrk" (OuterVolumeSpecName: "kube-api-access-qqcrk") pod "50e55125-2439-4ff5-87f4-e15db8c26dae" (UID: "50e55125-2439-4ff5-87f4-e15db8c26dae"). InnerVolumeSpecName "kube-api-access-qqcrk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:20:32 crc kubenswrapper[5004]: I1201 08:20:32.765828 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d7ad504-8dea-4c9c-9a5a-682d56793c9b" path="/var/lib/kubelet/pods/4d7ad504-8dea-4c9c-9a5a-682d56793c9b/volumes" Dec 01 08:20:32 crc kubenswrapper[5004]: I1201 08:20:32.771336 5004 generic.go:334] "Generic (PLEG): container finished" podID="50e55125-2439-4ff5-87f4-e15db8c26dae" containerID="bfcd1bbf6870e76913b273ad7c527c24adf446ec364494906e515d729bc349ab" exitCode=0 Dec 01 08:20:32 crc kubenswrapper[5004]: I1201 08:20:32.771396 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-djhzh" event={"ID":"50e55125-2439-4ff5-87f4-e15db8c26dae","Type":"ContainerDied","Data":"bfcd1bbf6870e76913b273ad7c527c24adf446ec364494906e515d729bc349ab"} Dec 01 08:20:32 crc kubenswrapper[5004]: I1201 08:20:32.771431 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-djhzh" event={"ID":"50e55125-2439-4ff5-87f4-e15db8c26dae","Type":"ContainerDied","Data":"d271da7abd74019a5c1ca0e401792e264670963646b26b095fb75c0d480570a7"} Dec 01 08:20:32 crc kubenswrapper[5004]: I1201 08:20:32.771456 5004 scope.go:117] "RemoveContainer" containerID="bfcd1bbf6870e76913b273ad7c527c24adf446ec364494906e515d729bc349ab" Dec 01 08:20:32 crc kubenswrapper[5004]: I1201 08:20:32.771547 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-djhzh" Dec 01 08:20:32 crc kubenswrapper[5004]: I1201 08:20:32.783154 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50e55125-2439-4ff5-87f4-e15db8c26dae-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 08:20:32 crc kubenswrapper[5004]: I1201 08:20:32.783201 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqcrk\" (UniqueName: \"kubernetes.io/projected/50e55125-2439-4ff5-87f4-e15db8c26dae-kube-api-access-qqcrk\") on node \"crc\" DevicePath \"\"" Dec 01 08:20:32 crc kubenswrapper[5004]: I1201 08:20:32.795948 5004 scope.go:117] "RemoveContainer" containerID="5a25de6c21b3c95c5c331bde6b74062ba094b575da200747f1b11db8fe63d69b" Dec 01 08:20:32 crc kubenswrapper[5004]: I1201 08:20:32.804460 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50e55125-2439-4ff5-87f4-e15db8c26dae-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "50e55125-2439-4ff5-87f4-e15db8c26dae" (UID: "50e55125-2439-4ff5-87f4-e15db8c26dae"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:20:32 crc kubenswrapper[5004]: I1201 08:20:32.820040 5004 scope.go:117] "RemoveContainer" containerID="bf6ce909c7e5e2a43b2308b70506a049ee95d531e9282fbf54f2d0cc0c78ed35" Dec 01 08:20:32 crc kubenswrapper[5004]: I1201 08:20:32.839764 5004 scope.go:117] "RemoveContainer" containerID="bfcd1bbf6870e76913b273ad7c527c24adf446ec364494906e515d729bc349ab" Dec 01 08:20:32 crc kubenswrapper[5004]: E1201 08:20:32.840267 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfcd1bbf6870e76913b273ad7c527c24adf446ec364494906e515d729bc349ab\": container with ID starting with bfcd1bbf6870e76913b273ad7c527c24adf446ec364494906e515d729bc349ab not found: ID does not exist" containerID="bfcd1bbf6870e76913b273ad7c527c24adf446ec364494906e515d729bc349ab" Dec 01 08:20:32 crc kubenswrapper[5004]: I1201 08:20:32.840347 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfcd1bbf6870e76913b273ad7c527c24adf446ec364494906e515d729bc349ab"} err="failed to get container status \"bfcd1bbf6870e76913b273ad7c527c24adf446ec364494906e515d729bc349ab\": rpc error: code = NotFound desc = could not find container \"bfcd1bbf6870e76913b273ad7c527c24adf446ec364494906e515d729bc349ab\": container with ID starting with bfcd1bbf6870e76913b273ad7c527c24adf446ec364494906e515d729bc349ab not found: ID does not exist" Dec 01 08:20:32 crc kubenswrapper[5004]: I1201 08:20:32.840372 5004 scope.go:117] "RemoveContainer" containerID="5a25de6c21b3c95c5c331bde6b74062ba094b575da200747f1b11db8fe63d69b" Dec 01 08:20:32 crc kubenswrapper[5004]: E1201 08:20:32.840861 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a25de6c21b3c95c5c331bde6b74062ba094b575da200747f1b11db8fe63d69b\": container with ID starting with 5a25de6c21b3c95c5c331bde6b74062ba094b575da200747f1b11db8fe63d69b not found: ID does not exist" containerID="5a25de6c21b3c95c5c331bde6b74062ba094b575da200747f1b11db8fe63d69b" Dec 01 08:20:32 crc kubenswrapper[5004]: I1201 08:20:32.840900 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a25de6c21b3c95c5c331bde6b74062ba094b575da200747f1b11db8fe63d69b"} err="failed to get container status \"5a25de6c21b3c95c5c331bde6b74062ba094b575da200747f1b11db8fe63d69b\": rpc error: code = NotFound desc = could not find container \"5a25de6c21b3c95c5c331bde6b74062ba094b575da200747f1b11db8fe63d69b\": container with ID starting with 5a25de6c21b3c95c5c331bde6b74062ba094b575da200747f1b11db8fe63d69b not found: ID does not exist" Dec 01 08:20:32 crc kubenswrapper[5004]: I1201 08:20:32.840926 5004 scope.go:117] "RemoveContainer" containerID="bf6ce909c7e5e2a43b2308b70506a049ee95d531e9282fbf54f2d0cc0c78ed35" Dec 01 08:20:32 crc kubenswrapper[5004]: E1201 08:20:32.841368 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf6ce909c7e5e2a43b2308b70506a049ee95d531e9282fbf54f2d0cc0c78ed35\": container with ID starting with bf6ce909c7e5e2a43b2308b70506a049ee95d531e9282fbf54f2d0cc0c78ed35 not found: ID does not exist" containerID="bf6ce909c7e5e2a43b2308b70506a049ee95d531e9282fbf54f2d0cc0c78ed35" Dec 01 08:20:32 crc kubenswrapper[5004]: I1201 08:20:32.841422 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf6ce909c7e5e2a43b2308b70506a049ee95d531e9282fbf54f2d0cc0c78ed35"} err="failed to get container status \"bf6ce909c7e5e2a43b2308b70506a049ee95d531e9282fbf54f2d0cc0c78ed35\": rpc error: code = NotFound desc = could not find container \"bf6ce909c7e5e2a43b2308b70506a049ee95d531e9282fbf54f2d0cc0c78ed35\": container with ID starting with bf6ce909c7e5e2a43b2308b70506a049ee95d531e9282fbf54f2d0cc0c78ed35 not found: ID does not exist" Dec 01 08:20:32 crc kubenswrapper[5004]: I1201 08:20:32.884476 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50e55125-2439-4ff5-87f4-e15db8c26dae-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 08:20:33 crc kubenswrapper[5004]: I1201 08:20:33.095037 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-djhzh"] Dec 01 08:20:33 crc kubenswrapper[5004]: I1201 08:20:33.098510 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-djhzh"] Dec 01 08:20:34 crc kubenswrapper[5004]: I1201 08:20:34.772775 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50e55125-2439-4ff5-87f4-e15db8c26dae" path="/var/lib/kubelet/pods/50e55125-2439-4ff5-87f4-e15db8c26dae/volumes" Dec 01 08:20:35 crc kubenswrapper[5004]: I1201 08:20:35.448760 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mgs69" Dec 01 08:20:35 crc kubenswrapper[5004]: I1201 08:20:35.448827 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mgs69" Dec 01 08:20:35 crc kubenswrapper[5004]: I1201 08:20:35.491519 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mgs69" Dec 01 08:20:35 crc kubenswrapper[5004]: I1201 08:20:35.840281 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mgs69" Dec 01 08:20:36 crc kubenswrapper[5004]: I1201 08:20:36.900702 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mgs69"] Dec 01 08:20:37 crc kubenswrapper[5004]: I1201 08:20:37.812431 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mgs69" podUID="9ac4710f-ad3b-471c-861d-622e995871cd" containerName="registry-server" containerID="cri-o://1488ee8f8915ae5a8d85e3c4e8568265969c9bfa45ae9de2fe5bb52a36cdf944" gracePeriod=2 Dec 01 08:20:38 crc kubenswrapper[5004]: I1201 08:20:38.384970 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mgs69" Dec 01 08:20:38 crc kubenswrapper[5004]: I1201 08:20:38.565201 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ac4710f-ad3b-471c-861d-622e995871cd-catalog-content\") pod \"9ac4710f-ad3b-471c-861d-622e995871cd\" (UID: \"9ac4710f-ad3b-471c-861d-622e995871cd\") " Dec 01 08:20:38 crc kubenswrapper[5004]: I1201 08:20:38.565354 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rn7rw\" (UniqueName: \"kubernetes.io/projected/9ac4710f-ad3b-471c-861d-622e995871cd-kube-api-access-rn7rw\") pod \"9ac4710f-ad3b-471c-861d-622e995871cd\" (UID: \"9ac4710f-ad3b-471c-861d-622e995871cd\") " Dec 01 08:20:38 crc kubenswrapper[5004]: I1201 08:20:38.565392 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ac4710f-ad3b-471c-861d-622e995871cd-utilities\") pod \"9ac4710f-ad3b-471c-861d-622e995871cd\" (UID: \"9ac4710f-ad3b-471c-861d-622e995871cd\") " Dec 01 08:20:38 crc kubenswrapper[5004]: I1201 08:20:38.567718 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ac4710f-ad3b-471c-861d-622e995871cd-utilities" (OuterVolumeSpecName: "utilities") pod "9ac4710f-ad3b-471c-861d-622e995871cd" (UID: "9ac4710f-ad3b-471c-861d-622e995871cd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:20:38 crc kubenswrapper[5004]: I1201 08:20:38.576600 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ac4710f-ad3b-471c-861d-622e995871cd-kube-api-access-rn7rw" (OuterVolumeSpecName: "kube-api-access-rn7rw") pod "9ac4710f-ad3b-471c-861d-622e995871cd" (UID: "9ac4710f-ad3b-471c-861d-622e995871cd"). InnerVolumeSpecName "kube-api-access-rn7rw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:20:38 crc kubenswrapper[5004]: I1201 08:20:38.629854 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ac4710f-ad3b-471c-861d-622e995871cd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9ac4710f-ad3b-471c-861d-622e995871cd" (UID: "9ac4710f-ad3b-471c-861d-622e995871cd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:20:38 crc kubenswrapper[5004]: I1201 08:20:38.667348 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ac4710f-ad3b-471c-861d-622e995871cd-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 08:20:38 crc kubenswrapper[5004]: I1201 08:20:38.667379 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rn7rw\" (UniqueName: \"kubernetes.io/projected/9ac4710f-ad3b-471c-861d-622e995871cd-kube-api-access-rn7rw\") on node \"crc\" DevicePath \"\"" Dec 01 08:20:38 crc kubenswrapper[5004]: I1201 08:20:38.667395 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ac4710f-ad3b-471c-861d-622e995871cd-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 08:20:38 crc kubenswrapper[5004]: I1201 08:20:38.729864 5004 patch_prober.go:28] interesting pod/machine-config-daemon-fvdgt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 08:20:38 crc kubenswrapper[5004]: I1201 08:20:38.729925 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 08:20:38 crc kubenswrapper[5004]: I1201 08:20:38.729974 5004 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" Dec 01 08:20:38 crc kubenswrapper[5004]: I1201 08:20:38.730501 5004 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6c1e51fa92aec80e93d0fce11c743b8c4fde23fc23d8626e8d5b3e25ab42350d"} pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 08:20:38 crc kubenswrapper[5004]: I1201 08:20:38.730593 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" containerID="cri-o://6c1e51fa92aec80e93d0fce11c743b8c4fde23fc23d8626e8d5b3e25ab42350d" gracePeriod=600 Dec 01 08:20:38 crc kubenswrapper[5004]: I1201 08:20:38.821418 5004 generic.go:334] "Generic (PLEG): container finished" podID="9ac4710f-ad3b-471c-861d-622e995871cd" containerID="1488ee8f8915ae5a8d85e3c4e8568265969c9bfa45ae9de2fe5bb52a36cdf944" exitCode=0 Dec 01 08:20:38 crc kubenswrapper[5004]: I1201 08:20:38.821464 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mgs69" event={"ID":"9ac4710f-ad3b-471c-861d-622e995871cd","Type":"ContainerDied","Data":"1488ee8f8915ae5a8d85e3c4e8568265969c9bfa45ae9de2fe5bb52a36cdf944"} Dec 01 08:20:38 crc kubenswrapper[5004]: I1201 08:20:38.821493 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mgs69" event={"ID":"9ac4710f-ad3b-471c-861d-622e995871cd","Type":"ContainerDied","Data":"8fd7fd4f0691c759ff143a52dc6f3242fa255b64d63bcf8200d30713d3b2a58b"} Dec 01 08:20:38 crc kubenswrapper[5004]: I1201 08:20:38.821535 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mgs69" Dec 01 08:20:38 crc kubenswrapper[5004]: I1201 08:20:38.821536 5004 scope.go:117] "RemoveContainer" containerID="1488ee8f8915ae5a8d85e3c4e8568265969c9bfa45ae9de2fe5bb52a36cdf944" Dec 01 08:20:38 crc kubenswrapper[5004]: I1201 08:20:38.847680 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mgs69"] Dec 01 08:20:38 crc kubenswrapper[5004]: I1201 08:20:38.847888 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mgs69"] Dec 01 08:20:38 crc kubenswrapper[5004]: I1201 08:20:38.853686 5004 scope.go:117] "RemoveContainer" containerID="67ef06c26d9aff37fd4b87284932273cc996f6df75479b6a8cb00b9c6cdb8f99" Dec 01 08:20:38 crc kubenswrapper[5004]: I1201 08:20:38.893242 5004 scope.go:117] "RemoveContainer" containerID="0c041dd9cc98ed40a4edf455b208a49e72d4b8ca2d88ebb1aec12f9560fa8640" Dec 01 08:20:38 crc kubenswrapper[5004]: I1201 08:20:38.920998 5004 scope.go:117] "RemoveContainer" containerID="1488ee8f8915ae5a8d85e3c4e8568265969c9bfa45ae9de2fe5bb52a36cdf944" Dec 01 08:20:38 crc kubenswrapper[5004]: E1201 08:20:38.921533 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1488ee8f8915ae5a8d85e3c4e8568265969c9bfa45ae9de2fe5bb52a36cdf944\": container with ID starting with 1488ee8f8915ae5a8d85e3c4e8568265969c9bfa45ae9de2fe5bb52a36cdf944 not found: ID does not exist" containerID="1488ee8f8915ae5a8d85e3c4e8568265969c9bfa45ae9de2fe5bb52a36cdf944" Dec 01 08:20:38 crc kubenswrapper[5004]: I1201 08:20:38.921619 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1488ee8f8915ae5a8d85e3c4e8568265969c9bfa45ae9de2fe5bb52a36cdf944"} err="failed to get container status \"1488ee8f8915ae5a8d85e3c4e8568265969c9bfa45ae9de2fe5bb52a36cdf944\": rpc error: code = NotFound desc = could not find container \"1488ee8f8915ae5a8d85e3c4e8568265969c9bfa45ae9de2fe5bb52a36cdf944\": container with ID starting with 1488ee8f8915ae5a8d85e3c4e8568265969c9bfa45ae9de2fe5bb52a36cdf944 not found: ID does not exist" Dec 01 08:20:38 crc kubenswrapper[5004]: I1201 08:20:38.921653 5004 scope.go:117] "RemoveContainer" containerID="67ef06c26d9aff37fd4b87284932273cc996f6df75479b6a8cb00b9c6cdb8f99" Dec 01 08:20:38 crc kubenswrapper[5004]: E1201 08:20:38.922008 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67ef06c26d9aff37fd4b87284932273cc996f6df75479b6a8cb00b9c6cdb8f99\": container with ID starting with 67ef06c26d9aff37fd4b87284932273cc996f6df75479b6a8cb00b9c6cdb8f99 not found: ID does not exist" containerID="67ef06c26d9aff37fd4b87284932273cc996f6df75479b6a8cb00b9c6cdb8f99" Dec 01 08:20:38 crc kubenswrapper[5004]: I1201 08:20:38.922050 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67ef06c26d9aff37fd4b87284932273cc996f6df75479b6a8cb00b9c6cdb8f99"} err="failed to get container status \"67ef06c26d9aff37fd4b87284932273cc996f6df75479b6a8cb00b9c6cdb8f99\": rpc error: code = NotFound desc = could not find container \"67ef06c26d9aff37fd4b87284932273cc996f6df75479b6a8cb00b9c6cdb8f99\": container with ID starting with 67ef06c26d9aff37fd4b87284932273cc996f6df75479b6a8cb00b9c6cdb8f99 not found: ID does not exist" Dec 01 08:20:38 crc kubenswrapper[5004]: I1201 08:20:38.922076 5004 scope.go:117] "RemoveContainer" containerID="0c041dd9cc98ed40a4edf455b208a49e72d4b8ca2d88ebb1aec12f9560fa8640" Dec 01 08:20:38 crc kubenswrapper[5004]: E1201 08:20:38.922455 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c041dd9cc98ed40a4edf455b208a49e72d4b8ca2d88ebb1aec12f9560fa8640\": container with ID starting with 0c041dd9cc98ed40a4edf455b208a49e72d4b8ca2d88ebb1aec12f9560fa8640 not found: ID does not exist" containerID="0c041dd9cc98ed40a4edf455b208a49e72d4b8ca2d88ebb1aec12f9560fa8640" Dec 01 08:20:38 crc kubenswrapper[5004]: I1201 08:20:38.922512 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c041dd9cc98ed40a4edf455b208a49e72d4b8ca2d88ebb1aec12f9560fa8640"} err="failed to get container status \"0c041dd9cc98ed40a4edf455b208a49e72d4b8ca2d88ebb1aec12f9560fa8640\": rpc error: code = NotFound desc = could not find container \"0c041dd9cc98ed40a4edf455b208a49e72d4b8ca2d88ebb1aec12f9560fa8640\": container with ID starting with 0c041dd9cc98ed40a4edf455b208a49e72d4b8ca2d88ebb1aec12f9560fa8640 not found: ID does not exist" Dec 01 08:20:39 crc kubenswrapper[5004]: I1201 08:20:39.164844 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-p77t7"] Dec 01 08:20:39 crc kubenswrapper[5004]: I1201 08:20:39.833266 5004 generic.go:334] "Generic (PLEG): container finished" podID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerID="6c1e51fa92aec80e93d0fce11c743b8c4fde23fc23d8626e8d5b3e25ab42350d" exitCode=0 Dec 01 08:20:39 crc kubenswrapper[5004]: I1201 08:20:39.833335 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" event={"ID":"f9977ebb-82de-4e96-8763-0b5a84f8d4ce","Type":"ContainerDied","Data":"6c1e51fa92aec80e93d0fce11c743b8c4fde23fc23d8626e8d5b3e25ab42350d"} Dec 01 08:20:39 crc kubenswrapper[5004]: I1201 08:20:39.833391 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" event={"ID":"f9977ebb-82de-4e96-8763-0b5a84f8d4ce","Type":"ContainerStarted","Data":"aee81d40a16962a7717cc3a5a3263157cb0e536c40bc2b3b83dfa0f852f31e2a"} Dec 01 08:20:40 crc kubenswrapper[5004]: I1201 08:20:40.766973 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ac4710f-ad3b-471c-861d-622e995871cd" path="/var/lib/kubelet/pods/9ac4710f-ad3b-471c-861d-622e995871cd/volumes" Dec 01 08:20:51 crc kubenswrapper[5004]: I1201 08:20:51.939680 5004 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 01 08:20:51 crc kubenswrapper[5004]: E1201 08:20:51.941100 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="494e6c31-3cc3-45a4-b45c-8f5ffb4251fa" containerName="registry-server" Dec 01 08:20:51 crc kubenswrapper[5004]: I1201 08:20:51.941123 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="494e6c31-3cc3-45a4-b45c-8f5ffb4251fa" containerName="registry-server" Dec 01 08:20:51 crc kubenswrapper[5004]: E1201 08:20:51.941139 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d7ad504-8dea-4c9c-9a5a-682d56793c9b" containerName="extract-content" Dec 01 08:20:51 crc kubenswrapper[5004]: I1201 08:20:51.941149 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d7ad504-8dea-4c9c-9a5a-682d56793c9b" containerName="extract-content" Dec 01 08:20:51 crc kubenswrapper[5004]: E1201 08:20:51.941164 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d7ad504-8dea-4c9c-9a5a-682d56793c9b" containerName="extract-utilities" Dec 01 08:20:51 crc kubenswrapper[5004]: I1201 08:20:51.941172 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d7ad504-8dea-4c9c-9a5a-682d56793c9b" containerName="extract-utilities" Dec 01 08:20:51 crc kubenswrapper[5004]: E1201 08:20:51.941184 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ac4710f-ad3b-471c-861d-622e995871cd" containerName="registry-server" Dec 01 08:20:51 crc kubenswrapper[5004]: I1201 08:20:51.941194 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ac4710f-ad3b-471c-861d-622e995871cd" containerName="registry-server" Dec 01 08:20:51 crc kubenswrapper[5004]: E1201 08:20:51.941214 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="494e6c31-3cc3-45a4-b45c-8f5ffb4251fa" containerName="extract-utilities" Dec 01 08:20:51 crc kubenswrapper[5004]: I1201 08:20:51.941222 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="494e6c31-3cc3-45a4-b45c-8f5ffb4251fa" containerName="extract-utilities" Dec 01 08:20:51 crc kubenswrapper[5004]: E1201 08:20:51.941233 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ac4710f-ad3b-471c-861d-622e995871cd" containerName="extract-content" Dec 01 08:20:51 crc kubenswrapper[5004]: I1201 08:20:51.941241 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ac4710f-ad3b-471c-861d-622e995871cd" containerName="extract-content" Dec 01 08:20:51 crc kubenswrapper[5004]: E1201 08:20:51.941254 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="494e6c31-3cc3-45a4-b45c-8f5ffb4251fa" containerName="extract-content" Dec 01 08:20:51 crc kubenswrapper[5004]: I1201 08:20:51.941263 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="494e6c31-3cc3-45a4-b45c-8f5ffb4251fa" containerName="extract-content" Dec 01 08:20:51 crc kubenswrapper[5004]: E1201 08:20:51.941275 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d7ad504-8dea-4c9c-9a5a-682d56793c9b" containerName="registry-server" Dec 01 08:20:51 crc kubenswrapper[5004]: I1201 08:20:51.941283 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d7ad504-8dea-4c9c-9a5a-682d56793c9b" containerName="registry-server" Dec 01 08:20:51 crc kubenswrapper[5004]: E1201 08:20:51.941296 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50e55125-2439-4ff5-87f4-e15db8c26dae" containerName="registry-server" Dec 01 08:20:51 crc kubenswrapper[5004]: I1201 08:20:51.941306 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="50e55125-2439-4ff5-87f4-e15db8c26dae" containerName="registry-server" Dec 01 08:20:51 crc kubenswrapper[5004]: E1201 08:20:51.941318 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50e55125-2439-4ff5-87f4-e15db8c26dae" containerName="extract-content" Dec 01 08:20:51 crc kubenswrapper[5004]: I1201 08:20:51.941327 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="50e55125-2439-4ff5-87f4-e15db8c26dae" containerName="extract-content" Dec 01 08:20:51 crc kubenswrapper[5004]: E1201 08:20:51.941339 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ac4710f-ad3b-471c-861d-622e995871cd" containerName="extract-utilities" Dec 01 08:20:51 crc kubenswrapper[5004]: I1201 08:20:51.941347 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ac4710f-ad3b-471c-861d-622e995871cd" containerName="extract-utilities" Dec 01 08:20:51 crc kubenswrapper[5004]: E1201 08:20:51.941358 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50e55125-2439-4ff5-87f4-e15db8c26dae" containerName="extract-utilities" Dec 01 08:20:51 crc kubenswrapper[5004]: I1201 08:20:51.941366 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="50e55125-2439-4ff5-87f4-e15db8c26dae" containerName="extract-utilities" Dec 01 08:20:51 crc kubenswrapper[5004]: I1201 08:20:51.941507 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="50e55125-2439-4ff5-87f4-e15db8c26dae" containerName="registry-server" Dec 01 08:20:51 crc kubenswrapper[5004]: I1201 08:20:51.941529 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d7ad504-8dea-4c9c-9a5a-682d56793c9b" containerName="registry-server" Dec 01 08:20:51 crc kubenswrapper[5004]: I1201 08:20:51.941543 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="494e6c31-3cc3-45a4-b45c-8f5ffb4251fa" containerName="registry-server" Dec 01 08:20:51 crc kubenswrapper[5004]: I1201 08:20:51.941575 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ac4710f-ad3b-471c-861d-622e995871cd" containerName="registry-server" Dec 01 08:20:51 crc kubenswrapper[5004]: I1201 08:20:51.942117 5004 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 01 08:20:51 crc kubenswrapper[5004]: I1201 08:20:51.942302 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 08:20:51 crc kubenswrapper[5004]: I1201 08:20:51.942536 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://45a0d7a12e7c114de1740a151e04f7f39844b709740cab958f156ac918af3228" gracePeriod=15 Dec 01 08:20:51 crc kubenswrapper[5004]: I1201 08:20:51.942812 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://aa5f341e99100e985f65c24647d976a24782dfae7f52e752b774f41f69af27fd" gracePeriod=15 Dec 01 08:20:51 crc kubenswrapper[5004]: I1201 08:20:51.942874 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://2caeb2ebaf68110cb8728e60aae6db1b1fc48efae6018f66070766a4f42caa69" gracePeriod=15 Dec 01 08:20:51 crc kubenswrapper[5004]: I1201 08:20:51.942950 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://d69a370af9f8f6c575f99da6bb01cee0f660dccb7e85f16f5d8874d0e263e1fd" gracePeriod=15 Dec 01 08:20:51 crc kubenswrapper[5004]: I1201 08:20:51.942971 5004 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 01 08:20:51 crc kubenswrapper[5004]: I1201 08:20:51.943003 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://c82baee0fd931fd7b41cd5a73ca7e653ef13dfa3d573991cbe3238d6b95f6fac" gracePeriod=15 Dec 01 08:20:51 crc kubenswrapper[5004]: E1201 08:20:51.943618 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 01 08:20:51 crc kubenswrapper[5004]: I1201 08:20:51.943649 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 01 08:20:51 crc kubenswrapper[5004]: E1201 08:20:51.943680 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 01 08:20:51 crc kubenswrapper[5004]: I1201 08:20:51.943692 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 01 08:20:51 crc kubenswrapper[5004]: E1201 08:20:51.943706 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 01 08:20:51 crc kubenswrapper[5004]: I1201 08:20:51.943722 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 01 08:20:51 crc kubenswrapper[5004]: E1201 08:20:51.943739 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 01 08:20:51 crc kubenswrapper[5004]: I1201 08:20:51.943752 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 01 08:20:51 crc kubenswrapper[5004]: E1201 08:20:51.943777 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 01 08:20:51 crc kubenswrapper[5004]: I1201 08:20:51.943789 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 01 08:20:51 crc kubenswrapper[5004]: E1201 08:20:51.943809 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 01 08:20:51 crc kubenswrapper[5004]: I1201 08:20:51.943821 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 01 08:20:51 crc kubenswrapper[5004]: I1201 08:20:51.944005 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 01 08:20:51 crc kubenswrapper[5004]: I1201 08:20:51.944085 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 01 08:20:51 crc kubenswrapper[5004]: I1201 08:20:51.944104 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 01 08:20:51 crc kubenswrapper[5004]: I1201 08:20:51.944120 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 01 08:20:51 crc kubenswrapper[5004]: I1201 08:20:51.944139 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 01 08:20:51 crc kubenswrapper[5004]: I1201 08:20:51.944157 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 01 08:20:51 crc kubenswrapper[5004]: E1201 08:20:51.944333 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 01 08:20:51 crc kubenswrapper[5004]: I1201 08:20:51.944348 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 01 08:20:51 crc kubenswrapper[5004]: I1201 08:20:51.947154 5004 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Dec 01 08:20:52 crc kubenswrapper[5004]: I1201 08:20:52.061944 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 08:20:52 crc kubenswrapper[5004]: I1201 08:20:52.062024 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 08:20:52 crc kubenswrapper[5004]: I1201 08:20:52.062067 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 08:20:52 crc kubenswrapper[5004]: I1201 08:20:52.062095 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 08:20:52 crc kubenswrapper[5004]: I1201 08:20:52.062234 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 08:20:52 crc kubenswrapper[5004]: I1201 08:20:52.062269 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 08:20:52 crc kubenswrapper[5004]: I1201 08:20:52.062298 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 08:20:52 crc kubenswrapper[5004]: I1201 08:20:52.062338 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 08:20:52 crc kubenswrapper[5004]: I1201 08:20:52.163093 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 08:20:52 crc kubenswrapper[5004]: I1201 08:20:52.163158 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 08:20:52 crc kubenswrapper[5004]: I1201 08:20:52.163198 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 08:20:52 crc kubenswrapper[5004]: I1201 08:20:52.163226 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 08:20:52 crc kubenswrapper[5004]: I1201 08:20:52.163239 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 08:20:52 crc kubenswrapper[5004]: I1201 08:20:52.163272 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 08:20:52 crc kubenswrapper[5004]: I1201 08:20:52.163303 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 08:20:52 crc kubenswrapper[5004]: I1201 08:20:52.163319 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 08:20:52 crc kubenswrapper[5004]: I1201 08:20:52.163344 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 08:20:52 crc kubenswrapper[5004]: I1201 08:20:52.163353 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 08:20:52 crc kubenswrapper[5004]: I1201 08:20:52.163375 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 08:20:52 crc kubenswrapper[5004]: I1201 08:20:52.163406 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 08:20:52 crc kubenswrapper[5004]: I1201 08:20:52.163442 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 08:20:52 crc kubenswrapper[5004]: I1201 08:20:52.163483 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 08:20:52 crc kubenswrapper[5004]: I1201 08:20:52.163501 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 08:20:52 crc kubenswrapper[5004]: I1201 08:20:52.163515 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 08:20:52 crc kubenswrapper[5004]: I1201 08:20:52.929123 5004 generic.go:334] "Generic (PLEG): container finished" podID="af8fc39f-8aa1-4e0d-8fe1-33aa79852f78" containerID="3cac78010ceea6d134852b8f0e220fd6f5eec37a19b29bb2279cef8ebe2cee13" exitCode=0 Dec 01 08:20:52 crc kubenswrapper[5004]: I1201 08:20:52.929788 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"af8fc39f-8aa1-4e0d-8fe1-33aa79852f78","Type":"ContainerDied","Data":"3cac78010ceea6d134852b8f0e220fd6f5eec37a19b29bb2279cef8ebe2cee13"} Dec 01 08:20:52 crc kubenswrapper[5004]: I1201 08:20:52.931238 5004 status_manager.go:851] "Failed to get status for pod" podUID="af8fc39f-8aa1-4e0d-8fe1-33aa79852f78" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Dec 01 08:20:52 crc kubenswrapper[5004]: I1201 08:20:52.933537 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 01 08:20:52 crc kubenswrapper[5004]: I1201 08:20:52.935738 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 01 08:20:52 crc kubenswrapper[5004]: I1201 08:20:52.937614 5004 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="aa5f341e99100e985f65c24647d976a24782dfae7f52e752b774f41f69af27fd" exitCode=0 Dec 01 08:20:52 crc kubenswrapper[5004]: I1201 08:20:52.937852 5004 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2caeb2ebaf68110cb8728e60aae6db1b1fc48efae6018f66070766a4f42caa69" exitCode=0 Dec 01 08:20:52 crc kubenswrapper[5004]: I1201 08:20:52.937784 5004 scope.go:117] "RemoveContainer" containerID="ad581564fc519328ae429bce757797f6d87d8648e862b008637a245866ec4cbe" Dec 01 08:20:52 crc kubenswrapper[5004]: I1201 08:20:52.938036 5004 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d69a370af9f8f6c575f99da6bb01cee0f660dccb7e85f16f5d8874d0e263e1fd" exitCode=0 Dec 01 08:20:52 crc kubenswrapper[5004]: I1201 08:20:52.938215 5004 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c82baee0fd931fd7b41cd5a73ca7e653ef13dfa3d573991cbe3238d6b95f6fac" exitCode=2 Dec 01 08:20:53 crc kubenswrapper[5004]: I1201 08:20:53.951853 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 01 08:20:54 crc kubenswrapper[5004]: I1201 08:20:54.351015 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 01 08:20:54 crc kubenswrapper[5004]: I1201 08:20:54.352266 5004 status_manager.go:851] "Failed to get status for pod" podUID="af8fc39f-8aa1-4e0d-8fe1-33aa79852f78" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Dec 01 08:20:54 crc kubenswrapper[5004]: I1201 08:20:54.397756 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/af8fc39f-8aa1-4e0d-8fe1-33aa79852f78-var-lock\") pod \"af8fc39f-8aa1-4e0d-8fe1-33aa79852f78\" (UID: \"af8fc39f-8aa1-4e0d-8fe1-33aa79852f78\") " Dec 01 08:20:54 crc kubenswrapper[5004]: I1201 08:20:54.397801 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/af8fc39f-8aa1-4e0d-8fe1-33aa79852f78-kubelet-dir\") pod \"af8fc39f-8aa1-4e0d-8fe1-33aa79852f78\" (UID: \"af8fc39f-8aa1-4e0d-8fe1-33aa79852f78\") " Dec 01 08:20:54 crc kubenswrapper[5004]: I1201 08:20:54.397843 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/af8fc39f-8aa1-4e0d-8fe1-33aa79852f78-kube-api-access\") pod \"af8fc39f-8aa1-4e0d-8fe1-33aa79852f78\" (UID: \"af8fc39f-8aa1-4e0d-8fe1-33aa79852f78\") " Dec 01 08:20:54 crc kubenswrapper[5004]: I1201 08:20:54.397950 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/af8fc39f-8aa1-4e0d-8fe1-33aa79852f78-var-lock" (OuterVolumeSpecName: "var-lock") pod "af8fc39f-8aa1-4e0d-8fe1-33aa79852f78" (UID: "af8fc39f-8aa1-4e0d-8fe1-33aa79852f78"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 08:20:54 crc kubenswrapper[5004]: I1201 08:20:54.398005 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/af8fc39f-8aa1-4e0d-8fe1-33aa79852f78-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "af8fc39f-8aa1-4e0d-8fe1-33aa79852f78" (UID: "af8fc39f-8aa1-4e0d-8fe1-33aa79852f78"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 08:20:54 crc kubenswrapper[5004]: I1201 08:20:54.398353 5004 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/af8fc39f-8aa1-4e0d-8fe1-33aa79852f78-var-lock\") on node \"crc\" DevicePath \"\"" Dec 01 08:20:54 crc kubenswrapper[5004]: I1201 08:20:54.398380 5004 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/af8fc39f-8aa1-4e0d-8fe1-33aa79852f78-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 01 08:20:54 crc kubenswrapper[5004]: I1201 08:20:54.405847 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af8fc39f-8aa1-4e0d-8fe1-33aa79852f78-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "af8fc39f-8aa1-4e0d-8fe1-33aa79852f78" (UID: "af8fc39f-8aa1-4e0d-8fe1-33aa79852f78"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:20:54 crc kubenswrapper[5004]: I1201 08:20:54.499339 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/af8fc39f-8aa1-4e0d-8fe1-33aa79852f78-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 08:20:54 crc kubenswrapper[5004]: I1201 08:20:54.837525 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 01 08:20:54 crc kubenswrapper[5004]: I1201 08:20:54.838845 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 08:20:54 crc kubenswrapper[5004]: I1201 08:20:54.839811 5004 status_manager.go:851] "Failed to get status for pod" podUID="af8fc39f-8aa1-4e0d-8fe1-33aa79852f78" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Dec 01 08:20:54 crc kubenswrapper[5004]: I1201 08:20:54.840448 5004 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Dec 01 08:20:54 crc kubenswrapper[5004]: I1201 08:20:54.902976 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 01 08:20:54 crc kubenswrapper[5004]: I1201 08:20:54.903012 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 01 08:20:54 crc kubenswrapper[5004]: I1201 08:20:54.903042 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 01 08:20:54 crc kubenswrapper[5004]: I1201 08:20:54.903253 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 08:20:54 crc kubenswrapper[5004]: I1201 08:20:54.903283 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 08:20:54 crc kubenswrapper[5004]: I1201 08:20:54.903300 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 08:20:54 crc kubenswrapper[5004]: I1201 08:20:54.965817 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 01 08:20:54 crc kubenswrapper[5004]: I1201 08:20:54.966734 5004 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="45a0d7a12e7c114de1740a151e04f7f39844b709740cab958f156ac918af3228" exitCode=0 Dec 01 08:20:54 crc kubenswrapper[5004]: I1201 08:20:54.966955 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 08:20:54 crc kubenswrapper[5004]: I1201 08:20:54.967196 5004 scope.go:117] "RemoveContainer" containerID="aa5f341e99100e985f65c24647d976a24782dfae7f52e752b774f41f69af27fd" Dec 01 08:20:54 crc kubenswrapper[5004]: I1201 08:20:54.969704 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"af8fc39f-8aa1-4e0d-8fe1-33aa79852f78","Type":"ContainerDied","Data":"de753a5332f8cf7d717c3afe9c6e3657d53435793a1d2776f9228762a754c26b"} Dec 01 08:20:54 crc kubenswrapper[5004]: I1201 08:20:54.969741 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de753a5332f8cf7d717c3afe9c6e3657d53435793a1d2776f9228762a754c26b" Dec 01 08:20:54 crc kubenswrapper[5004]: I1201 08:20:54.969800 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 01 08:20:54 crc kubenswrapper[5004]: I1201 08:20:54.974235 5004 status_manager.go:851] "Failed to get status for pod" podUID="af8fc39f-8aa1-4e0d-8fe1-33aa79852f78" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Dec 01 08:20:54 crc kubenswrapper[5004]: I1201 08:20:54.974910 5004 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Dec 01 08:20:54 crc kubenswrapper[5004]: I1201 08:20:54.995463 5004 status_manager.go:851] "Failed to get status for pod" podUID="af8fc39f-8aa1-4e0d-8fe1-33aa79852f78" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Dec 01 08:20:54 crc kubenswrapper[5004]: I1201 08:20:54.996372 5004 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Dec 01 08:20:54 crc kubenswrapper[5004]: I1201 08:20:54.996887 5004 scope.go:117] "RemoveContainer" containerID="2caeb2ebaf68110cb8728e60aae6db1b1fc48efae6018f66070766a4f42caa69" Dec 01 08:20:55 crc kubenswrapper[5004]: I1201 08:20:55.004485 5004 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Dec 01 08:20:55 crc kubenswrapper[5004]: I1201 08:20:55.004547 5004 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 01 08:20:55 crc kubenswrapper[5004]: I1201 08:20:55.004626 5004 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 01 08:20:55 crc kubenswrapper[5004]: I1201 08:20:55.014848 5004 scope.go:117] "RemoveContainer" containerID="d69a370af9f8f6c575f99da6bb01cee0f660dccb7e85f16f5d8874d0e263e1fd" Dec 01 08:20:55 crc kubenswrapper[5004]: I1201 08:20:55.035040 5004 scope.go:117] "RemoveContainer" containerID="c82baee0fd931fd7b41cd5a73ca7e653ef13dfa3d573991cbe3238d6b95f6fac" Dec 01 08:20:55 crc kubenswrapper[5004]: I1201 08:20:55.062967 5004 scope.go:117] "RemoveContainer" containerID="45a0d7a12e7c114de1740a151e04f7f39844b709740cab958f156ac918af3228" Dec 01 08:20:55 crc kubenswrapper[5004]: I1201 08:20:55.088943 5004 scope.go:117] "RemoveContainer" containerID="2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0" Dec 01 08:20:55 crc kubenswrapper[5004]: I1201 08:20:55.117269 5004 scope.go:117] "RemoveContainer" containerID="aa5f341e99100e985f65c24647d976a24782dfae7f52e752b774f41f69af27fd" Dec 01 08:20:55 crc kubenswrapper[5004]: E1201 08:20:55.119186 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa5f341e99100e985f65c24647d976a24782dfae7f52e752b774f41f69af27fd\": container with ID starting with aa5f341e99100e985f65c24647d976a24782dfae7f52e752b774f41f69af27fd not found: ID does not exist" containerID="aa5f341e99100e985f65c24647d976a24782dfae7f52e752b774f41f69af27fd" Dec 01 08:20:55 crc kubenswrapper[5004]: I1201 08:20:55.119234 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa5f341e99100e985f65c24647d976a24782dfae7f52e752b774f41f69af27fd"} err="failed to get container status \"aa5f341e99100e985f65c24647d976a24782dfae7f52e752b774f41f69af27fd\": rpc error: code = NotFound desc = could not find container \"aa5f341e99100e985f65c24647d976a24782dfae7f52e752b774f41f69af27fd\": container with ID starting with aa5f341e99100e985f65c24647d976a24782dfae7f52e752b774f41f69af27fd not found: ID does not exist" Dec 01 08:20:55 crc kubenswrapper[5004]: I1201 08:20:55.119266 5004 scope.go:117] "RemoveContainer" containerID="2caeb2ebaf68110cb8728e60aae6db1b1fc48efae6018f66070766a4f42caa69" Dec 01 08:20:55 crc kubenswrapper[5004]: E1201 08:20:55.119867 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2caeb2ebaf68110cb8728e60aae6db1b1fc48efae6018f66070766a4f42caa69\": container with ID starting with 2caeb2ebaf68110cb8728e60aae6db1b1fc48efae6018f66070766a4f42caa69 not found: ID does not exist" containerID="2caeb2ebaf68110cb8728e60aae6db1b1fc48efae6018f66070766a4f42caa69" Dec 01 08:20:55 crc kubenswrapper[5004]: I1201 08:20:55.119907 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2caeb2ebaf68110cb8728e60aae6db1b1fc48efae6018f66070766a4f42caa69"} err="failed to get container status \"2caeb2ebaf68110cb8728e60aae6db1b1fc48efae6018f66070766a4f42caa69\": rpc error: code = NotFound desc = could not find container \"2caeb2ebaf68110cb8728e60aae6db1b1fc48efae6018f66070766a4f42caa69\": container with ID starting with 2caeb2ebaf68110cb8728e60aae6db1b1fc48efae6018f66070766a4f42caa69 not found: ID does not exist" Dec 01 08:20:55 crc kubenswrapper[5004]: I1201 08:20:55.119935 5004 scope.go:117] "RemoveContainer" containerID="d69a370af9f8f6c575f99da6bb01cee0f660dccb7e85f16f5d8874d0e263e1fd" Dec 01 08:20:55 crc kubenswrapper[5004]: E1201 08:20:55.120418 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d69a370af9f8f6c575f99da6bb01cee0f660dccb7e85f16f5d8874d0e263e1fd\": container with ID starting with d69a370af9f8f6c575f99da6bb01cee0f660dccb7e85f16f5d8874d0e263e1fd not found: ID does not exist" containerID="d69a370af9f8f6c575f99da6bb01cee0f660dccb7e85f16f5d8874d0e263e1fd" Dec 01 08:20:55 crc kubenswrapper[5004]: I1201 08:20:55.120470 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d69a370af9f8f6c575f99da6bb01cee0f660dccb7e85f16f5d8874d0e263e1fd"} err="failed to get container status \"d69a370af9f8f6c575f99da6bb01cee0f660dccb7e85f16f5d8874d0e263e1fd\": rpc error: code = NotFound desc = could not find container \"d69a370af9f8f6c575f99da6bb01cee0f660dccb7e85f16f5d8874d0e263e1fd\": container with ID starting with d69a370af9f8f6c575f99da6bb01cee0f660dccb7e85f16f5d8874d0e263e1fd not found: ID does not exist" Dec 01 08:20:55 crc kubenswrapper[5004]: I1201 08:20:55.120508 5004 scope.go:117] "RemoveContainer" containerID="c82baee0fd931fd7b41cd5a73ca7e653ef13dfa3d573991cbe3238d6b95f6fac" Dec 01 08:20:55 crc kubenswrapper[5004]: E1201 08:20:55.120809 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c82baee0fd931fd7b41cd5a73ca7e653ef13dfa3d573991cbe3238d6b95f6fac\": container with ID starting with c82baee0fd931fd7b41cd5a73ca7e653ef13dfa3d573991cbe3238d6b95f6fac not found: ID does not exist" containerID="c82baee0fd931fd7b41cd5a73ca7e653ef13dfa3d573991cbe3238d6b95f6fac" Dec 01 08:20:55 crc kubenswrapper[5004]: I1201 08:20:55.120875 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c82baee0fd931fd7b41cd5a73ca7e653ef13dfa3d573991cbe3238d6b95f6fac"} err="failed to get container status \"c82baee0fd931fd7b41cd5a73ca7e653ef13dfa3d573991cbe3238d6b95f6fac\": rpc error: code = NotFound desc = could not find container \"c82baee0fd931fd7b41cd5a73ca7e653ef13dfa3d573991cbe3238d6b95f6fac\": container with ID starting with c82baee0fd931fd7b41cd5a73ca7e653ef13dfa3d573991cbe3238d6b95f6fac not found: ID does not exist" Dec 01 08:20:55 crc kubenswrapper[5004]: I1201 08:20:55.120896 5004 scope.go:117] "RemoveContainer" containerID="45a0d7a12e7c114de1740a151e04f7f39844b709740cab958f156ac918af3228" Dec 01 08:20:55 crc kubenswrapper[5004]: E1201 08:20:55.121090 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45a0d7a12e7c114de1740a151e04f7f39844b709740cab958f156ac918af3228\": container with ID starting with 45a0d7a12e7c114de1740a151e04f7f39844b709740cab958f156ac918af3228 not found: ID does not exist" containerID="45a0d7a12e7c114de1740a151e04f7f39844b709740cab958f156ac918af3228" Dec 01 08:20:55 crc kubenswrapper[5004]: I1201 08:20:55.121129 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45a0d7a12e7c114de1740a151e04f7f39844b709740cab958f156ac918af3228"} err="failed to get container status \"45a0d7a12e7c114de1740a151e04f7f39844b709740cab958f156ac918af3228\": rpc error: code = NotFound desc = could not find container \"45a0d7a12e7c114de1740a151e04f7f39844b709740cab958f156ac918af3228\": container with ID starting with 45a0d7a12e7c114de1740a151e04f7f39844b709740cab958f156ac918af3228 not found: ID does not exist" Dec 01 08:20:55 crc kubenswrapper[5004]: I1201 08:20:55.121149 5004 scope.go:117] "RemoveContainer" containerID="2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0" Dec 01 08:20:55 crc kubenswrapper[5004]: E1201 08:20:55.121408 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\": container with ID starting with 2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0 not found: ID does not exist" containerID="2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0" Dec 01 08:20:55 crc kubenswrapper[5004]: I1201 08:20:55.121436 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0"} err="failed to get container status \"2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\": rpc error: code = NotFound desc = could not find container \"2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0\": container with ID starting with 2276a185fefa99f4e9fe63b7acce901ce2eb168f71a65f5fa97c1892aebce0e0 not found: ID does not exist" Dec 01 08:20:56 crc kubenswrapper[5004]: I1201 08:20:56.767937 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Dec 01 08:20:56 crc kubenswrapper[5004]: E1201 08:20:56.999866 5004 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.75:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 08:20:57 crc kubenswrapper[5004]: I1201 08:20:57.000356 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 08:20:57 crc kubenswrapper[5004]: E1201 08:20:57.048444 5004 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.75:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187d09aa890f62ae openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-01 08:20:57.047679662 +0000 UTC m=+234.612671684,LastTimestamp:2025-12-01 08:20:57.047679662 +0000 UTC m=+234.612671684,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 01 08:20:57 crc kubenswrapper[5004]: E1201 08:20:57.141880 5004 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.75:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187d09aa890f62ae openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-01 08:20:57.047679662 +0000 UTC m=+234.612671684,LastTimestamp:2025-12-01 08:20:57.047679662 +0000 UTC m=+234.612671684,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 01 08:20:57 crc kubenswrapper[5004]: E1201 08:20:57.227988 5004 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" Dec 01 08:20:57 crc kubenswrapper[5004]: E1201 08:20:57.228415 5004 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" Dec 01 08:20:57 crc kubenswrapper[5004]: E1201 08:20:57.228694 5004 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" Dec 01 08:20:57 crc kubenswrapper[5004]: E1201 08:20:57.228916 5004 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" Dec 01 08:20:57 crc kubenswrapper[5004]: E1201 08:20:57.229199 5004 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" Dec 01 08:20:57 crc kubenswrapper[5004]: I1201 08:20:57.229230 5004 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 01 08:20:57 crc kubenswrapper[5004]: E1201 08:20:57.229509 5004 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="200ms" Dec 01 08:20:57 crc kubenswrapper[5004]: E1201 08:20:57.430738 5004 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="400ms" Dec 01 08:20:57 crc kubenswrapper[5004]: E1201 08:20:57.831553 5004 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="800ms" Dec 01 08:20:57 crc kubenswrapper[5004]: I1201 08:20:57.997448 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"1c5cdd1b3c526285033c11c62efe5731c0942b80c2e6d88eaeca2818df30f71e"} Dec 01 08:20:57 crc kubenswrapper[5004]: I1201 08:20:57.997504 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"5d7338c4f88b16d6c2832b781f997cd20b822b6e00064b1be449d309006a6ff7"} Dec 01 08:20:57 crc kubenswrapper[5004]: E1201 08:20:57.998256 5004 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.75:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 08:20:57 crc kubenswrapper[5004]: I1201 08:20:57.998691 5004 status_manager.go:851] "Failed to get status for pod" podUID="af8fc39f-8aa1-4e0d-8fe1-33aa79852f78" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Dec 01 08:20:58 crc kubenswrapper[5004]: E1201 08:20:58.632882 5004 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="1.6s" Dec 01 08:21:00 crc kubenswrapper[5004]: E1201 08:21:00.234117 5004 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="3.2s" Dec 01 08:21:02 crc kubenswrapper[5004]: I1201 08:21:02.759064 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 08:21:02 crc kubenswrapper[5004]: I1201 08:21:02.762219 5004 status_manager.go:851] "Failed to get status for pod" podUID="af8fc39f-8aa1-4e0d-8fe1-33aa79852f78" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Dec 01 08:21:02 crc kubenswrapper[5004]: I1201 08:21:02.762961 5004 status_manager.go:851] "Failed to get status for pod" podUID="af8fc39f-8aa1-4e0d-8fe1-33aa79852f78" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Dec 01 08:21:02 crc kubenswrapper[5004]: I1201 08:21:02.786160 5004 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f0ed2b6d-0c61-4639-bc3b-1c8effc4815d" Dec 01 08:21:02 crc kubenswrapper[5004]: I1201 08:21:02.786201 5004 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f0ed2b6d-0c61-4639-bc3b-1c8effc4815d" Dec 01 08:21:02 crc kubenswrapper[5004]: E1201 08:21:02.786759 5004 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 08:21:02 crc kubenswrapper[5004]: I1201 08:21:02.790506 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 08:21:03 crc kubenswrapper[5004]: I1201 08:21:03.040579 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f41460f7dd71d2bdbf19ed93f187f8dd1fe86db1f64b616d59b35cba14db6e96"} Dec 01 08:21:03 crc kubenswrapper[5004]: E1201 08:21:03.434868 5004 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="6.4s" Dec 01 08:21:04 crc kubenswrapper[5004]: I1201 08:21:04.048201 5004 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="353c1fe2ca877b01a623f86a8e9e97e0ab793033b7408fa52cf06f585ef2935d" exitCode=0 Dec 01 08:21:04 crc kubenswrapper[5004]: I1201 08:21:04.048285 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"353c1fe2ca877b01a623f86a8e9e97e0ab793033b7408fa52cf06f585ef2935d"} Dec 01 08:21:04 crc kubenswrapper[5004]: I1201 08:21:04.048702 5004 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f0ed2b6d-0c61-4639-bc3b-1c8effc4815d" Dec 01 08:21:04 crc kubenswrapper[5004]: I1201 08:21:04.048734 5004 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f0ed2b6d-0c61-4639-bc3b-1c8effc4815d" Dec 01 08:21:04 crc kubenswrapper[5004]: I1201 08:21:04.049012 5004 status_manager.go:851] "Failed to get status for pod" podUID="af8fc39f-8aa1-4e0d-8fe1-33aa79852f78" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Dec 01 08:21:04 crc kubenswrapper[5004]: E1201 08:21:04.049318 5004 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 08:21:04 crc kubenswrapper[5004]: I1201 08:21:04.192962 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-p77t7" podUID="15dc5e3f-02c6-474d-bd7b-d51ce42340b3" containerName="oauth-openshift" containerID="cri-o://74772694e1c95b51d67b201d5ad882a5668763ec908b994b999c98e143fd7dd6" gracePeriod=15 Dec 01 08:21:04 crc kubenswrapper[5004]: I1201 08:21:04.618819 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-p77t7" Dec 01 08:21:04 crc kubenswrapper[5004]: I1201 08:21:04.642783 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-v4-0-config-user-template-error\") pod \"15dc5e3f-02c6-474d-bd7b-d51ce42340b3\" (UID: \"15dc5e3f-02c6-474d-bd7b-d51ce42340b3\") " Dec 01 08:21:04 crc kubenswrapper[5004]: I1201 08:21:04.642829 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-audit-dir\") pod \"15dc5e3f-02c6-474d-bd7b-d51ce42340b3\" (UID: \"15dc5e3f-02c6-474d-bd7b-d51ce42340b3\") " Dec 01 08:21:04 crc kubenswrapper[5004]: I1201 08:21:04.642860 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-v4-0-config-user-idp-0-file-data\") pod \"15dc5e3f-02c6-474d-bd7b-d51ce42340b3\" (UID: \"15dc5e3f-02c6-474d-bd7b-d51ce42340b3\") " Dec 01 08:21:04 crc kubenswrapper[5004]: I1201 08:21:04.642886 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-v4-0-config-system-serving-cert\") pod \"15dc5e3f-02c6-474d-bd7b-d51ce42340b3\" (UID: \"15dc5e3f-02c6-474d-bd7b-d51ce42340b3\") " Dec 01 08:21:04 crc kubenswrapper[5004]: I1201 08:21:04.642921 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-v4-0-config-system-trusted-ca-bundle\") pod \"15dc5e3f-02c6-474d-bd7b-d51ce42340b3\" (UID: \"15dc5e3f-02c6-474d-bd7b-d51ce42340b3\") " Dec 01 08:21:04 crc kubenswrapper[5004]: I1201 08:21:04.642943 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-audit-policies\") pod \"15dc5e3f-02c6-474d-bd7b-d51ce42340b3\" (UID: \"15dc5e3f-02c6-474d-bd7b-d51ce42340b3\") " Dec 01 08:21:04 crc kubenswrapper[5004]: I1201 08:21:04.642967 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-v4-0-config-system-service-ca\") pod \"15dc5e3f-02c6-474d-bd7b-d51ce42340b3\" (UID: \"15dc5e3f-02c6-474d-bd7b-d51ce42340b3\") " Dec 01 08:21:04 crc kubenswrapper[5004]: I1201 08:21:04.642997 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-v4-0-config-system-cliconfig\") pod \"15dc5e3f-02c6-474d-bd7b-d51ce42340b3\" (UID: \"15dc5e3f-02c6-474d-bd7b-d51ce42340b3\") " Dec 01 08:21:04 crc kubenswrapper[5004]: I1201 08:21:04.642927 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "15dc5e3f-02c6-474d-bd7b-d51ce42340b3" (UID: "15dc5e3f-02c6-474d-bd7b-d51ce42340b3"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 08:21:04 crc kubenswrapper[5004]: I1201 08:21:04.643023 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-v4-0-config-system-session\") pod \"15dc5e3f-02c6-474d-bd7b-d51ce42340b3\" (UID: \"15dc5e3f-02c6-474d-bd7b-d51ce42340b3\") " Dec 01 08:21:04 crc kubenswrapper[5004]: I1201 08:21:04.643051 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-v4-0-config-system-router-certs\") pod \"15dc5e3f-02c6-474d-bd7b-d51ce42340b3\" (UID: \"15dc5e3f-02c6-474d-bd7b-d51ce42340b3\") " Dec 01 08:21:04 crc kubenswrapper[5004]: I1201 08:21:04.643077 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-v4-0-config-user-template-login\") pod \"15dc5e3f-02c6-474d-bd7b-d51ce42340b3\" (UID: \"15dc5e3f-02c6-474d-bd7b-d51ce42340b3\") " Dec 01 08:21:04 crc kubenswrapper[5004]: I1201 08:21:04.643104 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hc6xg\" (UniqueName: \"kubernetes.io/projected/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-kube-api-access-hc6xg\") pod \"15dc5e3f-02c6-474d-bd7b-d51ce42340b3\" (UID: \"15dc5e3f-02c6-474d-bd7b-d51ce42340b3\") " Dec 01 08:21:04 crc kubenswrapper[5004]: I1201 08:21:04.643129 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-v4-0-config-system-ocp-branding-template\") pod \"15dc5e3f-02c6-474d-bd7b-d51ce42340b3\" (UID: \"15dc5e3f-02c6-474d-bd7b-d51ce42340b3\") " Dec 01 08:21:04 crc kubenswrapper[5004]: I1201 08:21:04.643161 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-v4-0-config-user-template-provider-selection\") pod \"15dc5e3f-02c6-474d-bd7b-d51ce42340b3\" (UID: \"15dc5e3f-02c6-474d-bd7b-d51ce42340b3\") " Dec 01 08:21:04 crc kubenswrapper[5004]: I1201 08:21:04.643320 5004 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 01 08:21:04 crc kubenswrapper[5004]: I1201 08:21:04.643854 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "15dc5e3f-02c6-474d-bd7b-d51ce42340b3" (UID: "15dc5e3f-02c6-474d-bd7b-d51ce42340b3"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:21:04 crc kubenswrapper[5004]: I1201 08:21:04.643871 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "15dc5e3f-02c6-474d-bd7b-d51ce42340b3" (UID: "15dc5e3f-02c6-474d-bd7b-d51ce42340b3"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:21:04 crc kubenswrapper[5004]: I1201 08:21:04.643895 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "15dc5e3f-02c6-474d-bd7b-d51ce42340b3" (UID: "15dc5e3f-02c6-474d-bd7b-d51ce42340b3"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:21:04 crc kubenswrapper[5004]: I1201 08:21:04.644737 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "15dc5e3f-02c6-474d-bd7b-d51ce42340b3" (UID: "15dc5e3f-02c6-474d-bd7b-d51ce42340b3"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:21:04 crc kubenswrapper[5004]: I1201 08:21:04.649076 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "15dc5e3f-02c6-474d-bd7b-d51ce42340b3" (UID: "15dc5e3f-02c6-474d-bd7b-d51ce42340b3"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:21:04 crc kubenswrapper[5004]: I1201 08:21:04.649473 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "15dc5e3f-02c6-474d-bd7b-d51ce42340b3" (UID: "15dc5e3f-02c6-474d-bd7b-d51ce42340b3"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:21:04 crc kubenswrapper[5004]: I1201 08:21:04.649696 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "15dc5e3f-02c6-474d-bd7b-d51ce42340b3" (UID: "15dc5e3f-02c6-474d-bd7b-d51ce42340b3"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:21:04 crc kubenswrapper[5004]: I1201 08:21:04.650082 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "15dc5e3f-02c6-474d-bd7b-d51ce42340b3" (UID: "15dc5e3f-02c6-474d-bd7b-d51ce42340b3"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:21:04 crc kubenswrapper[5004]: I1201 08:21:04.650101 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "15dc5e3f-02c6-474d-bd7b-d51ce42340b3" (UID: "15dc5e3f-02c6-474d-bd7b-d51ce42340b3"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:21:04 crc kubenswrapper[5004]: I1201 08:21:04.651448 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "15dc5e3f-02c6-474d-bd7b-d51ce42340b3" (UID: "15dc5e3f-02c6-474d-bd7b-d51ce42340b3"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:21:04 crc kubenswrapper[5004]: I1201 08:21:04.652005 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "15dc5e3f-02c6-474d-bd7b-d51ce42340b3" (UID: "15dc5e3f-02c6-474d-bd7b-d51ce42340b3"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:21:04 crc kubenswrapper[5004]: I1201 08:21:04.652157 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-kube-api-access-hc6xg" (OuterVolumeSpecName: "kube-api-access-hc6xg") pod "15dc5e3f-02c6-474d-bd7b-d51ce42340b3" (UID: "15dc5e3f-02c6-474d-bd7b-d51ce42340b3"). InnerVolumeSpecName "kube-api-access-hc6xg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:21:04 crc kubenswrapper[5004]: I1201 08:21:04.653875 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "15dc5e3f-02c6-474d-bd7b-d51ce42340b3" (UID: "15dc5e3f-02c6-474d-bd7b-d51ce42340b3"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:21:04 crc kubenswrapper[5004]: I1201 08:21:04.744189 5004 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 01 08:21:04 crc kubenswrapper[5004]: I1201 08:21:04.744243 5004 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 01 08:21:04 crc kubenswrapper[5004]: I1201 08:21:04.744267 5004 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 01 08:21:04 crc kubenswrapper[5004]: I1201 08:21:04.744287 5004 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 01 08:21:04 crc kubenswrapper[5004]: I1201 08:21:04.744308 5004 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 01 08:21:04 crc kubenswrapper[5004]: I1201 08:21:04.744326 5004 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 01 08:21:04 crc kubenswrapper[5004]: I1201 08:21:04.744343 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hc6xg\" (UniqueName: \"kubernetes.io/projected/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-kube-api-access-hc6xg\") on node \"crc\" DevicePath \"\"" Dec 01 08:21:04 crc kubenswrapper[5004]: I1201 08:21:04.744361 5004 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 01 08:21:04 crc kubenswrapper[5004]: I1201 08:21:04.744379 5004 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 01 08:21:04 crc kubenswrapper[5004]: I1201 08:21:04.744398 5004 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 01 08:21:04 crc kubenswrapper[5004]: I1201 08:21:04.744415 5004 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 01 08:21:04 crc kubenswrapper[5004]: I1201 08:21:04.744432 5004 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 08:21:04 crc kubenswrapper[5004]: I1201 08:21:04.744450 5004 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/15dc5e3f-02c6-474d-bd7b-d51ce42340b3-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:21:05 crc kubenswrapper[5004]: I1201 08:21:05.059455 5004 generic.go:334] "Generic (PLEG): container finished" podID="15dc5e3f-02c6-474d-bd7b-d51ce42340b3" containerID="74772694e1c95b51d67b201d5ad882a5668763ec908b994b999c98e143fd7dd6" exitCode=0 Dec 01 08:21:05 crc kubenswrapper[5004]: I1201 08:21:05.059545 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-p77t7" event={"ID":"15dc5e3f-02c6-474d-bd7b-d51ce42340b3","Type":"ContainerDied","Data":"74772694e1c95b51d67b201d5ad882a5668763ec908b994b999c98e143fd7dd6"} Dec 01 08:21:05 crc kubenswrapper[5004]: I1201 08:21:05.059553 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-p77t7" Dec 01 08:21:05 crc kubenswrapper[5004]: I1201 08:21:05.059593 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-p77t7" event={"ID":"15dc5e3f-02c6-474d-bd7b-d51ce42340b3","Type":"ContainerDied","Data":"cd868af285e19f0a76de95983008553906cc29bbd468ada721dac00ec9b687a0"} Dec 01 08:21:05 crc kubenswrapper[5004]: I1201 08:21:05.059617 5004 scope.go:117] "RemoveContainer" containerID="74772694e1c95b51d67b201d5ad882a5668763ec908b994b999c98e143fd7dd6" Dec 01 08:21:05 crc kubenswrapper[5004]: I1201 08:21:05.064498 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"284a57a2244c2b8385efe4d89f415d8907ba973a1532545b5b0f8986c3b30f23"} Dec 01 08:21:05 crc kubenswrapper[5004]: I1201 08:21:05.064538 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"daeb267b7d0029bbacbaba237179f9492216a44def4c1623c030c84577da2e12"} Dec 01 08:21:05 crc kubenswrapper[5004]: I1201 08:21:05.064548 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"8c87e336731800587f9f4ae5579a892db334a9e7de2269504e1e01296a47fb78"} Dec 01 08:21:05 crc kubenswrapper[5004]: I1201 08:21:05.098736 5004 scope.go:117] "RemoveContainer" containerID="74772694e1c95b51d67b201d5ad882a5668763ec908b994b999c98e143fd7dd6" Dec 01 08:21:05 crc kubenswrapper[5004]: E1201 08:21:05.099208 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74772694e1c95b51d67b201d5ad882a5668763ec908b994b999c98e143fd7dd6\": container with ID starting with 74772694e1c95b51d67b201d5ad882a5668763ec908b994b999c98e143fd7dd6 not found: ID does not exist" containerID="74772694e1c95b51d67b201d5ad882a5668763ec908b994b999c98e143fd7dd6" Dec 01 08:21:05 crc kubenswrapper[5004]: I1201 08:21:05.099262 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74772694e1c95b51d67b201d5ad882a5668763ec908b994b999c98e143fd7dd6"} err="failed to get container status \"74772694e1c95b51d67b201d5ad882a5668763ec908b994b999c98e143fd7dd6\": rpc error: code = NotFound desc = could not find container \"74772694e1c95b51d67b201d5ad882a5668763ec908b994b999c98e143fd7dd6\": container with ID starting with 74772694e1c95b51d67b201d5ad882a5668763ec908b994b999c98e143fd7dd6 not found: ID does not exist" Dec 01 08:21:06 crc kubenswrapper[5004]: I1201 08:21:06.072752 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 01 08:21:06 crc kubenswrapper[5004]: I1201 08:21:06.073013 5004 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="9b65b9bc444101900c62fd90c7d45789d186a89148ac0fd9c7f0c6048c3f55ad" exitCode=1 Dec 01 08:21:06 crc kubenswrapper[5004]: I1201 08:21:06.073080 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"9b65b9bc444101900c62fd90c7d45789d186a89148ac0fd9c7f0c6048c3f55ad"} Dec 01 08:21:06 crc kubenswrapper[5004]: I1201 08:21:06.073644 5004 scope.go:117] "RemoveContainer" containerID="9b65b9bc444101900c62fd90c7d45789d186a89148ac0fd9c7f0c6048c3f55ad" Dec 01 08:21:06 crc kubenswrapper[5004]: I1201 08:21:06.076006 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"59a7647331dae6f8279ccc39f070d9e827a8dba23d3edef158882ccce1a49d0a"} Dec 01 08:21:06 crc kubenswrapper[5004]: I1201 08:21:06.076039 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f6a57305f651e99402f6a2cfab9e5ca13ec15d043d82725b8574cf8ddfa37e8a"} Dec 01 08:21:06 crc kubenswrapper[5004]: I1201 08:21:06.076229 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 08:21:06 crc kubenswrapper[5004]: I1201 08:21:06.076270 5004 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f0ed2b6d-0c61-4639-bc3b-1c8effc4815d" Dec 01 08:21:06 crc kubenswrapper[5004]: I1201 08:21:06.076286 5004 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f0ed2b6d-0c61-4639-bc3b-1c8effc4815d" Dec 01 08:21:07 crc kubenswrapper[5004]: I1201 08:21:07.086463 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 01 08:21:07 crc kubenswrapper[5004]: I1201 08:21:07.086877 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"164bff45e935875c9439920f6d54d1444d870e61f2b983d5a274e3b17df6ec77"} Dec 01 08:21:07 crc kubenswrapper[5004]: I1201 08:21:07.790310 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 08:21:07 crc kubenswrapper[5004]: I1201 08:21:07.790386 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 08:21:07 crc kubenswrapper[5004]: I1201 08:21:07.798924 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 08:21:10 crc kubenswrapper[5004]: I1201 08:21:10.964275 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 08:21:10 crc kubenswrapper[5004]: I1201 08:21:10.991397 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 08:21:11 crc kubenswrapper[5004]: I1201 08:21:11.087967 5004 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 08:21:11 crc kubenswrapper[5004]: I1201 08:21:11.116133 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 08:21:11 crc kubenswrapper[5004]: I1201 08:21:11.116479 5004 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f0ed2b6d-0c61-4639-bc3b-1c8effc4815d" Dec 01 08:21:11 crc kubenswrapper[5004]: I1201 08:21:11.116502 5004 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f0ed2b6d-0c61-4639-bc3b-1c8effc4815d" Dec 01 08:21:11 crc kubenswrapper[5004]: I1201 08:21:11.129087 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 08:21:12 crc kubenswrapper[5004]: I1201 08:21:12.120962 5004 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f0ed2b6d-0c61-4639-bc3b-1c8effc4815d" Dec 01 08:21:12 crc kubenswrapper[5004]: I1201 08:21:12.120993 5004 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f0ed2b6d-0c61-4639-bc3b-1c8effc4815d" Dec 01 08:21:12 crc kubenswrapper[5004]: I1201 08:21:12.772362 5004 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="28c74d53-1f98-454b-834a-dba0c3138ecb" Dec 01 08:21:19 crc kubenswrapper[5004]: I1201 08:21:19.377445 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 01 08:21:20 crc kubenswrapper[5004]: I1201 08:21:20.739444 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 01 08:21:21 crc kubenswrapper[5004]: I1201 08:21:21.361447 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 08:21:22 crc kubenswrapper[5004]: I1201 08:21:22.325836 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 01 08:21:22 crc kubenswrapper[5004]: I1201 08:21:22.607912 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 01 08:21:22 crc kubenswrapper[5004]: I1201 08:21:22.908190 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 01 08:21:23 crc kubenswrapper[5004]: I1201 08:21:23.071822 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 01 08:21:23 crc kubenswrapper[5004]: I1201 08:21:23.250981 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 01 08:21:23 crc kubenswrapper[5004]: I1201 08:21:23.352644 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 01 08:21:23 crc kubenswrapper[5004]: I1201 08:21:23.437836 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 01 08:21:23 crc kubenswrapper[5004]: I1201 08:21:23.481325 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 01 08:21:23 crc kubenswrapper[5004]: I1201 08:21:23.496353 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 01 08:21:23 crc kubenswrapper[5004]: I1201 08:21:23.548927 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 01 08:21:23 crc kubenswrapper[5004]: I1201 08:21:23.791005 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 01 08:21:23 crc kubenswrapper[5004]: I1201 08:21:23.820163 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 01 08:21:23 crc kubenswrapper[5004]: I1201 08:21:23.826410 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 01 08:21:23 crc kubenswrapper[5004]: I1201 08:21:23.912507 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 01 08:21:24 crc kubenswrapper[5004]: I1201 08:21:24.088469 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 01 08:21:24 crc kubenswrapper[5004]: I1201 08:21:24.126683 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 01 08:21:24 crc kubenswrapper[5004]: I1201 08:21:24.363385 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 01 08:21:24 crc kubenswrapper[5004]: I1201 08:21:24.479892 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 01 08:21:24 crc kubenswrapper[5004]: I1201 08:21:24.485017 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 01 08:21:24 crc kubenswrapper[5004]: I1201 08:21:24.709825 5004 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 01 08:21:25 crc kubenswrapper[5004]: I1201 08:21:25.036702 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 01 08:21:25 crc kubenswrapper[5004]: I1201 08:21:25.069964 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 01 08:21:25 crc kubenswrapper[5004]: I1201 08:21:25.171150 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 01 08:21:25 crc kubenswrapper[5004]: I1201 08:21:25.176758 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 01 08:21:25 crc kubenswrapper[5004]: I1201 08:21:25.371348 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 01 08:21:25 crc kubenswrapper[5004]: I1201 08:21:25.412140 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 01 08:21:25 crc kubenswrapper[5004]: I1201 08:21:25.473474 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 01 08:21:25 crc kubenswrapper[5004]: I1201 08:21:25.576967 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 01 08:21:25 crc kubenswrapper[5004]: I1201 08:21:25.620674 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 01 08:21:25 crc kubenswrapper[5004]: I1201 08:21:25.755172 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 01 08:21:25 crc kubenswrapper[5004]: I1201 08:21:25.806214 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 01 08:21:25 crc kubenswrapper[5004]: I1201 08:21:25.814013 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 01 08:21:25 crc kubenswrapper[5004]: I1201 08:21:25.818831 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 01 08:21:26 crc kubenswrapper[5004]: I1201 08:21:26.022402 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 01 08:21:26 crc kubenswrapper[5004]: I1201 08:21:26.023635 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 01 08:21:26 crc kubenswrapper[5004]: I1201 08:21:26.249156 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 01 08:21:26 crc kubenswrapper[5004]: I1201 08:21:26.407598 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 01 08:21:26 crc kubenswrapper[5004]: I1201 08:21:26.423824 5004 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 01 08:21:26 crc kubenswrapper[5004]: I1201 08:21:26.450841 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 01 08:21:26 crc kubenswrapper[5004]: I1201 08:21:26.475022 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 01 08:21:26 crc kubenswrapper[5004]: I1201 08:21:26.585556 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 01 08:21:26 crc kubenswrapper[5004]: I1201 08:21:26.607321 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 01 08:21:26 crc kubenswrapper[5004]: I1201 08:21:26.689931 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 01 08:21:26 crc kubenswrapper[5004]: I1201 08:21:26.698488 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 01 08:21:26 crc kubenswrapper[5004]: I1201 08:21:26.706509 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 01 08:21:26 crc kubenswrapper[5004]: I1201 08:21:26.797481 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 01 08:21:26 crc kubenswrapper[5004]: I1201 08:21:26.887163 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 01 08:21:26 crc kubenswrapper[5004]: I1201 08:21:26.916395 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 01 08:21:26 crc kubenswrapper[5004]: I1201 08:21:26.927316 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 01 08:21:26 crc kubenswrapper[5004]: I1201 08:21:26.948817 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 01 08:21:26 crc kubenswrapper[5004]: I1201 08:21:26.990098 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 01 08:21:26 crc kubenswrapper[5004]: I1201 08:21:26.999809 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 01 08:21:27 crc kubenswrapper[5004]: I1201 08:21:27.027461 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 01 08:21:27 crc kubenswrapper[5004]: I1201 08:21:27.109374 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 01 08:21:27 crc kubenswrapper[5004]: I1201 08:21:27.147722 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 01 08:21:27 crc kubenswrapper[5004]: I1201 08:21:27.334712 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 01 08:21:27 crc kubenswrapper[5004]: I1201 08:21:27.362417 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 01 08:21:27 crc kubenswrapper[5004]: I1201 08:21:27.418760 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 01 08:21:27 crc kubenswrapper[5004]: I1201 08:21:27.429296 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 01 08:21:27 crc kubenswrapper[5004]: I1201 08:21:27.558214 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 01 08:21:27 crc kubenswrapper[5004]: I1201 08:21:27.600428 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 01 08:21:27 crc kubenswrapper[5004]: I1201 08:21:27.630343 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 01 08:21:27 crc kubenswrapper[5004]: I1201 08:21:27.648953 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 01 08:21:27 crc kubenswrapper[5004]: I1201 08:21:27.719149 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 01 08:21:27 crc kubenswrapper[5004]: I1201 08:21:27.789315 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 01 08:21:27 crc kubenswrapper[5004]: I1201 08:21:27.798379 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 01 08:21:27 crc kubenswrapper[5004]: I1201 08:21:27.950425 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 01 08:21:27 crc kubenswrapper[5004]: I1201 08:21:27.975211 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 01 08:21:28 crc kubenswrapper[5004]: I1201 08:21:28.014640 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 01 08:21:28 crc kubenswrapper[5004]: I1201 08:21:28.037048 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 01 08:21:28 crc kubenswrapper[5004]: I1201 08:21:28.103779 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 01 08:21:28 crc kubenswrapper[5004]: I1201 08:21:28.180006 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 01 08:21:28 crc kubenswrapper[5004]: I1201 08:21:28.261330 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 01 08:21:28 crc kubenswrapper[5004]: I1201 08:21:28.359100 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 01 08:21:28 crc kubenswrapper[5004]: I1201 08:21:28.359641 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 01 08:21:28 crc kubenswrapper[5004]: I1201 08:21:28.456215 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 01 08:21:28 crc kubenswrapper[5004]: I1201 08:21:28.474615 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 01 08:21:28 crc kubenswrapper[5004]: I1201 08:21:28.517445 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 01 08:21:28 crc kubenswrapper[5004]: I1201 08:21:28.534686 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 01 08:21:28 crc kubenswrapper[5004]: I1201 08:21:28.756353 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 01 08:21:28 crc kubenswrapper[5004]: I1201 08:21:28.767473 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 01 08:21:28 crc kubenswrapper[5004]: I1201 08:21:28.916838 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 01 08:21:29 crc kubenswrapper[5004]: I1201 08:21:29.055529 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 01 08:21:29 crc kubenswrapper[5004]: I1201 08:21:29.112020 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 01 08:21:29 crc kubenswrapper[5004]: I1201 08:21:29.145504 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 01 08:21:29 crc kubenswrapper[5004]: I1201 08:21:29.213710 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 01 08:21:29 crc kubenswrapper[5004]: I1201 08:21:29.317114 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 01 08:21:29 crc kubenswrapper[5004]: I1201 08:21:29.430981 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 01 08:21:29 crc kubenswrapper[5004]: I1201 08:21:29.509592 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 01 08:21:29 crc kubenswrapper[5004]: I1201 08:21:29.564208 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 01 08:21:29 crc kubenswrapper[5004]: I1201 08:21:29.569707 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 01 08:21:29 crc kubenswrapper[5004]: I1201 08:21:29.673954 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 01 08:21:29 crc kubenswrapper[5004]: I1201 08:21:29.682065 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 01 08:21:29 crc kubenswrapper[5004]: I1201 08:21:29.685250 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 01 08:21:29 crc kubenswrapper[5004]: I1201 08:21:29.688980 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 01 08:21:29 crc kubenswrapper[5004]: I1201 08:21:29.778710 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 01 08:21:29 crc kubenswrapper[5004]: I1201 08:21:29.834874 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 01 08:21:29 crc kubenswrapper[5004]: I1201 08:21:29.864756 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 01 08:21:29 crc kubenswrapper[5004]: I1201 08:21:29.957486 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 01 08:21:29 crc kubenswrapper[5004]: I1201 08:21:29.962814 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 01 08:21:30 crc kubenswrapper[5004]: I1201 08:21:30.032036 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 01 08:21:30 crc kubenswrapper[5004]: I1201 08:21:30.038467 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 01 08:21:30 crc kubenswrapper[5004]: I1201 08:21:30.040489 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 01 08:21:30 crc kubenswrapper[5004]: I1201 08:21:30.090430 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 01 08:21:30 crc kubenswrapper[5004]: I1201 08:21:30.139772 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 01 08:21:30 crc kubenswrapper[5004]: I1201 08:21:30.285835 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 01 08:21:30 crc kubenswrapper[5004]: I1201 08:21:30.335160 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 01 08:21:30 crc kubenswrapper[5004]: I1201 08:21:30.336990 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 01 08:21:30 crc kubenswrapper[5004]: I1201 08:21:30.397977 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 01 08:21:30 crc kubenswrapper[5004]: I1201 08:21:30.487482 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 01 08:21:30 crc kubenswrapper[5004]: I1201 08:21:30.516444 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 01 08:21:30 crc kubenswrapper[5004]: I1201 08:21:30.564846 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 01 08:21:30 crc kubenswrapper[5004]: I1201 08:21:30.574935 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 01 08:21:30 crc kubenswrapper[5004]: I1201 08:21:30.622574 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 01 08:21:30 crc kubenswrapper[5004]: I1201 08:21:30.725228 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 01 08:21:30 crc kubenswrapper[5004]: I1201 08:21:30.779757 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 01 08:21:30 crc kubenswrapper[5004]: I1201 08:21:30.804689 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 01 08:21:31 crc kubenswrapper[5004]: I1201 08:21:31.116115 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 01 08:21:31 crc kubenswrapper[5004]: I1201 08:21:31.211185 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 01 08:21:31 crc kubenswrapper[5004]: I1201 08:21:31.246169 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 01 08:21:31 crc kubenswrapper[5004]: I1201 08:21:31.248083 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 01 08:21:31 crc kubenswrapper[5004]: I1201 08:21:31.351031 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 01 08:21:31 crc kubenswrapper[5004]: I1201 08:21:31.462992 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 01 08:21:31 crc kubenswrapper[5004]: I1201 08:21:31.541673 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 01 08:21:31 crc kubenswrapper[5004]: I1201 08:21:31.643050 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 01 08:21:31 crc kubenswrapper[5004]: I1201 08:21:31.687958 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 01 08:21:31 crc kubenswrapper[5004]: I1201 08:21:31.787089 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 01 08:21:31 crc kubenswrapper[5004]: I1201 08:21:31.809475 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 01 08:21:31 crc kubenswrapper[5004]: I1201 08:21:31.868932 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 01 08:21:31 crc kubenswrapper[5004]: I1201 08:21:31.883158 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 01 08:21:31 crc kubenswrapper[5004]: I1201 08:21:31.895315 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 01 08:21:31 crc kubenswrapper[5004]: I1201 08:21:31.963551 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 01 08:21:32 crc kubenswrapper[5004]: I1201 08:21:32.027104 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 01 08:21:32 crc kubenswrapper[5004]: I1201 08:21:32.040869 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 01 08:21:32 crc kubenswrapper[5004]: I1201 08:21:32.224615 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 01 08:21:32 crc kubenswrapper[5004]: I1201 08:21:32.242284 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 01 08:21:32 crc kubenswrapper[5004]: I1201 08:21:32.274384 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 01 08:21:32 crc kubenswrapper[5004]: I1201 08:21:32.560985 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 01 08:21:32 crc kubenswrapper[5004]: I1201 08:21:32.582121 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 01 08:21:32 crc kubenswrapper[5004]: I1201 08:21:32.588249 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 01 08:21:32 crc kubenswrapper[5004]: I1201 08:21:32.655690 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 01 08:21:32 crc kubenswrapper[5004]: I1201 08:21:32.704646 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 01 08:21:32 crc kubenswrapper[5004]: I1201 08:21:32.829668 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 01 08:21:32 crc kubenswrapper[5004]: I1201 08:21:32.853445 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 01 08:21:32 crc kubenswrapper[5004]: I1201 08:21:32.910002 5004 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 01 08:21:32 crc kubenswrapper[5004]: I1201 08:21:32.913633 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 01 08:21:32 crc kubenswrapper[5004]: I1201 08:21:32.937269 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 01 08:21:33 crc kubenswrapper[5004]: I1201 08:21:33.070915 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 01 08:21:33 crc kubenswrapper[5004]: I1201 08:21:33.146414 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 01 08:21:33 crc kubenswrapper[5004]: I1201 08:21:33.178123 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 01 08:21:33 crc kubenswrapper[5004]: I1201 08:21:33.192664 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 01 08:21:33 crc kubenswrapper[5004]: I1201 08:21:33.295132 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 01 08:21:33 crc kubenswrapper[5004]: I1201 08:21:33.393621 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 01 08:21:33 crc kubenswrapper[5004]: I1201 08:21:33.411128 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 01 08:21:33 crc kubenswrapper[5004]: I1201 08:21:33.530532 5004 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 01 08:21:33 crc kubenswrapper[5004]: I1201 08:21:33.540598 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-p77t7","openshift-kube-apiserver/kube-apiserver-crc"] Dec 01 08:21:33 crc kubenswrapper[5004]: I1201 08:21:33.540728 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 01 08:21:33 crc kubenswrapper[5004]: I1201 08:21:33.547811 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 08:21:33 crc kubenswrapper[5004]: I1201 08:21:33.563045 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 01 08:21:33 crc kubenswrapper[5004]: I1201 08:21:33.577551 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=22.577516613 podStartE2EDuration="22.577516613s" podCreationTimestamp="2025-12-01 08:21:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:21:33.569721593 +0000 UTC m=+271.134713645" watchObservedRunningTime="2025-12-01 08:21:33.577516613 +0000 UTC m=+271.142508625" Dec 01 08:21:33 crc kubenswrapper[5004]: I1201 08:21:33.580384 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 01 08:21:33 crc kubenswrapper[5004]: I1201 08:21:33.657430 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 01 08:21:33 crc kubenswrapper[5004]: I1201 08:21:33.666603 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 01 08:21:33 crc kubenswrapper[5004]: I1201 08:21:33.680290 5004 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 01 08:21:33 crc kubenswrapper[5004]: I1201 08:21:33.680881 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://1c5cdd1b3c526285033c11c62efe5731c0942b80c2e6d88eaeca2818df30f71e" gracePeriod=5 Dec 01 08:21:33 crc kubenswrapper[5004]: I1201 08:21:33.686624 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 01 08:21:33 crc kubenswrapper[5004]: I1201 08:21:33.690629 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 01 08:21:33 crc kubenswrapper[5004]: I1201 08:21:33.695473 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 01 08:21:33 crc kubenswrapper[5004]: I1201 08:21:33.743480 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 01 08:21:33 crc kubenswrapper[5004]: I1201 08:21:33.800672 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 01 08:21:33 crc kubenswrapper[5004]: I1201 08:21:33.859398 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 01 08:21:33 crc kubenswrapper[5004]: I1201 08:21:33.890889 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.041103 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.171456 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.248435 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-86f87f4d46-ps5p4"] Dec 01 08:21:34 crc kubenswrapper[5004]: E1201 08:21:34.248651 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15dc5e3f-02c6-474d-bd7b-d51ce42340b3" containerName="oauth-openshift" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.248664 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="15dc5e3f-02c6-474d-bd7b-d51ce42340b3" containerName="oauth-openshift" Dec 01 08:21:34 crc kubenswrapper[5004]: E1201 08:21:34.248674 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.248680 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 01 08:21:34 crc kubenswrapper[5004]: E1201 08:21:34.248695 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af8fc39f-8aa1-4e0d-8fe1-33aa79852f78" containerName="installer" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.248702 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="af8fc39f-8aa1-4e0d-8fe1-33aa79852f78" containerName="installer" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.248779 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.248790 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="af8fc39f-8aa1-4e0d-8fe1-33aa79852f78" containerName="installer" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.248799 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="15dc5e3f-02c6-474d-bd7b-d51ce42340b3" containerName="oauth-openshift" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.249140 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-86f87f4d46-ps5p4" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.252599 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.254856 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.255118 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.256541 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.257711 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.258695 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.258738 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.258889 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.258955 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.259122 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.259801 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.264042 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.268143 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.271320 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.279089 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.279769 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4a839f85-e334-4696-bbd6-d80f255d98f6-v4-0-config-system-router-certs\") pod \"oauth-openshift-86f87f4d46-ps5p4\" (UID: \"4a839f85-e334-4696-bbd6-d80f255d98f6\") " pod="openshift-authentication/oauth-openshift-86f87f4d46-ps5p4" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.279861 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a839f85-e334-4696-bbd6-d80f255d98f6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-86f87f4d46-ps5p4\" (UID: \"4a839f85-e334-4696-bbd6-d80f255d98f6\") " pod="openshift-authentication/oauth-openshift-86f87f4d46-ps5p4" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.279905 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4a839f85-e334-4696-bbd6-d80f255d98f6-v4-0-config-user-template-login\") pod \"oauth-openshift-86f87f4d46-ps5p4\" (UID: \"4a839f85-e334-4696-bbd6-d80f255d98f6\") " pod="openshift-authentication/oauth-openshift-86f87f4d46-ps5p4" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.279986 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4a839f85-e334-4696-bbd6-d80f255d98f6-v4-0-config-system-session\") pod \"oauth-openshift-86f87f4d46-ps5p4\" (UID: \"4a839f85-e334-4696-bbd6-d80f255d98f6\") " pod="openshift-authentication/oauth-openshift-86f87f4d46-ps5p4" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.280044 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4a839f85-e334-4696-bbd6-d80f255d98f6-audit-policies\") pod \"oauth-openshift-86f87f4d46-ps5p4\" (UID: \"4a839f85-e334-4696-bbd6-d80f255d98f6\") " pod="openshift-authentication/oauth-openshift-86f87f4d46-ps5p4" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.280087 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4a839f85-e334-4696-bbd6-d80f255d98f6-v4-0-config-user-template-error\") pod \"oauth-openshift-86f87f4d46-ps5p4\" (UID: \"4a839f85-e334-4696-bbd6-d80f255d98f6\") " pod="openshift-authentication/oauth-openshift-86f87f4d46-ps5p4" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.280134 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4a839f85-e334-4696-bbd6-d80f255d98f6-v4-0-config-system-service-ca\") pod \"oauth-openshift-86f87f4d46-ps5p4\" (UID: \"4a839f85-e334-4696-bbd6-d80f255d98f6\") " pod="openshift-authentication/oauth-openshift-86f87f4d46-ps5p4" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.280175 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a839f85-e334-4696-bbd6-d80f255d98f6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-86f87f4d46-ps5p4\" (UID: \"4a839f85-e334-4696-bbd6-d80f255d98f6\") " pod="openshift-authentication/oauth-openshift-86f87f4d46-ps5p4" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.280221 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpn8w\" (UniqueName: \"kubernetes.io/projected/4a839f85-e334-4696-bbd6-d80f255d98f6-kube-api-access-xpn8w\") pod \"oauth-openshift-86f87f4d46-ps5p4\" (UID: \"4a839f85-e334-4696-bbd6-d80f255d98f6\") " pod="openshift-authentication/oauth-openshift-86f87f4d46-ps5p4" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.280268 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4a839f85-e334-4696-bbd6-d80f255d98f6-audit-dir\") pod \"oauth-openshift-86f87f4d46-ps5p4\" (UID: \"4a839f85-e334-4696-bbd6-d80f255d98f6\") " pod="openshift-authentication/oauth-openshift-86f87f4d46-ps5p4" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.280283 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-86f87f4d46-ps5p4"] Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.280299 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4a839f85-e334-4696-bbd6-d80f255d98f6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-86f87f4d46-ps5p4\" (UID: \"4a839f85-e334-4696-bbd6-d80f255d98f6\") " pod="openshift-authentication/oauth-openshift-86f87f4d46-ps5p4" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.280398 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4a839f85-e334-4696-bbd6-d80f255d98f6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-86f87f4d46-ps5p4\" (UID: \"4a839f85-e334-4696-bbd6-d80f255d98f6\") " pod="openshift-authentication/oauth-openshift-86f87f4d46-ps5p4" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.280441 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4a839f85-e334-4696-bbd6-d80f255d98f6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-86f87f4d46-ps5p4\" (UID: \"4a839f85-e334-4696-bbd6-d80f255d98f6\") " pod="openshift-authentication/oauth-openshift-86f87f4d46-ps5p4" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.280482 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4a839f85-e334-4696-bbd6-d80f255d98f6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-86f87f4d46-ps5p4\" (UID: \"4a839f85-e334-4696-bbd6-d80f255d98f6\") " pod="openshift-authentication/oauth-openshift-86f87f4d46-ps5p4" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.316475 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.381398 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4a839f85-e334-4696-bbd6-d80f255d98f6-v4-0-config-system-session\") pod \"oauth-openshift-86f87f4d46-ps5p4\" (UID: \"4a839f85-e334-4696-bbd6-d80f255d98f6\") " pod="openshift-authentication/oauth-openshift-86f87f4d46-ps5p4" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.381709 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4a839f85-e334-4696-bbd6-d80f255d98f6-audit-policies\") pod \"oauth-openshift-86f87f4d46-ps5p4\" (UID: \"4a839f85-e334-4696-bbd6-d80f255d98f6\") " pod="openshift-authentication/oauth-openshift-86f87f4d46-ps5p4" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.381810 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4a839f85-e334-4696-bbd6-d80f255d98f6-v4-0-config-user-template-error\") pod \"oauth-openshift-86f87f4d46-ps5p4\" (UID: \"4a839f85-e334-4696-bbd6-d80f255d98f6\") " pod="openshift-authentication/oauth-openshift-86f87f4d46-ps5p4" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.381917 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4a839f85-e334-4696-bbd6-d80f255d98f6-v4-0-config-system-service-ca\") pod \"oauth-openshift-86f87f4d46-ps5p4\" (UID: \"4a839f85-e334-4696-bbd6-d80f255d98f6\") " pod="openshift-authentication/oauth-openshift-86f87f4d46-ps5p4" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.382050 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a839f85-e334-4696-bbd6-d80f255d98f6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-86f87f4d46-ps5p4\" (UID: \"4a839f85-e334-4696-bbd6-d80f255d98f6\") " pod="openshift-authentication/oauth-openshift-86f87f4d46-ps5p4" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.382147 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpn8w\" (UniqueName: \"kubernetes.io/projected/4a839f85-e334-4696-bbd6-d80f255d98f6-kube-api-access-xpn8w\") pod \"oauth-openshift-86f87f4d46-ps5p4\" (UID: \"4a839f85-e334-4696-bbd6-d80f255d98f6\") " pod="openshift-authentication/oauth-openshift-86f87f4d46-ps5p4" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.382239 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4a839f85-e334-4696-bbd6-d80f255d98f6-audit-dir\") pod \"oauth-openshift-86f87f4d46-ps5p4\" (UID: \"4a839f85-e334-4696-bbd6-d80f255d98f6\") " pod="openshift-authentication/oauth-openshift-86f87f4d46-ps5p4" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.382317 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4a839f85-e334-4696-bbd6-d80f255d98f6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-86f87f4d46-ps5p4\" (UID: \"4a839f85-e334-4696-bbd6-d80f255d98f6\") " pod="openshift-authentication/oauth-openshift-86f87f4d46-ps5p4" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.382422 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4a839f85-e334-4696-bbd6-d80f255d98f6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-86f87f4d46-ps5p4\" (UID: \"4a839f85-e334-4696-bbd6-d80f255d98f6\") " pod="openshift-authentication/oauth-openshift-86f87f4d46-ps5p4" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.382525 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4a839f85-e334-4696-bbd6-d80f255d98f6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-86f87f4d46-ps5p4\" (UID: \"4a839f85-e334-4696-bbd6-d80f255d98f6\") " pod="openshift-authentication/oauth-openshift-86f87f4d46-ps5p4" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.382644 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4a839f85-e334-4696-bbd6-d80f255d98f6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-86f87f4d46-ps5p4\" (UID: \"4a839f85-e334-4696-bbd6-d80f255d98f6\") " pod="openshift-authentication/oauth-openshift-86f87f4d46-ps5p4" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.382743 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4a839f85-e334-4696-bbd6-d80f255d98f6-v4-0-config-system-router-certs\") pod \"oauth-openshift-86f87f4d46-ps5p4\" (UID: \"4a839f85-e334-4696-bbd6-d80f255d98f6\") " pod="openshift-authentication/oauth-openshift-86f87f4d46-ps5p4" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.382346 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4a839f85-e334-4696-bbd6-d80f255d98f6-audit-dir\") pod \"oauth-openshift-86f87f4d46-ps5p4\" (UID: \"4a839f85-e334-4696-bbd6-d80f255d98f6\") " pod="openshift-authentication/oauth-openshift-86f87f4d46-ps5p4" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.382837 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a839f85-e334-4696-bbd6-d80f255d98f6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-86f87f4d46-ps5p4\" (UID: \"4a839f85-e334-4696-bbd6-d80f255d98f6\") " pod="openshift-authentication/oauth-openshift-86f87f4d46-ps5p4" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.383009 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4a839f85-e334-4696-bbd6-d80f255d98f6-v4-0-config-user-template-login\") pod \"oauth-openshift-86f87f4d46-ps5p4\" (UID: \"4a839f85-e334-4696-bbd6-d80f255d98f6\") " pod="openshift-authentication/oauth-openshift-86f87f4d46-ps5p4" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.383537 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4a839f85-e334-4696-bbd6-d80f255d98f6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-86f87f4d46-ps5p4\" (UID: \"4a839f85-e334-4696-bbd6-d80f255d98f6\") " pod="openshift-authentication/oauth-openshift-86f87f4d46-ps5p4" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.383716 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4a839f85-e334-4696-bbd6-d80f255d98f6-audit-policies\") pod \"oauth-openshift-86f87f4d46-ps5p4\" (UID: \"4a839f85-e334-4696-bbd6-d80f255d98f6\") " pod="openshift-authentication/oauth-openshift-86f87f4d46-ps5p4" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.384621 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a839f85-e334-4696-bbd6-d80f255d98f6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-86f87f4d46-ps5p4\" (UID: \"4a839f85-e334-4696-bbd6-d80f255d98f6\") " pod="openshift-authentication/oauth-openshift-86f87f4d46-ps5p4" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.387509 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4a839f85-e334-4696-bbd6-d80f255d98f6-v4-0-config-system-service-ca\") pod \"oauth-openshift-86f87f4d46-ps5p4\" (UID: \"4a839f85-e334-4696-bbd6-d80f255d98f6\") " pod="openshift-authentication/oauth-openshift-86f87f4d46-ps5p4" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.389222 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4a839f85-e334-4696-bbd6-d80f255d98f6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-86f87f4d46-ps5p4\" (UID: \"4a839f85-e334-4696-bbd6-d80f255d98f6\") " pod="openshift-authentication/oauth-openshift-86f87f4d46-ps5p4" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.389743 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4a839f85-e334-4696-bbd6-d80f255d98f6-v4-0-config-user-template-login\") pod \"oauth-openshift-86f87f4d46-ps5p4\" (UID: \"4a839f85-e334-4696-bbd6-d80f255d98f6\") " pod="openshift-authentication/oauth-openshift-86f87f4d46-ps5p4" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.390032 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.390014 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4a839f85-e334-4696-bbd6-d80f255d98f6-v4-0-config-system-router-certs\") pod \"oauth-openshift-86f87f4d46-ps5p4\" (UID: \"4a839f85-e334-4696-bbd6-d80f255d98f6\") " pod="openshift-authentication/oauth-openshift-86f87f4d46-ps5p4" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.390485 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a839f85-e334-4696-bbd6-d80f255d98f6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-86f87f4d46-ps5p4\" (UID: \"4a839f85-e334-4696-bbd6-d80f255d98f6\") " pod="openshift-authentication/oauth-openshift-86f87f4d46-ps5p4" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.392129 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4a839f85-e334-4696-bbd6-d80f255d98f6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-86f87f4d46-ps5p4\" (UID: \"4a839f85-e334-4696-bbd6-d80f255d98f6\") " pod="openshift-authentication/oauth-openshift-86f87f4d46-ps5p4" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.393965 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4a839f85-e334-4696-bbd6-d80f255d98f6-v4-0-config-system-session\") pod \"oauth-openshift-86f87f4d46-ps5p4\" (UID: \"4a839f85-e334-4696-bbd6-d80f255d98f6\") " pod="openshift-authentication/oauth-openshift-86f87f4d46-ps5p4" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.395403 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4a839f85-e334-4696-bbd6-d80f255d98f6-v4-0-config-user-template-error\") pod \"oauth-openshift-86f87f4d46-ps5p4\" (UID: \"4a839f85-e334-4696-bbd6-d80f255d98f6\") " pod="openshift-authentication/oauth-openshift-86f87f4d46-ps5p4" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.399830 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.403139 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4a839f85-e334-4696-bbd6-d80f255d98f6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-86f87f4d46-ps5p4\" (UID: \"4a839f85-e334-4696-bbd6-d80f255d98f6\") " pod="openshift-authentication/oauth-openshift-86f87f4d46-ps5p4" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.409829 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpn8w\" (UniqueName: \"kubernetes.io/projected/4a839f85-e334-4696-bbd6-d80f255d98f6-kube-api-access-xpn8w\") pod \"oauth-openshift-86f87f4d46-ps5p4\" (UID: \"4a839f85-e334-4696-bbd6-d80f255d98f6\") " pod="openshift-authentication/oauth-openshift-86f87f4d46-ps5p4" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.466864 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.468857 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.486923 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.523533 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.576509 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-86f87f4d46-ps5p4" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.655299 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.763379 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.779497 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15dc5e3f-02c6-474d-bd7b-d51ce42340b3" path="/var/lib/kubelet/pods/15dc5e3f-02c6-474d-bd7b-d51ce42340b3/volumes" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.780714 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-86f87f4d46-ps5p4"] Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.811287 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.817858 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.817918 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.848940 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 01 08:21:34 crc kubenswrapper[5004]: I1201 08:21:34.859246 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 01 08:21:35 crc kubenswrapper[5004]: I1201 08:21:35.095744 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 01 08:21:35 crc kubenswrapper[5004]: I1201 08:21:35.122193 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 01 08:21:35 crc kubenswrapper[5004]: I1201 08:21:35.124215 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 01 08:21:35 crc kubenswrapper[5004]: I1201 08:21:35.204279 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 01 08:21:35 crc kubenswrapper[5004]: I1201 08:21:35.231016 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 01 08:21:35 crc kubenswrapper[5004]: I1201 08:21:35.273285 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-86f87f4d46-ps5p4" event={"ID":"4a839f85-e334-4696-bbd6-d80f255d98f6","Type":"ContainerStarted","Data":"1e6cd266ad7795d97bbccb4a19913adeb5d6fe706acb942e387e263913226810"} Dec 01 08:21:35 crc kubenswrapper[5004]: I1201 08:21:35.273338 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-86f87f4d46-ps5p4" event={"ID":"4a839f85-e334-4696-bbd6-d80f255d98f6","Type":"ContainerStarted","Data":"3157b3a95ec09ad608e0f036f9e153d05f6a9f8bcc3877dc15a56ac4f77edf16"} Dec 01 08:21:35 crc kubenswrapper[5004]: I1201 08:21:35.273611 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-86f87f4d46-ps5p4" Dec 01 08:21:35 crc kubenswrapper[5004]: I1201 08:21:35.294316 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-86f87f4d46-ps5p4" podStartSLOduration=56.294297415 podStartE2EDuration="56.294297415s" podCreationTimestamp="2025-12-01 08:20:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:21:35.293338019 +0000 UTC m=+272.858329991" watchObservedRunningTime="2025-12-01 08:21:35.294297415 +0000 UTC m=+272.859289417" Dec 01 08:21:35 crc kubenswrapper[5004]: I1201 08:21:35.297095 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 01 08:21:35 crc kubenswrapper[5004]: I1201 08:21:35.436750 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 01 08:21:35 crc kubenswrapper[5004]: I1201 08:21:35.503426 5004 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 01 08:21:35 crc kubenswrapper[5004]: I1201 08:21:35.519225 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 01 08:21:35 crc kubenswrapper[5004]: I1201 08:21:35.667899 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 01 08:21:35 crc kubenswrapper[5004]: I1201 08:21:35.695316 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 01 08:21:35 crc kubenswrapper[5004]: I1201 08:21:35.704332 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 01 08:21:35 crc kubenswrapper[5004]: I1201 08:21:35.719325 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 01 08:21:35 crc kubenswrapper[5004]: I1201 08:21:35.754745 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-86f87f4d46-ps5p4" Dec 01 08:21:35 crc kubenswrapper[5004]: I1201 08:21:35.783903 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 01 08:21:35 crc kubenswrapper[5004]: I1201 08:21:35.800263 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 01 08:21:35 crc kubenswrapper[5004]: I1201 08:21:35.815995 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 01 08:21:36 crc kubenswrapper[5004]: I1201 08:21:36.011518 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 01 08:21:36 crc kubenswrapper[5004]: I1201 08:21:36.077230 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 01 08:21:36 crc kubenswrapper[5004]: I1201 08:21:36.118244 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 01 08:21:36 crc kubenswrapper[5004]: I1201 08:21:36.324753 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 01 08:21:36 crc kubenswrapper[5004]: I1201 08:21:36.389995 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 01 08:21:36 crc kubenswrapper[5004]: I1201 08:21:36.441665 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 01 08:21:36 crc kubenswrapper[5004]: I1201 08:21:36.693914 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 01 08:21:36 crc kubenswrapper[5004]: I1201 08:21:36.700371 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 01 08:21:36 crc kubenswrapper[5004]: I1201 08:21:36.766918 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 01 08:21:36 crc kubenswrapper[5004]: I1201 08:21:36.910298 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 01 08:21:37 crc kubenswrapper[5004]: I1201 08:21:37.180513 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 01 08:21:37 crc kubenswrapper[5004]: I1201 08:21:37.315035 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 01 08:21:37 crc kubenswrapper[5004]: I1201 08:21:37.328751 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 01 08:21:37 crc kubenswrapper[5004]: I1201 08:21:37.337027 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 01 08:21:37 crc kubenswrapper[5004]: I1201 08:21:37.594280 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 01 08:21:37 crc kubenswrapper[5004]: I1201 08:21:37.616854 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 01 08:21:37 crc kubenswrapper[5004]: I1201 08:21:37.825739 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 01 08:21:37 crc kubenswrapper[5004]: I1201 08:21:37.885865 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 01 08:21:37 crc kubenswrapper[5004]: I1201 08:21:37.945427 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 01 08:21:37 crc kubenswrapper[5004]: I1201 08:21:37.988051 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 01 08:21:38 crc kubenswrapper[5004]: I1201 08:21:38.041207 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 01 08:21:38 crc kubenswrapper[5004]: I1201 08:21:38.050300 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 01 08:21:38 crc kubenswrapper[5004]: I1201 08:21:38.260543 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 01 08:21:38 crc kubenswrapper[5004]: I1201 08:21:38.664327 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 01 08:21:38 crc kubenswrapper[5004]: I1201 08:21:38.692080 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 01 08:21:38 crc kubenswrapper[5004]: I1201 08:21:38.814817 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 01 08:21:38 crc kubenswrapper[5004]: I1201 08:21:38.850983 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 01 08:21:38 crc kubenswrapper[5004]: I1201 08:21:38.881737 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 01 08:21:38 crc kubenswrapper[5004]: I1201 08:21:38.902979 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 01 08:21:38 crc kubenswrapper[5004]: I1201 08:21:38.928970 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 01 08:21:38 crc kubenswrapper[5004]: I1201 08:21:38.943760 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 01 08:21:39 crc kubenswrapper[5004]: I1201 08:21:39.192281 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 01 08:21:39 crc kubenswrapper[5004]: I1201 08:21:39.266330 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 01 08:21:39 crc kubenswrapper[5004]: I1201 08:21:39.266418 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 08:21:39 crc kubenswrapper[5004]: I1201 08:21:39.301635 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 01 08:21:39 crc kubenswrapper[5004]: I1201 08:21:39.301713 5004 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="1c5cdd1b3c526285033c11c62efe5731c0942b80c2e6d88eaeca2818df30f71e" exitCode=137 Dec 01 08:21:39 crc kubenswrapper[5004]: I1201 08:21:39.301759 5004 scope.go:117] "RemoveContainer" containerID="1c5cdd1b3c526285033c11c62efe5731c0942b80c2e6d88eaeca2818df30f71e" Dec 01 08:21:39 crc kubenswrapper[5004]: I1201 08:21:39.301803 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 08:21:39 crc kubenswrapper[5004]: I1201 08:21:39.319073 5004 scope.go:117] "RemoveContainer" containerID="1c5cdd1b3c526285033c11c62efe5731c0942b80c2e6d88eaeca2818df30f71e" Dec 01 08:21:39 crc kubenswrapper[5004]: E1201 08:21:39.319643 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c5cdd1b3c526285033c11c62efe5731c0942b80c2e6d88eaeca2818df30f71e\": container with ID starting with 1c5cdd1b3c526285033c11c62efe5731c0942b80c2e6d88eaeca2818df30f71e not found: ID does not exist" containerID="1c5cdd1b3c526285033c11c62efe5731c0942b80c2e6d88eaeca2818df30f71e" Dec 01 08:21:39 crc kubenswrapper[5004]: I1201 08:21:39.319694 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c5cdd1b3c526285033c11c62efe5731c0942b80c2e6d88eaeca2818df30f71e"} err="failed to get container status \"1c5cdd1b3c526285033c11c62efe5731c0942b80c2e6d88eaeca2818df30f71e\": rpc error: code = NotFound desc = could not find container \"1c5cdd1b3c526285033c11c62efe5731c0942b80c2e6d88eaeca2818df30f71e\": container with ID starting with 1c5cdd1b3c526285033c11c62efe5731c0942b80c2e6d88eaeca2818df30f71e not found: ID does not exist" Dec 01 08:21:39 crc kubenswrapper[5004]: I1201 08:21:39.351361 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 01 08:21:39 crc kubenswrapper[5004]: I1201 08:21:39.354929 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 01 08:21:39 crc kubenswrapper[5004]: I1201 08:21:39.355023 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 01 08:21:39 crc kubenswrapper[5004]: I1201 08:21:39.355032 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 08:21:39 crc kubenswrapper[5004]: I1201 08:21:39.355102 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 08:21:39 crc kubenswrapper[5004]: I1201 08:21:39.355075 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 01 08:21:39 crc kubenswrapper[5004]: I1201 08:21:39.355148 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 08:21:39 crc kubenswrapper[5004]: I1201 08:21:39.355202 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 01 08:21:39 crc kubenswrapper[5004]: I1201 08:21:39.355265 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 01 08:21:39 crc kubenswrapper[5004]: I1201 08:21:39.355427 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 08:21:39 crc kubenswrapper[5004]: I1201 08:21:39.355649 5004 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 01 08:21:39 crc kubenswrapper[5004]: I1201 08:21:39.355679 5004 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 01 08:21:39 crc kubenswrapper[5004]: I1201 08:21:39.355698 5004 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Dec 01 08:21:39 crc kubenswrapper[5004]: I1201 08:21:39.355715 5004 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Dec 01 08:21:39 crc kubenswrapper[5004]: I1201 08:21:39.363221 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 01 08:21:39 crc kubenswrapper[5004]: I1201 08:21:39.366431 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 08:21:39 crc kubenswrapper[5004]: I1201 08:21:39.457307 5004 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 01 08:21:39 crc kubenswrapper[5004]: I1201 08:21:39.644980 5004 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 01 08:21:39 crc kubenswrapper[5004]: I1201 08:21:39.999234 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 01 08:21:40 crc kubenswrapper[5004]: I1201 08:21:40.769003 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Dec 01 08:21:40 crc kubenswrapper[5004]: I1201 08:21:40.866029 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 01 08:21:56 crc kubenswrapper[5004]: I1201 08:21:56.410302 5004 generic.go:334] "Generic (PLEG): container finished" podID="8a9f98dc-e84b-4fb8-9d4d-69c766486ebb" containerID="886c8baa541d32465cd9ae76af7323fb43bd8bcafa4a5a26793c1480ef9bf2cc" exitCode=0 Dec 01 08:21:56 crc kubenswrapper[5004]: I1201 08:21:56.410817 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8g8bf" event={"ID":"8a9f98dc-e84b-4fb8-9d4d-69c766486ebb","Type":"ContainerDied","Data":"886c8baa541d32465cd9ae76af7323fb43bd8bcafa4a5a26793c1480ef9bf2cc"} Dec 01 08:21:56 crc kubenswrapper[5004]: I1201 08:21:56.412461 5004 scope.go:117] "RemoveContainer" containerID="886c8baa541d32465cd9ae76af7323fb43bd8bcafa4a5a26793c1480ef9bf2cc" Dec 01 08:21:57 crc kubenswrapper[5004]: I1201 08:21:57.420948 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8g8bf" event={"ID":"8a9f98dc-e84b-4fb8-9d4d-69c766486ebb","Type":"ContainerStarted","Data":"37403851b7a3995647d9472cee025a2ab98a3adeeb95ee3d5d85053ebf1f1384"} Dec 01 08:21:57 crc kubenswrapper[5004]: I1201 08:21:57.421893 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-8g8bf" Dec 01 08:21:57 crc kubenswrapper[5004]: I1201 08:21:57.426763 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-8g8bf" Dec 01 08:22:02 crc kubenswrapper[5004]: I1201 08:22:02.607399 5004 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Dec 01 08:22:13 crc kubenswrapper[5004]: I1201 08:22:13.309779 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zj88j"] Dec 01 08:22:13 crc kubenswrapper[5004]: I1201 08:22:13.310551 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-zj88j" podUID="e796fca9-e620-4e16-bda0-0e722b91b53c" containerName="controller-manager" containerID="cri-o://8c1f2a555dc364d531d5a43583c945284728abe28b766059988547fb43857a5b" gracePeriod=30 Dec 01 08:22:13 crc kubenswrapper[5004]: I1201 08:22:13.411981 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-7khqv"] Dec 01 08:22:13 crc kubenswrapper[5004]: I1201 08:22:13.412247 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7khqv" podUID="d2bbf8d8-0338-4af4-8d6a-402033f87676" containerName="route-controller-manager" containerID="cri-o://03edc9ff781b4fd46a8e1e10ed0b366003a6d2f0ffda4f693fc27b63d2c61230" gracePeriod=30 Dec 01 08:22:13 crc kubenswrapper[5004]: I1201 08:22:13.514964 5004 generic.go:334] "Generic (PLEG): container finished" podID="e796fca9-e620-4e16-bda0-0e722b91b53c" containerID="8c1f2a555dc364d531d5a43583c945284728abe28b766059988547fb43857a5b" exitCode=0 Dec 01 08:22:13 crc kubenswrapper[5004]: I1201 08:22:13.515481 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-zj88j" event={"ID":"e796fca9-e620-4e16-bda0-0e722b91b53c","Type":"ContainerDied","Data":"8c1f2a555dc364d531d5a43583c945284728abe28b766059988547fb43857a5b"} Dec 01 08:22:13 crc kubenswrapper[5004]: I1201 08:22:13.656529 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-zj88j" Dec 01 08:22:13 crc kubenswrapper[5004]: I1201 08:22:13.820455 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e796fca9-e620-4e16-bda0-0e722b91b53c-serving-cert\") pod \"e796fca9-e620-4e16-bda0-0e722b91b53c\" (UID: \"e796fca9-e620-4e16-bda0-0e722b91b53c\") " Dec 01 08:22:13 crc kubenswrapper[5004]: I1201 08:22:13.820502 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e796fca9-e620-4e16-bda0-0e722b91b53c-proxy-ca-bundles\") pod \"e796fca9-e620-4e16-bda0-0e722b91b53c\" (UID: \"e796fca9-e620-4e16-bda0-0e722b91b53c\") " Dec 01 08:22:13 crc kubenswrapper[5004]: I1201 08:22:13.820523 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e796fca9-e620-4e16-bda0-0e722b91b53c-client-ca\") pod \"e796fca9-e620-4e16-bda0-0e722b91b53c\" (UID: \"e796fca9-e620-4e16-bda0-0e722b91b53c\") " Dec 01 08:22:13 crc kubenswrapper[5004]: I1201 08:22:13.820547 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e796fca9-e620-4e16-bda0-0e722b91b53c-config\") pod \"e796fca9-e620-4e16-bda0-0e722b91b53c\" (UID: \"e796fca9-e620-4e16-bda0-0e722b91b53c\") " Dec 01 08:22:13 crc kubenswrapper[5004]: I1201 08:22:13.820645 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rv442\" (UniqueName: \"kubernetes.io/projected/e796fca9-e620-4e16-bda0-0e722b91b53c-kube-api-access-rv442\") pod \"e796fca9-e620-4e16-bda0-0e722b91b53c\" (UID: \"e796fca9-e620-4e16-bda0-0e722b91b53c\") " Dec 01 08:22:13 crc kubenswrapper[5004]: I1201 08:22:13.821248 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e796fca9-e620-4e16-bda0-0e722b91b53c-client-ca" (OuterVolumeSpecName: "client-ca") pod "e796fca9-e620-4e16-bda0-0e722b91b53c" (UID: "e796fca9-e620-4e16-bda0-0e722b91b53c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:22:13 crc kubenswrapper[5004]: I1201 08:22:13.821306 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e796fca9-e620-4e16-bda0-0e722b91b53c-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "e796fca9-e620-4e16-bda0-0e722b91b53c" (UID: "e796fca9-e620-4e16-bda0-0e722b91b53c"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:22:13 crc kubenswrapper[5004]: I1201 08:22:13.821412 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e796fca9-e620-4e16-bda0-0e722b91b53c-config" (OuterVolumeSpecName: "config") pod "e796fca9-e620-4e16-bda0-0e722b91b53c" (UID: "e796fca9-e620-4e16-bda0-0e722b91b53c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:22:13 crc kubenswrapper[5004]: I1201 08:22:13.836811 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e796fca9-e620-4e16-bda0-0e722b91b53c-kube-api-access-rv442" (OuterVolumeSpecName: "kube-api-access-rv442") pod "e796fca9-e620-4e16-bda0-0e722b91b53c" (UID: "e796fca9-e620-4e16-bda0-0e722b91b53c"). InnerVolumeSpecName "kube-api-access-rv442". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:22:13 crc kubenswrapper[5004]: I1201 08:22:13.837053 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e796fca9-e620-4e16-bda0-0e722b91b53c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e796fca9-e620-4e16-bda0-0e722b91b53c" (UID: "e796fca9-e620-4e16-bda0-0e722b91b53c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:22:13 crc kubenswrapper[5004]: I1201 08:22:13.846429 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7khqv" Dec 01 08:22:13 crc kubenswrapper[5004]: I1201 08:22:13.921732 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rv442\" (UniqueName: \"kubernetes.io/projected/e796fca9-e620-4e16-bda0-0e722b91b53c-kube-api-access-rv442\") on node \"crc\" DevicePath \"\"" Dec 01 08:22:13 crc kubenswrapper[5004]: I1201 08:22:13.921759 5004 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e796fca9-e620-4e16-bda0-0e722b91b53c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 08:22:13 crc kubenswrapper[5004]: I1201 08:22:13.921769 5004 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e796fca9-e620-4e16-bda0-0e722b91b53c-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 01 08:22:13 crc kubenswrapper[5004]: I1201 08:22:13.921777 5004 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e796fca9-e620-4e16-bda0-0e722b91b53c-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 08:22:13 crc kubenswrapper[5004]: I1201 08:22:13.921786 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e796fca9-e620-4e16-bda0-0e722b91b53c-config\") on node \"crc\" DevicePath \"\"" Dec 01 08:22:14 crc kubenswrapper[5004]: I1201 08:22:14.022534 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d2bbf8d8-0338-4af4-8d6a-402033f87676-client-ca\") pod \"d2bbf8d8-0338-4af4-8d6a-402033f87676\" (UID: \"d2bbf8d8-0338-4af4-8d6a-402033f87676\") " Dec 01 08:22:14 crc kubenswrapper[5004]: I1201 08:22:14.022663 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2bbf8d8-0338-4af4-8d6a-402033f87676-config\") pod \"d2bbf8d8-0338-4af4-8d6a-402033f87676\" (UID: \"d2bbf8d8-0338-4af4-8d6a-402033f87676\") " Dec 01 08:22:14 crc kubenswrapper[5004]: I1201 08:22:14.022706 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2bbf8d8-0338-4af4-8d6a-402033f87676-serving-cert\") pod \"d2bbf8d8-0338-4af4-8d6a-402033f87676\" (UID: \"d2bbf8d8-0338-4af4-8d6a-402033f87676\") " Dec 01 08:22:14 crc kubenswrapper[5004]: I1201 08:22:14.022742 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghv98\" (UniqueName: \"kubernetes.io/projected/d2bbf8d8-0338-4af4-8d6a-402033f87676-kube-api-access-ghv98\") pod \"d2bbf8d8-0338-4af4-8d6a-402033f87676\" (UID: \"d2bbf8d8-0338-4af4-8d6a-402033f87676\") " Dec 01 08:22:14 crc kubenswrapper[5004]: I1201 08:22:14.023541 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2bbf8d8-0338-4af4-8d6a-402033f87676-config" (OuterVolumeSpecName: "config") pod "d2bbf8d8-0338-4af4-8d6a-402033f87676" (UID: "d2bbf8d8-0338-4af4-8d6a-402033f87676"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:22:14 crc kubenswrapper[5004]: I1201 08:22:14.023825 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2bbf8d8-0338-4af4-8d6a-402033f87676-client-ca" (OuterVolumeSpecName: "client-ca") pod "d2bbf8d8-0338-4af4-8d6a-402033f87676" (UID: "d2bbf8d8-0338-4af4-8d6a-402033f87676"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:22:14 crc kubenswrapper[5004]: I1201 08:22:14.027536 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2bbf8d8-0338-4af4-8d6a-402033f87676-kube-api-access-ghv98" (OuterVolumeSpecName: "kube-api-access-ghv98") pod "d2bbf8d8-0338-4af4-8d6a-402033f87676" (UID: "d2bbf8d8-0338-4af4-8d6a-402033f87676"). InnerVolumeSpecName "kube-api-access-ghv98". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:22:14 crc kubenswrapper[5004]: I1201 08:22:14.027764 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2bbf8d8-0338-4af4-8d6a-402033f87676-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d2bbf8d8-0338-4af4-8d6a-402033f87676" (UID: "d2bbf8d8-0338-4af4-8d6a-402033f87676"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:22:14 crc kubenswrapper[5004]: I1201 08:22:14.123903 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2bbf8d8-0338-4af4-8d6a-402033f87676-config\") on node \"crc\" DevicePath \"\"" Dec 01 08:22:14 crc kubenswrapper[5004]: I1201 08:22:14.123986 5004 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2bbf8d8-0338-4af4-8d6a-402033f87676-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 08:22:14 crc kubenswrapper[5004]: I1201 08:22:14.124018 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ghv98\" (UniqueName: \"kubernetes.io/projected/d2bbf8d8-0338-4af4-8d6a-402033f87676-kube-api-access-ghv98\") on node \"crc\" DevicePath \"\"" Dec 01 08:22:14 crc kubenswrapper[5004]: I1201 08:22:14.124042 5004 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d2bbf8d8-0338-4af4-8d6a-402033f87676-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 08:22:14 crc kubenswrapper[5004]: I1201 08:22:14.523873 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-zj88j" Dec 01 08:22:14 crc kubenswrapper[5004]: I1201 08:22:14.524307 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-zj88j" event={"ID":"e796fca9-e620-4e16-bda0-0e722b91b53c","Type":"ContainerDied","Data":"c3540b12fb85126fc3cb459c2bb712627922475242ff4f7e81cbcb4153a32976"} Dec 01 08:22:14 crc kubenswrapper[5004]: I1201 08:22:14.524434 5004 scope.go:117] "RemoveContainer" containerID="8c1f2a555dc364d531d5a43583c945284728abe28b766059988547fb43857a5b" Dec 01 08:22:14 crc kubenswrapper[5004]: I1201 08:22:14.528083 5004 generic.go:334] "Generic (PLEG): container finished" podID="d2bbf8d8-0338-4af4-8d6a-402033f87676" containerID="03edc9ff781b4fd46a8e1e10ed0b366003a6d2f0ffda4f693fc27b63d2c61230" exitCode=0 Dec 01 08:22:14 crc kubenswrapper[5004]: I1201 08:22:14.528190 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7khqv" Dec 01 08:22:14 crc kubenswrapper[5004]: I1201 08:22:14.528197 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7khqv" event={"ID":"d2bbf8d8-0338-4af4-8d6a-402033f87676","Type":"ContainerDied","Data":"03edc9ff781b4fd46a8e1e10ed0b366003a6d2f0ffda4f693fc27b63d2c61230"} Dec 01 08:22:14 crc kubenswrapper[5004]: I1201 08:22:14.528282 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7khqv" event={"ID":"d2bbf8d8-0338-4af4-8d6a-402033f87676","Type":"ContainerDied","Data":"eaeaa64e5cf8ec97c4cb29e5099fed5888c6853e6cc142278e293caefc0b2c6d"} Dec 01 08:22:14 crc kubenswrapper[5004]: I1201 08:22:14.553915 5004 scope.go:117] "RemoveContainer" containerID="03edc9ff781b4fd46a8e1e10ed0b366003a6d2f0ffda4f693fc27b63d2c61230" Dec 01 08:22:14 crc kubenswrapper[5004]: I1201 08:22:14.579411 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-7khqv"] Dec 01 08:22:14 crc kubenswrapper[5004]: I1201 08:22:14.583749 5004 scope.go:117] "RemoveContainer" containerID="03edc9ff781b4fd46a8e1e10ed0b366003a6d2f0ffda4f693fc27b63d2c61230" Dec 01 08:22:14 crc kubenswrapper[5004]: E1201 08:22:14.585249 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03edc9ff781b4fd46a8e1e10ed0b366003a6d2f0ffda4f693fc27b63d2c61230\": container with ID starting with 03edc9ff781b4fd46a8e1e10ed0b366003a6d2f0ffda4f693fc27b63d2c61230 not found: ID does not exist" containerID="03edc9ff781b4fd46a8e1e10ed0b366003a6d2f0ffda4f693fc27b63d2c61230" Dec 01 08:22:14 crc kubenswrapper[5004]: I1201 08:22:14.585321 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03edc9ff781b4fd46a8e1e10ed0b366003a6d2f0ffda4f693fc27b63d2c61230"} err="failed to get container status \"03edc9ff781b4fd46a8e1e10ed0b366003a6d2f0ffda4f693fc27b63d2c61230\": rpc error: code = NotFound desc = could not find container \"03edc9ff781b4fd46a8e1e10ed0b366003a6d2f0ffda4f693fc27b63d2c61230\": container with ID starting with 03edc9ff781b4fd46a8e1e10ed0b366003a6d2f0ffda4f693fc27b63d2c61230 not found: ID does not exist" Dec 01 08:22:14 crc kubenswrapper[5004]: I1201 08:22:14.593855 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-7khqv"] Dec 01 08:22:14 crc kubenswrapper[5004]: I1201 08:22:14.603316 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zj88j"] Dec 01 08:22:14 crc kubenswrapper[5004]: I1201 08:22:14.609100 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zj88j"] Dec 01 08:22:14 crc kubenswrapper[5004]: I1201 08:22:14.769206 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2bbf8d8-0338-4af4-8d6a-402033f87676" path="/var/lib/kubelet/pods/d2bbf8d8-0338-4af4-8d6a-402033f87676/volumes" Dec 01 08:22:14 crc kubenswrapper[5004]: I1201 08:22:14.770602 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e796fca9-e620-4e16-bda0-0e722b91b53c" path="/var/lib/kubelet/pods/e796fca9-e620-4e16-bda0-0e722b91b53c/volumes" Dec 01 08:22:15 crc kubenswrapper[5004]: I1201 08:22:15.230446 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-84d754dbcf-qct8w"] Dec 01 08:22:15 crc kubenswrapper[5004]: E1201 08:22:15.230834 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e796fca9-e620-4e16-bda0-0e722b91b53c" containerName="controller-manager" Dec 01 08:22:15 crc kubenswrapper[5004]: I1201 08:22:15.230863 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="e796fca9-e620-4e16-bda0-0e722b91b53c" containerName="controller-manager" Dec 01 08:22:15 crc kubenswrapper[5004]: E1201 08:22:15.230893 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2bbf8d8-0338-4af4-8d6a-402033f87676" containerName="route-controller-manager" Dec 01 08:22:15 crc kubenswrapper[5004]: I1201 08:22:15.230907 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2bbf8d8-0338-4af4-8d6a-402033f87676" containerName="route-controller-manager" Dec 01 08:22:15 crc kubenswrapper[5004]: I1201 08:22:15.231108 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2bbf8d8-0338-4af4-8d6a-402033f87676" containerName="route-controller-manager" Dec 01 08:22:15 crc kubenswrapper[5004]: I1201 08:22:15.231140 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="e796fca9-e620-4e16-bda0-0e722b91b53c" containerName="controller-manager" Dec 01 08:22:15 crc kubenswrapper[5004]: I1201 08:22:15.231775 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-84d754dbcf-qct8w" Dec 01 08:22:15 crc kubenswrapper[5004]: I1201 08:22:15.240253 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 01 08:22:15 crc kubenswrapper[5004]: I1201 08:22:15.240492 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 01 08:22:15 crc kubenswrapper[5004]: I1201 08:22:15.241187 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 01 08:22:15 crc kubenswrapper[5004]: I1201 08:22:15.241395 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 01 08:22:15 crc kubenswrapper[5004]: I1201 08:22:15.242682 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 01 08:22:15 crc kubenswrapper[5004]: I1201 08:22:15.243087 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 01 08:22:15 crc kubenswrapper[5004]: I1201 08:22:15.249728 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 01 08:22:15 crc kubenswrapper[5004]: I1201 08:22:15.255014 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7884944565-frcmw"] Dec 01 08:22:15 crc kubenswrapper[5004]: I1201 08:22:15.258998 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7884944565-frcmw"] Dec 01 08:22:15 crc kubenswrapper[5004]: I1201 08:22:15.259178 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7884944565-frcmw" Dec 01 08:22:15 crc kubenswrapper[5004]: I1201 08:22:15.260852 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 01 08:22:15 crc kubenswrapper[5004]: I1201 08:22:15.262051 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-84d754dbcf-qct8w"] Dec 01 08:22:15 crc kubenswrapper[5004]: I1201 08:22:15.264685 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 01 08:22:15 crc kubenswrapper[5004]: I1201 08:22:15.264878 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 01 08:22:15 crc kubenswrapper[5004]: I1201 08:22:15.265021 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 01 08:22:15 crc kubenswrapper[5004]: I1201 08:22:15.265083 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 01 08:22:15 crc kubenswrapper[5004]: I1201 08:22:15.265433 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 01 08:22:15 crc kubenswrapper[5004]: I1201 08:22:15.340107 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trv7v\" (UniqueName: \"kubernetes.io/projected/a7916295-91e0-4f50-b691-b6ac5e25ceff-kube-api-access-trv7v\") pod \"controller-manager-84d754dbcf-qct8w\" (UID: \"a7916295-91e0-4f50-b691-b6ac5e25ceff\") " pod="openshift-controller-manager/controller-manager-84d754dbcf-qct8w" Dec 01 08:22:15 crc kubenswrapper[5004]: I1201 08:22:15.340173 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a7916295-91e0-4f50-b691-b6ac5e25ceff-client-ca\") pod \"controller-manager-84d754dbcf-qct8w\" (UID: \"a7916295-91e0-4f50-b691-b6ac5e25ceff\") " pod="openshift-controller-manager/controller-manager-84d754dbcf-qct8w" Dec 01 08:22:15 crc kubenswrapper[5004]: I1201 08:22:15.340218 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7916295-91e0-4f50-b691-b6ac5e25ceff-config\") pod \"controller-manager-84d754dbcf-qct8w\" (UID: \"a7916295-91e0-4f50-b691-b6ac5e25ceff\") " pod="openshift-controller-manager/controller-manager-84d754dbcf-qct8w" Dec 01 08:22:15 crc kubenswrapper[5004]: I1201 08:22:15.340246 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a7916295-91e0-4f50-b691-b6ac5e25ceff-proxy-ca-bundles\") pod \"controller-manager-84d754dbcf-qct8w\" (UID: \"a7916295-91e0-4f50-b691-b6ac5e25ceff\") " pod="openshift-controller-manager/controller-manager-84d754dbcf-qct8w" Dec 01 08:22:15 crc kubenswrapper[5004]: I1201 08:22:15.340291 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7916295-91e0-4f50-b691-b6ac5e25ceff-serving-cert\") pod \"controller-manager-84d754dbcf-qct8w\" (UID: \"a7916295-91e0-4f50-b691-b6ac5e25ceff\") " pod="openshift-controller-manager/controller-manager-84d754dbcf-qct8w" Dec 01 08:22:15 crc kubenswrapper[5004]: I1201 08:22:15.441976 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/252142d1-1b95-41a8-9143-af6b9285de17-client-ca\") pod \"route-controller-manager-7884944565-frcmw\" (UID: \"252142d1-1b95-41a8-9143-af6b9285de17\") " pod="openshift-route-controller-manager/route-controller-manager-7884944565-frcmw" Dec 01 08:22:15 crc kubenswrapper[5004]: I1201 08:22:15.442033 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trv7v\" (UniqueName: \"kubernetes.io/projected/a7916295-91e0-4f50-b691-b6ac5e25ceff-kube-api-access-trv7v\") pod \"controller-manager-84d754dbcf-qct8w\" (UID: \"a7916295-91e0-4f50-b691-b6ac5e25ceff\") " pod="openshift-controller-manager/controller-manager-84d754dbcf-qct8w" Dec 01 08:22:15 crc kubenswrapper[5004]: I1201 08:22:15.442218 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a7916295-91e0-4f50-b691-b6ac5e25ceff-client-ca\") pod \"controller-manager-84d754dbcf-qct8w\" (UID: \"a7916295-91e0-4f50-b691-b6ac5e25ceff\") " pod="openshift-controller-manager/controller-manager-84d754dbcf-qct8w" Dec 01 08:22:15 crc kubenswrapper[5004]: I1201 08:22:15.442263 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/252142d1-1b95-41a8-9143-af6b9285de17-config\") pod \"route-controller-manager-7884944565-frcmw\" (UID: \"252142d1-1b95-41a8-9143-af6b9285de17\") " pod="openshift-route-controller-manager/route-controller-manager-7884944565-frcmw" Dec 01 08:22:15 crc kubenswrapper[5004]: I1201 08:22:15.442301 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/252142d1-1b95-41a8-9143-af6b9285de17-serving-cert\") pod \"route-controller-manager-7884944565-frcmw\" (UID: \"252142d1-1b95-41a8-9143-af6b9285de17\") " pod="openshift-route-controller-manager/route-controller-manager-7884944565-frcmw" Dec 01 08:22:15 crc kubenswrapper[5004]: I1201 08:22:15.442339 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7916295-91e0-4f50-b691-b6ac5e25ceff-config\") pod \"controller-manager-84d754dbcf-qct8w\" (UID: \"a7916295-91e0-4f50-b691-b6ac5e25ceff\") " pod="openshift-controller-manager/controller-manager-84d754dbcf-qct8w" Dec 01 08:22:15 crc kubenswrapper[5004]: I1201 08:22:15.442373 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a7916295-91e0-4f50-b691-b6ac5e25ceff-proxy-ca-bundles\") pod \"controller-manager-84d754dbcf-qct8w\" (UID: \"a7916295-91e0-4f50-b691-b6ac5e25ceff\") " pod="openshift-controller-manager/controller-manager-84d754dbcf-qct8w" Dec 01 08:22:15 crc kubenswrapper[5004]: I1201 08:22:15.442433 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8l7z\" (UniqueName: \"kubernetes.io/projected/252142d1-1b95-41a8-9143-af6b9285de17-kube-api-access-h8l7z\") pod \"route-controller-manager-7884944565-frcmw\" (UID: \"252142d1-1b95-41a8-9143-af6b9285de17\") " pod="openshift-route-controller-manager/route-controller-manager-7884944565-frcmw" Dec 01 08:22:15 crc kubenswrapper[5004]: I1201 08:22:15.442478 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7916295-91e0-4f50-b691-b6ac5e25ceff-serving-cert\") pod \"controller-manager-84d754dbcf-qct8w\" (UID: \"a7916295-91e0-4f50-b691-b6ac5e25ceff\") " pod="openshift-controller-manager/controller-manager-84d754dbcf-qct8w" Dec 01 08:22:15 crc kubenswrapper[5004]: I1201 08:22:15.444880 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a7916295-91e0-4f50-b691-b6ac5e25ceff-client-ca\") pod \"controller-manager-84d754dbcf-qct8w\" (UID: \"a7916295-91e0-4f50-b691-b6ac5e25ceff\") " pod="openshift-controller-manager/controller-manager-84d754dbcf-qct8w" Dec 01 08:22:15 crc kubenswrapper[5004]: I1201 08:22:15.445449 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a7916295-91e0-4f50-b691-b6ac5e25ceff-proxy-ca-bundles\") pod \"controller-manager-84d754dbcf-qct8w\" (UID: \"a7916295-91e0-4f50-b691-b6ac5e25ceff\") " pod="openshift-controller-manager/controller-manager-84d754dbcf-qct8w" Dec 01 08:22:15 crc kubenswrapper[5004]: I1201 08:22:15.446091 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7916295-91e0-4f50-b691-b6ac5e25ceff-config\") pod \"controller-manager-84d754dbcf-qct8w\" (UID: \"a7916295-91e0-4f50-b691-b6ac5e25ceff\") " pod="openshift-controller-manager/controller-manager-84d754dbcf-qct8w" Dec 01 08:22:15 crc kubenswrapper[5004]: I1201 08:22:15.456829 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7916295-91e0-4f50-b691-b6ac5e25ceff-serving-cert\") pod \"controller-manager-84d754dbcf-qct8w\" (UID: \"a7916295-91e0-4f50-b691-b6ac5e25ceff\") " pod="openshift-controller-manager/controller-manager-84d754dbcf-qct8w" Dec 01 08:22:15 crc kubenswrapper[5004]: I1201 08:22:15.459066 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trv7v\" (UniqueName: \"kubernetes.io/projected/a7916295-91e0-4f50-b691-b6ac5e25ceff-kube-api-access-trv7v\") pod \"controller-manager-84d754dbcf-qct8w\" (UID: \"a7916295-91e0-4f50-b691-b6ac5e25ceff\") " pod="openshift-controller-manager/controller-manager-84d754dbcf-qct8w" Dec 01 08:22:15 crc kubenswrapper[5004]: I1201 08:22:15.543928 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/252142d1-1b95-41a8-9143-af6b9285de17-client-ca\") pod \"route-controller-manager-7884944565-frcmw\" (UID: \"252142d1-1b95-41a8-9143-af6b9285de17\") " pod="openshift-route-controller-manager/route-controller-manager-7884944565-frcmw" Dec 01 08:22:15 crc kubenswrapper[5004]: I1201 08:22:15.544034 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/252142d1-1b95-41a8-9143-af6b9285de17-config\") pod \"route-controller-manager-7884944565-frcmw\" (UID: \"252142d1-1b95-41a8-9143-af6b9285de17\") " pod="openshift-route-controller-manager/route-controller-manager-7884944565-frcmw" Dec 01 08:22:15 crc kubenswrapper[5004]: I1201 08:22:15.544082 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/252142d1-1b95-41a8-9143-af6b9285de17-serving-cert\") pod \"route-controller-manager-7884944565-frcmw\" (UID: \"252142d1-1b95-41a8-9143-af6b9285de17\") " pod="openshift-route-controller-manager/route-controller-manager-7884944565-frcmw" Dec 01 08:22:15 crc kubenswrapper[5004]: I1201 08:22:15.544234 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8l7z\" (UniqueName: \"kubernetes.io/projected/252142d1-1b95-41a8-9143-af6b9285de17-kube-api-access-h8l7z\") pod \"route-controller-manager-7884944565-frcmw\" (UID: \"252142d1-1b95-41a8-9143-af6b9285de17\") " pod="openshift-route-controller-manager/route-controller-manager-7884944565-frcmw" Dec 01 08:22:15 crc kubenswrapper[5004]: I1201 08:22:15.545485 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/252142d1-1b95-41a8-9143-af6b9285de17-client-ca\") pod \"route-controller-manager-7884944565-frcmw\" (UID: \"252142d1-1b95-41a8-9143-af6b9285de17\") " pod="openshift-route-controller-manager/route-controller-manager-7884944565-frcmw" Dec 01 08:22:15 crc kubenswrapper[5004]: I1201 08:22:15.546763 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/252142d1-1b95-41a8-9143-af6b9285de17-config\") pod \"route-controller-manager-7884944565-frcmw\" (UID: \"252142d1-1b95-41a8-9143-af6b9285de17\") " pod="openshift-route-controller-manager/route-controller-manager-7884944565-frcmw" Dec 01 08:22:15 crc kubenswrapper[5004]: I1201 08:22:15.562283 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-84d754dbcf-qct8w" Dec 01 08:22:15 crc kubenswrapper[5004]: I1201 08:22:15.566240 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/252142d1-1b95-41a8-9143-af6b9285de17-serving-cert\") pod \"route-controller-manager-7884944565-frcmw\" (UID: \"252142d1-1b95-41a8-9143-af6b9285de17\") " pod="openshift-route-controller-manager/route-controller-manager-7884944565-frcmw" Dec 01 08:22:15 crc kubenswrapper[5004]: I1201 08:22:15.570219 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8l7z\" (UniqueName: \"kubernetes.io/projected/252142d1-1b95-41a8-9143-af6b9285de17-kube-api-access-h8l7z\") pod \"route-controller-manager-7884944565-frcmw\" (UID: \"252142d1-1b95-41a8-9143-af6b9285de17\") " pod="openshift-route-controller-manager/route-controller-manager-7884944565-frcmw" Dec 01 08:22:15 crc kubenswrapper[5004]: I1201 08:22:15.581945 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7884944565-frcmw" Dec 01 08:22:15 crc kubenswrapper[5004]: I1201 08:22:15.832825 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-84d754dbcf-qct8w"] Dec 01 08:22:15 crc kubenswrapper[5004]: W1201 08:22:15.834486 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7916295_91e0_4f50_b691_b6ac5e25ceff.slice/crio-6b02deb030b77125ca4591a93250ce22d6b4665d54b25b70d0810bb3c656f03d WatchSource:0}: Error finding container 6b02deb030b77125ca4591a93250ce22d6b4665d54b25b70d0810bb3c656f03d: Status 404 returned error can't find the container with id 6b02deb030b77125ca4591a93250ce22d6b4665d54b25b70d0810bb3c656f03d Dec 01 08:22:15 crc kubenswrapper[5004]: I1201 08:22:15.881932 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7884944565-frcmw"] Dec 01 08:22:15 crc kubenswrapper[5004]: W1201 08:22:15.889853 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod252142d1_1b95_41a8_9143_af6b9285de17.slice/crio-2a4f25aa742a6a31511e11dff0b294d6381741ca29a739c083ca35ebeb74bbde WatchSource:0}: Error finding container 2a4f25aa742a6a31511e11dff0b294d6381741ca29a739c083ca35ebeb74bbde: Status 404 returned error can't find the container with id 2a4f25aa742a6a31511e11dff0b294d6381741ca29a739c083ca35ebeb74bbde Dec 01 08:22:16 crc kubenswrapper[5004]: I1201 08:22:16.545225 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7884944565-frcmw" event={"ID":"252142d1-1b95-41a8-9143-af6b9285de17","Type":"ContainerStarted","Data":"92bbac1ed5b27eb5deb8f118ad119bf46ffab2f8261006f1ec915b67ad130d0e"} Dec 01 08:22:16 crc kubenswrapper[5004]: I1201 08:22:16.545264 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7884944565-frcmw" event={"ID":"252142d1-1b95-41a8-9143-af6b9285de17","Type":"ContainerStarted","Data":"2a4f25aa742a6a31511e11dff0b294d6381741ca29a739c083ca35ebeb74bbde"} Dec 01 08:22:16 crc kubenswrapper[5004]: I1201 08:22:16.545400 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7884944565-frcmw" Dec 01 08:22:16 crc kubenswrapper[5004]: I1201 08:22:16.548953 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-84d754dbcf-qct8w" event={"ID":"a7916295-91e0-4f50-b691-b6ac5e25ceff","Type":"ContainerStarted","Data":"21f9d9a2cdeee37aa731f4732deb8d66a0444b8634cbc4f765dc0fd8fd555044"} Dec 01 08:22:16 crc kubenswrapper[5004]: I1201 08:22:16.548995 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-84d754dbcf-qct8w" event={"ID":"a7916295-91e0-4f50-b691-b6ac5e25ceff","Type":"ContainerStarted","Data":"6b02deb030b77125ca4591a93250ce22d6b4665d54b25b70d0810bb3c656f03d"} Dec 01 08:22:16 crc kubenswrapper[5004]: I1201 08:22:16.549203 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-84d754dbcf-qct8w" Dec 01 08:22:16 crc kubenswrapper[5004]: I1201 08:22:16.555181 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-84d754dbcf-qct8w" Dec 01 08:22:16 crc kubenswrapper[5004]: I1201 08:22:16.568002 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7884944565-frcmw" podStartSLOduration=3.567986052 podStartE2EDuration="3.567986052s" podCreationTimestamp="2025-12-01 08:22:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:22:16.566008999 +0000 UTC m=+314.131000981" watchObservedRunningTime="2025-12-01 08:22:16.567986052 +0000 UTC m=+314.132978034" Dec 01 08:22:16 crc kubenswrapper[5004]: I1201 08:22:16.653380 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7884944565-frcmw" Dec 01 08:22:16 crc kubenswrapper[5004]: I1201 08:22:16.671154 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-84d754dbcf-qct8w" podStartSLOduration=3.671137398 podStartE2EDuration="3.671137398s" podCreationTimestamp="2025-12-01 08:22:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:22:16.594161946 +0000 UTC m=+314.159153958" watchObservedRunningTime="2025-12-01 08:22:16.671137398 +0000 UTC m=+314.236129380" Dec 01 08:22:17 crc kubenswrapper[5004]: I1201 08:22:17.940789 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-84d754dbcf-qct8w"] Dec 01 08:22:17 crc kubenswrapper[5004]: I1201 08:22:17.977495 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7884944565-frcmw"] Dec 01 08:22:19 crc kubenswrapper[5004]: I1201 08:22:19.567020 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7884944565-frcmw" podUID="252142d1-1b95-41a8-9143-af6b9285de17" containerName="route-controller-manager" containerID="cri-o://92bbac1ed5b27eb5deb8f118ad119bf46ffab2f8261006f1ec915b67ad130d0e" gracePeriod=30 Dec 01 08:22:19 crc kubenswrapper[5004]: I1201 08:22:19.567180 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-84d754dbcf-qct8w" podUID="a7916295-91e0-4f50-b691-b6ac5e25ceff" containerName="controller-manager" containerID="cri-o://21f9d9a2cdeee37aa731f4732deb8d66a0444b8634cbc4f765dc0fd8fd555044" gracePeriod=30 Dec 01 08:22:20 crc kubenswrapper[5004]: I1201 08:22:20.127417 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7884944565-frcmw" Dec 01 08:22:20 crc kubenswrapper[5004]: I1201 08:22:20.159320 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7595559b7b-qjt8j"] Dec 01 08:22:20 crc kubenswrapper[5004]: E1201 08:22:20.159650 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="252142d1-1b95-41a8-9143-af6b9285de17" containerName="route-controller-manager" Dec 01 08:22:20 crc kubenswrapper[5004]: I1201 08:22:20.159671 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="252142d1-1b95-41a8-9143-af6b9285de17" containerName="route-controller-manager" Dec 01 08:22:20 crc kubenswrapper[5004]: I1201 08:22:20.159835 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="252142d1-1b95-41a8-9143-af6b9285de17" containerName="route-controller-manager" Dec 01 08:22:20 crc kubenswrapper[5004]: I1201 08:22:20.160419 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7595559b7b-qjt8j" Dec 01 08:22:20 crc kubenswrapper[5004]: I1201 08:22:20.175989 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7595559b7b-qjt8j"] Dec 01 08:22:20 crc kubenswrapper[5004]: I1201 08:22:20.223890 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/252142d1-1b95-41a8-9143-af6b9285de17-client-ca\") pod \"252142d1-1b95-41a8-9143-af6b9285de17\" (UID: \"252142d1-1b95-41a8-9143-af6b9285de17\") " Dec 01 08:22:20 crc kubenswrapper[5004]: I1201 08:22:20.224224 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/252142d1-1b95-41a8-9143-af6b9285de17-serving-cert\") pod \"252142d1-1b95-41a8-9143-af6b9285de17\" (UID: \"252142d1-1b95-41a8-9143-af6b9285de17\") " Dec 01 08:22:20 crc kubenswrapper[5004]: I1201 08:22:20.224257 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/252142d1-1b95-41a8-9143-af6b9285de17-config\") pod \"252142d1-1b95-41a8-9143-af6b9285de17\" (UID: \"252142d1-1b95-41a8-9143-af6b9285de17\") " Dec 01 08:22:20 crc kubenswrapper[5004]: I1201 08:22:20.224315 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8l7z\" (UniqueName: \"kubernetes.io/projected/252142d1-1b95-41a8-9143-af6b9285de17-kube-api-access-h8l7z\") pod \"252142d1-1b95-41a8-9143-af6b9285de17\" (UID: \"252142d1-1b95-41a8-9143-af6b9285de17\") " Dec 01 08:22:20 crc kubenswrapper[5004]: I1201 08:22:20.224455 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e6ad3c76-f4e2-43c9-8dc0-abd71417f683-client-ca\") pod \"route-controller-manager-7595559b7b-qjt8j\" (UID: \"e6ad3c76-f4e2-43c9-8dc0-abd71417f683\") " pod="openshift-route-controller-manager/route-controller-manager-7595559b7b-qjt8j" Dec 01 08:22:20 crc kubenswrapper[5004]: I1201 08:22:20.224476 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9s7s\" (UniqueName: \"kubernetes.io/projected/e6ad3c76-f4e2-43c9-8dc0-abd71417f683-kube-api-access-b9s7s\") pod \"route-controller-manager-7595559b7b-qjt8j\" (UID: \"e6ad3c76-f4e2-43c9-8dc0-abd71417f683\") " pod="openshift-route-controller-manager/route-controller-manager-7595559b7b-qjt8j" Dec 01 08:22:20 crc kubenswrapper[5004]: I1201 08:22:20.224525 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6ad3c76-f4e2-43c9-8dc0-abd71417f683-config\") pod \"route-controller-manager-7595559b7b-qjt8j\" (UID: \"e6ad3c76-f4e2-43c9-8dc0-abd71417f683\") " pod="openshift-route-controller-manager/route-controller-manager-7595559b7b-qjt8j" Dec 01 08:22:20 crc kubenswrapper[5004]: I1201 08:22:20.224551 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6ad3c76-f4e2-43c9-8dc0-abd71417f683-serving-cert\") pod \"route-controller-manager-7595559b7b-qjt8j\" (UID: \"e6ad3c76-f4e2-43c9-8dc0-abd71417f683\") " pod="openshift-route-controller-manager/route-controller-manager-7595559b7b-qjt8j" Dec 01 08:22:20 crc kubenswrapper[5004]: I1201 08:22:20.225012 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/252142d1-1b95-41a8-9143-af6b9285de17-config" (OuterVolumeSpecName: "config") pod "252142d1-1b95-41a8-9143-af6b9285de17" (UID: "252142d1-1b95-41a8-9143-af6b9285de17"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:22:20 crc kubenswrapper[5004]: I1201 08:22:20.225379 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/252142d1-1b95-41a8-9143-af6b9285de17-client-ca" (OuterVolumeSpecName: "client-ca") pod "252142d1-1b95-41a8-9143-af6b9285de17" (UID: "252142d1-1b95-41a8-9143-af6b9285de17"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:22:20 crc kubenswrapper[5004]: I1201 08:22:20.228321 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-84d754dbcf-qct8w" Dec 01 08:22:20 crc kubenswrapper[5004]: I1201 08:22:20.232051 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/252142d1-1b95-41a8-9143-af6b9285de17-kube-api-access-h8l7z" (OuterVolumeSpecName: "kube-api-access-h8l7z") pod "252142d1-1b95-41a8-9143-af6b9285de17" (UID: "252142d1-1b95-41a8-9143-af6b9285de17"). InnerVolumeSpecName "kube-api-access-h8l7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:22:20 crc kubenswrapper[5004]: I1201 08:22:20.232317 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/252142d1-1b95-41a8-9143-af6b9285de17-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "252142d1-1b95-41a8-9143-af6b9285de17" (UID: "252142d1-1b95-41a8-9143-af6b9285de17"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:22:20 crc kubenswrapper[5004]: I1201 08:22:20.325156 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7916295-91e0-4f50-b691-b6ac5e25ceff-config\") pod \"a7916295-91e0-4f50-b691-b6ac5e25ceff\" (UID: \"a7916295-91e0-4f50-b691-b6ac5e25ceff\") " Dec 01 08:22:20 crc kubenswrapper[5004]: I1201 08:22:20.325583 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a7916295-91e0-4f50-b691-b6ac5e25ceff-proxy-ca-bundles\") pod \"a7916295-91e0-4f50-b691-b6ac5e25ceff\" (UID: \"a7916295-91e0-4f50-b691-b6ac5e25ceff\") " Dec 01 08:22:20 crc kubenswrapper[5004]: I1201 08:22:20.325739 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7916295-91e0-4f50-b691-b6ac5e25ceff-serving-cert\") pod \"a7916295-91e0-4f50-b691-b6ac5e25ceff\" (UID: \"a7916295-91e0-4f50-b691-b6ac5e25ceff\") " Dec 01 08:22:20 crc kubenswrapper[5004]: I1201 08:22:20.325842 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7916295-91e0-4f50-b691-b6ac5e25ceff-config" (OuterVolumeSpecName: "config") pod "a7916295-91e0-4f50-b691-b6ac5e25ceff" (UID: "a7916295-91e0-4f50-b691-b6ac5e25ceff"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:22:20 crc kubenswrapper[5004]: I1201 08:22:20.325866 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trv7v\" (UniqueName: \"kubernetes.io/projected/a7916295-91e0-4f50-b691-b6ac5e25ceff-kube-api-access-trv7v\") pod \"a7916295-91e0-4f50-b691-b6ac5e25ceff\" (UID: \"a7916295-91e0-4f50-b691-b6ac5e25ceff\") " Dec 01 08:22:20 crc kubenswrapper[5004]: I1201 08:22:20.326065 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a7916295-91e0-4f50-b691-b6ac5e25ceff-client-ca\") pod \"a7916295-91e0-4f50-b691-b6ac5e25ceff\" (UID: \"a7916295-91e0-4f50-b691-b6ac5e25ceff\") " Dec 01 08:22:20 crc kubenswrapper[5004]: I1201 08:22:20.326212 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7916295-91e0-4f50-b691-b6ac5e25ceff-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "a7916295-91e0-4f50-b691-b6ac5e25ceff" (UID: "a7916295-91e0-4f50-b691-b6ac5e25ceff"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:22:20 crc kubenswrapper[5004]: I1201 08:22:20.326541 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e6ad3c76-f4e2-43c9-8dc0-abd71417f683-client-ca\") pod \"route-controller-manager-7595559b7b-qjt8j\" (UID: \"e6ad3c76-f4e2-43c9-8dc0-abd71417f683\") " pod="openshift-route-controller-manager/route-controller-manager-7595559b7b-qjt8j" Dec 01 08:22:20 crc kubenswrapper[5004]: I1201 08:22:20.326683 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7916295-91e0-4f50-b691-b6ac5e25ceff-client-ca" (OuterVolumeSpecName: "client-ca") pod "a7916295-91e0-4f50-b691-b6ac5e25ceff" (UID: "a7916295-91e0-4f50-b691-b6ac5e25ceff"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:22:20 crc kubenswrapper[5004]: I1201 08:22:20.326693 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9s7s\" (UniqueName: \"kubernetes.io/projected/e6ad3c76-f4e2-43c9-8dc0-abd71417f683-kube-api-access-b9s7s\") pod \"route-controller-manager-7595559b7b-qjt8j\" (UID: \"e6ad3c76-f4e2-43c9-8dc0-abd71417f683\") " pod="openshift-route-controller-manager/route-controller-manager-7595559b7b-qjt8j" Dec 01 08:22:20 crc kubenswrapper[5004]: I1201 08:22:20.326930 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6ad3c76-f4e2-43c9-8dc0-abd71417f683-config\") pod \"route-controller-manager-7595559b7b-qjt8j\" (UID: \"e6ad3c76-f4e2-43c9-8dc0-abd71417f683\") " pod="openshift-route-controller-manager/route-controller-manager-7595559b7b-qjt8j" Dec 01 08:22:20 crc kubenswrapper[5004]: I1201 08:22:20.327090 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6ad3c76-f4e2-43c9-8dc0-abd71417f683-serving-cert\") pod \"route-controller-manager-7595559b7b-qjt8j\" (UID: \"e6ad3c76-f4e2-43c9-8dc0-abd71417f683\") " pod="openshift-route-controller-manager/route-controller-manager-7595559b7b-qjt8j" Dec 01 08:22:20 crc kubenswrapper[5004]: I1201 08:22:20.327257 5004 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a7916295-91e0-4f50-b691-b6ac5e25ceff-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 08:22:20 crc kubenswrapper[5004]: I1201 08:22:20.327359 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8l7z\" (UniqueName: \"kubernetes.io/projected/252142d1-1b95-41a8-9143-af6b9285de17-kube-api-access-h8l7z\") on node \"crc\" DevicePath \"\"" Dec 01 08:22:20 crc kubenswrapper[5004]: I1201 08:22:20.327451 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7916295-91e0-4f50-b691-b6ac5e25ceff-config\") on node \"crc\" DevicePath \"\"" Dec 01 08:22:20 crc kubenswrapper[5004]: I1201 08:22:20.327589 5004 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/252142d1-1b95-41a8-9143-af6b9285de17-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 08:22:20 crc kubenswrapper[5004]: I1201 08:22:20.327703 5004 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/252142d1-1b95-41a8-9143-af6b9285de17-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 08:22:20 crc kubenswrapper[5004]: I1201 08:22:20.327814 5004 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a7916295-91e0-4f50-b691-b6ac5e25ceff-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 01 08:22:20 crc kubenswrapper[5004]: I1201 08:22:20.327930 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/252142d1-1b95-41a8-9143-af6b9285de17-config\") on node \"crc\" DevicePath \"\"" Dec 01 08:22:20 crc kubenswrapper[5004]: I1201 08:22:20.327416 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e6ad3c76-f4e2-43c9-8dc0-abd71417f683-client-ca\") pod \"route-controller-manager-7595559b7b-qjt8j\" (UID: \"e6ad3c76-f4e2-43c9-8dc0-abd71417f683\") " pod="openshift-route-controller-manager/route-controller-manager-7595559b7b-qjt8j" Dec 01 08:22:20 crc kubenswrapper[5004]: I1201 08:22:20.328275 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7916295-91e0-4f50-b691-b6ac5e25ceff-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a7916295-91e0-4f50-b691-b6ac5e25ceff" (UID: "a7916295-91e0-4f50-b691-b6ac5e25ceff"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:22:20 crc kubenswrapper[5004]: I1201 08:22:20.328540 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6ad3c76-f4e2-43c9-8dc0-abd71417f683-config\") pod \"route-controller-manager-7595559b7b-qjt8j\" (UID: \"e6ad3c76-f4e2-43c9-8dc0-abd71417f683\") " pod="openshift-route-controller-manager/route-controller-manager-7595559b7b-qjt8j" Dec 01 08:22:20 crc kubenswrapper[5004]: I1201 08:22:20.328856 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7916295-91e0-4f50-b691-b6ac5e25ceff-kube-api-access-trv7v" (OuterVolumeSpecName: "kube-api-access-trv7v") pod "a7916295-91e0-4f50-b691-b6ac5e25ceff" (UID: "a7916295-91e0-4f50-b691-b6ac5e25ceff"). InnerVolumeSpecName "kube-api-access-trv7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:22:20 crc kubenswrapper[5004]: I1201 08:22:20.332837 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6ad3c76-f4e2-43c9-8dc0-abd71417f683-serving-cert\") pod \"route-controller-manager-7595559b7b-qjt8j\" (UID: \"e6ad3c76-f4e2-43c9-8dc0-abd71417f683\") " pod="openshift-route-controller-manager/route-controller-manager-7595559b7b-qjt8j" Dec 01 08:22:20 crc kubenswrapper[5004]: I1201 08:22:20.344058 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9s7s\" (UniqueName: \"kubernetes.io/projected/e6ad3c76-f4e2-43c9-8dc0-abd71417f683-kube-api-access-b9s7s\") pod \"route-controller-manager-7595559b7b-qjt8j\" (UID: \"e6ad3c76-f4e2-43c9-8dc0-abd71417f683\") " pod="openshift-route-controller-manager/route-controller-manager-7595559b7b-qjt8j" Dec 01 08:22:20 crc kubenswrapper[5004]: I1201 08:22:20.429866 5004 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7916295-91e0-4f50-b691-b6ac5e25ceff-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 08:22:20 crc kubenswrapper[5004]: I1201 08:22:20.429945 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trv7v\" (UniqueName: \"kubernetes.io/projected/a7916295-91e0-4f50-b691-b6ac5e25ceff-kube-api-access-trv7v\") on node \"crc\" DevicePath \"\"" Dec 01 08:22:20 crc kubenswrapper[5004]: I1201 08:22:20.526007 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7595559b7b-qjt8j" Dec 01 08:22:20 crc kubenswrapper[5004]: I1201 08:22:20.575797 5004 generic.go:334] "Generic (PLEG): container finished" podID="252142d1-1b95-41a8-9143-af6b9285de17" containerID="92bbac1ed5b27eb5deb8f118ad119bf46ffab2f8261006f1ec915b67ad130d0e" exitCode=0 Dec 01 08:22:20 crc kubenswrapper[5004]: I1201 08:22:20.575919 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7884944565-frcmw" Dec 01 08:22:20 crc kubenswrapper[5004]: I1201 08:22:20.575925 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7884944565-frcmw" event={"ID":"252142d1-1b95-41a8-9143-af6b9285de17","Type":"ContainerDied","Data":"92bbac1ed5b27eb5deb8f118ad119bf46ffab2f8261006f1ec915b67ad130d0e"} Dec 01 08:22:20 crc kubenswrapper[5004]: I1201 08:22:20.576024 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7884944565-frcmw" event={"ID":"252142d1-1b95-41a8-9143-af6b9285de17","Type":"ContainerDied","Data":"2a4f25aa742a6a31511e11dff0b294d6381741ca29a739c083ca35ebeb74bbde"} Dec 01 08:22:20 crc kubenswrapper[5004]: I1201 08:22:20.576069 5004 scope.go:117] "RemoveContainer" containerID="92bbac1ed5b27eb5deb8f118ad119bf46ffab2f8261006f1ec915b67ad130d0e" Dec 01 08:22:20 crc kubenswrapper[5004]: I1201 08:22:20.580300 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-84d754dbcf-qct8w" Dec 01 08:22:20 crc kubenswrapper[5004]: I1201 08:22:20.580349 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-84d754dbcf-qct8w" event={"ID":"a7916295-91e0-4f50-b691-b6ac5e25ceff","Type":"ContainerDied","Data":"21f9d9a2cdeee37aa731f4732deb8d66a0444b8634cbc4f765dc0fd8fd555044"} Dec 01 08:22:20 crc kubenswrapper[5004]: I1201 08:22:20.582847 5004 generic.go:334] "Generic (PLEG): container finished" podID="a7916295-91e0-4f50-b691-b6ac5e25ceff" containerID="21f9d9a2cdeee37aa731f4732deb8d66a0444b8634cbc4f765dc0fd8fd555044" exitCode=0 Dec 01 08:22:20 crc kubenswrapper[5004]: I1201 08:22:20.583341 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-84d754dbcf-qct8w" event={"ID":"a7916295-91e0-4f50-b691-b6ac5e25ceff","Type":"ContainerDied","Data":"6b02deb030b77125ca4591a93250ce22d6b4665d54b25b70d0810bb3c656f03d"} Dec 01 08:22:20 crc kubenswrapper[5004]: I1201 08:22:20.611709 5004 scope.go:117] "RemoveContainer" containerID="92bbac1ed5b27eb5deb8f118ad119bf46ffab2f8261006f1ec915b67ad130d0e" Dec 01 08:22:20 crc kubenswrapper[5004]: E1201 08:22:20.612202 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92bbac1ed5b27eb5deb8f118ad119bf46ffab2f8261006f1ec915b67ad130d0e\": container with ID starting with 92bbac1ed5b27eb5deb8f118ad119bf46ffab2f8261006f1ec915b67ad130d0e not found: ID does not exist" containerID="92bbac1ed5b27eb5deb8f118ad119bf46ffab2f8261006f1ec915b67ad130d0e" Dec 01 08:22:20 crc kubenswrapper[5004]: I1201 08:22:20.612272 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92bbac1ed5b27eb5deb8f118ad119bf46ffab2f8261006f1ec915b67ad130d0e"} err="failed to get container status \"92bbac1ed5b27eb5deb8f118ad119bf46ffab2f8261006f1ec915b67ad130d0e\": rpc error: code = NotFound desc = could not find container \"92bbac1ed5b27eb5deb8f118ad119bf46ffab2f8261006f1ec915b67ad130d0e\": container with ID starting with 92bbac1ed5b27eb5deb8f118ad119bf46ffab2f8261006f1ec915b67ad130d0e not found: ID does not exist" Dec 01 08:22:20 crc kubenswrapper[5004]: I1201 08:22:20.612314 5004 scope.go:117] "RemoveContainer" containerID="21f9d9a2cdeee37aa731f4732deb8d66a0444b8634cbc4f765dc0fd8fd555044" Dec 01 08:22:20 crc kubenswrapper[5004]: I1201 08:22:20.646007 5004 scope.go:117] "RemoveContainer" containerID="21f9d9a2cdeee37aa731f4732deb8d66a0444b8634cbc4f765dc0fd8fd555044" Dec 01 08:22:20 crc kubenswrapper[5004]: I1201 08:22:20.647199 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-84d754dbcf-qct8w"] Dec 01 08:22:20 crc kubenswrapper[5004]: E1201 08:22:20.647804 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21f9d9a2cdeee37aa731f4732deb8d66a0444b8634cbc4f765dc0fd8fd555044\": container with ID starting with 21f9d9a2cdeee37aa731f4732deb8d66a0444b8634cbc4f765dc0fd8fd555044 not found: ID does not exist" containerID="21f9d9a2cdeee37aa731f4732deb8d66a0444b8634cbc4f765dc0fd8fd555044" Dec 01 08:22:20 crc kubenswrapper[5004]: I1201 08:22:20.647832 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21f9d9a2cdeee37aa731f4732deb8d66a0444b8634cbc4f765dc0fd8fd555044"} err="failed to get container status \"21f9d9a2cdeee37aa731f4732deb8d66a0444b8634cbc4f765dc0fd8fd555044\": rpc error: code = NotFound desc = could not find container \"21f9d9a2cdeee37aa731f4732deb8d66a0444b8634cbc4f765dc0fd8fd555044\": container with ID starting with 21f9d9a2cdeee37aa731f4732deb8d66a0444b8634cbc4f765dc0fd8fd555044 not found: ID does not exist" Dec 01 08:22:20 crc kubenswrapper[5004]: I1201 08:22:20.654478 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-84d754dbcf-qct8w"] Dec 01 08:22:20 crc kubenswrapper[5004]: I1201 08:22:20.660890 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7884944565-frcmw"] Dec 01 08:22:20 crc kubenswrapper[5004]: I1201 08:22:20.667482 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7884944565-frcmw"] Dec 01 08:22:20 crc kubenswrapper[5004]: I1201 08:22:20.772546 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="252142d1-1b95-41a8-9143-af6b9285de17" path="/var/lib/kubelet/pods/252142d1-1b95-41a8-9143-af6b9285de17/volumes" Dec 01 08:22:20 crc kubenswrapper[5004]: I1201 08:22:20.773192 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7916295-91e0-4f50-b691-b6ac5e25ceff" path="/var/lib/kubelet/pods/a7916295-91e0-4f50-b691-b6ac5e25ceff/volumes" Dec 01 08:22:21 crc kubenswrapper[5004]: I1201 08:22:21.052675 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7595559b7b-qjt8j"] Dec 01 08:22:21 crc kubenswrapper[5004]: I1201 08:22:21.591335 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7595559b7b-qjt8j" event={"ID":"e6ad3c76-f4e2-43c9-8dc0-abd71417f683","Type":"ContainerStarted","Data":"d54428b9d2ec7eba50b047e3c28d9ff45d9885a631a698cf44bd1abaab93f3b5"} Dec 01 08:22:21 crc kubenswrapper[5004]: I1201 08:22:21.591408 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7595559b7b-qjt8j" event={"ID":"e6ad3c76-f4e2-43c9-8dc0-abd71417f683","Type":"ContainerStarted","Data":"34e7fbc3c1a3fc3277466e60f327684f487dc56f4f4c6695e27d9aaa535be7b2"} Dec 01 08:22:21 crc kubenswrapper[5004]: I1201 08:22:21.593006 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7595559b7b-qjt8j" Dec 01 08:22:21 crc kubenswrapper[5004]: I1201 08:22:21.602078 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7595559b7b-qjt8j" Dec 01 08:22:21 crc kubenswrapper[5004]: I1201 08:22:21.621754 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7595559b7b-qjt8j" podStartSLOduration=3.621733191 podStartE2EDuration="3.621733191s" podCreationTimestamp="2025-12-01 08:22:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:22:21.618608837 +0000 UTC m=+319.183600829" watchObservedRunningTime="2025-12-01 08:22:21.621733191 +0000 UTC m=+319.186725193" Dec 01 08:22:22 crc kubenswrapper[5004]: I1201 08:22:22.237734 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7877d8b9bb-nkhzr"] Dec 01 08:22:22 crc kubenswrapper[5004]: E1201 08:22:22.238602 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7916295-91e0-4f50-b691-b6ac5e25ceff" containerName="controller-manager" Dec 01 08:22:22 crc kubenswrapper[5004]: I1201 08:22:22.238632 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7916295-91e0-4f50-b691-b6ac5e25ceff" containerName="controller-manager" Dec 01 08:22:22 crc kubenswrapper[5004]: I1201 08:22:22.238898 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7916295-91e0-4f50-b691-b6ac5e25ceff" containerName="controller-manager" Dec 01 08:22:22 crc kubenswrapper[5004]: I1201 08:22:22.239489 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7877d8b9bb-nkhzr" Dec 01 08:22:22 crc kubenswrapper[5004]: I1201 08:22:22.243788 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 01 08:22:22 crc kubenswrapper[5004]: I1201 08:22:22.244185 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 01 08:22:22 crc kubenswrapper[5004]: I1201 08:22:22.244765 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 01 08:22:22 crc kubenswrapper[5004]: I1201 08:22:22.245201 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 01 08:22:22 crc kubenswrapper[5004]: I1201 08:22:22.245450 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 01 08:22:22 crc kubenswrapper[5004]: I1201 08:22:22.249973 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 01 08:22:22 crc kubenswrapper[5004]: I1201 08:22:22.251496 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8949\" (UniqueName: \"kubernetes.io/projected/391720eb-78be-4c56-9be9-edb627920635-kube-api-access-g8949\") pod \"controller-manager-7877d8b9bb-nkhzr\" (UID: \"391720eb-78be-4c56-9be9-edb627920635\") " pod="openshift-controller-manager/controller-manager-7877d8b9bb-nkhzr" Dec 01 08:22:22 crc kubenswrapper[5004]: I1201 08:22:22.251659 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/391720eb-78be-4c56-9be9-edb627920635-serving-cert\") pod \"controller-manager-7877d8b9bb-nkhzr\" (UID: \"391720eb-78be-4c56-9be9-edb627920635\") " pod="openshift-controller-manager/controller-manager-7877d8b9bb-nkhzr" Dec 01 08:22:22 crc kubenswrapper[5004]: I1201 08:22:22.251719 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/391720eb-78be-4c56-9be9-edb627920635-proxy-ca-bundles\") pod \"controller-manager-7877d8b9bb-nkhzr\" (UID: \"391720eb-78be-4c56-9be9-edb627920635\") " pod="openshift-controller-manager/controller-manager-7877d8b9bb-nkhzr" Dec 01 08:22:22 crc kubenswrapper[5004]: I1201 08:22:22.251766 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/391720eb-78be-4c56-9be9-edb627920635-client-ca\") pod \"controller-manager-7877d8b9bb-nkhzr\" (UID: \"391720eb-78be-4c56-9be9-edb627920635\") " pod="openshift-controller-manager/controller-manager-7877d8b9bb-nkhzr" Dec 01 08:22:22 crc kubenswrapper[5004]: I1201 08:22:22.252007 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/391720eb-78be-4c56-9be9-edb627920635-config\") pod \"controller-manager-7877d8b9bb-nkhzr\" (UID: \"391720eb-78be-4c56-9be9-edb627920635\") " pod="openshift-controller-manager/controller-manager-7877d8b9bb-nkhzr" Dec 01 08:22:22 crc kubenswrapper[5004]: I1201 08:22:22.258168 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7877d8b9bb-nkhzr"] Dec 01 08:22:22 crc kubenswrapper[5004]: I1201 08:22:22.259892 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 01 08:22:22 crc kubenswrapper[5004]: I1201 08:22:22.353780 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/391720eb-78be-4c56-9be9-edb627920635-config\") pod \"controller-manager-7877d8b9bb-nkhzr\" (UID: \"391720eb-78be-4c56-9be9-edb627920635\") " pod="openshift-controller-manager/controller-manager-7877d8b9bb-nkhzr" Dec 01 08:22:22 crc kubenswrapper[5004]: I1201 08:22:22.353872 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8949\" (UniqueName: \"kubernetes.io/projected/391720eb-78be-4c56-9be9-edb627920635-kube-api-access-g8949\") pod \"controller-manager-7877d8b9bb-nkhzr\" (UID: \"391720eb-78be-4c56-9be9-edb627920635\") " pod="openshift-controller-manager/controller-manager-7877d8b9bb-nkhzr" Dec 01 08:22:22 crc kubenswrapper[5004]: I1201 08:22:22.353924 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/391720eb-78be-4c56-9be9-edb627920635-serving-cert\") pod \"controller-manager-7877d8b9bb-nkhzr\" (UID: \"391720eb-78be-4c56-9be9-edb627920635\") " pod="openshift-controller-manager/controller-manager-7877d8b9bb-nkhzr" Dec 01 08:22:22 crc kubenswrapper[5004]: I1201 08:22:22.353963 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/391720eb-78be-4c56-9be9-edb627920635-proxy-ca-bundles\") pod \"controller-manager-7877d8b9bb-nkhzr\" (UID: \"391720eb-78be-4c56-9be9-edb627920635\") " pod="openshift-controller-manager/controller-manager-7877d8b9bb-nkhzr" Dec 01 08:22:22 crc kubenswrapper[5004]: I1201 08:22:22.353994 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/391720eb-78be-4c56-9be9-edb627920635-client-ca\") pod \"controller-manager-7877d8b9bb-nkhzr\" (UID: \"391720eb-78be-4c56-9be9-edb627920635\") " pod="openshift-controller-manager/controller-manager-7877d8b9bb-nkhzr" Dec 01 08:22:22 crc kubenswrapper[5004]: I1201 08:22:22.356707 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/391720eb-78be-4c56-9be9-edb627920635-proxy-ca-bundles\") pod \"controller-manager-7877d8b9bb-nkhzr\" (UID: \"391720eb-78be-4c56-9be9-edb627920635\") " pod="openshift-controller-manager/controller-manager-7877d8b9bb-nkhzr" Dec 01 08:22:22 crc kubenswrapper[5004]: I1201 08:22:22.357251 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/391720eb-78be-4c56-9be9-edb627920635-config\") pod \"controller-manager-7877d8b9bb-nkhzr\" (UID: \"391720eb-78be-4c56-9be9-edb627920635\") " pod="openshift-controller-manager/controller-manager-7877d8b9bb-nkhzr" Dec 01 08:22:22 crc kubenswrapper[5004]: I1201 08:22:22.357354 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/391720eb-78be-4c56-9be9-edb627920635-client-ca\") pod \"controller-manager-7877d8b9bb-nkhzr\" (UID: \"391720eb-78be-4c56-9be9-edb627920635\") " pod="openshift-controller-manager/controller-manager-7877d8b9bb-nkhzr" Dec 01 08:22:22 crc kubenswrapper[5004]: I1201 08:22:22.366642 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/391720eb-78be-4c56-9be9-edb627920635-serving-cert\") pod \"controller-manager-7877d8b9bb-nkhzr\" (UID: \"391720eb-78be-4c56-9be9-edb627920635\") " pod="openshift-controller-manager/controller-manager-7877d8b9bb-nkhzr" Dec 01 08:22:22 crc kubenswrapper[5004]: I1201 08:22:22.384448 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8949\" (UniqueName: \"kubernetes.io/projected/391720eb-78be-4c56-9be9-edb627920635-kube-api-access-g8949\") pod \"controller-manager-7877d8b9bb-nkhzr\" (UID: \"391720eb-78be-4c56-9be9-edb627920635\") " pod="openshift-controller-manager/controller-manager-7877d8b9bb-nkhzr" Dec 01 08:22:22 crc kubenswrapper[5004]: I1201 08:22:22.571140 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7877d8b9bb-nkhzr" Dec 01 08:22:23 crc kubenswrapper[5004]: W1201 08:22:23.064021 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod391720eb_78be_4c56_9be9_edb627920635.slice/crio-3b54a841dabe6670e5a0f38fd96f87c14be1d5fd565bea4edafae99cef0c983d WatchSource:0}: Error finding container 3b54a841dabe6670e5a0f38fd96f87c14be1d5fd565bea4edafae99cef0c983d: Status 404 returned error can't find the container with id 3b54a841dabe6670e5a0f38fd96f87c14be1d5fd565bea4edafae99cef0c983d Dec 01 08:22:23 crc kubenswrapper[5004]: I1201 08:22:23.064480 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7877d8b9bb-nkhzr"] Dec 01 08:22:23 crc kubenswrapper[5004]: I1201 08:22:23.610694 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7877d8b9bb-nkhzr" event={"ID":"391720eb-78be-4c56-9be9-edb627920635","Type":"ContainerStarted","Data":"d3a148b97c9f2d80720ace4d333eb47e14ab3474575a7fc7f16095674440618b"} Dec 01 08:22:23 crc kubenswrapper[5004]: I1201 08:22:23.610745 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7877d8b9bb-nkhzr" event={"ID":"391720eb-78be-4c56-9be9-edb627920635","Type":"ContainerStarted","Data":"3b54a841dabe6670e5a0f38fd96f87c14be1d5fd565bea4edafae99cef0c983d"} Dec 01 08:22:23 crc kubenswrapper[5004]: I1201 08:22:23.612390 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7877d8b9bb-nkhzr" Dec 01 08:22:23 crc kubenswrapper[5004]: I1201 08:22:23.617782 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7877d8b9bb-nkhzr" Dec 01 08:22:23 crc kubenswrapper[5004]: I1201 08:22:23.650960 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7877d8b9bb-nkhzr" podStartSLOduration=5.650934533 podStartE2EDuration="5.650934533s" podCreationTimestamp="2025-12-01 08:22:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:22:23.627378798 +0000 UTC m=+321.192370800" watchObservedRunningTime="2025-12-01 08:22:23.650934533 +0000 UTC m=+321.215926525" Dec 01 08:22:53 crc kubenswrapper[5004]: I1201 08:22:53.295914 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7877d8b9bb-nkhzr"] Dec 01 08:22:53 crc kubenswrapper[5004]: I1201 08:22:53.296611 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7877d8b9bb-nkhzr" podUID="391720eb-78be-4c56-9be9-edb627920635" containerName="controller-manager" containerID="cri-o://d3a148b97c9f2d80720ace4d333eb47e14ab3474575a7fc7f16095674440618b" gracePeriod=30 Dec 01 08:22:53 crc kubenswrapper[5004]: I1201 08:22:53.324838 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7595559b7b-qjt8j"] Dec 01 08:22:53 crc kubenswrapper[5004]: I1201 08:22:53.325329 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7595559b7b-qjt8j" podUID="e6ad3c76-f4e2-43c9-8dc0-abd71417f683" containerName="route-controller-manager" containerID="cri-o://d54428b9d2ec7eba50b047e3c28d9ff45d9885a631a698cf44bd1abaab93f3b5" gracePeriod=30 Dec 01 08:22:53 crc kubenswrapper[5004]: I1201 08:22:53.801223 5004 generic.go:334] "Generic (PLEG): container finished" podID="e6ad3c76-f4e2-43c9-8dc0-abd71417f683" containerID="d54428b9d2ec7eba50b047e3c28d9ff45d9885a631a698cf44bd1abaab93f3b5" exitCode=0 Dec 01 08:22:53 crc kubenswrapper[5004]: I1201 08:22:53.801321 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7595559b7b-qjt8j" event={"ID":"e6ad3c76-f4e2-43c9-8dc0-abd71417f683","Type":"ContainerDied","Data":"d54428b9d2ec7eba50b047e3c28d9ff45d9885a631a698cf44bd1abaab93f3b5"} Dec 01 08:22:53 crc kubenswrapper[5004]: I1201 08:22:53.803524 5004 generic.go:334] "Generic (PLEG): container finished" podID="391720eb-78be-4c56-9be9-edb627920635" containerID="d3a148b97c9f2d80720ace4d333eb47e14ab3474575a7fc7f16095674440618b" exitCode=0 Dec 01 08:22:53 crc kubenswrapper[5004]: I1201 08:22:53.803549 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7877d8b9bb-nkhzr" event={"ID":"391720eb-78be-4c56-9be9-edb627920635","Type":"ContainerDied","Data":"d3a148b97c9f2d80720ace4d333eb47e14ab3474575a7fc7f16095674440618b"} Dec 01 08:22:53 crc kubenswrapper[5004]: I1201 08:22:53.856631 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7595559b7b-qjt8j" Dec 01 08:22:53 crc kubenswrapper[5004]: I1201 08:22:53.892084 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7877d8b9bb-nkhzr" Dec 01 08:22:54 crc kubenswrapper[5004]: I1201 08:22:54.001360 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9s7s\" (UniqueName: \"kubernetes.io/projected/e6ad3c76-f4e2-43c9-8dc0-abd71417f683-kube-api-access-b9s7s\") pod \"e6ad3c76-f4e2-43c9-8dc0-abd71417f683\" (UID: \"e6ad3c76-f4e2-43c9-8dc0-abd71417f683\") " Dec 01 08:22:54 crc kubenswrapper[5004]: I1201 08:22:54.001515 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6ad3c76-f4e2-43c9-8dc0-abd71417f683-serving-cert\") pod \"e6ad3c76-f4e2-43c9-8dc0-abd71417f683\" (UID: \"e6ad3c76-f4e2-43c9-8dc0-abd71417f683\") " Dec 01 08:22:54 crc kubenswrapper[5004]: I1201 08:22:54.001586 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/391720eb-78be-4c56-9be9-edb627920635-config\") pod \"391720eb-78be-4c56-9be9-edb627920635\" (UID: \"391720eb-78be-4c56-9be9-edb627920635\") " Dec 01 08:22:54 crc kubenswrapper[5004]: I1201 08:22:54.001632 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/391720eb-78be-4c56-9be9-edb627920635-proxy-ca-bundles\") pod \"391720eb-78be-4c56-9be9-edb627920635\" (UID: \"391720eb-78be-4c56-9be9-edb627920635\") " Dec 01 08:22:54 crc kubenswrapper[5004]: I1201 08:22:54.001690 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8949\" (UniqueName: \"kubernetes.io/projected/391720eb-78be-4c56-9be9-edb627920635-kube-api-access-g8949\") pod \"391720eb-78be-4c56-9be9-edb627920635\" (UID: \"391720eb-78be-4c56-9be9-edb627920635\") " Dec 01 08:22:54 crc kubenswrapper[5004]: I1201 08:22:54.001727 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e6ad3c76-f4e2-43c9-8dc0-abd71417f683-client-ca\") pod \"e6ad3c76-f4e2-43c9-8dc0-abd71417f683\" (UID: \"e6ad3c76-f4e2-43c9-8dc0-abd71417f683\") " Dec 01 08:22:54 crc kubenswrapper[5004]: I1201 08:22:54.001791 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6ad3c76-f4e2-43c9-8dc0-abd71417f683-config\") pod \"e6ad3c76-f4e2-43c9-8dc0-abd71417f683\" (UID: \"e6ad3c76-f4e2-43c9-8dc0-abd71417f683\") " Dec 01 08:22:54 crc kubenswrapper[5004]: I1201 08:22:54.001826 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/391720eb-78be-4c56-9be9-edb627920635-client-ca\") pod \"391720eb-78be-4c56-9be9-edb627920635\" (UID: \"391720eb-78be-4c56-9be9-edb627920635\") " Dec 01 08:22:54 crc kubenswrapper[5004]: I1201 08:22:54.001878 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/391720eb-78be-4c56-9be9-edb627920635-serving-cert\") pod \"391720eb-78be-4c56-9be9-edb627920635\" (UID: \"391720eb-78be-4c56-9be9-edb627920635\") " Dec 01 08:22:54 crc kubenswrapper[5004]: I1201 08:22:54.002786 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/391720eb-78be-4c56-9be9-edb627920635-config" (OuterVolumeSpecName: "config") pod "391720eb-78be-4c56-9be9-edb627920635" (UID: "391720eb-78be-4c56-9be9-edb627920635"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:22:54 crc kubenswrapper[5004]: I1201 08:22:54.003023 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6ad3c76-f4e2-43c9-8dc0-abd71417f683-config" (OuterVolumeSpecName: "config") pod "e6ad3c76-f4e2-43c9-8dc0-abd71417f683" (UID: "e6ad3c76-f4e2-43c9-8dc0-abd71417f683"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:22:54 crc kubenswrapper[5004]: I1201 08:22:54.003212 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/391720eb-78be-4c56-9be9-edb627920635-client-ca" (OuterVolumeSpecName: "client-ca") pod "391720eb-78be-4c56-9be9-edb627920635" (UID: "391720eb-78be-4c56-9be9-edb627920635"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:22:54 crc kubenswrapper[5004]: I1201 08:22:54.003244 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/391720eb-78be-4c56-9be9-edb627920635-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "391720eb-78be-4c56-9be9-edb627920635" (UID: "391720eb-78be-4c56-9be9-edb627920635"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:22:54 crc kubenswrapper[5004]: I1201 08:22:54.003338 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6ad3c76-f4e2-43c9-8dc0-abd71417f683-client-ca" (OuterVolumeSpecName: "client-ca") pod "e6ad3c76-f4e2-43c9-8dc0-abd71417f683" (UID: "e6ad3c76-f4e2-43c9-8dc0-abd71417f683"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:22:54 crc kubenswrapper[5004]: I1201 08:22:54.006828 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/391720eb-78be-4c56-9be9-edb627920635-kube-api-access-g8949" (OuterVolumeSpecName: "kube-api-access-g8949") pod "391720eb-78be-4c56-9be9-edb627920635" (UID: "391720eb-78be-4c56-9be9-edb627920635"). InnerVolumeSpecName "kube-api-access-g8949". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:22:54 crc kubenswrapper[5004]: I1201 08:22:54.008432 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6ad3c76-f4e2-43c9-8dc0-abd71417f683-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e6ad3c76-f4e2-43c9-8dc0-abd71417f683" (UID: "e6ad3c76-f4e2-43c9-8dc0-abd71417f683"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:22:54 crc kubenswrapper[5004]: I1201 08:22:54.008481 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6ad3c76-f4e2-43c9-8dc0-abd71417f683-kube-api-access-b9s7s" (OuterVolumeSpecName: "kube-api-access-b9s7s") pod "e6ad3c76-f4e2-43c9-8dc0-abd71417f683" (UID: "e6ad3c76-f4e2-43c9-8dc0-abd71417f683"). InnerVolumeSpecName "kube-api-access-b9s7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:22:54 crc kubenswrapper[5004]: I1201 08:22:54.009105 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/391720eb-78be-4c56-9be9-edb627920635-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "391720eb-78be-4c56-9be9-edb627920635" (UID: "391720eb-78be-4c56-9be9-edb627920635"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:22:54 crc kubenswrapper[5004]: I1201 08:22:54.102795 5004 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/391720eb-78be-4c56-9be9-edb627920635-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 08:22:54 crc kubenswrapper[5004]: I1201 08:22:54.103073 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9s7s\" (UniqueName: \"kubernetes.io/projected/e6ad3c76-f4e2-43c9-8dc0-abd71417f683-kube-api-access-b9s7s\") on node \"crc\" DevicePath \"\"" Dec 01 08:22:54 crc kubenswrapper[5004]: I1201 08:22:54.103091 5004 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6ad3c76-f4e2-43c9-8dc0-abd71417f683-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 08:22:54 crc kubenswrapper[5004]: I1201 08:22:54.103103 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/391720eb-78be-4c56-9be9-edb627920635-config\") on node \"crc\" DevicePath \"\"" Dec 01 08:22:54 crc kubenswrapper[5004]: I1201 08:22:54.103114 5004 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/391720eb-78be-4c56-9be9-edb627920635-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 01 08:22:54 crc kubenswrapper[5004]: I1201 08:22:54.103125 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8949\" (UniqueName: \"kubernetes.io/projected/391720eb-78be-4c56-9be9-edb627920635-kube-api-access-g8949\") on node \"crc\" DevicePath \"\"" Dec 01 08:22:54 crc kubenswrapper[5004]: I1201 08:22:54.103135 5004 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e6ad3c76-f4e2-43c9-8dc0-abd71417f683-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 08:22:54 crc kubenswrapper[5004]: I1201 08:22:54.103147 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6ad3c76-f4e2-43c9-8dc0-abd71417f683-config\") on node \"crc\" DevicePath \"\"" Dec 01 08:22:54 crc kubenswrapper[5004]: I1201 08:22:54.103157 5004 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/391720eb-78be-4c56-9be9-edb627920635-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 08:22:54 crc kubenswrapper[5004]: I1201 08:22:54.812757 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7595559b7b-qjt8j" Dec 01 08:22:54 crc kubenswrapper[5004]: I1201 08:22:54.812751 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7595559b7b-qjt8j" event={"ID":"e6ad3c76-f4e2-43c9-8dc0-abd71417f683","Type":"ContainerDied","Data":"34e7fbc3c1a3fc3277466e60f327684f487dc56f4f4c6695e27d9aaa535be7b2"} Dec 01 08:22:54 crc kubenswrapper[5004]: I1201 08:22:54.812979 5004 scope.go:117] "RemoveContainer" containerID="d54428b9d2ec7eba50b047e3c28d9ff45d9885a631a698cf44bd1abaab93f3b5" Dec 01 08:22:54 crc kubenswrapper[5004]: I1201 08:22:54.816814 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7877d8b9bb-nkhzr" event={"ID":"391720eb-78be-4c56-9be9-edb627920635","Type":"ContainerDied","Data":"3b54a841dabe6670e5a0f38fd96f87c14be1d5fd565bea4edafae99cef0c983d"} Dec 01 08:22:54 crc kubenswrapper[5004]: I1201 08:22:54.816912 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7877d8b9bb-nkhzr" Dec 01 08:22:54 crc kubenswrapper[5004]: I1201 08:22:54.849632 5004 scope.go:117] "RemoveContainer" containerID="d3a148b97c9f2d80720ace4d333eb47e14ab3474575a7fc7f16095674440618b" Dec 01 08:22:54 crc kubenswrapper[5004]: I1201 08:22:54.866336 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7595559b7b-qjt8j"] Dec 01 08:22:54 crc kubenswrapper[5004]: I1201 08:22:54.874960 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7595559b7b-qjt8j"] Dec 01 08:22:54 crc kubenswrapper[5004]: I1201 08:22:54.880446 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7877d8b9bb-nkhzr"] Dec 01 08:22:54 crc kubenswrapper[5004]: I1201 08:22:54.884225 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7877d8b9bb-nkhzr"] Dec 01 08:22:55 crc kubenswrapper[5004]: I1201 08:22:55.248915 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77f4d6b9f5-5fd4t"] Dec 01 08:22:55 crc kubenswrapper[5004]: E1201 08:22:55.249145 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="391720eb-78be-4c56-9be9-edb627920635" containerName="controller-manager" Dec 01 08:22:55 crc kubenswrapper[5004]: I1201 08:22:55.249159 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="391720eb-78be-4c56-9be9-edb627920635" containerName="controller-manager" Dec 01 08:22:55 crc kubenswrapper[5004]: E1201 08:22:55.249179 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6ad3c76-f4e2-43c9-8dc0-abd71417f683" containerName="route-controller-manager" Dec 01 08:22:55 crc kubenswrapper[5004]: I1201 08:22:55.249189 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6ad3c76-f4e2-43c9-8dc0-abd71417f683" containerName="route-controller-manager" Dec 01 08:22:55 crc kubenswrapper[5004]: I1201 08:22:55.249304 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="391720eb-78be-4c56-9be9-edb627920635" containerName="controller-manager" Dec 01 08:22:55 crc kubenswrapper[5004]: I1201 08:22:55.249325 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6ad3c76-f4e2-43c9-8dc0-abd71417f683" containerName="route-controller-manager" Dec 01 08:22:55 crc kubenswrapper[5004]: I1201 08:22:55.249738 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-77f4d6b9f5-5fd4t" Dec 01 08:22:55 crc kubenswrapper[5004]: I1201 08:22:55.253135 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 01 08:22:55 crc kubenswrapper[5004]: I1201 08:22:55.253766 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 01 08:22:55 crc kubenswrapper[5004]: I1201 08:22:55.253969 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 01 08:22:55 crc kubenswrapper[5004]: I1201 08:22:55.253980 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 01 08:22:55 crc kubenswrapper[5004]: I1201 08:22:55.254212 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6b485c8cc-zpj69"] Dec 01 08:22:55 crc kubenswrapper[5004]: I1201 08:22:55.254907 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6b485c8cc-zpj69" Dec 01 08:22:55 crc kubenswrapper[5004]: I1201 08:22:55.255531 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 01 08:22:55 crc kubenswrapper[5004]: I1201 08:22:55.255643 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 01 08:22:55 crc kubenswrapper[5004]: I1201 08:22:55.258158 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 01 08:22:55 crc kubenswrapper[5004]: I1201 08:22:55.258254 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 01 08:22:55 crc kubenswrapper[5004]: I1201 08:22:55.258392 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 01 08:22:55 crc kubenswrapper[5004]: I1201 08:22:55.258412 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 01 08:22:55 crc kubenswrapper[5004]: I1201 08:22:55.258430 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 01 08:22:55 crc kubenswrapper[5004]: I1201 08:22:55.263993 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6b485c8cc-zpj69"] Dec 01 08:22:55 crc kubenswrapper[5004]: I1201 08:22:55.264547 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 01 08:22:55 crc kubenswrapper[5004]: I1201 08:22:55.267957 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 01 08:22:55 crc kubenswrapper[5004]: I1201 08:22:55.268217 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77f4d6b9f5-5fd4t"] Dec 01 08:22:55 crc kubenswrapper[5004]: I1201 08:22:55.422888 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc138938-43ad-49bc-9c98-4f3e9fa7b8b9-config\") pod \"route-controller-manager-77f4d6b9f5-5fd4t\" (UID: \"cc138938-43ad-49bc-9c98-4f3e9fa7b8b9\") " pod="openshift-route-controller-manager/route-controller-manager-77f4d6b9f5-5fd4t" Dec 01 08:22:55 crc kubenswrapper[5004]: I1201 08:22:55.422996 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a61921a-71af-4454-9a7c-175daab33ea7-config\") pod \"controller-manager-6b485c8cc-zpj69\" (UID: \"9a61921a-71af-4454-9a7c-175daab33ea7\") " pod="openshift-controller-manager/controller-manager-6b485c8cc-zpj69" Dec 01 08:22:55 crc kubenswrapper[5004]: I1201 08:22:55.423156 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9a61921a-71af-4454-9a7c-175daab33ea7-client-ca\") pod \"controller-manager-6b485c8cc-zpj69\" (UID: \"9a61921a-71af-4454-9a7c-175daab33ea7\") " pod="openshift-controller-manager/controller-manager-6b485c8cc-zpj69" Dec 01 08:22:55 crc kubenswrapper[5004]: I1201 08:22:55.423294 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a61921a-71af-4454-9a7c-175daab33ea7-serving-cert\") pod \"controller-manager-6b485c8cc-zpj69\" (UID: \"9a61921a-71af-4454-9a7c-175daab33ea7\") " pod="openshift-controller-manager/controller-manager-6b485c8cc-zpj69" Dec 01 08:22:55 crc kubenswrapper[5004]: I1201 08:22:55.423404 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc138938-43ad-49bc-9c98-4f3e9fa7b8b9-serving-cert\") pod \"route-controller-manager-77f4d6b9f5-5fd4t\" (UID: \"cc138938-43ad-49bc-9c98-4f3e9fa7b8b9\") " pod="openshift-route-controller-manager/route-controller-manager-77f4d6b9f5-5fd4t" Dec 01 08:22:55 crc kubenswrapper[5004]: I1201 08:22:55.423550 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsmml\" (UniqueName: \"kubernetes.io/projected/cc138938-43ad-49bc-9c98-4f3e9fa7b8b9-kube-api-access-hsmml\") pod \"route-controller-manager-77f4d6b9f5-5fd4t\" (UID: \"cc138938-43ad-49bc-9c98-4f3e9fa7b8b9\") " pod="openshift-route-controller-manager/route-controller-manager-77f4d6b9f5-5fd4t" Dec 01 08:22:55 crc kubenswrapper[5004]: I1201 08:22:55.423614 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9a61921a-71af-4454-9a7c-175daab33ea7-proxy-ca-bundles\") pod \"controller-manager-6b485c8cc-zpj69\" (UID: \"9a61921a-71af-4454-9a7c-175daab33ea7\") " pod="openshift-controller-manager/controller-manager-6b485c8cc-zpj69" Dec 01 08:22:55 crc kubenswrapper[5004]: I1201 08:22:55.423681 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cc138938-43ad-49bc-9c98-4f3e9fa7b8b9-client-ca\") pod \"route-controller-manager-77f4d6b9f5-5fd4t\" (UID: \"cc138938-43ad-49bc-9c98-4f3e9fa7b8b9\") " pod="openshift-route-controller-manager/route-controller-manager-77f4d6b9f5-5fd4t" Dec 01 08:22:55 crc kubenswrapper[5004]: I1201 08:22:55.423729 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzbpq\" (UniqueName: \"kubernetes.io/projected/9a61921a-71af-4454-9a7c-175daab33ea7-kube-api-access-tzbpq\") pod \"controller-manager-6b485c8cc-zpj69\" (UID: \"9a61921a-71af-4454-9a7c-175daab33ea7\") " pod="openshift-controller-manager/controller-manager-6b485c8cc-zpj69" Dec 01 08:22:55 crc kubenswrapper[5004]: I1201 08:22:55.524888 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc138938-43ad-49bc-9c98-4f3e9fa7b8b9-config\") pod \"route-controller-manager-77f4d6b9f5-5fd4t\" (UID: \"cc138938-43ad-49bc-9c98-4f3e9fa7b8b9\") " pod="openshift-route-controller-manager/route-controller-manager-77f4d6b9f5-5fd4t" Dec 01 08:22:55 crc kubenswrapper[5004]: I1201 08:22:55.525009 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a61921a-71af-4454-9a7c-175daab33ea7-config\") pod \"controller-manager-6b485c8cc-zpj69\" (UID: \"9a61921a-71af-4454-9a7c-175daab33ea7\") " pod="openshift-controller-manager/controller-manager-6b485c8cc-zpj69" Dec 01 08:22:55 crc kubenswrapper[5004]: I1201 08:22:55.525080 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9a61921a-71af-4454-9a7c-175daab33ea7-client-ca\") pod \"controller-manager-6b485c8cc-zpj69\" (UID: \"9a61921a-71af-4454-9a7c-175daab33ea7\") " pod="openshift-controller-manager/controller-manager-6b485c8cc-zpj69" Dec 01 08:22:55 crc kubenswrapper[5004]: I1201 08:22:55.525168 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a61921a-71af-4454-9a7c-175daab33ea7-serving-cert\") pod \"controller-manager-6b485c8cc-zpj69\" (UID: \"9a61921a-71af-4454-9a7c-175daab33ea7\") " pod="openshift-controller-manager/controller-manager-6b485c8cc-zpj69" Dec 01 08:22:55 crc kubenswrapper[5004]: I1201 08:22:55.525252 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc138938-43ad-49bc-9c98-4f3e9fa7b8b9-serving-cert\") pod \"route-controller-manager-77f4d6b9f5-5fd4t\" (UID: \"cc138938-43ad-49bc-9c98-4f3e9fa7b8b9\") " pod="openshift-route-controller-manager/route-controller-manager-77f4d6b9f5-5fd4t" Dec 01 08:22:55 crc kubenswrapper[5004]: I1201 08:22:55.525309 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsmml\" (UniqueName: \"kubernetes.io/projected/cc138938-43ad-49bc-9c98-4f3e9fa7b8b9-kube-api-access-hsmml\") pod \"route-controller-manager-77f4d6b9f5-5fd4t\" (UID: \"cc138938-43ad-49bc-9c98-4f3e9fa7b8b9\") " pod="openshift-route-controller-manager/route-controller-manager-77f4d6b9f5-5fd4t" Dec 01 08:22:55 crc kubenswrapper[5004]: I1201 08:22:55.525355 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9a61921a-71af-4454-9a7c-175daab33ea7-proxy-ca-bundles\") pod \"controller-manager-6b485c8cc-zpj69\" (UID: \"9a61921a-71af-4454-9a7c-175daab33ea7\") " pod="openshift-controller-manager/controller-manager-6b485c8cc-zpj69" Dec 01 08:22:55 crc kubenswrapper[5004]: I1201 08:22:55.525425 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cc138938-43ad-49bc-9c98-4f3e9fa7b8b9-client-ca\") pod \"route-controller-manager-77f4d6b9f5-5fd4t\" (UID: \"cc138938-43ad-49bc-9c98-4f3e9fa7b8b9\") " pod="openshift-route-controller-manager/route-controller-manager-77f4d6b9f5-5fd4t" Dec 01 08:22:55 crc kubenswrapper[5004]: I1201 08:22:55.525484 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzbpq\" (UniqueName: \"kubernetes.io/projected/9a61921a-71af-4454-9a7c-175daab33ea7-kube-api-access-tzbpq\") pod \"controller-manager-6b485c8cc-zpj69\" (UID: \"9a61921a-71af-4454-9a7c-175daab33ea7\") " pod="openshift-controller-manager/controller-manager-6b485c8cc-zpj69" Dec 01 08:22:55 crc kubenswrapper[5004]: I1201 08:22:55.528418 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9a61921a-71af-4454-9a7c-175daab33ea7-client-ca\") pod \"controller-manager-6b485c8cc-zpj69\" (UID: \"9a61921a-71af-4454-9a7c-175daab33ea7\") " pod="openshift-controller-manager/controller-manager-6b485c8cc-zpj69" Dec 01 08:22:55 crc kubenswrapper[5004]: I1201 08:22:55.528417 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cc138938-43ad-49bc-9c98-4f3e9fa7b8b9-client-ca\") pod \"route-controller-manager-77f4d6b9f5-5fd4t\" (UID: \"cc138938-43ad-49bc-9c98-4f3e9fa7b8b9\") " pod="openshift-route-controller-manager/route-controller-manager-77f4d6b9f5-5fd4t" Dec 01 08:22:55 crc kubenswrapper[5004]: I1201 08:22:55.528731 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9a61921a-71af-4454-9a7c-175daab33ea7-proxy-ca-bundles\") pod \"controller-manager-6b485c8cc-zpj69\" (UID: \"9a61921a-71af-4454-9a7c-175daab33ea7\") " pod="openshift-controller-manager/controller-manager-6b485c8cc-zpj69" Dec 01 08:22:55 crc kubenswrapper[5004]: I1201 08:22:55.529423 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a61921a-71af-4454-9a7c-175daab33ea7-config\") pod \"controller-manager-6b485c8cc-zpj69\" (UID: \"9a61921a-71af-4454-9a7c-175daab33ea7\") " pod="openshift-controller-manager/controller-manager-6b485c8cc-zpj69" Dec 01 08:22:55 crc kubenswrapper[5004]: I1201 08:22:55.529454 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc138938-43ad-49bc-9c98-4f3e9fa7b8b9-config\") pod \"route-controller-manager-77f4d6b9f5-5fd4t\" (UID: \"cc138938-43ad-49bc-9c98-4f3e9fa7b8b9\") " pod="openshift-route-controller-manager/route-controller-manager-77f4d6b9f5-5fd4t" Dec 01 08:22:55 crc kubenswrapper[5004]: I1201 08:22:55.536414 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc138938-43ad-49bc-9c98-4f3e9fa7b8b9-serving-cert\") pod \"route-controller-manager-77f4d6b9f5-5fd4t\" (UID: \"cc138938-43ad-49bc-9c98-4f3e9fa7b8b9\") " pod="openshift-route-controller-manager/route-controller-manager-77f4d6b9f5-5fd4t" Dec 01 08:22:55 crc kubenswrapper[5004]: I1201 08:22:55.538540 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a61921a-71af-4454-9a7c-175daab33ea7-serving-cert\") pod \"controller-manager-6b485c8cc-zpj69\" (UID: \"9a61921a-71af-4454-9a7c-175daab33ea7\") " pod="openshift-controller-manager/controller-manager-6b485c8cc-zpj69" Dec 01 08:22:55 crc kubenswrapper[5004]: I1201 08:22:55.564402 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzbpq\" (UniqueName: \"kubernetes.io/projected/9a61921a-71af-4454-9a7c-175daab33ea7-kube-api-access-tzbpq\") pod \"controller-manager-6b485c8cc-zpj69\" (UID: \"9a61921a-71af-4454-9a7c-175daab33ea7\") " pod="openshift-controller-manager/controller-manager-6b485c8cc-zpj69" Dec 01 08:22:55 crc kubenswrapper[5004]: I1201 08:22:55.577444 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6b485c8cc-zpj69" Dec 01 08:22:55 crc kubenswrapper[5004]: I1201 08:22:55.578594 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsmml\" (UniqueName: \"kubernetes.io/projected/cc138938-43ad-49bc-9c98-4f3e9fa7b8b9-kube-api-access-hsmml\") pod \"route-controller-manager-77f4d6b9f5-5fd4t\" (UID: \"cc138938-43ad-49bc-9c98-4f3e9fa7b8b9\") " pod="openshift-route-controller-manager/route-controller-manager-77f4d6b9f5-5fd4t" Dec 01 08:22:55 crc kubenswrapper[5004]: I1201 08:22:55.865593 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-77f4d6b9f5-5fd4t" Dec 01 08:22:56 crc kubenswrapper[5004]: I1201 08:22:56.045273 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6b485c8cc-zpj69"] Dec 01 08:22:56 crc kubenswrapper[5004]: I1201 08:22:56.362156 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77f4d6b9f5-5fd4t"] Dec 01 08:22:56 crc kubenswrapper[5004]: W1201 08:22:56.368350 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc138938_43ad_49bc_9c98_4f3e9fa7b8b9.slice/crio-d3d2da931a05a27f4e44278f7ce9bae3f819a2ee9ee1e14e0991453bf9041f4d WatchSource:0}: Error finding container d3d2da931a05a27f4e44278f7ce9bae3f819a2ee9ee1e14e0991453bf9041f4d: Status 404 returned error can't find the container with id d3d2da931a05a27f4e44278f7ce9bae3f819a2ee9ee1e14e0991453bf9041f4d Dec 01 08:22:56 crc kubenswrapper[5004]: I1201 08:22:56.653338 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-885ts"] Dec 01 08:22:56 crc kubenswrapper[5004]: I1201 08:22:56.654253 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-885ts" Dec 01 08:22:56 crc kubenswrapper[5004]: I1201 08:22:56.670543 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-885ts"] Dec 01 08:22:56 crc kubenswrapper[5004]: I1201 08:22:56.747159 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/686a9183-c95d-49cd-82fc-77470e843f0d-registry-certificates\") pod \"image-registry-66df7c8f76-885ts\" (UID: \"686a9183-c95d-49cd-82fc-77470e843f0d\") " pod="openshift-image-registry/image-registry-66df7c8f76-885ts" Dec 01 08:22:56 crc kubenswrapper[5004]: I1201 08:22:56.747212 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/686a9183-c95d-49cd-82fc-77470e843f0d-trusted-ca\") pod \"image-registry-66df7c8f76-885ts\" (UID: \"686a9183-c95d-49cd-82fc-77470e843f0d\") " pod="openshift-image-registry/image-registry-66df7c8f76-885ts" Dec 01 08:22:56 crc kubenswrapper[5004]: I1201 08:22:56.747229 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/686a9183-c95d-49cd-82fc-77470e843f0d-registry-tls\") pod \"image-registry-66df7c8f76-885ts\" (UID: \"686a9183-c95d-49cd-82fc-77470e843f0d\") " pod="openshift-image-registry/image-registry-66df7c8f76-885ts" Dec 01 08:22:56 crc kubenswrapper[5004]: I1201 08:22:56.747246 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/686a9183-c95d-49cd-82fc-77470e843f0d-bound-sa-token\") pod \"image-registry-66df7c8f76-885ts\" (UID: \"686a9183-c95d-49cd-82fc-77470e843f0d\") " pod="openshift-image-registry/image-registry-66df7c8f76-885ts" Dec 01 08:22:56 crc kubenswrapper[5004]: I1201 08:22:56.747474 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kf89w\" (UniqueName: \"kubernetes.io/projected/686a9183-c95d-49cd-82fc-77470e843f0d-kube-api-access-kf89w\") pod \"image-registry-66df7c8f76-885ts\" (UID: \"686a9183-c95d-49cd-82fc-77470e843f0d\") " pod="openshift-image-registry/image-registry-66df7c8f76-885ts" Dec 01 08:22:56 crc kubenswrapper[5004]: I1201 08:22:56.747521 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/686a9183-c95d-49cd-82fc-77470e843f0d-ca-trust-extracted\") pod \"image-registry-66df7c8f76-885ts\" (UID: \"686a9183-c95d-49cd-82fc-77470e843f0d\") " pod="openshift-image-registry/image-registry-66df7c8f76-885ts" Dec 01 08:22:56 crc kubenswrapper[5004]: I1201 08:22:56.747633 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/686a9183-c95d-49cd-82fc-77470e843f0d-installation-pull-secrets\") pod \"image-registry-66df7c8f76-885ts\" (UID: \"686a9183-c95d-49cd-82fc-77470e843f0d\") " pod="openshift-image-registry/image-registry-66df7c8f76-885ts" Dec 01 08:22:56 crc kubenswrapper[5004]: I1201 08:22:56.747706 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-885ts\" (UID: \"686a9183-c95d-49cd-82fc-77470e843f0d\") " pod="openshift-image-registry/image-registry-66df7c8f76-885ts" Dec 01 08:22:56 crc kubenswrapper[5004]: I1201 08:22:56.768933 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="391720eb-78be-4c56-9be9-edb627920635" path="/var/lib/kubelet/pods/391720eb-78be-4c56-9be9-edb627920635/volumes" Dec 01 08:22:56 crc kubenswrapper[5004]: I1201 08:22:56.769911 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6ad3c76-f4e2-43c9-8dc0-abd71417f683" path="/var/lib/kubelet/pods/e6ad3c76-f4e2-43c9-8dc0-abd71417f683/volumes" Dec 01 08:22:56 crc kubenswrapper[5004]: I1201 08:22:56.778431 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-885ts\" (UID: \"686a9183-c95d-49cd-82fc-77470e843f0d\") " pod="openshift-image-registry/image-registry-66df7c8f76-885ts" Dec 01 08:22:56 crc kubenswrapper[5004]: I1201 08:22:56.835392 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-77f4d6b9f5-5fd4t" event={"ID":"cc138938-43ad-49bc-9c98-4f3e9fa7b8b9","Type":"ContainerStarted","Data":"46e39000f954e55cbce54649433d101208ce2aeee566d0ac5ce826d3e9b56bb3"} Dec 01 08:22:56 crc kubenswrapper[5004]: I1201 08:22:56.835711 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-77f4d6b9f5-5fd4t" event={"ID":"cc138938-43ad-49bc-9c98-4f3e9fa7b8b9","Type":"ContainerStarted","Data":"d3d2da931a05a27f4e44278f7ce9bae3f819a2ee9ee1e14e0991453bf9041f4d"} Dec 01 08:22:56 crc kubenswrapper[5004]: I1201 08:22:56.836394 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-77f4d6b9f5-5fd4t" Dec 01 08:22:56 crc kubenswrapper[5004]: I1201 08:22:56.837104 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6b485c8cc-zpj69" event={"ID":"9a61921a-71af-4454-9a7c-175daab33ea7","Type":"ContainerStarted","Data":"5c05e1bcad53138248e33e724bf4bc42286e8a69253baa86ed692d0bf8b104a8"} Dec 01 08:22:56 crc kubenswrapper[5004]: I1201 08:22:56.837143 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6b485c8cc-zpj69" event={"ID":"9a61921a-71af-4454-9a7c-175daab33ea7","Type":"ContainerStarted","Data":"bfc8c9cfa472e4eed72967316a4a58300df86f65a5691eb42306c3ab54ab50c6"} Dec 01 08:22:56 crc kubenswrapper[5004]: I1201 08:22:56.837381 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6b485c8cc-zpj69" Dec 01 08:22:56 crc kubenswrapper[5004]: I1201 08:22:56.842041 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6b485c8cc-zpj69" Dec 01 08:22:56 crc kubenswrapper[5004]: I1201 08:22:56.849369 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kf89w\" (UniqueName: \"kubernetes.io/projected/686a9183-c95d-49cd-82fc-77470e843f0d-kube-api-access-kf89w\") pod \"image-registry-66df7c8f76-885ts\" (UID: \"686a9183-c95d-49cd-82fc-77470e843f0d\") " pod="openshift-image-registry/image-registry-66df7c8f76-885ts" Dec 01 08:22:56 crc kubenswrapper[5004]: I1201 08:22:56.849447 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/686a9183-c95d-49cd-82fc-77470e843f0d-ca-trust-extracted\") pod \"image-registry-66df7c8f76-885ts\" (UID: \"686a9183-c95d-49cd-82fc-77470e843f0d\") " pod="openshift-image-registry/image-registry-66df7c8f76-885ts" Dec 01 08:22:56 crc kubenswrapper[5004]: I1201 08:22:56.849483 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/686a9183-c95d-49cd-82fc-77470e843f0d-installation-pull-secrets\") pod \"image-registry-66df7c8f76-885ts\" (UID: \"686a9183-c95d-49cd-82fc-77470e843f0d\") " pod="openshift-image-registry/image-registry-66df7c8f76-885ts" Dec 01 08:22:56 crc kubenswrapper[5004]: I1201 08:22:56.849538 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/686a9183-c95d-49cd-82fc-77470e843f0d-registry-certificates\") pod \"image-registry-66df7c8f76-885ts\" (UID: \"686a9183-c95d-49cd-82fc-77470e843f0d\") " pod="openshift-image-registry/image-registry-66df7c8f76-885ts" Dec 01 08:22:56 crc kubenswrapper[5004]: I1201 08:22:56.849598 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/686a9183-c95d-49cd-82fc-77470e843f0d-trusted-ca\") pod \"image-registry-66df7c8f76-885ts\" (UID: \"686a9183-c95d-49cd-82fc-77470e843f0d\") " pod="openshift-image-registry/image-registry-66df7c8f76-885ts" Dec 01 08:22:56 crc kubenswrapper[5004]: I1201 08:22:56.849621 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/686a9183-c95d-49cd-82fc-77470e843f0d-registry-tls\") pod \"image-registry-66df7c8f76-885ts\" (UID: \"686a9183-c95d-49cd-82fc-77470e843f0d\") " pod="openshift-image-registry/image-registry-66df7c8f76-885ts" Dec 01 08:22:56 crc kubenswrapper[5004]: I1201 08:22:56.849642 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/686a9183-c95d-49cd-82fc-77470e843f0d-bound-sa-token\") pod \"image-registry-66df7c8f76-885ts\" (UID: \"686a9183-c95d-49cd-82fc-77470e843f0d\") " pod="openshift-image-registry/image-registry-66df7c8f76-885ts" Dec 01 08:22:56 crc kubenswrapper[5004]: I1201 08:22:56.850315 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/686a9183-c95d-49cd-82fc-77470e843f0d-ca-trust-extracted\") pod \"image-registry-66df7c8f76-885ts\" (UID: \"686a9183-c95d-49cd-82fc-77470e843f0d\") " pod="openshift-image-registry/image-registry-66df7c8f76-885ts" Dec 01 08:22:56 crc kubenswrapper[5004]: I1201 08:22:56.851804 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/686a9183-c95d-49cd-82fc-77470e843f0d-registry-certificates\") pod \"image-registry-66df7c8f76-885ts\" (UID: \"686a9183-c95d-49cd-82fc-77470e843f0d\") " pod="openshift-image-registry/image-registry-66df7c8f76-885ts" Dec 01 08:22:56 crc kubenswrapper[5004]: I1201 08:22:56.851942 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/686a9183-c95d-49cd-82fc-77470e843f0d-trusted-ca\") pod \"image-registry-66df7c8f76-885ts\" (UID: \"686a9183-c95d-49cd-82fc-77470e843f0d\") " pod="openshift-image-registry/image-registry-66df7c8f76-885ts" Dec 01 08:22:56 crc kubenswrapper[5004]: I1201 08:22:56.858727 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/686a9183-c95d-49cd-82fc-77470e843f0d-installation-pull-secrets\") pod \"image-registry-66df7c8f76-885ts\" (UID: \"686a9183-c95d-49cd-82fc-77470e843f0d\") " pod="openshift-image-registry/image-registry-66df7c8f76-885ts" Dec 01 08:22:56 crc kubenswrapper[5004]: I1201 08:22:56.858736 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/686a9183-c95d-49cd-82fc-77470e843f0d-registry-tls\") pod \"image-registry-66df7c8f76-885ts\" (UID: \"686a9183-c95d-49cd-82fc-77470e843f0d\") " pod="openshift-image-registry/image-registry-66df7c8f76-885ts" Dec 01 08:22:56 crc kubenswrapper[5004]: I1201 08:22:56.862675 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-77f4d6b9f5-5fd4t" podStartSLOduration=3.862658061 podStartE2EDuration="3.862658061s" podCreationTimestamp="2025-12-01 08:22:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:22:56.852938019 +0000 UTC m=+354.417930031" watchObservedRunningTime="2025-12-01 08:22:56.862658061 +0000 UTC m=+354.427650053" Dec 01 08:22:56 crc kubenswrapper[5004]: I1201 08:22:56.880208 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/686a9183-c95d-49cd-82fc-77470e843f0d-bound-sa-token\") pod \"image-registry-66df7c8f76-885ts\" (UID: \"686a9183-c95d-49cd-82fc-77470e843f0d\") " pod="openshift-image-registry/image-registry-66df7c8f76-885ts" Dec 01 08:22:56 crc kubenswrapper[5004]: I1201 08:22:56.884954 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6b485c8cc-zpj69" podStartSLOduration=3.884935988 podStartE2EDuration="3.884935988s" podCreationTimestamp="2025-12-01 08:22:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:22:56.880617146 +0000 UTC m=+354.445609138" watchObservedRunningTime="2025-12-01 08:22:56.884935988 +0000 UTC m=+354.449927970" Dec 01 08:22:56 crc kubenswrapper[5004]: I1201 08:22:56.902392 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kf89w\" (UniqueName: \"kubernetes.io/projected/686a9183-c95d-49cd-82fc-77470e843f0d-kube-api-access-kf89w\") pod \"image-registry-66df7c8f76-885ts\" (UID: \"686a9183-c95d-49cd-82fc-77470e843f0d\") " pod="openshift-image-registry/image-registry-66df7c8f76-885ts" Dec 01 08:22:56 crc kubenswrapper[5004]: I1201 08:22:56.969434 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-885ts" Dec 01 08:22:56 crc kubenswrapper[5004]: I1201 08:22:56.985723 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-77f4d6b9f5-5fd4t" Dec 01 08:22:57 crc kubenswrapper[5004]: I1201 08:22:57.227850 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-885ts"] Dec 01 08:22:57 crc kubenswrapper[5004]: W1201 08:22:57.232042 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod686a9183_c95d_49cd_82fc_77470e843f0d.slice/crio-9e2c5a1bbe712ddd7abe04b0d5723b2d5bb526157f681536d01ac903a4b1d81d WatchSource:0}: Error finding container 9e2c5a1bbe712ddd7abe04b0d5723b2d5bb526157f681536d01ac903a4b1d81d: Status 404 returned error can't find the container with id 9e2c5a1bbe712ddd7abe04b0d5723b2d5bb526157f681536d01ac903a4b1d81d Dec 01 08:22:57 crc kubenswrapper[5004]: I1201 08:22:57.848249 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-885ts" event={"ID":"686a9183-c95d-49cd-82fc-77470e843f0d","Type":"ContainerStarted","Data":"3dd79b5f2d8984fa502a07de7eb9a4eca4bb88eb8449cc21fd5c26f15c6d2e72"} Dec 01 08:22:57 crc kubenswrapper[5004]: I1201 08:22:57.848309 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-885ts" event={"ID":"686a9183-c95d-49cd-82fc-77470e843f0d","Type":"ContainerStarted","Data":"9e2c5a1bbe712ddd7abe04b0d5723b2d5bb526157f681536d01ac903a4b1d81d"} Dec 01 08:22:57 crc kubenswrapper[5004]: I1201 08:22:57.890787 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-885ts" podStartSLOduration=1.890761838 podStartE2EDuration="1.890761838s" podCreationTimestamp="2025-12-01 08:22:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:22:57.88579354 +0000 UTC m=+355.450785592" watchObservedRunningTime="2025-12-01 08:22:57.890761838 +0000 UTC m=+355.455753860" Dec 01 08:22:58 crc kubenswrapper[5004]: I1201 08:22:58.851818 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-885ts" Dec 01 08:23:08 crc kubenswrapper[5004]: I1201 08:23:08.729035 5004 patch_prober.go:28] interesting pod/machine-config-daemon-fvdgt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 08:23:08 crc kubenswrapper[5004]: I1201 08:23:08.729655 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 08:23:16 crc kubenswrapper[5004]: I1201 08:23:16.978358 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-885ts" Dec 01 08:23:17 crc kubenswrapper[5004]: I1201 08:23:17.102073 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-8jr7l"] Dec 01 08:23:24 crc kubenswrapper[5004]: I1201 08:23:24.997266 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jd8z9"] Dec 01 08:23:24 crc kubenswrapper[5004]: I1201 08:23:24.998208 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jd8z9" podUID="56a0aeb3-484c-4e7e-9ff7-b3bc75f253f4" containerName="registry-server" containerID="cri-o://c63f5e37abd10023683d84c37956975712137d4a6df7a00047f5d62ada1bc723" gracePeriod=30 Dec 01 08:23:25 crc kubenswrapper[5004]: I1201 08:23:25.018597 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dg9cf"] Dec 01 08:23:25 crc kubenswrapper[5004]: I1201 08:23:25.019071 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-dg9cf" podUID="19da9663-9e98-41f0-a737-0c2683293496" containerName="registry-server" containerID="cri-o://ec801c83d20861c9e218221b9eb3a2728ff98a78239de7472bae8eb8d0d9af11" gracePeriod=30 Dec 01 08:23:25 crc kubenswrapper[5004]: E1201 08:23:25.044749 5004 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ec801c83d20861c9e218221b9eb3a2728ff98a78239de7472bae8eb8d0d9af11" cmd=["grpc_health_probe","-addr=:50051"] Dec 01 08:23:25 crc kubenswrapper[5004]: I1201 08:23:25.048167 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8g8bf"] Dec 01 08:23:25 crc kubenswrapper[5004]: I1201 08:23:25.048416 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-8g8bf" podUID="8a9f98dc-e84b-4fb8-9d4d-69c766486ebb" containerName="marketplace-operator" containerID="cri-o://37403851b7a3995647d9472cee025a2ab98a3adeeb95ee3d5d85053ebf1f1384" gracePeriod=30 Dec 01 08:23:25 crc kubenswrapper[5004]: E1201 08:23:25.049754 5004 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ec801c83d20861c9e218221b9eb3a2728ff98a78239de7472bae8eb8d0d9af11" cmd=["grpc_health_probe","-addr=:50051"] Dec 01 08:23:25 crc kubenswrapper[5004]: E1201 08:23:25.050545 5004 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ec801c83d20861c9e218221b9eb3a2728ff98a78239de7472bae8eb8d0d9af11 is running failed: container process not found" containerID="ec801c83d20861c9e218221b9eb3a2728ff98a78239de7472bae8eb8d0d9af11" cmd=["grpc_health_probe","-addr=:50051"] Dec 01 08:23:25 crc kubenswrapper[5004]: E1201 08:23:25.050639 5004 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ec801c83d20861c9e218221b9eb3a2728ff98a78239de7472bae8eb8d0d9af11 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-dg9cf" podUID="19da9663-9e98-41f0-a737-0c2683293496" containerName="registry-server" Dec 01 08:23:25 crc kubenswrapper[5004]: I1201 08:23:25.054175 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pl87w"] Dec 01 08:23:25 crc kubenswrapper[5004]: I1201 08:23:25.054445 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pl87w" podUID="b0286c29-4a56-4e46-8820-21dfbc658c86" containerName="registry-server" containerID="cri-o://edff7af1925601359de2784bf43e9f064567d2588c070ac07415d42a8213ae54" gracePeriod=30 Dec 01 08:23:25 crc kubenswrapper[5004]: I1201 08:23:25.066074 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wvfds"] Dec 01 08:23:25 crc kubenswrapper[5004]: I1201 08:23:25.066353 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wvfds" podUID="8f079082-1b6f-4919-8d2c-c640f30de417" containerName="registry-server" containerID="cri-o://98d8c746f45777a77a60247a36da7f37e3cba4aab2752e59ee09374f7e491cbd" gracePeriod=30 Dec 01 08:23:25 crc kubenswrapper[5004]: I1201 08:23:25.074349 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-5ffg4"] Dec 01 08:23:25 crc kubenswrapper[5004]: I1201 08:23:25.075540 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-5ffg4" Dec 01 08:23:25 crc kubenswrapper[5004]: I1201 08:23:25.083806 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2370cb7e-e860-40eb-a3a2-0a711f1e05b1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-5ffg4\" (UID: \"2370cb7e-e860-40eb-a3a2-0a711f1e05b1\") " pod="openshift-marketplace/marketplace-operator-79b997595-5ffg4" Dec 01 08:23:25 crc kubenswrapper[5004]: I1201 08:23:25.083882 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2370cb7e-e860-40eb-a3a2-0a711f1e05b1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-5ffg4\" (UID: \"2370cb7e-e860-40eb-a3a2-0a711f1e05b1\") " pod="openshift-marketplace/marketplace-operator-79b997595-5ffg4" Dec 01 08:23:25 crc kubenswrapper[5004]: I1201 08:23:25.083971 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lnt4\" (UniqueName: \"kubernetes.io/projected/2370cb7e-e860-40eb-a3a2-0a711f1e05b1-kube-api-access-7lnt4\") pod \"marketplace-operator-79b997595-5ffg4\" (UID: \"2370cb7e-e860-40eb-a3a2-0a711f1e05b1\") " pod="openshift-marketplace/marketplace-operator-79b997595-5ffg4" Dec 01 08:23:25 crc kubenswrapper[5004]: I1201 08:23:25.092238 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-5ffg4"] Dec 01 08:23:25 crc kubenswrapper[5004]: E1201 08:23:25.123750 5004 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c63f5e37abd10023683d84c37956975712137d4a6df7a00047f5d62ada1bc723 is running failed: container process not found" containerID="c63f5e37abd10023683d84c37956975712137d4a6df7a00047f5d62ada1bc723" cmd=["grpc_health_probe","-addr=:50051"] Dec 01 08:23:25 crc kubenswrapper[5004]: E1201 08:23:25.127685 5004 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c63f5e37abd10023683d84c37956975712137d4a6df7a00047f5d62ada1bc723 is running failed: container process not found" containerID="c63f5e37abd10023683d84c37956975712137d4a6df7a00047f5d62ada1bc723" cmd=["grpc_health_probe","-addr=:50051"] Dec 01 08:23:25 crc kubenswrapper[5004]: E1201 08:23:25.128179 5004 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c63f5e37abd10023683d84c37956975712137d4a6df7a00047f5d62ada1bc723 is running failed: container process not found" containerID="c63f5e37abd10023683d84c37956975712137d4a6df7a00047f5d62ada1bc723" cmd=["grpc_health_probe","-addr=:50051"] Dec 01 08:23:25 crc kubenswrapper[5004]: E1201 08:23:25.128240 5004 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c63f5e37abd10023683d84c37956975712137d4a6df7a00047f5d62ada1bc723 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-jd8z9" podUID="56a0aeb3-484c-4e7e-9ff7-b3bc75f253f4" containerName="registry-server" Dec 01 08:23:25 crc kubenswrapper[5004]: I1201 08:23:25.185883 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2370cb7e-e860-40eb-a3a2-0a711f1e05b1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-5ffg4\" (UID: \"2370cb7e-e860-40eb-a3a2-0a711f1e05b1\") " pod="openshift-marketplace/marketplace-operator-79b997595-5ffg4" Dec 01 08:23:25 crc kubenswrapper[5004]: I1201 08:23:25.185951 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lnt4\" (UniqueName: \"kubernetes.io/projected/2370cb7e-e860-40eb-a3a2-0a711f1e05b1-kube-api-access-7lnt4\") pod \"marketplace-operator-79b997595-5ffg4\" (UID: \"2370cb7e-e860-40eb-a3a2-0a711f1e05b1\") " pod="openshift-marketplace/marketplace-operator-79b997595-5ffg4" Dec 01 08:23:25 crc kubenswrapper[5004]: I1201 08:23:25.185999 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2370cb7e-e860-40eb-a3a2-0a711f1e05b1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-5ffg4\" (UID: \"2370cb7e-e860-40eb-a3a2-0a711f1e05b1\") " pod="openshift-marketplace/marketplace-operator-79b997595-5ffg4" Dec 01 08:23:25 crc kubenswrapper[5004]: I1201 08:23:25.192947 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2370cb7e-e860-40eb-a3a2-0a711f1e05b1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-5ffg4\" (UID: \"2370cb7e-e860-40eb-a3a2-0a711f1e05b1\") " pod="openshift-marketplace/marketplace-operator-79b997595-5ffg4" Dec 01 08:23:25 crc kubenswrapper[5004]: I1201 08:23:25.199085 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2370cb7e-e860-40eb-a3a2-0a711f1e05b1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-5ffg4\" (UID: \"2370cb7e-e860-40eb-a3a2-0a711f1e05b1\") " pod="openshift-marketplace/marketplace-operator-79b997595-5ffg4" Dec 01 08:23:25 crc kubenswrapper[5004]: I1201 08:23:25.209162 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lnt4\" (UniqueName: \"kubernetes.io/projected/2370cb7e-e860-40eb-a3a2-0a711f1e05b1-kube-api-access-7lnt4\") pod \"marketplace-operator-79b997595-5ffg4\" (UID: \"2370cb7e-e860-40eb-a3a2-0a711f1e05b1\") " pod="openshift-marketplace/marketplace-operator-79b997595-5ffg4" Dec 01 08:23:25 crc kubenswrapper[5004]: I1201 08:23:25.497838 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-5ffg4" Dec 01 08:23:25 crc kubenswrapper[5004]: I1201 08:23:25.589327 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jd8z9" Dec 01 08:23:25 crc kubenswrapper[5004]: I1201 08:23:25.593142 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66j5l\" (UniqueName: \"kubernetes.io/projected/56a0aeb3-484c-4e7e-9ff7-b3bc75f253f4-kube-api-access-66j5l\") pod \"56a0aeb3-484c-4e7e-9ff7-b3bc75f253f4\" (UID: \"56a0aeb3-484c-4e7e-9ff7-b3bc75f253f4\") " Dec 01 08:23:25 crc kubenswrapper[5004]: I1201 08:23:25.593182 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56a0aeb3-484c-4e7e-9ff7-b3bc75f253f4-utilities\") pod \"56a0aeb3-484c-4e7e-9ff7-b3bc75f253f4\" (UID: \"56a0aeb3-484c-4e7e-9ff7-b3bc75f253f4\") " Dec 01 08:23:25 crc kubenswrapper[5004]: I1201 08:23:25.594658 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56a0aeb3-484c-4e7e-9ff7-b3bc75f253f4-utilities" (OuterVolumeSpecName: "utilities") pod "56a0aeb3-484c-4e7e-9ff7-b3bc75f253f4" (UID: "56a0aeb3-484c-4e7e-9ff7-b3bc75f253f4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:23:25 crc kubenswrapper[5004]: I1201 08:23:25.597433 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56a0aeb3-484c-4e7e-9ff7-b3bc75f253f4-kube-api-access-66j5l" (OuterVolumeSpecName: "kube-api-access-66j5l") pod "56a0aeb3-484c-4e7e-9ff7-b3bc75f253f4" (UID: "56a0aeb3-484c-4e7e-9ff7-b3bc75f253f4"). InnerVolumeSpecName "kube-api-access-66j5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:23:25 crc kubenswrapper[5004]: I1201 08:23:25.694574 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56a0aeb3-484c-4e7e-9ff7-b3bc75f253f4-catalog-content\") pod \"56a0aeb3-484c-4e7e-9ff7-b3bc75f253f4\" (UID: \"56a0aeb3-484c-4e7e-9ff7-b3bc75f253f4\") " Dec 01 08:23:25 crc kubenswrapper[5004]: I1201 08:23:25.695152 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66j5l\" (UniqueName: \"kubernetes.io/projected/56a0aeb3-484c-4e7e-9ff7-b3bc75f253f4-kube-api-access-66j5l\") on node \"crc\" DevicePath \"\"" Dec 01 08:23:25 crc kubenswrapper[5004]: I1201 08:23:25.695173 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56a0aeb3-484c-4e7e-9ff7-b3bc75f253f4-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 08:23:25 crc kubenswrapper[5004]: I1201 08:23:25.723192 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8g8bf" Dec 01 08:23:25 crc kubenswrapper[5004]: I1201 08:23:25.740342 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dg9cf" Dec 01 08:23:25 crc kubenswrapper[5004]: I1201 08:23:25.749552 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56a0aeb3-484c-4e7e-9ff7-b3bc75f253f4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "56a0aeb3-484c-4e7e-9ff7-b3bc75f253f4" (UID: "56a0aeb3-484c-4e7e-9ff7-b3bc75f253f4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:23:25 crc kubenswrapper[5004]: I1201 08:23:25.758238 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wvfds" Dec 01 08:23:25 crc kubenswrapper[5004]: I1201 08:23:25.764883 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pl87w" Dec 01 08:23:25 crc kubenswrapper[5004]: I1201 08:23:25.795773 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56a0aeb3-484c-4e7e-9ff7-b3bc75f253f4-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 08:23:25 crc kubenswrapper[5004]: I1201 08:23:25.896424 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0286c29-4a56-4e46-8820-21dfbc658c86-utilities\") pod \"b0286c29-4a56-4e46-8820-21dfbc658c86\" (UID: \"b0286c29-4a56-4e46-8820-21dfbc658c86\") " Dec 01 08:23:25 crc kubenswrapper[5004]: I1201 08:23:25.896497 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fng7c\" (UniqueName: \"kubernetes.io/projected/b0286c29-4a56-4e46-8820-21dfbc658c86-kube-api-access-fng7c\") pod \"b0286c29-4a56-4e46-8820-21dfbc658c86\" (UID: \"b0286c29-4a56-4e46-8820-21dfbc658c86\") " Dec 01 08:23:25 crc kubenswrapper[5004]: I1201 08:23:25.896536 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8a9f98dc-e84b-4fb8-9d4d-69c766486ebb-marketplace-trusted-ca\") pod \"8a9f98dc-e84b-4fb8-9d4d-69c766486ebb\" (UID: \"8a9f98dc-e84b-4fb8-9d4d-69c766486ebb\") " Dec 01 08:23:25 crc kubenswrapper[5004]: I1201 08:23:25.896637 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f079082-1b6f-4919-8d2c-c640f30de417-utilities\") pod \"8f079082-1b6f-4919-8d2c-c640f30de417\" (UID: \"8f079082-1b6f-4919-8d2c-c640f30de417\") " Dec 01 08:23:25 crc kubenswrapper[5004]: I1201 08:23:25.896684 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19da9663-9e98-41f0-a737-0c2683293496-utilities\") pod \"19da9663-9e98-41f0-a737-0c2683293496\" (UID: \"19da9663-9e98-41f0-a737-0c2683293496\") " Dec 01 08:23:25 crc kubenswrapper[5004]: I1201 08:23:25.896709 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lsshm\" (UniqueName: \"kubernetes.io/projected/19da9663-9e98-41f0-a737-0c2683293496-kube-api-access-lsshm\") pod \"19da9663-9e98-41f0-a737-0c2683293496\" (UID: \"19da9663-9e98-41f0-a737-0c2683293496\") " Dec 01 08:23:25 crc kubenswrapper[5004]: I1201 08:23:25.896745 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cghx8\" (UniqueName: \"kubernetes.io/projected/8f079082-1b6f-4919-8d2c-c640f30de417-kube-api-access-cghx8\") pod \"8f079082-1b6f-4919-8d2c-c640f30de417\" (UID: \"8f079082-1b6f-4919-8d2c-c640f30de417\") " Dec 01 08:23:25 crc kubenswrapper[5004]: I1201 08:23:25.896773 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19da9663-9e98-41f0-a737-0c2683293496-catalog-content\") pod \"19da9663-9e98-41f0-a737-0c2683293496\" (UID: \"19da9663-9e98-41f0-a737-0c2683293496\") " Dec 01 08:23:25 crc kubenswrapper[5004]: I1201 08:23:25.896818 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8a9f98dc-e84b-4fb8-9d4d-69c766486ebb-marketplace-operator-metrics\") pod \"8a9f98dc-e84b-4fb8-9d4d-69c766486ebb\" (UID: \"8a9f98dc-e84b-4fb8-9d4d-69c766486ebb\") " Dec 01 08:23:25 crc kubenswrapper[5004]: I1201 08:23:25.896853 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0286c29-4a56-4e46-8820-21dfbc658c86-catalog-content\") pod \"b0286c29-4a56-4e46-8820-21dfbc658c86\" (UID: \"b0286c29-4a56-4e46-8820-21dfbc658c86\") " Dec 01 08:23:25 crc kubenswrapper[5004]: I1201 08:23:25.896882 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8wh9\" (UniqueName: \"kubernetes.io/projected/8a9f98dc-e84b-4fb8-9d4d-69c766486ebb-kube-api-access-w8wh9\") pod \"8a9f98dc-e84b-4fb8-9d4d-69c766486ebb\" (UID: \"8a9f98dc-e84b-4fb8-9d4d-69c766486ebb\") " Dec 01 08:23:25 crc kubenswrapper[5004]: I1201 08:23:25.896920 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f079082-1b6f-4919-8d2c-c640f30de417-catalog-content\") pod \"8f079082-1b6f-4919-8d2c-c640f30de417\" (UID: \"8f079082-1b6f-4919-8d2c-c640f30de417\") " Dec 01 08:23:25 crc kubenswrapper[5004]: I1201 08:23:25.898270 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a9f98dc-e84b-4fb8-9d4d-69c766486ebb-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "8a9f98dc-e84b-4fb8-9d4d-69c766486ebb" (UID: "8a9f98dc-e84b-4fb8-9d4d-69c766486ebb"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:23:25 crc kubenswrapper[5004]: I1201 08:23:25.898364 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0286c29-4a56-4e46-8820-21dfbc658c86-utilities" (OuterVolumeSpecName: "utilities") pod "b0286c29-4a56-4e46-8820-21dfbc658c86" (UID: "b0286c29-4a56-4e46-8820-21dfbc658c86"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:23:25 crc kubenswrapper[5004]: I1201 08:23:25.899246 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f079082-1b6f-4919-8d2c-c640f30de417-utilities" (OuterVolumeSpecName: "utilities") pod "8f079082-1b6f-4919-8d2c-c640f30de417" (UID: "8f079082-1b6f-4919-8d2c-c640f30de417"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:23:25 crc kubenswrapper[5004]: I1201 08:23:25.899543 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19da9663-9e98-41f0-a737-0c2683293496-utilities" (OuterVolumeSpecName: "utilities") pod "19da9663-9e98-41f0-a737-0c2683293496" (UID: "19da9663-9e98-41f0-a737-0c2683293496"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:23:25 crc kubenswrapper[5004]: I1201 08:23:25.901844 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0286c29-4a56-4e46-8820-21dfbc658c86-kube-api-access-fng7c" (OuterVolumeSpecName: "kube-api-access-fng7c") pod "b0286c29-4a56-4e46-8820-21dfbc658c86" (UID: "b0286c29-4a56-4e46-8820-21dfbc658c86"). InnerVolumeSpecName "kube-api-access-fng7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:23:25 crc kubenswrapper[5004]: I1201 08:23:25.905139 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19da9663-9e98-41f0-a737-0c2683293496-kube-api-access-lsshm" (OuterVolumeSpecName: "kube-api-access-lsshm") pod "19da9663-9e98-41f0-a737-0c2683293496" (UID: "19da9663-9e98-41f0-a737-0c2683293496"). InnerVolumeSpecName "kube-api-access-lsshm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:23:25 crc kubenswrapper[5004]: I1201 08:23:25.905651 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a9f98dc-e84b-4fb8-9d4d-69c766486ebb-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "8a9f98dc-e84b-4fb8-9d4d-69c766486ebb" (UID: "8a9f98dc-e84b-4fb8-9d4d-69c766486ebb"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:23:25 crc kubenswrapper[5004]: I1201 08:23:25.905830 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a9f98dc-e84b-4fb8-9d4d-69c766486ebb-kube-api-access-w8wh9" (OuterVolumeSpecName: "kube-api-access-w8wh9") pod "8a9f98dc-e84b-4fb8-9d4d-69c766486ebb" (UID: "8a9f98dc-e84b-4fb8-9d4d-69c766486ebb"). InnerVolumeSpecName "kube-api-access-w8wh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:23:25 crc kubenswrapper[5004]: I1201 08:23:25.906262 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f079082-1b6f-4919-8d2c-c640f30de417-kube-api-access-cghx8" (OuterVolumeSpecName: "kube-api-access-cghx8") pod "8f079082-1b6f-4919-8d2c-c640f30de417" (UID: "8f079082-1b6f-4919-8d2c-c640f30de417"). InnerVolumeSpecName "kube-api-access-cghx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:23:25 crc kubenswrapper[5004]: I1201 08:23:25.922472 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0286c29-4a56-4e46-8820-21dfbc658c86-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b0286c29-4a56-4e46-8820-21dfbc658c86" (UID: "b0286c29-4a56-4e46-8820-21dfbc658c86"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:23:25 crc kubenswrapper[5004]: I1201 08:23:25.956264 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19da9663-9e98-41f0-a737-0c2683293496-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "19da9663-9e98-41f0-a737-0c2683293496" (UID: "19da9663-9e98-41f0-a737-0c2683293496"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:23:25 crc kubenswrapper[5004]: I1201 08:23:25.998620 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0286c29-4a56-4e46-8820-21dfbc658c86-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 08:23:25 crc kubenswrapper[5004]: I1201 08:23:25.998657 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fng7c\" (UniqueName: \"kubernetes.io/projected/b0286c29-4a56-4e46-8820-21dfbc658c86-kube-api-access-fng7c\") on node \"crc\" DevicePath \"\"" Dec 01 08:23:25 crc kubenswrapper[5004]: I1201 08:23:25.998671 5004 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8a9f98dc-e84b-4fb8-9d4d-69c766486ebb-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 08:23:25 crc kubenswrapper[5004]: I1201 08:23:25.998684 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f079082-1b6f-4919-8d2c-c640f30de417-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 08:23:25 crc kubenswrapper[5004]: I1201 08:23:25.998696 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19da9663-9e98-41f0-a737-0c2683293496-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 08:23:25 crc kubenswrapper[5004]: I1201 08:23:25.998709 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lsshm\" (UniqueName: \"kubernetes.io/projected/19da9663-9e98-41f0-a737-0c2683293496-kube-api-access-lsshm\") on node \"crc\" DevicePath \"\"" Dec 01 08:23:25 crc kubenswrapper[5004]: I1201 08:23:25.998720 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cghx8\" (UniqueName: \"kubernetes.io/projected/8f079082-1b6f-4919-8d2c-c640f30de417-kube-api-access-cghx8\") on node \"crc\" DevicePath \"\"" Dec 01 08:23:25 crc kubenswrapper[5004]: I1201 08:23:25.998731 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19da9663-9e98-41f0-a737-0c2683293496-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 08:23:25 crc kubenswrapper[5004]: I1201 08:23:25.998742 5004 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8a9f98dc-e84b-4fb8-9d4d-69c766486ebb-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 01 08:23:25 crc kubenswrapper[5004]: I1201 08:23:25.998755 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0286c29-4a56-4e46-8820-21dfbc658c86-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 08:23:25 crc kubenswrapper[5004]: I1201 08:23:25.998765 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8wh9\" (UniqueName: \"kubernetes.io/projected/8a9f98dc-e84b-4fb8-9d4d-69c766486ebb-kube-api-access-w8wh9\") on node \"crc\" DevicePath \"\"" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.015834 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f079082-1b6f-4919-8d2c-c640f30de417-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8f079082-1b6f-4919-8d2c-c640f30de417" (UID: "8f079082-1b6f-4919-8d2c-c640f30de417"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.052723 5004 generic.go:334] "Generic (PLEG): container finished" podID="56a0aeb3-484c-4e7e-9ff7-b3bc75f253f4" containerID="c63f5e37abd10023683d84c37956975712137d4a6df7a00047f5d62ada1bc723" exitCode=0 Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.052843 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jd8z9" event={"ID":"56a0aeb3-484c-4e7e-9ff7-b3bc75f253f4","Type":"ContainerDied","Data":"c63f5e37abd10023683d84c37956975712137d4a6df7a00047f5d62ada1bc723"} Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.052881 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jd8z9" event={"ID":"56a0aeb3-484c-4e7e-9ff7-b3bc75f253f4","Type":"ContainerDied","Data":"1eba94626f192621bf1d375428f6d4084dd29d1d1b4693fd4f7efbbb5ebf7e26"} Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.052961 5004 scope.go:117] "RemoveContainer" containerID="c63f5e37abd10023683d84c37956975712137d4a6df7a00047f5d62ada1bc723" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.053545 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jd8z9" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.055415 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-5ffg4"] Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.060429 5004 generic.go:334] "Generic (PLEG): container finished" podID="8f079082-1b6f-4919-8d2c-c640f30de417" containerID="98d8c746f45777a77a60247a36da7f37e3cba4aab2752e59ee09374f7e491cbd" exitCode=0 Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.060513 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wvfds" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.060660 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvfds" event={"ID":"8f079082-1b6f-4919-8d2c-c640f30de417","Type":"ContainerDied","Data":"98d8c746f45777a77a60247a36da7f37e3cba4aab2752e59ee09374f7e491cbd"} Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.060704 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvfds" event={"ID":"8f079082-1b6f-4919-8d2c-c640f30de417","Type":"ContainerDied","Data":"e3d302ab764558a23ad99272aae3abf968e4b0dce735241d75edd3fea79c1c00"} Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.072605 5004 generic.go:334] "Generic (PLEG): container finished" podID="19da9663-9e98-41f0-a737-0c2683293496" containerID="ec801c83d20861c9e218221b9eb3a2728ff98a78239de7472bae8eb8d0d9af11" exitCode=0 Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.072697 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dg9cf" event={"ID":"19da9663-9e98-41f0-a737-0c2683293496","Type":"ContainerDied","Data":"ec801c83d20861c9e218221b9eb3a2728ff98a78239de7472bae8eb8d0d9af11"} Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.072719 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dg9cf" event={"ID":"19da9663-9e98-41f0-a737-0c2683293496","Type":"ContainerDied","Data":"41d9e939e0d909f1a131ac2fea6303ecc0a0a33f3f5527a76a276e35446ba44a"} Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.072848 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dg9cf" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.077654 5004 scope.go:117] "RemoveContainer" containerID="782aa8929e5fd32c5ea22d3c039cad33119a0899cd482e7f975aeb590a3461ab" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.078253 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8g8bf" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.078308 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8g8bf" event={"ID":"8a9f98dc-e84b-4fb8-9d4d-69c766486ebb","Type":"ContainerDied","Data":"37403851b7a3995647d9472cee025a2ab98a3adeeb95ee3d5d85053ebf1f1384"} Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.080427 5004 generic.go:334] "Generic (PLEG): container finished" podID="8a9f98dc-e84b-4fb8-9d4d-69c766486ebb" containerID="37403851b7a3995647d9472cee025a2ab98a3adeeb95ee3d5d85053ebf1f1384" exitCode=0 Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.081391 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8g8bf" event={"ID":"8a9f98dc-e84b-4fb8-9d4d-69c766486ebb","Type":"ContainerDied","Data":"941e8a5c4fc20058261dcf84f0b89e290d51d4d0982ee5fe35dacd50c2652e10"} Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.086333 5004 generic.go:334] "Generic (PLEG): container finished" podID="b0286c29-4a56-4e46-8820-21dfbc658c86" containerID="edff7af1925601359de2784bf43e9f064567d2588c070ac07415d42a8213ae54" exitCode=0 Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.086375 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pl87w" event={"ID":"b0286c29-4a56-4e46-8820-21dfbc658c86","Type":"ContainerDied","Data":"edff7af1925601359de2784bf43e9f064567d2588c070ac07415d42a8213ae54"} Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.086405 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pl87w" event={"ID":"b0286c29-4a56-4e46-8820-21dfbc658c86","Type":"ContainerDied","Data":"722a3f9b52cf235625e44e5718f1f3371bd16330168a7bb4c05370dd23f51940"} Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.086517 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pl87w" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.099892 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f079082-1b6f-4919-8d2c-c640f30de417-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.101082 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wvfds"] Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.105582 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wvfds"] Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.119385 5004 scope.go:117] "RemoveContainer" containerID="ef3d20796b6bb7f0a3f5208286e3b40defefb1b479f815014fd30621a7e76f81" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.123905 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jd8z9"] Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.126792 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jd8z9"] Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.130114 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8g8bf"] Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.132233 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8g8bf"] Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.136519 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dg9cf"] Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.146969 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-dg9cf"] Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.155955 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pl87w"] Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.161462 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pl87w"] Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.199681 5004 scope.go:117] "RemoveContainer" containerID="c63f5e37abd10023683d84c37956975712137d4a6df7a00047f5d62ada1bc723" Dec 01 08:23:26 crc kubenswrapper[5004]: E1201 08:23:26.200310 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c63f5e37abd10023683d84c37956975712137d4a6df7a00047f5d62ada1bc723\": container with ID starting with c63f5e37abd10023683d84c37956975712137d4a6df7a00047f5d62ada1bc723 not found: ID does not exist" containerID="c63f5e37abd10023683d84c37956975712137d4a6df7a00047f5d62ada1bc723" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.200411 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c63f5e37abd10023683d84c37956975712137d4a6df7a00047f5d62ada1bc723"} err="failed to get container status \"c63f5e37abd10023683d84c37956975712137d4a6df7a00047f5d62ada1bc723\": rpc error: code = NotFound desc = could not find container \"c63f5e37abd10023683d84c37956975712137d4a6df7a00047f5d62ada1bc723\": container with ID starting with c63f5e37abd10023683d84c37956975712137d4a6df7a00047f5d62ada1bc723 not found: ID does not exist" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.200511 5004 scope.go:117] "RemoveContainer" containerID="782aa8929e5fd32c5ea22d3c039cad33119a0899cd482e7f975aeb590a3461ab" Dec 01 08:23:26 crc kubenswrapper[5004]: E1201 08:23:26.201178 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"782aa8929e5fd32c5ea22d3c039cad33119a0899cd482e7f975aeb590a3461ab\": container with ID starting with 782aa8929e5fd32c5ea22d3c039cad33119a0899cd482e7f975aeb590a3461ab not found: ID does not exist" containerID="782aa8929e5fd32c5ea22d3c039cad33119a0899cd482e7f975aeb590a3461ab" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.201213 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"782aa8929e5fd32c5ea22d3c039cad33119a0899cd482e7f975aeb590a3461ab"} err="failed to get container status \"782aa8929e5fd32c5ea22d3c039cad33119a0899cd482e7f975aeb590a3461ab\": rpc error: code = NotFound desc = could not find container \"782aa8929e5fd32c5ea22d3c039cad33119a0899cd482e7f975aeb590a3461ab\": container with ID starting with 782aa8929e5fd32c5ea22d3c039cad33119a0899cd482e7f975aeb590a3461ab not found: ID does not exist" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.201236 5004 scope.go:117] "RemoveContainer" containerID="ef3d20796b6bb7f0a3f5208286e3b40defefb1b479f815014fd30621a7e76f81" Dec 01 08:23:26 crc kubenswrapper[5004]: E1201 08:23:26.201851 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef3d20796b6bb7f0a3f5208286e3b40defefb1b479f815014fd30621a7e76f81\": container with ID starting with ef3d20796b6bb7f0a3f5208286e3b40defefb1b479f815014fd30621a7e76f81 not found: ID does not exist" containerID="ef3d20796b6bb7f0a3f5208286e3b40defefb1b479f815014fd30621a7e76f81" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.201948 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef3d20796b6bb7f0a3f5208286e3b40defefb1b479f815014fd30621a7e76f81"} err="failed to get container status \"ef3d20796b6bb7f0a3f5208286e3b40defefb1b479f815014fd30621a7e76f81\": rpc error: code = NotFound desc = could not find container \"ef3d20796b6bb7f0a3f5208286e3b40defefb1b479f815014fd30621a7e76f81\": container with ID starting with ef3d20796b6bb7f0a3f5208286e3b40defefb1b479f815014fd30621a7e76f81 not found: ID does not exist" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.202020 5004 scope.go:117] "RemoveContainer" containerID="98d8c746f45777a77a60247a36da7f37e3cba4aab2752e59ee09374f7e491cbd" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.227376 5004 scope.go:117] "RemoveContainer" containerID="4ac44ca1597f9ba8f4e6822986319e334129b0cbf52bbdfc96aeb5c30e74d163" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.256509 5004 scope.go:117] "RemoveContainer" containerID="0020faff5bf1262e9cb333a1f3852acfcaf730ef0f690befd288586307f1caca" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.272159 5004 scope.go:117] "RemoveContainer" containerID="98d8c746f45777a77a60247a36da7f37e3cba4aab2752e59ee09374f7e491cbd" Dec 01 08:23:26 crc kubenswrapper[5004]: E1201 08:23:26.272528 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98d8c746f45777a77a60247a36da7f37e3cba4aab2752e59ee09374f7e491cbd\": container with ID starting with 98d8c746f45777a77a60247a36da7f37e3cba4aab2752e59ee09374f7e491cbd not found: ID does not exist" containerID="98d8c746f45777a77a60247a36da7f37e3cba4aab2752e59ee09374f7e491cbd" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.272590 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98d8c746f45777a77a60247a36da7f37e3cba4aab2752e59ee09374f7e491cbd"} err="failed to get container status \"98d8c746f45777a77a60247a36da7f37e3cba4aab2752e59ee09374f7e491cbd\": rpc error: code = NotFound desc = could not find container \"98d8c746f45777a77a60247a36da7f37e3cba4aab2752e59ee09374f7e491cbd\": container with ID starting with 98d8c746f45777a77a60247a36da7f37e3cba4aab2752e59ee09374f7e491cbd not found: ID does not exist" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.272625 5004 scope.go:117] "RemoveContainer" containerID="4ac44ca1597f9ba8f4e6822986319e334129b0cbf52bbdfc96aeb5c30e74d163" Dec 01 08:23:26 crc kubenswrapper[5004]: E1201 08:23:26.272908 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ac44ca1597f9ba8f4e6822986319e334129b0cbf52bbdfc96aeb5c30e74d163\": container with ID starting with 4ac44ca1597f9ba8f4e6822986319e334129b0cbf52bbdfc96aeb5c30e74d163 not found: ID does not exist" containerID="4ac44ca1597f9ba8f4e6822986319e334129b0cbf52bbdfc96aeb5c30e74d163" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.272972 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ac44ca1597f9ba8f4e6822986319e334129b0cbf52bbdfc96aeb5c30e74d163"} err="failed to get container status \"4ac44ca1597f9ba8f4e6822986319e334129b0cbf52bbdfc96aeb5c30e74d163\": rpc error: code = NotFound desc = could not find container \"4ac44ca1597f9ba8f4e6822986319e334129b0cbf52bbdfc96aeb5c30e74d163\": container with ID starting with 4ac44ca1597f9ba8f4e6822986319e334129b0cbf52bbdfc96aeb5c30e74d163 not found: ID does not exist" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.272994 5004 scope.go:117] "RemoveContainer" containerID="0020faff5bf1262e9cb333a1f3852acfcaf730ef0f690befd288586307f1caca" Dec 01 08:23:26 crc kubenswrapper[5004]: E1201 08:23:26.273413 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0020faff5bf1262e9cb333a1f3852acfcaf730ef0f690befd288586307f1caca\": container with ID starting with 0020faff5bf1262e9cb333a1f3852acfcaf730ef0f690befd288586307f1caca not found: ID does not exist" containerID="0020faff5bf1262e9cb333a1f3852acfcaf730ef0f690befd288586307f1caca" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.273437 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0020faff5bf1262e9cb333a1f3852acfcaf730ef0f690befd288586307f1caca"} err="failed to get container status \"0020faff5bf1262e9cb333a1f3852acfcaf730ef0f690befd288586307f1caca\": rpc error: code = NotFound desc = could not find container \"0020faff5bf1262e9cb333a1f3852acfcaf730ef0f690befd288586307f1caca\": container with ID starting with 0020faff5bf1262e9cb333a1f3852acfcaf730ef0f690befd288586307f1caca not found: ID does not exist" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.273454 5004 scope.go:117] "RemoveContainer" containerID="ec801c83d20861c9e218221b9eb3a2728ff98a78239de7472bae8eb8d0d9af11" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.287808 5004 scope.go:117] "RemoveContainer" containerID="82271e5359a06cdbb1f6c56401f792c5b2fb6bafb7fed178312d8a044b0e6d60" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.307915 5004 scope.go:117] "RemoveContainer" containerID="738c7e32c65dc8a55331b1cefb9f74180dbacc98c66732e1b36cccf0c10debf9" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.322674 5004 scope.go:117] "RemoveContainer" containerID="ec801c83d20861c9e218221b9eb3a2728ff98a78239de7472bae8eb8d0d9af11" Dec 01 08:23:26 crc kubenswrapper[5004]: E1201 08:23:26.323078 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec801c83d20861c9e218221b9eb3a2728ff98a78239de7472bae8eb8d0d9af11\": container with ID starting with ec801c83d20861c9e218221b9eb3a2728ff98a78239de7472bae8eb8d0d9af11 not found: ID does not exist" containerID="ec801c83d20861c9e218221b9eb3a2728ff98a78239de7472bae8eb8d0d9af11" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.323167 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec801c83d20861c9e218221b9eb3a2728ff98a78239de7472bae8eb8d0d9af11"} err="failed to get container status \"ec801c83d20861c9e218221b9eb3a2728ff98a78239de7472bae8eb8d0d9af11\": rpc error: code = NotFound desc = could not find container \"ec801c83d20861c9e218221b9eb3a2728ff98a78239de7472bae8eb8d0d9af11\": container with ID starting with ec801c83d20861c9e218221b9eb3a2728ff98a78239de7472bae8eb8d0d9af11 not found: ID does not exist" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.323193 5004 scope.go:117] "RemoveContainer" containerID="82271e5359a06cdbb1f6c56401f792c5b2fb6bafb7fed178312d8a044b0e6d60" Dec 01 08:23:26 crc kubenswrapper[5004]: E1201 08:23:26.323546 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82271e5359a06cdbb1f6c56401f792c5b2fb6bafb7fed178312d8a044b0e6d60\": container with ID starting with 82271e5359a06cdbb1f6c56401f792c5b2fb6bafb7fed178312d8a044b0e6d60 not found: ID does not exist" containerID="82271e5359a06cdbb1f6c56401f792c5b2fb6bafb7fed178312d8a044b0e6d60" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.323578 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82271e5359a06cdbb1f6c56401f792c5b2fb6bafb7fed178312d8a044b0e6d60"} err="failed to get container status \"82271e5359a06cdbb1f6c56401f792c5b2fb6bafb7fed178312d8a044b0e6d60\": rpc error: code = NotFound desc = could not find container \"82271e5359a06cdbb1f6c56401f792c5b2fb6bafb7fed178312d8a044b0e6d60\": container with ID starting with 82271e5359a06cdbb1f6c56401f792c5b2fb6bafb7fed178312d8a044b0e6d60 not found: ID does not exist" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.323592 5004 scope.go:117] "RemoveContainer" containerID="738c7e32c65dc8a55331b1cefb9f74180dbacc98c66732e1b36cccf0c10debf9" Dec 01 08:23:26 crc kubenswrapper[5004]: E1201 08:23:26.323841 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"738c7e32c65dc8a55331b1cefb9f74180dbacc98c66732e1b36cccf0c10debf9\": container with ID starting with 738c7e32c65dc8a55331b1cefb9f74180dbacc98c66732e1b36cccf0c10debf9 not found: ID does not exist" containerID="738c7e32c65dc8a55331b1cefb9f74180dbacc98c66732e1b36cccf0c10debf9" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.323862 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"738c7e32c65dc8a55331b1cefb9f74180dbacc98c66732e1b36cccf0c10debf9"} err="failed to get container status \"738c7e32c65dc8a55331b1cefb9f74180dbacc98c66732e1b36cccf0c10debf9\": rpc error: code = NotFound desc = could not find container \"738c7e32c65dc8a55331b1cefb9f74180dbacc98c66732e1b36cccf0c10debf9\": container with ID starting with 738c7e32c65dc8a55331b1cefb9f74180dbacc98c66732e1b36cccf0c10debf9 not found: ID does not exist" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.323873 5004 scope.go:117] "RemoveContainer" containerID="37403851b7a3995647d9472cee025a2ab98a3adeeb95ee3d5d85053ebf1f1384" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.334695 5004 scope.go:117] "RemoveContainer" containerID="886c8baa541d32465cd9ae76af7323fb43bd8bcafa4a5a26793c1480ef9bf2cc" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.352195 5004 scope.go:117] "RemoveContainer" containerID="37403851b7a3995647d9472cee025a2ab98a3adeeb95ee3d5d85053ebf1f1384" Dec 01 08:23:26 crc kubenswrapper[5004]: E1201 08:23:26.356054 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37403851b7a3995647d9472cee025a2ab98a3adeeb95ee3d5d85053ebf1f1384\": container with ID starting with 37403851b7a3995647d9472cee025a2ab98a3adeeb95ee3d5d85053ebf1f1384 not found: ID does not exist" containerID="37403851b7a3995647d9472cee025a2ab98a3adeeb95ee3d5d85053ebf1f1384" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.356106 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37403851b7a3995647d9472cee025a2ab98a3adeeb95ee3d5d85053ebf1f1384"} err="failed to get container status \"37403851b7a3995647d9472cee025a2ab98a3adeeb95ee3d5d85053ebf1f1384\": rpc error: code = NotFound desc = could not find container \"37403851b7a3995647d9472cee025a2ab98a3adeeb95ee3d5d85053ebf1f1384\": container with ID starting with 37403851b7a3995647d9472cee025a2ab98a3adeeb95ee3d5d85053ebf1f1384 not found: ID does not exist" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.356145 5004 scope.go:117] "RemoveContainer" containerID="886c8baa541d32465cd9ae76af7323fb43bd8bcafa4a5a26793c1480ef9bf2cc" Dec 01 08:23:26 crc kubenswrapper[5004]: E1201 08:23:26.357536 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"886c8baa541d32465cd9ae76af7323fb43bd8bcafa4a5a26793c1480ef9bf2cc\": container with ID starting with 886c8baa541d32465cd9ae76af7323fb43bd8bcafa4a5a26793c1480ef9bf2cc not found: ID does not exist" containerID="886c8baa541d32465cd9ae76af7323fb43bd8bcafa4a5a26793c1480ef9bf2cc" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.357614 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"886c8baa541d32465cd9ae76af7323fb43bd8bcafa4a5a26793c1480ef9bf2cc"} err="failed to get container status \"886c8baa541d32465cd9ae76af7323fb43bd8bcafa4a5a26793c1480ef9bf2cc\": rpc error: code = NotFound desc = could not find container \"886c8baa541d32465cd9ae76af7323fb43bd8bcafa4a5a26793c1480ef9bf2cc\": container with ID starting with 886c8baa541d32465cd9ae76af7323fb43bd8bcafa4a5a26793c1480ef9bf2cc not found: ID does not exist" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.357658 5004 scope.go:117] "RemoveContainer" containerID="edff7af1925601359de2784bf43e9f064567d2588c070ac07415d42a8213ae54" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.374789 5004 scope.go:117] "RemoveContainer" containerID="d40961f147a0a57e437432250ec1c726b9307da206e459eba42bc49fca28ef6e" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.390628 5004 scope.go:117] "RemoveContainer" containerID="96e1f7b40fe70c152339afe6573337284eb05591732581b555d6a7a8b1e564a0" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.405119 5004 scope.go:117] "RemoveContainer" containerID="edff7af1925601359de2784bf43e9f064567d2588c070ac07415d42a8213ae54" Dec 01 08:23:26 crc kubenswrapper[5004]: E1201 08:23:26.405535 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edff7af1925601359de2784bf43e9f064567d2588c070ac07415d42a8213ae54\": container with ID starting with edff7af1925601359de2784bf43e9f064567d2588c070ac07415d42a8213ae54 not found: ID does not exist" containerID="edff7af1925601359de2784bf43e9f064567d2588c070ac07415d42a8213ae54" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.405599 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edff7af1925601359de2784bf43e9f064567d2588c070ac07415d42a8213ae54"} err="failed to get container status \"edff7af1925601359de2784bf43e9f064567d2588c070ac07415d42a8213ae54\": rpc error: code = NotFound desc = could not find container \"edff7af1925601359de2784bf43e9f064567d2588c070ac07415d42a8213ae54\": container with ID starting with edff7af1925601359de2784bf43e9f064567d2588c070ac07415d42a8213ae54 not found: ID does not exist" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.405629 5004 scope.go:117] "RemoveContainer" containerID="d40961f147a0a57e437432250ec1c726b9307da206e459eba42bc49fca28ef6e" Dec 01 08:23:26 crc kubenswrapper[5004]: E1201 08:23:26.405974 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d40961f147a0a57e437432250ec1c726b9307da206e459eba42bc49fca28ef6e\": container with ID starting with d40961f147a0a57e437432250ec1c726b9307da206e459eba42bc49fca28ef6e not found: ID does not exist" containerID="d40961f147a0a57e437432250ec1c726b9307da206e459eba42bc49fca28ef6e" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.406054 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d40961f147a0a57e437432250ec1c726b9307da206e459eba42bc49fca28ef6e"} err="failed to get container status \"d40961f147a0a57e437432250ec1c726b9307da206e459eba42bc49fca28ef6e\": rpc error: code = NotFound desc = could not find container \"d40961f147a0a57e437432250ec1c726b9307da206e459eba42bc49fca28ef6e\": container with ID starting with d40961f147a0a57e437432250ec1c726b9307da206e459eba42bc49fca28ef6e not found: ID does not exist" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.406108 5004 scope.go:117] "RemoveContainer" containerID="96e1f7b40fe70c152339afe6573337284eb05591732581b555d6a7a8b1e564a0" Dec 01 08:23:26 crc kubenswrapper[5004]: E1201 08:23:26.406528 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96e1f7b40fe70c152339afe6573337284eb05591732581b555d6a7a8b1e564a0\": container with ID starting with 96e1f7b40fe70c152339afe6573337284eb05591732581b555d6a7a8b1e564a0 not found: ID does not exist" containerID="96e1f7b40fe70c152339afe6573337284eb05591732581b555d6a7a8b1e564a0" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.406553 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96e1f7b40fe70c152339afe6573337284eb05591732581b555d6a7a8b1e564a0"} err="failed to get container status \"96e1f7b40fe70c152339afe6573337284eb05591732581b555d6a7a8b1e564a0\": rpc error: code = NotFound desc = could not find container \"96e1f7b40fe70c152339afe6573337284eb05591732581b555d6a7a8b1e564a0\": container with ID starting with 96e1f7b40fe70c152339afe6573337284eb05591732581b555d6a7a8b1e564a0 not found: ID does not exist" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.772357 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19da9663-9e98-41f0-a737-0c2683293496" path="/var/lib/kubelet/pods/19da9663-9e98-41f0-a737-0c2683293496/volumes" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.773344 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56a0aeb3-484c-4e7e-9ff7-b3bc75f253f4" path="/var/lib/kubelet/pods/56a0aeb3-484c-4e7e-9ff7-b3bc75f253f4/volumes" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.774192 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a9f98dc-e84b-4fb8-9d4d-69c766486ebb" path="/var/lib/kubelet/pods/8a9f98dc-e84b-4fb8-9d4d-69c766486ebb/volumes" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.775364 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f079082-1b6f-4919-8d2c-c640f30de417" path="/var/lib/kubelet/pods/8f079082-1b6f-4919-8d2c-c640f30de417/volumes" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.776127 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0286c29-4a56-4e46-8820-21dfbc658c86" path="/var/lib/kubelet/pods/b0286c29-4a56-4e46-8820-21dfbc658c86/volumes" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.843783 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zsqrg"] Dec 01 08:23:26 crc kubenswrapper[5004]: E1201 08:23:26.844016 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0286c29-4a56-4e46-8820-21dfbc658c86" containerName="extract-utilities" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.844030 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0286c29-4a56-4e46-8820-21dfbc658c86" containerName="extract-utilities" Dec 01 08:23:26 crc kubenswrapper[5004]: E1201 08:23:26.844040 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56a0aeb3-484c-4e7e-9ff7-b3bc75f253f4" containerName="registry-server" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.844050 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="56a0aeb3-484c-4e7e-9ff7-b3bc75f253f4" containerName="registry-server" Dec 01 08:23:26 crc kubenswrapper[5004]: E1201 08:23:26.844069 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f079082-1b6f-4919-8d2c-c640f30de417" containerName="extract-utilities" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.844082 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f079082-1b6f-4919-8d2c-c640f30de417" containerName="extract-utilities" Dec 01 08:23:26 crc kubenswrapper[5004]: E1201 08:23:26.844096 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56a0aeb3-484c-4e7e-9ff7-b3bc75f253f4" containerName="extract-utilities" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.844105 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="56a0aeb3-484c-4e7e-9ff7-b3bc75f253f4" containerName="extract-utilities" Dec 01 08:23:26 crc kubenswrapper[5004]: E1201 08:23:26.844115 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a9f98dc-e84b-4fb8-9d4d-69c766486ebb" containerName="marketplace-operator" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.844123 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a9f98dc-e84b-4fb8-9d4d-69c766486ebb" containerName="marketplace-operator" Dec 01 08:23:26 crc kubenswrapper[5004]: E1201 08:23:26.844132 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19da9663-9e98-41f0-a737-0c2683293496" containerName="registry-server" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.844140 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="19da9663-9e98-41f0-a737-0c2683293496" containerName="registry-server" Dec 01 08:23:26 crc kubenswrapper[5004]: E1201 08:23:26.844155 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0286c29-4a56-4e46-8820-21dfbc658c86" containerName="registry-server" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.844165 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0286c29-4a56-4e46-8820-21dfbc658c86" containerName="registry-server" Dec 01 08:23:26 crc kubenswrapper[5004]: E1201 08:23:26.844183 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19da9663-9e98-41f0-a737-0c2683293496" containerName="extract-utilities" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.844193 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="19da9663-9e98-41f0-a737-0c2683293496" containerName="extract-utilities" Dec 01 08:23:26 crc kubenswrapper[5004]: E1201 08:23:26.844205 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0286c29-4a56-4e46-8820-21dfbc658c86" containerName="extract-content" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.844212 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0286c29-4a56-4e46-8820-21dfbc658c86" containerName="extract-content" Dec 01 08:23:26 crc kubenswrapper[5004]: E1201 08:23:26.844225 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19da9663-9e98-41f0-a737-0c2683293496" containerName="extract-content" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.844234 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="19da9663-9e98-41f0-a737-0c2683293496" containerName="extract-content" Dec 01 08:23:26 crc kubenswrapper[5004]: E1201 08:23:26.844253 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a9f98dc-e84b-4fb8-9d4d-69c766486ebb" containerName="marketplace-operator" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.844265 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a9f98dc-e84b-4fb8-9d4d-69c766486ebb" containerName="marketplace-operator" Dec 01 08:23:26 crc kubenswrapper[5004]: E1201 08:23:26.844282 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f079082-1b6f-4919-8d2c-c640f30de417" containerName="extract-content" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.844292 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f079082-1b6f-4919-8d2c-c640f30de417" containerName="extract-content" Dec 01 08:23:26 crc kubenswrapper[5004]: E1201 08:23:26.844305 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f079082-1b6f-4919-8d2c-c640f30de417" containerName="registry-server" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.844313 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f079082-1b6f-4919-8d2c-c640f30de417" containerName="registry-server" Dec 01 08:23:26 crc kubenswrapper[5004]: E1201 08:23:26.844324 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56a0aeb3-484c-4e7e-9ff7-b3bc75f253f4" containerName="extract-content" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.844332 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="56a0aeb3-484c-4e7e-9ff7-b3bc75f253f4" containerName="extract-content" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.844463 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a9f98dc-e84b-4fb8-9d4d-69c766486ebb" containerName="marketplace-operator" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.844479 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="56a0aeb3-484c-4e7e-9ff7-b3bc75f253f4" containerName="registry-server" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.844495 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="19da9663-9e98-41f0-a737-0c2683293496" containerName="registry-server" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.844504 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0286c29-4a56-4e46-8820-21dfbc658c86" containerName="registry-server" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.844528 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f079082-1b6f-4919-8d2c-c640f30de417" containerName="registry-server" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.844824 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a9f98dc-e84b-4fb8-9d4d-69c766486ebb" containerName="marketplace-operator" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.845647 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zsqrg" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.849967 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 01 08:23:26 crc kubenswrapper[5004]: I1201 08:23:26.858390 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zsqrg"] Dec 01 08:23:27 crc kubenswrapper[5004]: I1201 08:23:27.009821 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14df59e3-a048-40c2-9400-9accbd0badd7-utilities\") pod \"certified-operators-zsqrg\" (UID: \"14df59e3-a048-40c2-9400-9accbd0badd7\") " pod="openshift-marketplace/certified-operators-zsqrg" Dec 01 08:23:27 crc kubenswrapper[5004]: I1201 08:23:27.009925 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14df59e3-a048-40c2-9400-9accbd0badd7-catalog-content\") pod \"certified-operators-zsqrg\" (UID: \"14df59e3-a048-40c2-9400-9accbd0badd7\") " pod="openshift-marketplace/certified-operators-zsqrg" Dec 01 08:23:27 crc kubenswrapper[5004]: I1201 08:23:27.009979 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8j6kl\" (UniqueName: \"kubernetes.io/projected/14df59e3-a048-40c2-9400-9accbd0badd7-kube-api-access-8j6kl\") pod \"certified-operators-zsqrg\" (UID: \"14df59e3-a048-40c2-9400-9accbd0badd7\") " pod="openshift-marketplace/certified-operators-zsqrg" Dec 01 08:23:27 crc kubenswrapper[5004]: I1201 08:23:27.101667 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-5ffg4" event={"ID":"2370cb7e-e860-40eb-a3a2-0a711f1e05b1","Type":"ContainerStarted","Data":"b1faf556f7d90f66710b321661bfe48755d0eb162b60ba71844dede8d3877a77"} Dec 01 08:23:27 crc kubenswrapper[5004]: I1201 08:23:27.101708 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-5ffg4" event={"ID":"2370cb7e-e860-40eb-a3a2-0a711f1e05b1","Type":"ContainerStarted","Data":"8c504540267060972957df630c514f5b568733a57e54157a9fa108b8b8fe5938"} Dec 01 08:23:27 crc kubenswrapper[5004]: I1201 08:23:27.102620 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-5ffg4" Dec 01 08:23:27 crc kubenswrapper[5004]: I1201 08:23:27.108212 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-5ffg4" Dec 01 08:23:27 crc kubenswrapper[5004]: I1201 08:23:27.110950 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14df59e3-a048-40c2-9400-9accbd0badd7-catalog-content\") pod \"certified-operators-zsqrg\" (UID: \"14df59e3-a048-40c2-9400-9accbd0badd7\") " pod="openshift-marketplace/certified-operators-zsqrg" Dec 01 08:23:27 crc kubenswrapper[5004]: I1201 08:23:27.111020 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8j6kl\" (UniqueName: \"kubernetes.io/projected/14df59e3-a048-40c2-9400-9accbd0badd7-kube-api-access-8j6kl\") pod \"certified-operators-zsqrg\" (UID: \"14df59e3-a048-40c2-9400-9accbd0badd7\") " pod="openshift-marketplace/certified-operators-zsqrg" Dec 01 08:23:27 crc kubenswrapper[5004]: I1201 08:23:27.111072 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14df59e3-a048-40c2-9400-9accbd0badd7-utilities\") pod \"certified-operators-zsqrg\" (UID: \"14df59e3-a048-40c2-9400-9accbd0badd7\") " pod="openshift-marketplace/certified-operators-zsqrg" Dec 01 08:23:27 crc kubenswrapper[5004]: I1201 08:23:27.111364 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14df59e3-a048-40c2-9400-9accbd0badd7-catalog-content\") pod \"certified-operators-zsqrg\" (UID: \"14df59e3-a048-40c2-9400-9accbd0badd7\") " pod="openshift-marketplace/certified-operators-zsqrg" Dec 01 08:23:27 crc kubenswrapper[5004]: I1201 08:23:27.111525 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14df59e3-a048-40c2-9400-9accbd0badd7-utilities\") pod \"certified-operators-zsqrg\" (UID: \"14df59e3-a048-40c2-9400-9accbd0badd7\") " pod="openshift-marketplace/certified-operators-zsqrg" Dec 01 08:23:27 crc kubenswrapper[5004]: I1201 08:23:27.123300 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-5ffg4" podStartSLOduration=2.123193782 podStartE2EDuration="2.123193782s" podCreationTimestamp="2025-12-01 08:23:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:23:27.117707031 +0000 UTC m=+384.682699063" watchObservedRunningTime="2025-12-01 08:23:27.123193782 +0000 UTC m=+384.688185834" Dec 01 08:23:27 crc kubenswrapper[5004]: I1201 08:23:27.131970 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8j6kl\" (UniqueName: \"kubernetes.io/projected/14df59e3-a048-40c2-9400-9accbd0badd7-kube-api-access-8j6kl\") pod \"certified-operators-zsqrg\" (UID: \"14df59e3-a048-40c2-9400-9accbd0badd7\") " pod="openshift-marketplace/certified-operators-zsqrg" Dec 01 08:23:27 crc kubenswrapper[5004]: I1201 08:23:27.165436 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zsqrg" Dec 01 08:23:27 crc kubenswrapper[5004]: I1201 08:23:27.554868 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zsqrg"] Dec 01 08:23:27 crc kubenswrapper[5004]: W1201 08:23:27.560968 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14df59e3_a048_40c2_9400_9accbd0badd7.slice/crio-1de3802aa2e66c35efd66e019d489bc237bd9eb1c1c70cacf769ae38ecb2296b WatchSource:0}: Error finding container 1de3802aa2e66c35efd66e019d489bc237bd9eb1c1c70cacf769ae38ecb2296b: Status 404 returned error can't find the container with id 1de3802aa2e66c35efd66e019d489bc237bd9eb1c1c70cacf769ae38ecb2296b Dec 01 08:23:27 crc kubenswrapper[5004]: I1201 08:23:27.814258 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kwt85"] Dec 01 08:23:27 crc kubenswrapper[5004]: I1201 08:23:27.815365 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kwt85" Dec 01 08:23:27 crc kubenswrapper[5004]: I1201 08:23:27.818730 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 01 08:23:27 crc kubenswrapper[5004]: I1201 08:23:27.823598 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kwt85"] Dec 01 08:23:27 crc kubenswrapper[5004]: I1201 08:23:27.922190 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bw7c\" (UniqueName: \"kubernetes.io/projected/408c336f-4cb7-4ebd-80c3-53bf49c6b884-kube-api-access-9bw7c\") pod \"redhat-marketplace-kwt85\" (UID: \"408c336f-4cb7-4ebd-80c3-53bf49c6b884\") " pod="openshift-marketplace/redhat-marketplace-kwt85" Dec 01 08:23:27 crc kubenswrapper[5004]: I1201 08:23:27.922259 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/408c336f-4cb7-4ebd-80c3-53bf49c6b884-utilities\") pod \"redhat-marketplace-kwt85\" (UID: \"408c336f-4cb7-4ebd-80c3-53bf49c6b884\") " pod="openshift-marketplace/redhat-marketplace-kwt85" Dec 01 08:23:27 crc kubenswrapper[5004]: I1201 08:23:27.922299 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/408c336f-4cb7-4ebd-80c3-53bf49c6b884-catalog-content\") pod \"redhat-marketplace-kwt85\" (UID: \"408c336f-4cb7-4ebd-80c3-53bf49c6b884\") " pod="openshift-marketplace/redhat-marketplace-kwt85" Dec 01 08:23:28 crc kubenswrapper[5004]: I1201 08:23:28.023210 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bw7c\" (UniqueName: \"kubernetes.io/projected/408c336f-4cb7-4ebd-80c3-53bf49c6b884-kube-api-access-9bw7c\") pod \"redhat-marketplace-kwt85\" (UID: \"408c336f-4cb7-4ebd-80c3-53bf49c6b884\") " pod="openshift-marketplace/redhat-marketplace-kwt85" Dec 01 08:23:28 crc kubenswrapper[5004]: I1201 08:23:28.023281 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/408c336f-4cb7-4ebd-80c3-53bf49c6b884-utilities\") pod \"redhat-marketplace-kwt85\" (UID: \"408c336f-4cb7-4ebd-80c3-53bf49c6b884\") " pod="openshift-marketplace/redhat-marketplace-kwt85" Dec 01 08:23:28 crc kubenswrapper[5004]: I1201 08:23:28.023308 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/408c336f-4cb7-4ebd-80c3-53bf49c6b884-catalog-content\") pod \"redhat-marketplace-kwt85\" (UID: \"408c336f-4cb7-4ebd-80c3-53bf49c6b884\") " pod="openshift-marketplace/redhat-marketplace-kwt85" Dec 01 08:23:28 crc kubenswrapper[5004]: I1201 08:23:28.024007 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/408c336f-4cb7-4ebd-80c3-53bf49c6b884-catalog-content\") pod \"redhat-marketplace-kwt85\" (UID: \"408c336f-4cb7-4ebd-80c3-53bf49c6b884\") " pod="openshift-marketplace/redhat-marketplace-kwt85" Dec 01 08:23:28 crc kubenswrapper[5004]: I1201 08:23:28.024198 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/408c336f-4cb7-4ebd-80c3-53bf49c6b884-utilities\") pod \"redhat-marketplace-kwt85\" (UID: \"408c336f-4cb7-4ebd-80c3-53bf49c6b884\") " pod="openshift-marketplace/redhat-marketplace-kwt85" Dec 01 08:23:28 crc kubenswrapper[5004]: I1201 08:23:28.047702 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bw7c\" (UniqueName: \"kubernetes.io/projected/408c336f-4cb7-4ebd-80c3-53bf49c6b884-kube-api-access-9bw7c\") pod \"redhat-marketplace-kwt85\" (UID: \"408c336f-4cb7-4ebd-80c3-53bf49c6b884\") " pod="openshift-marketplace/redhat-marketplace-kwt85" Dec 01 08:23:28 crc kubenswrapper[5004]: I1201 08:23:28.110266 5004 generic.go:334] "Generic (PLEG): container finished" podID="14df59e3-a048-40c2-9400-9accbd0badd7" containerID="1fb4b9b8d86574d95ac09867b57d653a82e6a3786753c6cf6a45501553879e60" exitCode=0 Dec 01 08:23:28 crc kubenswrapper[5004]: I1201 08:23:28.110367 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zsqrg" event={"ID":"14df59e3-a048-40c2-9400-9accbd0badd7","Type":"ContainerDied","Data":"1fb4b9b8d86574d95ac09867b57d653a82e6a3786753c6cf6a45501553879e60"} Dec 01 08:23:28 crc kubenswrapper[5004]: I1201 08:23:28.110416 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zsqrg" event={"ID":"14df59e3-a048-40c2-9400-9accbd0badd7","Type":"ContainerStarted","Data":"1de3802aa2e66c35efd66e019d489bc237bd9eb1c1c70cacf769ae38ecb2296b"} Dec 01 08:23:28 crc kubenswrapper[5004]: I1201 08:23:28.189038 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kwt85" Dec 01 08:23:28 crc kubenswrapper[5004]: I1201 08:23:28.610743 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kwt85"] Dec 01 08:23:29 crc kubenswrapper[5004]: I1201 08:23:29.117689 5004 generic.go:334] "Generic (PLEG): container finished" podID="408c336f-4cb7-4ebd-80c3-53bf49c6b884" containerID="f60e7aaddc256c85cb62478086cae23530ecac84a49801e998c2fd1868ce1429" exitCode=0 Dec 01 08:23:29 crc kubenswrapper[5004]: I1201 08:23:29.117875 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kwt85" event={"ID":"408c336f-4cb7-4ebd-80c3-53bf49c6b884","Type":"ContainerDied","Data":"f60e7aaddc256c85cb62478086cae23530ecac84a49801e998c2fd1868ce1429"} Dec 01 08:23:29 crc kubenswrapper[5004]: I1201 08:23:29.118099 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kwt85" event={"ID":"408c336f-4cb7-4ebd-80c3-53bf49c6b884","Type":"ContainerStarted","Data":"36ba0cd6f327fc02098dd9a9cbb7b52d1327c3e3ae6ba106464ee9a553f6e28a"} Dec 01 08:23:29 crc kubenswrapper[5004]: I1201 08:23:29.121536 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zsqrg" event={"ID":"14df59e3-a048-40c2-9400-9accbd0badd7","Type":"ContainerStarted","Data":"ed5dd56fcbca0522c7582906cd70075a8dba38ccf161b6fb328687d5bd484ef8"} Dec 01 08:23:29 crc kubenswrapper[5004]: I1201 08:23:29.211372 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mrv6v"] Dec 01 08:23:29 crc kubenswrapper[5004]: I1201 08:23:29.212944 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mrv6v" Dec 01 08:23:29 crc kubenswrapper[5004]: I1201 08:23:29.214341 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 01 08:23:29 crc kubenswrapper[5004]: I1201 08:23:29.216672 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mrv6v"] Dec 01 08:23:29 crc kubenswrapper[5004]: I1201 08:23:29.252804 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gbr6\" (UniqueName: \"kubernetes.io/projected/de6dd90c-9ef5-4754-8979-7c4efaf00386-kube-api-access-4gbr6\") pod \"redhat-operators-mrv6v\" (UID: \"de6dd90c-9ef5-4754-8979-7c4efaf00386\") " pod="openshift-marketplace/redhat-operators-mrv6v" Dec 01 08:23:29 crc kubenswrapper[5004]: I1201 08:23:29.252853 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de6dd90c-9ef5-4754-8979-7c4efaf00386-utilities\") pod \"redhat-operators-mrv6v\" (UID: \"de6dd90c-9ef5-4754-8979-7c4efaf00386\") " pod="openshift-marketplace/redhat-operators-mrv6v" Dec 01 08:23:29 crc kubenswrapper[5004]: I1201 08:23:29.252907 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de6dd90c-9ef5-4754-8979-7c4efaf00386-catalog-content\") pod \"redhat-operators-mrv6v\" (UID: \"de6dd90c-9ef5-4754-8979-7c4efaf00386\") " pod="openshift-marketplace/redhat-operators-mrv6v" Dec 01 08:23:29 crc kubenswrapper[5004]: I1201 08:23:29.353725 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gbr6\" (UniqueName: \"kubernetes.io/projected/de6dd90c-9ef5-4754-8979-7c4efaf00386-kube-api-access-4gbr6\") pod \"redhat-operators-mrv6v\" (UID: \"de6dd90c-9ef5-4754-8979-7c4efaf00386\") " pod="openshift-marketplace/redhat-operators-mrv6v" Dec 01 08:23:29 crc kubenswrapper[5004]: I1201 08:23:29.353793 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de6dd90c-9ef5-4754-8979-7c4efaf00386-utilities\") pod \"redhat-operators-mrv6v\" (UID: \"de6dd90c-9ef5-4754-8979-7c4efaf00386\") " pod="openshift-marketplace/redhat-operators-mrv6v" Dec 01 08:23:29 crc kubenswrapper[5004]: I1201 08:23:29.353848 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de6dd90c-9ef5-4754-8979-7c4efaf00386-catalog-content\") pod \"redhat-operators-mrv6v\" (UID: \"de6dd90c-9ef5-4754-8979-7c4efaf00386\") " pod="openshift-marketplace/redhat-operators-mrv6v" Dec 01 08:23:29 crc kubenswrapper[5004]: I1201 08:23:29.354610 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de6dd90c-9ef5-4754-8979-7c4efaf00386-catalog-content\") pod \"redhat-operators-mrv6v\" (UID: \"de6dd90c-9ef5-4754-8979-7c4efaf00386\") " pod="openshift-marketplace/redhat-operators-mrv6v" Dec 01 08:23:29 crc kubenswrapper[5004]: I1201 08:23:29.355762 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de6dd90c-9ef5-4754-8979-7c4efaf00386-utilities\") pod \"redhat-operators-mrv6v\" (UID: \"de6dd90c-9ef5-4754-8979-7c4efaf00386\") " pod="openshift-marketplace/redhat-operators-mrv6v" Dec 01 08:23:29 crc kubenswrapper[5004]: I1201 08:23:29.376698 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gbr6\" (UniqueName: \"kubernetes.io/projected/de6dd90c-9ef5-4754-8979-7c4efaf00386-kube-api-access-4gbr6\") pod \"redhat-operators-mrv6v\" (UID: \"de6dd90c-9ef5-4754-8979-7c4efaf00386\") " pod="openshift-marketplace/redhat-operators-mrv6v" Dec 01 08:23:29 crc kubenswrapper[5004]: I1201 08:23:29.584876 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mrv6v" Dec 01 08:23:30 crc kubenswrapper[5004]: I1201 08:23:30.018123 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mrv6v"] Dec 01 08:23:30 crc kubenswrapper[5004]: I1201 08:23:30.139079 5004 generic.go:334] "Generic (PLEG): container finished" podID="14df59e3-a048-40c2-9400-9accbd0badd7" containerID="ed5dd56fcbca0522c7582906cd70075a8dba38ccf161b6fb328687d5bd484ef8" exitCode=0 Dec 01 08:23:30 crc kubenswrapper[5004]: I1201 08:23:30.139191 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zsqrg" event={"ID":"14df59e3-a048-40c2-9400-9accbd0badd7","Type":"ContainerDied","Data":"ed5dd56fcbca0522c7582906cd70075a8dba38ccf161b6fb328687d5bd484ef8"} Dec 01 08:23:30 crc kubenswrapper[5004]: I1201 08:23:30.143991 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mrv6v" event={"ID":"de6dd90c-9ef5-4754-8979-7c4efaf00386","Type":"ContainerStarted","Data":"6f198c91cee3005c2107c1cc39e1949efa5e5d55b945caebc0783eee5f64459d"} Dec 01 08:23:30 crc kubenswrapper[5004]: I1201 08:23:30.144237 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mrv6v" event={"ID":"de6dd90c-9ef5-4754-8979-7c4efaf00386","Type":"ContainerStarted","Data":"12957fa2fe9b7bb8cf3ab23a3740cb786aea61ab5057d8a9cfa8416e82327015"} Dec 01 08:23:30 crc kubenswrapper[5004]: I1201 08:23:30.149004 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kwt85" event={"ID":"408c336f-4cb7-4ebd-80c3-53bf49c6b884","Type":"ContainerStarted","Data":"4a9e0624febcd3311616df278e5c08c3a1e55ddf37a60ecaf030f61ccf5c05ba"} Dec 01 08:23:30 crc kubenswrapper[5004]: I1201 08:23:30.214398 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lzmgh"] Dec 01 08:23:30 crc kubenswrapper[5004]: I1201 08:23:30.216023 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lzmgh" Dec 01 08:23:30 crc kubenswrapper[5004]: I1201 08:23:30.218705 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 01 08:23:30 crc kubenswrapper[5004]: I1201 08:23:30.224868 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lzmgh"] Dec 01 08:23:30 crc kubenswrapper[5004]: I1201 08:23:30.265847 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nk2k\" (UniqueName: \"kubernetes.io/projected/df757840-7c38-4de3-829b-759182d9c96d-kube-api-access-9nk2k\") pod \"community-operators-lzmgh\" (UID: \"df757840-7c38-4de3-829b-759182d9c96d\") " pod="openshift-marketplace/community-operators-lzmgh" Dec 01 08:23:30 crc kubenswrapper[5004]: I1201 08:23:30.265897 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df757840-7c38-4de3-829b-759182d9c96d-catalog-content\") pod \"community-operators-lzmgh\" (UID: \"df757840-7c38-4de3-829b-759182d9c96d\") " pod="openshift-marketplace/community-operators-lzmgh" Dec 01 08:23:30 crc kubenswrapper[5004]: I1201 08:23:30.265956 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df757840-7c38-4de3-829b-759182d9c96d-utilities\") pod \"community-operators-lzmgh\" (UID: \"df757840-7c38-4de3-829b-759182d9c96d\") " pod="openshift-marketplace/community-operators-lzmgh" Dec 01 08:23:30 crc kubenswrapper[5004]: I1201 08:23:30.367022 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df757840-7c38-4de3-829b-759182d9c96d-utilities\") pod \"community-operators-lzmgh\" (UID: \"df757840-7c38-4de3-829b-759182d9c96d\") " pod="openshift-marketplace/community-operators-lzmgh" Dec 01 08:23:30 crc kubenswrapper[5004]: I1201 08:23:30.367125 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nk2k\" (UniqueName: \"kubernetes.io/projected/df757840-7c38-4de3-829b-759182d9c96d-kube-api-access-9nk2k\") pod \"community-operators-lzmgh\" (UID: \"df757840-7c38-4de3-829b-759182d9c96d\") " pod="openshift-marketplace/community-operators-lzmgh" Dec 01 08:23:30 crc kubenswrapper[5004]: I1201 08:23:30.367184 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df757840-7c38-4de3-829b-759182d9c96d-catalog-content\") pod \"community-operators-lzmgh\" (UID: \"df757840-7c38-4de3-829b-759182d9c96d\") " pod="openshift-marketplace/community-operators-lzmgh" Dec 01 08:23:30 crc kubenswrapper[5004]: I1201 08:23:30.367585 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df757840-7c38-4de3-829b-759182d9c96d-utilities\") pod \"community-operators-lzmgh\" (UID: \"df757840-7c38-4de3-829b-759182d9c96d\") " pod="openshift-marketplace/community-operators-lzmgh" Dec 01 08:23:30 crc kubenswrapper[5004]: I1201 08:23:30.367811 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df757840-7c38-4de3-829b-759182d9c96d-catalog-content\") pod \"community-operators-lzmgh\" (UID: \"df757840-7c38-4de3-829b-759182d9c96d\") " pod="openshift-marketplace/community-operators-lzmgh" Dec 01 08:23:30 crc kubenswrapper[5004]: I1201 08:23:30.404297 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nk2k\" (UniqueName: \"kubernetes.io/projected/df757840-7c38-4de3-829b-759182d9c96d-kube-api-access-9nk2k\") pod \"community-operators-lzmgh\" (UID: \"df757840-7c38-4de3-829b-759182d9c96d\") " pod="openshift-marketplace/community-operators-lzmgh" Dec 01 08:23:30 crc kubenswrapper[5004]: I1201 08:23:30.582356 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lzmgh" Dec 01 08:23:30 crc kubenswrapper[5004]: I1201 08:23:30.992131 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lzmgh"] Dec 01 08:23:31 crc kubenswrapper[5004]: I1201 08:23:31.156868 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lzmgh" event={"ID":"df757840-7c38-4de3-829b-759182d9c96d","Type":"ContainerStarted","Data":"d8107835040e63c1bba04c5b9332fed22616c141a47453aa447fa8aa99a06177"} Dec 01 08:23:31 crc kubenswrapper[5004]: I1201 08:23:31.156920 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lzmgh" event={"ID":"df757840-7c38-4de3-829b-759182d9c96d","Type":"ContainerStarted","Data":"87ced3db20294ef1bfcc68fec4c0abdbddf46a0b0d06dd530137bc0c2cb75cda"} Dec 01 08:23:31 crc kubenswrapper[5004]: I1201 08:23:31.160133 5004 generic.go:334] "Generic (PLEG): container finished" podID="408c336f-4cb7-4ebd-80c3-53bf49c6b884" containerID="4a9e0624febcd3311616df278e5c08c3a1e55ddf37a60ecaf030f61ccf5c05ba" exitCode=0 Dec 01 08:23:31 crc kubenswrapper[5004]: I1201 08:23:31.160229 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kwt85" event={"ID":"408c336f-4cb7-4ebd-80c3-53bf49c6b884","Type":"ContainerDied","Data":"4a9e0624febcd3311616df278e5c08c3a1e55ddf37a60ecaf030f61ccf5c05ba"} Dec 01 08:23:31 crc kubenswrapper[5004]: I1201 08:23:31.162933 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zsqrg" event={"ID":"14df59e3-a048-40c2-9400-9accbd0badd7","Type":"ContainerStarted","Data":"cc6173be1c2e49fe596839dcc6de799fe661766631b9208608a00e8b44be24ef"} Dec 01 08:23:31 crc kubenswrapper[5004]: I1201 08:23:31.164818 5004 generic.go:334] "Generic (PLEG): container finished" podID="de6dd90c-9ef5-4754-8979-7c4efaf00386" containerID="6f198c91cee3005c2107c1cc39e1949efa5e5d55b945caebc0783eee5f64459d" exitCode=0 Dec 01 08:23:31 crc kubenswrapper[5004]: I1201 08:23:31.164864 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mrv6v" event={"ID":"de6dd90c-9ef5-4754-8979-7c4efaf00386","Type":"ContainerDied","Data":"6f198c91cee3005c2107c1cc39e1949efa5e5d55b945caebc0783eee5f64459d"} Dec 01 08:23:31 crc kubenswrapper[5004]: I1201 08:23:31.254366 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zsqrg" podStartSLOduration=2.692595309 podStartE2EDuration="5.254336916s" podCreationTimestamp="2025-12-01 08:23:26 +0000 UTC" firstStartedPulling="2025-12-01 08:23:28.112022054 +0000 UTC m=+385.677014076" lastFinishedPulling="2025-12-01 08:23:30.673763711 +0000 UTC m=+388.238755683" observedRunningTime="2025-12-01 08:23:31.247271843 +0000 UTC m=+388.812263835" watchObservedRunningTime="2025-12-01 08:23:31.254336916 +0000 UTC m=+388.819328928" Dec 01 08:23:32 crc kubenswrapper[5004]: I1201 08:23:32.170863 5004 generic.go:334] "Generic (PLEG): container finished" podID="df757840-7c38-4de3-829b-759182d9c96d" containerID="d8107835040e63c1bba04c5b9332fed22616c141a47453aa447fa8aa99a06177" exitCode=0 Dec 01 08:23:32 crc kubenswrapper[5004]: I1201 08:23:32.171195 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lzmgh" event={"ID":"df757840-7c38-4de3-829b-759182d9c96d","Type":"ContainerDied","Data":"d8107835040e63c1bba04c5b9332fed22616c141a47453aa447fa8aa99a06177"} Dec 01 08:23:32 crc kubenswrapper[5004]: I1201 08:23:32.175849 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kwt85" event={"ID":"408c336f-4cb7-4ebd-80c3-53bf49c6b884","Type":"ContainerStarted","Data":"2830b496df12f2614f3d7f277f0edbc97980264f3962c0aad6c2169410d4000d"} Dec 01 08:23:32 crc kubenswrapper[5004]: I1201 08:23:32.178889 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mrv6v" event={"ID":"de6dd90c-9ef5-4754-8979-7c4efaf00386","Type":"ContainerStarted","Data":"0f2c1c05059774506536e55da080ecd4f13ef4505b8b01a641d1c41a5878533d"} Dec 01 08:23:33 crc kubenswrapper[5004]: I1201 08:23:33.185715 5004 generic.go:334] "Generic (PLEG): container finished" podID="de6dd90c-9ef5-4754-8979-7c4efaf00386" containerID="0f2c1c05059774506536e55da080ecd4f13ef4505b8b01a641d1c41a5878533d" exitCode=0 Dec 01 08:23:33 crc kubenswrapper[5004]: I1201 08:23:33.185784 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mrv6v" event={"ID":"de6dd90c-9ef5-4754-8979-7c4efaf00386","Type":"ContainerDied","Data":"0f2c1c05059774506536e55da080ecd4f13ef4505b8b01a641d1c41a5878533d"} Dec 01 08:23:33 crc kubenswrapper[5004]: I1201 08:23:33.202503 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kwt85" podStartSLOduration=3.463076068 podStartE2EDuration="6.202481734s" podCreationTimestamp="2025-12-01 08:23:27 +0000 UTC" firstStartedPulling="2025-12-01 08:23:29.122136945 +0000 UTC m=+386.687128927" lastFinishedPulling="2025-12-01 08:23:31.861542611 +0000 UTC m=+389.426534593" observedRunningTime="2025-12-01 08:23:32.233112157 +0000 UTC m=+389.798104149" watchObservedRunningTime="2025-12-01 08:23:33.202481734 +0000 UTC m=+390.767473726" Dec 01 08:23:34 crc kubenswrapper[5004]: I1201 08:23:34.194758 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mrv6v" event={"ID":"de6dd90c-9ef5-4754-8979-7c4efaf00386","Type":"ContainerStarted","Data":"c13c42ffcda8fe14761e34915935c6dd1ef7244d8e923ad5fd0132a9255e3a16"} Dec 01 08:23:34 crc kubenswrapper[5004]: I1201 08:23:34.197638 5004 generic.go:334] "Generic (PLEG): container finished" podID="df757840-7c38-4de3-829b-759182d9c96d" containerID="5001cfd699548e88f27495bae9358b98a0dacac724da0da5679228e94705f0e6" exitCode=0 Dec 01 08:23:34 crc kubenswrapper[5004]: I1201 08:23:34.197683 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lzmgh" event={"ID":"df757840-7c38-4de3-829b-759182d9c96d","Type":"ContainerDied","Data":"5001cfd699548e88f27495bae9358b98a0dacac724da0da5679228e94705f0e6"} Dec 01 08:23:34 crc kubenswrapper[5004]: I1201 08:23:34.210257 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mrv6v" podStartSLOduration=2.507116016 podStartE2EDuration="5.210238442s" podCreationTimestamp="2025-12-01 08:23:29 +0000 UTC" firstStartedPulling="2025-12-01 08:23:31.165911468 +0000 UTC m=+388.730903450" lastFinishedPulling="2025-12-01 08:23:33.869033874 +0000 UTC m=+391.434025876" observedRunningTime="2025-12-01 08:23:34.207665736 +0000 UTC m=+391.772657748" watchObservedRunningTime="2025-12-01 08:23:34.210238442 +0000 UTC m=+391.775230424" Dec 01 08:23:36 crc kubenswrapper[5004]: I1201 08:23:36.209499 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lzmgh" event={"ID":"df757840-7c38-4de3-829b-759182d9c96d","Type":"ContainerStarted","Data":"782332edccb146590f6c58fc9817f1c50c21ec328fdd227aeb22e7e514d4be6e"} Dec 01 08:23:36 crc kubenswrapper[5004]: I1201 08:23:36.228265 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lzmgh" podStartSLOduration=3.362205097 podStartE2EDuration="6.228241746s" podCreationTimestamp="2025-12-01 08:23:30 +0000 UTC" firstStartedPulling="2025-12-01 08:23:32.172858568 +0000 UTC m=+389.737850550" lastFinishedPulling="2025-12-01 08:23:35.038895217 +0000 UTC m=+392.603887199" observedRunningTime="2025-12-01 08:23:36.223815563 +0000 UTC m=+393.788807555" watchObservedRunningTime="2025-12-01 08:23:36.228241746 +0000 UTC m=+393.793233768" Dec 01 08:23:37 crc kubenswrapper[5004]: I1201 08:23:37.166664 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zsqrg" Dec 01 08:23:37 crc kubenswrapper[5004]: I1201 08:23:37.166965 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zsqrg" Dec 01 08:23:37 crc kubenswrapper[5004]: I1201 08:23:37.225597 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zsqrg" Dec 01 08:23:37 crc kubenswrapper[5004]: I1201 08:23:37.272725 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zsqrg" Dec 01 08:23:38 crc kubenswrapper[5004]: I1201 08:23:38.189755 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kwt85" Dec 01 08:23:38 crc kubenswrapper[5004]: I1201 08:23:38.189812 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kwt85" Dec 01 08:23:38 crc kubenswrapper[5004]: I1201 08:23:38.256262 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kwt85" Dec 01 08:23:38 crc kubenswrapper[5004]: I1201 08:23:38.302275 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kwt85" Dec 01 08:23:38 crc kubenswrapper[5004]: I1201 08:23:38.729708 5004 patch_prober.go:28] interesting pod/machine-config-daemon-fvdgt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 08:23:38 crc kubenswrapper[5004]: I1201 08:23:38.729835 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 08:23:39 crc kubenswrapper[5004]: I1201 08:23:39.586123 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mrv6v" Dec 01 08:23:39 crc kubenswrapper[5004]: I1201 08:23:39.586186 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mrv6v" Dec 01 08:23:39 crc kubenswrapper[5004]: I1201 08:23:39.647655 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mrv6v" Dec 01 08:23:40 crc kubenswrapper[5004]: I1201 08:23:40.282602 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mrv6v" Dec 01 08:23:40 crc kubenswrapper[5004]: I1201 08:23:40.583514 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lzmgh" Dec 01 08:23:40 crc kubenswrapper[5004]: I1201 08:23:40.583962 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lzmgh" Dec 01 08:23:40 crc kubenswrapper[5004]: I1201 08:23:40.638241 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lzmgh" Dec 01 08:23:41 crc kubenswrapper[5004]: I1201 08:23:41.283222 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lzmgh" Dec 01 08:23:42 crc kubenswrapper[5004]: I1201 08:23:42.140320 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-8jr7l" podUID="9c645213-a3fd-4f35-9edd-60905873a559" containerName="registry" containerID="cri-o://e54f7026de6305decc27f119174eb4e406e62b4b6ac16f898f07825ff5ef24ce" gracePeriod=30 Dec 01 08:23:42 crc kubenswrapper[5004]: I1201 08:23:42.586767 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-8jr7l" Dec 01 08:23:42 crc kubenswrapper[5004]: I1201 08:23:42.737401 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bv8b7\" (UniqueName: \"kubernetes.io/projected/9c645213-a3fd-4f35-9edd-60905873a559-kube-api-access-bv8b7\") pod \"9c645213-a3fd-4f35-9edd-60905873a559\" (UID: \"9c645213-a3fd-4f35-9edd-60905873a559\") " Dec 01 08:23:42 crc kubenswrapper[5004]: I1201 08:23:42.737459 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9c645213-a3fd-4f35-9edd-60905873a559-bound-sa-token\") pod \"9c645213-a3fd-4f35-9edd-60905873a559\" (UID: \"9c645213-a3fd-4f35-9edd-60905873a559\") " Dec 01 08:23:42 crc kubenswrapper[5004]: I1201 08:23:42.737609 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"9c645213-a3fd-4f35-9edd-60905873a559\" (UID: \"9c645213-a3fd-4f35-9edd-60905873a559\") " Dec 01 08:23:42 crc kubenswrapper[5004]: I1201 08:23:42.737639 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9c645213-a3fd-4f35-9edd-60905873a559-registry-certificates\") pod \"9c645213-a3fd-4f35-9edd-60905873a559\" (UID: \"9c645213-a3fd-4f35-9edd-60905873a559\") " Dec 01 08:23:42 crc kubenswrapper[5004]: I1201 08:23:42.737658 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9c645213-a3fd-4f35-9edd-60905873a559-ca-trust-extracted\") pod \"9c645213-a3fd-4f35-9edd-60905873a559\" (UID: \"9c645213-a3fd-4f35-9edd-60905873a559\") " Dec 01 08:23:42 crc kubenswrapper[5004]: I1201 08:23:42.737682 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9c645213-a3fd-4f35-9edd-60905873a559-installation-pull-secrets\") pod \"9c645213-a3fd-4f35-9edd-60905873a559\" (UID: \"9c645213-a3fd-4f35-9edd-60905873a559\") " Dec 01 08:23:42 crc kubenswrapper[5004]: I1201 08:23:42.737711 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9c645213-a3fd-4f35-9edd-60905873a559-registry-tls\") pod \"9c645213-a3fd-4f35-9edd-60905873a559\" (UID: \"9c645213-a3fd-4f35-9edd-60905873a559\") " Dec 01 08:23:42 crc kubenswrapper[5004]: I1201 08:23:42.738371 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c645213-a3fd-4f35-9edd-60905873a559-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "9c645213-a3fd-4f35-9edd-60905873a559" (UID: "9c645213-a3fd-4f35-9edd-60905873a559"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:23:42 crc kubenswrapper[5004]: I1201 08:23:42.738436 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9c645213-a3fd-4f35-9edd-60905873a559-trusted-ca\") pod \"9c645213-a3fd-4f35-9edd-60905873a559\" (UID: \"9c645213-a3fd-4f35-9edd-60905873a559\") " Dec 01 08:23:42 crc kubenswrapper[5004]: I1201 08:23:42.738617 5004 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9c645213-a3fd-4f35-9edd-60905873a559-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 01 08:23:42 crc kubenswrapper[5004]: I1201 08:23:42.738923 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c645213-a3fd-4f35-9edd-60905873a559-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9c645213-a3fd-4f35-9edd-60905873a559" (UID: "9c645213-a3fd-4f35-9edd-60905873a559"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:23:42 crc kubenswrapper[5004]: I1201 08:23:42.748226 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c645213-a3fd-4f35-9edd-60905873a559-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "9c645213-a3fd-4f35-9edd-60905873a559" (UID: "9c645213-a3fd-4f35-9edd-60905873a559"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:23:42 crc kubenswrapper[5004]: I1201 08:23:42.758703 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c645213-a3fd-4f35-9edd-60905873a559-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "9c645213-a3fd-4f35-9edd-60905873a559" (UID: "9c645213-a3fd-4f35-9edd-60905873a559"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:23:42 crc kubenswrapper[5004]: I1201 08:23:42.759303 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c645213-a3fd-4f35-9edd-60905873a559-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "9c645213-a3fd-4f35-9edd-60905873a559" (UID: "9c645213-a3fd-4f35-9edd-60905873a559"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:23:42 crc kubenswrapper[5004]: I1201 08:23:42.763748 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c645213-a3fd-4f35-9edd-60905873a559-kube-api-access-bv8b7" (OuterVolumeSpecName: "kube-api-access-bv8b7") pod "9c645213-a3fd-4f35-9edd-60905873a559" (UID: "9c645213-a3fd-4f35-9edd-60905873a559"). InnerVolumeSpecName "kube-api-access-bv8b7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:23:42 crc kubenswrapper[5004]: I1201 08:23:42.765811 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c645213-a3fd-4f35-9edd-60905873a559-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "9c645213-a3fd-4f35-9edd-60905873a559" (UID: "9c645213-a3fd-4f35-9edd-60905873a559"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:23:42 crc kubenswrapper[5004]: I1201 08:23:42.765985 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "9c645213-a3fd-4f35-9edd-60905873a559" (UID: "9c645213-a3fd-4f35-9edd-60905873a559"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 01 08:23:42 crc kubenswrapper[5004]: I1201 08:23:42.840438 5004 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9c645213-a3fd-4f35-9edd-60905873a559-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 01 08:23:42 crc kubenswrapper[5004]: I1201 08:23:42.840525 5004 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9c645213-a3fd-4f35-9edd-60905873a559-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 01 08:23:42 crc kubenswrapper[5004]: I1201 08:23:42.840541 5004 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9c645213-a3fd-4f35-9edd-60905873a559-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 01 08:23:42 crc kubenswrapper[5004]: I1201 08:23:42.840570 5004 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9c645213-a3fd-4f35-9edd-60905873a559-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 08:23:42 crc kubenswrapper[5004]: I1201 08:23:42.840586 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bv8b7\" (UniqueName: \"kubernetes.io/projected/9c645213-a3fd-4f35-9edd-60905873a559-kube-api-access-bv8b7\") on node \"crc\" DevicePath \"\"" Dec 01 08:23:42 crc kubenswrapper[5004]: I1201 08:23:42.840597 5004 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9c645213-a3fd-4f35-9edd-60905873a559-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 01 08:23:43 crc kubenswrapper[5004]: I1201 08:23:43.245041 5004 generic.go:334] "Generic (PLEG): container finished" podID="9c645213-a3fd-4f35-9edd-60905873a559" containerID="e54f7026de6305decc27f119174eb4e406e62b4b6ac16f898f07825ff5ef24ce" exitCode=0 Dec 01 08:23:43 crc kubenswrapper[5004]: I1201 08:23:43.245093 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-8jr7l" event={"ID":"9c645213-a3fd-4f35-9edd-60905873a559","Type":"ContainerDied","Data":"e54f7026de6305decc27f119174eb4e406e62b4b6ac16f898f07825ff5ef24ce"} Dec 01 08:23:43 crc kubenswrapper[5004]: I1201 08:23:43.245119 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-8jr7l" event={"ID":"9c645213-a3fd-4f35-9edd-60905873a559","Type":"ContainerDied","Data":"122870c1f8f7094cb7442e50daf493b4d8e00b41dbe2d955d8ad6493dce19c96"} Dec 01 08:23:43 crc kubenswrapper[5004]: I1201 08:23:43.245134 5004 scope.go:117] "RemoveContainer" containerID="e54f7026de6305decc27f119174eb4e406e62b4b6ac16f898f07825ff5ef24ce" Dec 01 08:23:43 crc kubenswrapper[5004]: I1201 08:23:43.245230 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-8jr7l" Dec 01 08:23:43 crc kubenswrapper[5004]: I1201 08:23:43.258796 5004 scope.go:117] "RemoveContainer" containerID="e54f7026de6305decc27f119174eb4e406e62b4b6ac16f898f07825ff5ef24ce" Dec 01 08:23:43 crc kubenswrapper[5004]: E1201 08:23:43.259159 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e54f7026de6305decc27f119174eb4e406e62b4b6ac16f898f07825ff5ef24ce\": container with ID starting with e54f7026de6305decc27f119174eb4e406e62b4b6ac16f898f07825ff5ef24ce not found: ID does not exist" containerID="e54f7026de6305decc27f119174eb4e406e62b4b6ac16f898f07825ff5ef24ce" Dec 01 08:23:43 crc kubenswrapper[5004]: I1201 08:23:43.259186 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e54f7026de6305decc27f119174eb4e406e62b4b6ac16f898f07825ff5ef24ce"} err="failed to get container status \"e54f7026de6305decc27f119174eb4e406e62b4b6ac16f898f07825ff5ef24ce\": rpc error: code = NotFound desc = could not find container \"e54f7026de6305decc27f119174eb4e406e62b4b6ac16f898f07825ff5ef24ce\": container with ID starting with e54f7026de6305decc27f119174eb4e406e62b4b6ac16f898f07825ff5ef24ce not found: ID does not exist" Dec 01 08:23:43 crc kubenswrapper[5004]: I1201 08:23:43.265199 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-8jr7l"] Dec 01 08:23:43 crc kubenswrapper[5004]: I1201 08:23:43.274869 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-8jr7l"] Dec 01 08:23:44 crc kubenswrapper[5004]: I1201 08:23:44.764317 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c645213-a3fd-4f35-9edd-60905873a559" path="/var/lib/kubelet/pods/9c645213-a3fd-4f35-9edd-60905873a559/volumes" Dec 01 08:23:54 crc kubenswrapper[5004]: I1201 08:23:54.970099 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-kf449"] Dec 01 08:23:54 crc kubenswrapper[5004]: E1201 08:23:54.970944 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c645213-a3fd-4f35-9edd-60905873a559" containerName="registry" Dec 01 08:23:54 crc kubenswrapper[5004]: I1201 08:23:54.970960 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c645213-a3fd-4f35-9edd-60905873a559" containerName="registry" Dec 01 08:23:54 crc kubenswrapper[5004]: I1201 08:23:54.971097 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c645213-a3fd-4f35-9edd-60905873a559" containerName="registry" Dec 01 08:23:54 crc kubenswrapper[5004]: I1201 08:23:54.971588 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-kf449" Dec 01 08:23:54 crc kubenswrapper[5004]: I1201 08:23:54.974173 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Dec 01 08:23:54 crc kubenswrapper[5004]: I1201 08:23:54.975068 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Dec 01 08:23:54 crc kubenswrapper[5004]: I1201 08:23:54.975094 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Dec 01 08:23:54 crc kubenswrapper[5004]: I1201 08:23:54.977473 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Dec 01 08:23:54 crc kubenswrapper[5004]: I1201 08:23:54.979066 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-dockercfg-wwt9l" Dec 01 08:23:54 crc kubenswrapper[5004]: I1201 08:23:54.986807 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-kf449"] Dec 01 08:23:55 crc kubenswrapper[5004]: I1201 08:23:55.021241 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/2181cb45-a12c-43b9-8c32-309ed3c31e74-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-kf449\" (UID: \"2181cb45-a12c-43b9-8c32-309ed3c31e74\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-kf449" Dec 01 08:23:55 crc kubenswrapper[5004]: I1201 08:23:55.021294 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68n6n\" (UniqueName: \"kubernetes.io/projected/2181cb45-a12c-43b9-8c32-309ed3c31e74-kube-api-access-68n6n\") pod \"cluster-monitoring-operator-6d5b84845-kf449\" (UID: \"2181cb45-a12c-43b9-8c32-309ed3c31e74\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-kf449" Dec 01 08:23:55 crc kubenswrapper[5004]: I1201 08:23:55.021364 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/2181cb45-a12c-43b9-8c32-309ed3c31e74-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-kf449\" (UID: \"2181cb45-a12c-43b9-8c32-309ed3c31e74\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-kf449" Dec 01 08:23:55 crc kubenswrapper[5004]: I1201 08:23:55.122397 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/2181cb45-a12c-43b9-8c32-309ed3c31e74-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-kf449\" (UID: \"2181cb45-a12c-43b9-8c32-309ed3c31e74\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-kf449" Dec 01 08:23:55 crc kubenswrapper[5004]: I1201 08:23:55.122449 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68n6n\" (UniqueName: \"kubernetes.io/projected/2181cb45-a12c-43b9-8c32-309ed3c31e74-kube-api-access-68n6n\") pod \"cluster-monitoring-operator-6d5b84845-kf449\" (UID: \"2181cb45-a12c-43b9-8c32-309ed3c31e74\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-kf449" Dec 01 08:23:55 crc kubenswrapper[5004]: I1201 08:23:55.122482 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/2181cb45-a12c-43b9-8c32-309ed3c31e74-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-kf449\" (UID: \"2181cb45-a12c-43b9-8c32-309ed3c31e74\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-kf449" Dec 01 08:23:55 crc kubenswrapper[5004]: I1201 08:23:55.123430 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/2181cb45-a12c-43b9-8c32-309ed3c31e74-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-kf449\" (UID: \"2181cb45-a12c-43b9-8c32-309ed3c31e74\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-kf449" Dec 01 08:23:55 crc kubenswrapper[5004]: I1201 08:23:55.129882 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/2181cb45-a12c-43b9-8c32-309ed3c31e74-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-kf449\" (UID: \"2181cb45-a12c-43b9-8c32-309ed3c31e74\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-kf449" Dec 01 08:23:55 crc kubenswrapper[5004]: I1201 08:23:55.141098 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68n6n\" (UniqueName: \"kubernetes.io/projected/2181cb45-a12c-43b9-8c32-309ed3c31e74-kube-api-access-68n6n\") pod \"cluster-monitoring-operator-6d5b84845-kf449\" (UID: \"2181cb45-a12c-43b9-8c32-309ed3c31e74\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-kf449" Dec 01 08:23:55 crc kubenswrapper[5004]: I1201 08:23:55.295082 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-kf449" Dec 01 08:23:55 crc kubenswrapper[5004]: I1201 08:23:55.757991 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-kf449"] Dec 01 08:23:56 crc kubenswrapper[5004]: I1201 08:23:56.358762 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-kf449" event={"ID":"2181cb45-a12c-43b9-8c32-309ed3c31e74","Type":"ContainerStarted","Data":"451db11b5266b5af3925339880853ca306b64028e08f424d18ee15c3f125b3c2"} Dec 01 08:23:59 crc kubenswrapper[5004]: I1201 08:23:59.231129 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-twlk4"] Dec 01 08:23:59 crc kubenswrapper[5004]: I1201 08:23:59.232427 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-twlk4" Dec 01 08:23:59 crc kubenswrapper[5004]: I1201 08:23:59.234397 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-dockercfg-pv5gx" Dec 01 08:23:59 crc kubenswrapper[5004]: I1201 08:23:59.235226 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Dec 01 08:23:59 crc kubenswrapper[5004]: I1201 08:23:59.242228 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-twlk4"] Dec 01 08:23:59 crc kubenswrapper[5004]: I1201 08:23:59.291938 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/00db2d0b-826a-4dac-a07e-dbe92a86b17f-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-twlk4\" (UID: \"00db2d0b-826a-4dac-a07e-dbe92a86b17f\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-twlk4" Dec 01 08:23:59 crc kubenswrapper[5004]: I1201 08:23:59.379673 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-kf449" event={"ID":"2181cb45-a12c-43b9-8c32-309ed3c31e74","Type":"ContainerStarted","Data":"998248a66a532fa55c1eb830c5a8b27473fb5f9a54ef716a59a1d0c7abb297c9"} Dec 01 08:23:59 crc kubenswrapper[5004]: I1201 08:23:59.393425 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/00db2d0b-826a-4dac-a07e-dbe92a86b17f-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-twlk4\" (UID: \"00db2d0b-826a-4dac-a07e-dbe92a86b17f\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-twlk4" Dec 01 08:23:59 crc kubenswrapper[5004]: E1201 08:23:59.393621 5004 secret.go:188] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Dec 01 08:23:59 crc kubenswrapper[5004]: E1201 08:23:59.393704 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/00db2d0b-826a-4dac-a07e-dbe92a86b17f-tls-certificates podName:00db2d0b-826a-4dac-a07e-dbe92a86b17f nodeName:}" failed. No retries permitted until 2025-12-01 08:23:59.893677445 +0000 UTC m=+417.458669437 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/00db2d0b-826a-4dac-a07e-dbe92a86b17f-tls-certificates") pod "prometheus-operator-admission-webhook-f54c54754-twlk4" (UID: "00db2d0b-826a-4dac-a07e-dbe92a86b17f") : secret "prometheus-operator-admission-webhook-tls" not found Dec 01 08:23:59 crc kubenswrapper[5004]: I1201 08:23:59.402649 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-kf449" podStartSLOduration=2.539062073 podStartE2EDuration="5.402626245s" podCreationTimestamp="2025-12-01 08:23:54 +0000 UTC" firstStartedPulling="2025-12-01 08:23:55.765429355 +0000 UTC m=+413.330421377" lastFinishedPulling="2025-12-01 08:23:58.628993567 +0000 UTC m=+416.193985549" observedRunningTime="2025-12-01 08:23:59.400407698 +0000 UTC m=+416.965399680" watchObservedRunningTime="2025-12-01 08:23:59.402626245 +0000 UTC m=+416.967618247" Dec 01 08:23:59 crc kubenswrapper[5004]: I1201 08:23:59.900622 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/00db2d0b-826a-4dac-a07e-dbe92a86b17f-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-twlk4\" (UID: \"00db2d0b-826a-4dac-a07e-dbe92a86b17f\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-twlk4" Dec 01 08:23:59 crc kubenswrapper[5004]: I1201 08:23:59.910048 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/00db2d0b-826a-4dac-a07e-dbe92a86b17f-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-twlk4\" (UID: \"00db2d0b-826a-4dac-a07e-dbe92a86b17f\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-twlk4" Dec 01 08:24:00 crc kubenswrapper[5004]: I1201 08:24:00.146628 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-twlk4" Dec 01 08:24:00 crc kubenswrapper[5004]: I1201 08:24:00.634003 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-twlk4"] Dec 01 08:24:00 crc kubenswrapper[5004]: W1201 08:24:00.645083 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00db2d0b_826a_4dac_a07e_dbe92a86b17f.slice/crio-0a8fc59db7e24122d42e529a8ba0c2805ac7e18210b2012fe0fa21f000137056 WatchSource:0}: Error finding container 0a8fc59db7e24122d42e529a8ba0c2805ac7e18210b2012fe0fa21f000137056: Status 404 returned error can't find the container with id 0a8fc59db7e24122d42e529a8ba0c2805ac7e18210b2012fe0fa21f000137056 Dec 01 08:24:01 crc kubenswrapper[5004]: I1201 08:24:01.394300 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-twlk4" event={"ID":"00db2d0b-826a-4dac-a07e-dbe92a86b17f","Type":"ContainerStarted","Data":"0a8fc59db7e24122d42e529a8ba0c2805ac7e18210b2012fe0fa21f000137056"} Dec 01 08:24:03 crc kubenswrapper[5004]: I1201 08:24:03.407509 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-twlk4" event={"ID":"00db2d0b-826a-4dac-a07e-dbe92a86b17f","Type":"ContainerStarted","Data":"5cca8299787b9bc201f92d355f94e92ce10f3ed0266a69a8d901f1077b6f71a5"} Dec 01 08:24:03 crc kubenswrapper[5004]: I1201 08:24:03.408121 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-twlk4" Dec 01 08:24:03 crc kubenswrapper[5004]: I1201 08:24:03.416474 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-twlk4" Dec 01 08:24:03 crc kubenswrapper[5004]: I1201 08:24:03.431386 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-twlk4" podStartSLOduration=2.124324322 podStartE2EDuration="4.431363909s" podCreationTimestamp="2025-12-01 08:23:59 +0000 UTC" firstStartedPulling="2025-12-01 08:24:00.6484427 +0000 UTC m=+418.213434722" lastFinishedPulling="2025-12-01 08:24:02.955482317 +0000 UTC m=+420.520474309" observedRunningTime="2025-12-01 08:24:03.427372907 +0000 UTC m=+420.992364919" watchObservedRunningTime="2025-12-01 08:24:03.431363909 +0000 UTC m=+420.996355921" Dec 01 08:24:04 crc kubenswrapper[5004]: I1201 08:24:04.331918 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-4lbk8"] Dec 01 08:24:04 crc kubenswrapper[5004]: I1201 08:24:04.334392 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-db54df47d-4lbk8" Dec 01 08:24:04 crc kubenswrapper[5004]: I1201 08:24:04.337785 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-pcll5" Dec 01 08:24:04 crc kubenswrapper[5004]: I1201 08:24:04.338149 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Dec 01 08:24:04 crc kubenswrapper[5004]: I1201 08:24:04.338868 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Dec 01 08:24:04 crc kubenswrapper[5004]: I1201 08:24:04.339396 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Dec 01 08:24:04 crc kubenswrapper[5004]: I1201 08:24:04.356307 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-4lbk8"] Dec 01 08:24:04 crc kubenswrapper[5004]: I1201 08:24:04.458836 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6fc2d710-6868-4033-a4ae-13eeb7572be4-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-4lbk8\" (UID: \"6fc2d710-6868-4033-a4ae-13eeb7572be4\") " pod="openshift-monitoring/prometheus-operator-db54df47d-4lbk8" Dec 01 08:24:04 crc kubenswrapper[5004]: I1201 08:24:04.458948 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/6fc2d710-6868-4033-a4ae-13eeb7572be4-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-4lbk8\" (UID: \"6fc2d710-6868-4033-a4ae-13eeb7572be4\") " pod="openshift-monitoring/prometheus-operator-db54df47d-4lbk8" Dec 01 08:24:04 crc kubenswrapper[5004]: I1201 08:24:04.459311 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6fc2d710-6868-4033-a4ae-13eeb7572be4-metrics-client-ca\") pod \"prometheus-operator-db54df47d-4lbk8\" (UID: \"6fc2d710-6868-4033-a4ae-13eeb7572be4\") " pod="openshift-monitoring/prometheus-operator-db54df47d-4lbk8" Dec 01 08:24:04 crc kubenswrapper[5004]: I1201 08:24:04.459639 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkk5r\" (UniqueName: \"kubernetes.io/projected/6fc2d710-6868-4033-a4ae-13eeb7572be4-kube-api-access-hkk5r\") pod \"prometheus-operator-db54df47d-4lbk8\" (UID: \"6fc2d710-6868-4033-a4ae-13eeb7572be4\") " pod="openshift-monitoring/prometheus-operator-db54df47d-4lbk8" Dec 01 08:24:04 crc kubenswrapper[5004]: I1201 08:24:04.560891 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/6fc2d710-6868-4033-a4ae-13eeb7572be4-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-4lbk8\" (UID: \"6fc2d710-6868-4033-a4ae-13eeb7572be4\") " pod="openshift-monitoring/prometheus-operator-db54df47d-4lbk8" Dec 01 08:24:04 crc kubenswrapper[5004]: I1201 08:24:04.560996 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6fc2d710-6868-4033-a4ae-13eeb7572be4-metrics-client-ca\") pod \"prometheus-operator-db54df47d-4lbk8\" (UID: \"6fc2d710-6868-4033-a4ae-13eeb7572be4\") " pod="openshift-monitoring/prometheus-operator-db54df47d-4lbk8" Dec 01 08:24:04 crc kubenswrapper[5004]: I1201 08:24:04.561046 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkk5r\" (UniqueName: \"kubernetes.io/projected/6fc2d710-6868-4033-a4ae-13eeb7572be4-kube-api-access-hkk5r\") pod \"prometheus-operator-db54df47d-4lbk8\" (UID: \"6fc2d710-6868-4033-a4ae-13eeb7572be4\") " pod="openshift-monitoring/prometheus-operator-db54df47d-4lbk8" Dec 01 08:24:04 crc kubenswrapper[5004]: I1201 08:24:04.561186 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6fc2d710-6868-4033-a4ae-13eeb7572be4-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-4lbk8\" (UID: \"6fc2d710-6868-4033-a4ae-13eeb7572be4\") " pod="openshift-monitoring/prometheus-operator-db54df47d-4lbk8" Dec 01 08:24:04 crc kubenswrapper[5004]: I1201 08:24:04.562781 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6fc2d710-6868-4033-a4ae-13eeb7572be4-metrics-client-ca\") pod \"prometheus-operator-db54df47d-4lbk8\" (UID: \"6fc2d710-6868-4033-a4ae-13eeb7572be4\") " pod="openshift-monitoring/prometheus-operator-db54df47d-4lbk8" Dec 01 08:24:04 crc kubenswrapper[5004]: I1201 08:24:04.570017 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6fc2d710-6868-4033-a4ae-13eeb7572be4-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-4lbk8\" (UID: \"6fc2d710-6868-4033-a4ae-13eeb7572be4\") " pod="openshift-monitoring/prometheus-operator-db54df47d-4lbk8" Dec 01 08:24:04 crc kubenswrapper[5004]: I1201 08:24:04.570136 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/6fc2d710-6868-4033-a4ae-13eeb7572be4-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-4lbk8\" (UID: \"6fc2d710-6868-4033-a4ae-13eeb7572be4\") " pod="openshift-monitoring/prometheus-operator-db54df47d-4lbk8" Dec 01 08:24:04 crc kubenswrapper[5004]: I1201 08:24:04.598076 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkk5r\" (UniqueName: \"kubernetes.io/projected/6fc2d710-6868-4033-a4ae-13eeb7572be4-kube-api-access-hkk5r\") pod \"prometheus-operator-db54df47d-4lbk8\" (UID: \"6fc2d710-6868-4033-a4ae-13eeb7572be4\") " pod="openshift-monitoring/prometheus-operator-db54df47d-4lbk8" Dec 01 08:24:04 crc kubenswrapper[5004]: I1201 08:24:04.654555 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-db54df47d-4lbk8" Dec 01 08:24:05 crc kubenswrapper[5004]: I1201 08:24:05.144133 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-4lbk8"] Dec 01 08:24:05 crc kubenswrapper[5004]: I1201 08:24:05.421805 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-4lbk8" event={"ID":"6fc2d710-6868-4033-a4ae-13eeb7572be4","Type":"ContainerStarted","Data":"c009028feeb123883df6b97c408496e11b51a765035a385812338c84537bb1bc"} Dec 01 08:24:08 crc kubenswrapper[5004]: I1201 08:24:08.443135 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-4lbk8" event={"ID":"6fc2d710-6868-4033-a4ae-13eeb7572be4","Type":"ContainerStarted","Data":"c80cd79e0e0893af506692a92cc3709e121565ec946dd590ecf34a2f2aed89ef"} Dec 01 08:24:08 crc kubenswrapper[5004]: I1201 08:24:08.443969 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-4lbk8" event={"ID":"6fc2d710-6868-4033-a4ae-13eeb7572be4","Type":"ContainerStarted","Data":"768fc6e0874329fa0fae516e3bb6afc1ad6adc2783c7f057a7fcded4747985d3"} Dec 01 08:24:08 crc kubenswrapper[5004]: I1201 08:24:08.474925 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-db54df47d-4lbk8" podStartSLOduration=2.145494659 podStartE2EDuration="4.474896519s" podCreationTimestamp="2025-12-01 08:24:04 +0000 UTC" firstStartedPulling="2025-12-01 08:24:05.153514627 +0000 UTC m=+422.718506609" lastFinishedPulling="2025-12-01 08:24:07.482916467 +0000 UTC m=+425.047908469" observedRunningTime="2025-12-01 08:24:08.468760123 +0000 UTC m=+426.033752135" watchObservedRunningTime="2025-12-01 08:24:08.474896519 +0000 UTC m=+426.039888531" Dec 01 08:24:08 crc kubenswrapper[5004]: I1201 08:24:08.729667 5004 patch_prober.go:28] interesting pod/machine-config-daemon-fvdgt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 08:24:08 crc kubenswrapper[5004]: I1201 08:24:08.729766 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 08:24:08 crc kubenswrapper[5004]: I1201 08:24:08.729840 5004 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" Dec 01 08:24:08 crc kubenswrapper[5004]: I1201 08:24:08.730729 5004 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"aee81d40a16962a7717cc3a5a3263157cb0e536c40bc2b3b83dfa0f852f31e2a"} pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 08:24:08 crc kubenswrapper[5004]: I1201 08:24:08.730860 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" containerID="cri-o://aee81d40a16962a7717cc3a5a3263157cb0e536c40bc2b3b83dfa0f852f31e2a" gracePeriod=600 Dec 01 08:24:09 crc kubenswrapper[5004]: I1201 08:24:09.453925 5004 generic.go:334] "Generic (PLEG): container finished" podID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerID="aee81d40a16962a7717cc3a5a3263157cb0e536c40bc2b3b83dfa0f852f31e2a" exitCode=0 Dec 01 08:24:09 crc kubenswrapper[5004]: I1201 08:24:09.453981 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" event={"ID":"f9977ebb-82de-4e96-8763-0b5a84f8d4ce","Type":"ContainerDied","Data":"aee81d40a16962a7717cc3a5a3263157cb0e536c40bc2b3b83dfa0f852f31e2a"} Dec 01 08:24:09 crc kubenswrapper[5004]: I1201 08:24:09.454398 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" event={"ID":"f9977ebb-82de-4e96-8763-0b5a84f8d4ce","Type":"ContainerStarted","Data":"f5cb4e2ac3b55859ead5c898a2b42c280b2d9fe9b770bdbbd6d9799deecd9d6a"} Dec 01 08:24:09 crc kubenswrapper[5004]: I1201 08:24:09.454458 5004 scope.go:117] "RemoveContainer" containerID="6c1e51fa92aec80e93d0fce11c743b8c4fde23fc23d8626e8d5b3e25ab42350d" Dec 01 08:24:10 crc kubenswrapper[5004]: I1201 08:24:10.691546 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-5hh42"] Dec 01 08:24:10 crc kubenswrapper[5004]: I1201 08:24:10.693320 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-566fddb674-5hh42" Dec 01 08:24:10 crc kubenswrapper[5004]: I1201 08:24:10.694908 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-dockercfg-4976v" Dec 01 08:24:10 crc kubenswrapper[5004]: I1201 08:24:10.695296 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Dec 01 08:24:10 crc kubenswrapper[5004]: I1201 08:24:10.695837 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Dec 01 08:24:10 crc kubenswrapper[5004]: I1201 08:24:10.700985 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-5hh42"] Dec 01 08:24:10 crc kubenswrapper[5004]: I1201 08:24:10.730104 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-g5xj9"] Dec 01 08:24:10 crc kubenswrapper[5004]: I1201 08:24:10.731367 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-g5xj9" Dec 01 08:24:10 crc kubenswrapper[5004]: I1201 08:24:10.735387 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Dec 01 08:24:10 crc kubenswrapper[5004]: I1201 08:24:10.736936 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Dec 01 08:24:10 crc kubenswrapper[5004]: I1201 08:24:10.737776 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-xzp28" Dec 01 08:24:10 crc kubenswrapper[5004]: I1201 08:24:10.740419 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-vfdrv"] Dec 01 08:24:10 crc kubenswrapper[5004]: I1201 08:24:10.741494 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-vfdrv" Dec 01 08:24:10 crc kubenswrapper[5004]: I1201 08:24:10.747067 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Dec 01 08:24:10 crc kubenswrapper[5004]: I1201 08:24:10.747412 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Dec 01 08:24:10 crc kubenswrapper[5004]: I1201 08:24:10.747701 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Dec 01 08:24:10 crc kubenswrapper[5004]: I1201 08:24:10.748252 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-tns4k" Dec 01 08:24:10 crc kubenswrapper[5004]: I1201 08:24:10.761689 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhbx6\" (UniqueName: \"kubernetes.io/projected/d4291cf4-2b49-471b-9526-8591c8e15bab-kube-api-access-lhbx6\") pod \"openshift-state-metrics-566fddb674-5hh42\" (UID: \"d4291cf4-2b49-471b-9526-8591c8e15bab\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-5hh42" Dec 01 08:24:10 crc kubenswrapper[5004]: I1201 08:24:10.761748 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d4291cf4-2b49-471b-9526-8591c8e15bab-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-5hh42\" (UID: \"d4291cf4-2b49-471b-9526-8591c8e15bab\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-5hh42" Dec 01 08:24:10 crc kubenswrapper[5004]: I1201 08:24:10.761776 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/d4291cf4-2b49-471b-9526-8591c8e15bab-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-5hh42\" (UID: \"d4291cf4-2b49-471b-9526-8591c8e15bab\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-5hh42" Dec 01 08:24:10 crc kubenswrapper[5004]: I1201 08:24:10.761812 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d4291cf4-2b49-471b-9526-8591c8e15bab-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-5hh42\" (UID: \"d4291cf4-2b49-471b-9526-8591c8e15bab\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-5hh42" Dec 01 08:24:10 crc kubenswrapper[5004]: I1201 08:24:10.775101 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-vfdrv"] Dec 01 08:24:10 crc kubenswrapper[5004]: I1201 08:24:10.863258 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d4291cf4-2b49-471b-9526-8591c8e15bab-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-5hh42\" (UID: \"d4291cf4-2b49-471b-9526-8591c8e15bab\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-5hh42" Dec 01 08:24:10 crc kubenswrapper[5004]: I1201 08:24:10.863324 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5d20148e-b0bd-4052-a9d7-ed4064cc0f3a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-g5xj9\" (UID: \"5d20148e-b0bd-4052-a9d7-ed4064cc0f3a\") " pod="openshift-monitoring/node-exporter-g5xj9" Dec 01 08:24:10 crc kubenswrapper[5004]: I1201 08:24:10.863347 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/621d07ba-8cd2-4aa8-8064-db1e702f8da5-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-vfdrv\" (UID: \"621d07ba-8cd2-4aa8-8064-db1e702f8da5\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-vfdrv" Dec 01 08:24:10 crc kubenswrapper[5004]: I1201 08:24:10.863372 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/621d07ba-8cd2-4aa8-8064-db1e702f8da5-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-vfdrv\" (UID: \"621d07ba-8cd2-4aa8-8064-db1e702f8da5\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-vfdrv" Dec 01 08:24:10 crc kubenswrapper[5004]: I1201 08:24:10.863399 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/d4291cf4-2b49-471b-9526-8591c8e15bab-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-5hh42\" (UID: \"d4291cf4-2b49-471b-9526-8591c8e15bab\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-5hh42" Dec 01 08:24:10 crc kubenswrapper[5004]: I1201 08:24:10.863425 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/621d07ba-8cd2-4aa8-8064-db1e702f8da5-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-vfdrv\" (UID: \"621d07ba-8cd2-4aa8-8064-db1e702f8da5\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-vfdrv" Dec 01 08:24:10 crc kubenswrapper[5004]: I1201 08:24:10.863460 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d4291cf4-2b49-471b-9526-8591c8e15bab-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-5hh42\" (UID: \"d4291cf4-2b49-471b-9526-8591c8e15bab\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-5hh42" Dec 01 08:24:10 crc kubenswrapper[5004]: I1201 08:24:10.863491 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5d20148e-b0bd-4052-a9d7-ed4064cc0f3a-metrics-client-ca\") pod \"node-exporter-g5xj9\" (UID: \"5d20148e-b0bd-4052-a9d7-ed4064cc0f3a\") " pod="openshift-monitoring/node-exporter-g5xj9" Dec 01 08:24:10 crc kubenswrapper[5004]: I1201 08:24:10.863520 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/621d07ba-8cd2-4aa8-8064-db1e702f8da5-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-vfdrv\" (UID: \"621d07ba-8cd2-4aa8-8064-db1e702f8da5\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-vfdrv" Dec 01 08:24:10 crc kubenswrapper[5004]: I1201 08:24:10.863543 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5d20148e-b0bd-4052-a9d7-ed4064cc0f3a-node-exporter-wtmp\") pod \"node-exporter-g5xj9\" (UID: \"5d20148e-b0bd-4052-a9d7-ed4064cc0f3a\") " pod="openshift-monitoring/node-exporter-g5xj9" Dec 01 08:24:10 crc kubenswrapper[5004]: I1201 08:24:10.863594 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5d20148e-b0bd-4052-a9d7-ed4064cc0f3a-sys\") pod \"node-exporter-g5xj9\" (UID: \"5d20148e-b0bd-4052-a9d7-ed4064cc0f3a\") " pod="openshift-monitoring/node-exporter-g5xj9" Dec 01 08:24:10 crc kubenswrapper[5004]: I1201 08:24:10.863615 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q46cq\" (UniqueName: \"kubernetes.io/projected/621d07ba-8cd2-4aa8-8064-db1e702f8da5-kube-api-access-q46cq\") pod \"kube-state-metrics-777cb5bd5d-vfdrv\" (UID: \"621d07ba-8cd2-4aa8-8064-db1e702f8da5\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-vfdrv" Dec 01 08:24:10 crc kubenswrapper[5004]: I1201 08:24:10.863641 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5d20148e-b0bd-4052-a9d7-ed4064cc0f3a-node-exporter-textfile\") pod \"node-exporter-g5xj9\" (UID: \"5d20148e-b0bd-4052-a9d7-ed4064cc0f3a\") " pod="openshift-monitoring/node-exporter-g5xj9" Dec 01 08:24:10 crc kubenswrapper[5004]: I1201 08:24:10.863667 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/621d07ba-8cd2-4aa8-8064-db1e702f8da5-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-vfdrv\" (UID: \"621d07ba-8cd2-4aa8-8064-db1e702f8da5\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-vfdrv" Dec 01 08:24:10 crc kubenswrapper[5004]: I1201 08:24:10.863693 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5d20148e-b0bd-4052-a9d7-ed4064cc0f3a-node-exporter-tls\") pod \"node-exporter-g5xj9\" (UID: \"5d20148e-b0bd-4052-a9d7-ed4064cc0f3a\") " pod="openshift-monitoring/node-exporter-g5xj9" Dec 01 08:24:10 crc kubenswrapper[5004]: I1201 08:24:10.863714 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9kbr\" (UniqueName: \"kubernetes.io/projected/5d20148e-b0bd-4052-a9d7-ed4064cc0f3a-kube-api-access-d9kbr\") pod \"node-exporter-g5xj9\" (UID: \"5d20148e-b0bd-4052-a9d7-ed4064cc0f3a\") " pod="openshift-monitoring/node-exporter-g5xj9" Dec 01 08:24:10 crc kubenswrapper[5004]: I1201 08:24:10.863745 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5d20148e-b0bd-4052-a9d7-ed4064cc0f3a-root\") pod \"node-exporter-g5xj9\" (UID: \"5d20148e-b0bd-4052-a9d7-ed4064cc0f3a\") " pod="openshift-monitoring/node-exporter-g5xj9" Dec 01 08:24:10 crc kubenswrapper[5004]: I1201 08:24:10.863775 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhbx6\" (UniqueName: \"kubernetes.io/projected/d4291cf4-2b49-471b-9526-8591c8e15bab-kube-api-access-lhbx6\") pod \"openshift-state-metrics-566fddb674-5hh42\" (UID: \"d4291cf4-2b49-471b-9526-8591c8e15bab\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-5hh42" Dec 01 08:24:10 crc kubenswrapper[5004]: I1201 08:24:10.864925 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d4291cf4-2b49-471b-9526-8591c8e15bab-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-5hh42\" (UID: \"d4291cf4-2b49-471b-9526-8591c8e15bab\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-5hh42" Dec 01 08:24:10 crc kubenswrapper[5004]: E1201 08:24:10.865017 5004 secret.go:188] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Dec 01 08:24:10 crc kubenswrapper[5004]: E1201 08:24:10.865060 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4291cf4-2b49-471b-9526-8591c8e15bab-openshift-state-metrics-tls podName:d4291cf4-2b49-471b-9526-8591c8e15bab nodeName:}" failed. No retries permitted until 2025-12-01 08:24:11.365048493 +0000 UTC m=+428.930040475 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/d4291cf4-2b49-471b-9526-8591c8e15bab-openshift-state-metrics-tls") pod "openshift-state-metrics-566fddb674-5hh42" (UID: "d4291cf4-2b49-471b-9526-8591c8e15bab") : secret "openshift-state-metrics-tls" not found Dec 01 08:24:10 crc kubenswrapper[5004]: I1201 08:24:10.871266 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d4291cf4-2b49-471b-9526-8591c8e15bab-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-5hh42\" (UID: \"d4291cf4-2b49-471b-9526-8591c8e15bab\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-5hh42" Dec 01 08:24:10 crc kubenswrapper[5004]: I1201 08:24:10.879365 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhbx6\" (UniqueName: \"kubernetes.io/projected/d4291cf4-2b49-471b-9526-8591c8e15bab-kube-api-access-lhbx6\") pod \"openshift-state-metrics-566fddb674-5hh42\" (UID: \"d4291cf4-2b49-471b-9526-8591c8e15bab\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-5hh42" Dec 01 08:24:10 crc kubenswrapper[5004]: I1201 08:24:10.964638 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5d20148e-b0bd-4052-a9d7-ed4064cc0f3a-sys\") pod \"node-exporter-g5xj9\" (UID: \"5d20148e-b0bd-4052-a9d7-ed4064cc0f3a\") " pod="openshift-monitoring/node-exporter-g5xj9" Dec 01 08:24:10 crc kubenswrapper[5004]: I1201 08:24:10.964712 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q46cq\" (UniqueName: \"kubernetes.io/projected/621d07ba-8cd2-4aa8-8064-db1e702f8da5-kube-api-access-q46cq\") pod \"kube-state-metrics-777cb5bd5d-vfdrv\" (UID: \"621d07ba-8cd2-4aa8-8064-db1e702f8da5\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-vfdrv" Dec 01 08:24:10 crc kubenswrapper[5004]: I1201 08:24:10.964746 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5d20148e-b0bd-4052-a9d7-ed4064cc0f3a-node-exporter-textfile\") pod \"node-exporter-g5xj9\" (UID: \"5d20148e-b0bd-4052-a9d7-ed4064cc0f3a\") " pod="openshift-monitoring/node-exporter-g5xj9" Dec 01 08:24:10 crc kubenswrapper[5004]: I1201 08:24:10.964775 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/621d07ba-8cd2-4aa8-8064-db1e702f8da5-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-vfdrv\" (UID: \"621d07ba-8cd2-4aa8-8064-db1e702f8da5\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-vfdrv" Dec 01 08:24:10 crc kubenswrapper[5004]: I1201 08:24:10.964782 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5d20148e-b0bd-4052-a9d7-ed4064cc0f3a-sys\") pod \"node-exporter-g5xj9\" (UID: \"5d20148e-b0bd-4052-a9d7-ed4064cc0f3a\") " pod="openshift-monitoring/node-exporter-g5xj9" Dec 01 08:24:10 crc kubenswrapper[5004]: I1201 08:24:10.964802 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5d20148e-b0bd-4052-a9d7-ed4064cc0f3a-node-exporter-tls\") pod \"node-exporter-g5xj9\" (UID: \"5d20148e-b0bd-4052-a9d7-ed4064cc0f3a\") " pod="openshift-monitoring/node-exporter-g5xj9" Dec 01 08:24:10 crc kubenswrapper[5004]: I1201 08:24:10.964898 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9kbr\" (UniqueName: \"kubernetes.io/projected/5d20148e-b0bd-4052-a9d7-ed4064cc0f3a-kube-api-access-d9kbr\") pod \"node-exporter-g5xj9\" (UID: \"5d20148e-b0bd-4052-a9d7-ed4064cc0f3a\") " pod="openshift-monitoring/node-exporter-g5xj9" Dec 01 08:24:10 crc kubenswrapper[5004]: I1201 08:24:10.964950 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5d20148e-b0bd-4052-a9d7-ed4064cc0f3a-root\") pod \"node-exporter-g5xj9\" (UID: \"5d20148e-b0bd-4052-a9d7-ed4064cc0f3a\") " pod="openshift-monitoring/node-exporter-g5xj9" Dec 01 08:24:10 crc kubenswrapper[5004]: I1201 08:24:10.965040 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5d20148e-b0bd-4052-a9d7-ed4064cc0f3a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-g5xj9\" (UID: \"5d20148e-b0bd-4052-a9d7-ed4064cc0f3a\") " pod="openshift-monitoring/node-exporter-g5xj9" Dec 01 08:24:10 crc kubenswrapper[5004]: I1201 08:24:10.965057 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/621d07ba-8cd2-4aa8-8064-db1e702f8da5-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-vfdrv\" (UID: \"621d07ba-8cd2-4aa8-8064-db1e702f8da5\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-vfdrv" Dec 01 08:24:10 crc kubenswrapper[5004]: I1201 08:24:10.965078 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/621d07ba-8cd2-4aa8-8064-db1e702f8da5-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-vfdrv\" (UID: \"621d07ba-8cd2-4aa8-8064-db1e702f8da5\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-vfdrv" Dec 01 08:24:10 crc kubenswrapper[5004]: I1201 08:24:10.965122 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/621d07ba-8cd2-4aa8-8064-db1e702f8da5-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-vfdrv\" (UID: \"621d07ba-8cd2-4aa8-8064-db1e702f8da5\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-vfdrv" Dec 01 08:24:10 crc kubenswrapper[5004]: I1201 08:24:10.965185 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5d20148e-b0bd-4052-a9d7-ed4064cc0f3a-metrics-client-ca\") pod \"node-exporter-g5xj9\" (UID: \"5d20148e-b0bd-4052-a9d7-ed4064cc0f3a\") " pod="openshift-monitoring/node-exporter-g5xj9" Dec 01 08:24:10 crc kubenswrapper[5004]: I1201 08:24:10.965218 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/621d07ba-8cd2-4aa8-8064-db1e702f8da5-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-vfdrv\" (UID: \"621d07ba-8cd2-4aa8-8064-db1e702f8da5\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-vfdrv" Dec 01 08:24:10 crc kubenswrapper[5004]: I1201 08:24:10.965239 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5d20148e-b0bd-4052-a9d7-ed4064cc0f3a-node-exporter-wtmp\") pod \"node-exporter-g5xj9\" (UID: \"5d20148e-b0bd-4052-a9d7-ed4064cc0f3a\") " pod="openshift-monitoring/node-exporter-g5xj9" Dec 01 08:24:10 crc kubenswrapper[5004]: I1201 08:24:10.965354 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5d20148e-b0bd-4052-a9d7-ed4064cc0f3a-node-exporter-textfile\") pod \"node-exporter-g5xj9\" (UID: \"5d20148e-b0bd-4052-a9d7-ed4064cc0f3a\") " pod="openshift-monitoring/node-exporter-g5xj9" Dec 01 08:24:10 crc kubenswrapper[5004]: I1201 08:24:10.965442 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5d20148e-b0bd-4052-a9d7-ed4064cc0f3a-node-exporter-wtmp\") pod \"node-exporter-g5xj9\" (UID: \"5d20148e-b0bd-4052-a9d7-ed4064cc0f3a\") " pod="openshift-monitoring/node-exporter-g5xj9" Dec 01 08:24:10 crc kubenswrapper[5004]: I1201 08:24:10.965764 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/621d07ba-8cd2-4aa8-8064-db1e702f8da5-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-vfdrv\" (UID: \"621d07ba-8cd2-4aa8-8064-db1e702f8da5\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-vfdrv" Dec 01 08:24:10 crc kubenswrapper[5004]: I1201 08:24:10.965079 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5d20148e-b0bd-4052-a9d7-ed4064cc0f3a-root\") pod \"node-exporter-g5xj9\" (UID: \"5d20148e-b0bd-4052-a9d7-ed4064cc0f3a\") " pod="openshift-monitoring/node-exporter-g5xj9" Dec 01 08:24:10 crc kubenswrapper[5004]: I1201 08:24:10.965846 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/621d07ba-8cd2-4aa8-8064-db1e702f8da5-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-vfdrv\" (UID: \"621d07ba-8cd2-4aa8-8064-db1e702f8da5\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-vfdrv" Dec 01 08:24:10 crc kubenswrapper[5004]: I1201 08:24:10.966088 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/621d07ba-8cd2-4aa8-8064-db1e702f8da5-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-vfdrv\" (UID: \"621d07ba-8cd2-4aa8-8064-db1e702f8da5\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-vfdrv" Dec 01 08:24:10 crc kubenswrapper[5004]: I1201 08:24:10.966127 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5d20148e-b0bd-4052-a9d7-ed4064cc0f3a-metrics-client-ca\") pod \"node-exporter-g5xj9\" (UID: \"5d20148e-b0bd-4052-a9d7-ed4064cc0f3a\") " pod="openshift-monitoring/node-exporter-g5xj9" Dec 01 08:24:10 crc kubenswrapper[5004]: E1201 08:24:10.966143 5004 secret.go:188] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Dec 01 08:24:10 crc kubenswrapper[5004]: E1201 08:24:10.966200 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/621d07ba-8cd2-4aa8-8064-db1e702f8da5-kube-state-metrics-tls podName:621d07ba-8cd2-4aa8-8064-db1e702f8da5 nodeName:}" failed. No retries permitted until 2025-12-01 08:24:11.46618207 +0000 UTC m=+429.031174052 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/621d07ba-8cd2-4aa8-8064-db1e702f8da5-kube-state-metrics-tls") pod "kube-state-metrics-777cb5bd5d-vfdrv" (UID: "621d07ba-8cd2-4aa8-8064-db1e702f8da5") : secret "kube-state-metrics-tls" not found Dec 01 08:24:10 crc kubenswrapper[5004]: I1201 08:24:10.980200 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5d20148e-b0bd-4052-a9d7-ed4064cc0f3a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-g5xj9\" (UID: \"5d20148e-b0bd-4052-a9d7-ed4064cc0f3a\") " pod="openshift-monitoring/node-exporter-g5xj9" Dec 01 08:24:10 crc kubenswrapper[5004]: I1201 08:24:10.980254 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5d20148e-b0bd-4052-a9d7-ed4064cc0f3a-node-exporter-tls\") pod \"node-exporter-g5xj9\" (UID: \"5d20148e-b0bd-4052-a9d7-ed4064cc0f3a\") " pod="openshift-monitoring/node-exporter-g5xj9" Dec 01 08:24:10 crc kubenswrapper[5004]: I1201 08:24:10.980380 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/621d07ba-8cd2-4aa8-8064-db1e702f8da5-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-vfdrv\" (UID: \"621d07ba-8cd2-4aa8-8064-db1e702f8da5\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-vfdrv" Dec 01 08:24:10 crc kubenswrapper[5004]: I1201 08:24:10.988124 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q46cq\" (UniqueName: \"kubernetes.io/projected/621d07ba-8cd2-4aa8-8064-db1e702f8da5-kube-api-access-q46cq\") pod \"kube-state-metrics-777cb5bd5d-vfdrv\" (UID: \"621d07ba-8cd2-4aa8-8064-db1e702f8da5\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-vfdrv" Dec 01 08:24:10 crc kubenswrapper[5004]: I1201 08:24:10.993096 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9kbr\" (UniqueName: \"kubernetes.io/projected/5d20148e-b0bd-4052-a9d7-ed4064cc0f3a-kube-api-access-d9kbr\") pod \"node-exporter-g5xj9\" (UID: \"5d20148e-b0bd-4052-a9d7-ed4064cc0f3a\") " pod="openshift-monitoring/node-exporter-g5xj9" Dec 01 08:24:11 crc kubenswrapper[5004]: I1201 08:24:11.060791 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-g5xj9" Dec 01 08:24:11 crc kubenswrapper[5004]: I1201 08:24:11.369586 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/d4291cf4-2b49-471b-9526-8591c8e15bab-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-5hh42\" (UID: \"d4291cf4-2b49-471b-9526-8591c8e15bab\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-5hh42" Dec 01 08:24:11 crc kubenswrapper[5004]: I1201 08:24:11.376048 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/d4291cf4-2b49-471b-9526-8591c8e15bab-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-5hh42\" (UID: \"d4291cf4-2b49-471b-9526-8591c8e15bab\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-5hh42" Dec 01 08:24:11 crc kubenswrapper[5004]: I1201 08:24:11.471123 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/621d07ba-8cd2-4aa8-8064-db1e702f8da5-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-vfdrv\" (UID: \"621d07ba-8cd2-4aa8-8064-db1e702f8da5\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-vfdrv" Dec 01 08:24:11 crc kubenswrapper[5004]: I1201 08:24:11.471663 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-g5xj9" event={"ID":"5d20148e-b0bd-4052-a9d7-ed4064cc0f3a","Type":"ContainerStarted","Data":"38deb83151aea56c1329d49aac0f916b72f5afa7f4de382fa42636ddf26034d0"} Dec 01 08:24:11 crc kubenswrapper[5004]: I1201 08:24:11.477096 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/621d07ba-8cd2-4aa8-8064-db1e702f8da5-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-vfdrv\" (UID: \"621d07ba-8cd2-4aa8-8064-db1e702f8da5\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-vfdrv" Dec 01 08:24:11 crc kubenswrapper[5004]: I1201 08:24:11.614643 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-566fddb674-5hh42" Dec 01 08:24:11 crc kubenswrapper[5004]: I1201 08:24:11.677546 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-vfdrv" Dec 01 08:24:11 crc kubenswrapper[5004]: I1201 08:24:11.794251 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Dec 01 08:24:11 crc kubenswrapper[5004]: I1201 08:24:11.798164 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Dec 01 08:24:11 crc kubenswrapper[5004]: I1201 08:24:11.802762 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy" Dec 01 08:24:11 crc kubenswrapper[5004]: I1201 08:24:11.804415 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-web-config" Dec 01 08:24:11 crc kubenswrapper[5004]: I1201 08:24:11.805013 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls-assets-0" Dec 01 08:24:11 crc kubenswrapper[5004]: I1201 08:24:11.805282 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls" Dec 01 08:24:11 crc kubenswrapper[5004]: I1201 08:24:11.805425 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-generated" Dec 01 08:24:11 crc kubenswrapper[5004]: I1201 08:24:11.805785 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-metric" Dec 01 08:24:11 crc kubenswrapper[5004]: I1201 08:24:11.805934 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-web" Dec 01 08:24:11 crc kubenswrapper[5004]: I1201 08:24:11.807519 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-dockercfg-59q8f" Dec 01 08:24:11 crc kubenswrapper[5004]: I1201 08:24:11.811701 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"alertmanager-trusted-ca-bundle" Dec 01 08:24:11 crc kubenswrapper[5004]: I1201 08:24:11.873954 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Dec 01 08:24:11 crc kubenswrapper[5004]: I1201 08:24:11.979795 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1e2dafee-52e4-4eb0-a77b-4270ed20783b-config-out\") pod \"alertmanager-main-0\" (UID: \"1e2dafee-52e4-4eb0-a77b-4270ed20783b\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 08:24:11 crc kubenswrapper[5004]: I1201 08:24:11.979856 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1e2dafee-52e4-4eb0-a77b-4270ed20783b-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1e2dafee-52e4-4eb0-a77b-4270ed20783b\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 08:24:11 crc kubenswrapper[5004]: I1201 08:24:11.979916 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1e2dafee-52e4-4eb0-a77b-4270ed20783b-tls-assets\") pod \"alertmanager-main-0\" (UID: \"1e2dafee-52e4-4eb0-a77b-4270ed20783b\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 08:24:11 crc kubenswrapper[5004]: I1201 08:24:11.979938 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1e2dafee-52e4-4eb0-a77b-4270ed20783b-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"1e2dafee-52e4-4eb0-a77b-4270ed20783b\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 08:24:11 crc kubenswrapper[5004]: I1201 08:24:11.979976 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1e2dafee-52e4-4eb0-a77b-4270ed20783b-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"1e2dafee-52e4-4eb0-a77b-4270ed20783b\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 08:24:11 crc kubenswrapper[5004]: I1201 08:24:11.979993 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x82n8\" (UniqueName: \"kubernetes.io/projected/1e2dafee-52e4-4eb0-a77b-4270ed20783b-kube-api-access-x82n8\") pod \"alertmanager-main-0\" (UID: \"1e2dafee-52e4-4eb0-a77b-4270ed20783b\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 08:24:11 crc kubenswrapper[5004]: I1201 08:24:11.980008 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1e2dafee-52e4-4eb0-a77b-4270ed20783b-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"1e2dafee-52e4-4eb0-a77b-4270ed20783b\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 08:24:11 crc kubenswrapper[5004]: I1201 08:24:11.980062 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e2dafee-52e4-4eb0-a77b-4270ed20783b-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1e2dafee-52e4-4eb0-a77b-4270ed20783b\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 08:24:11 crc kubenswrapper[5004]: I1201 08:24:11.980083 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1e2dafee-52e4-4eb0-a77b-4270ed20783b-config-volume\") pod \"alertmanager-main-0\" (UID: \"1e2dafee-52e4-4eb0-a77b-4270ed20783b\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 08:24:11 crc kubenswrapper[5004]: I1201 08:24:11.980144 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1e2dafee-52e4-4eb0-a77b-4270ed20783b-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"1e2dafee-52e4-4eb0-a77b-4270ed20783b\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 08:24:11 crc kubenswrapper[5004]: I1201 08:24:11.980165 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1e2dafee-52e4-4eb0-a77b-4270ed20783b-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"1e2dafee-52e4-4eb0-a77b-4270ed20783b\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 08:24:11 crc kubenswrapper[5004]: I1201 08:24:11.980185 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1e2dafee-52e4-4eb0-a77b-4270ed20783b-web-config\") pod \"alertmanager-main-0\" (UID: \"1e2dafee-52e4-4eb0-a77b-4270ed20783b\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 08:24:12 crc kubenswrapper[5004]: I1201 08:24:12.042681 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-5hh42"] Dec 01 08:24:12 crc kubenswrapper[5004]: I1201 08:24:12.081269 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1e2dafee-52e4-4eb0-a77b-4270ed20783b-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"1e2dafee-52e4-4eb0-a77b-4270ed20783b\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 08:24:12 crc kubenswrapper[5004]: I1201 08:24:12.081316 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1e2dafee-52e4-4eb0-a77b-4270ed20783b-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"1e2dafee-52e4-4eb0-a77b-4270ed20783b\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 08:24:12 crc kubenswrapper[5004]: I1201 08:24:12.081341 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1e2dafee-52e4-4eb0-a77b-4270ed20783b-web-config\") pod \"alertmanager-main-0\" (UID: \"1e2dafee-52e4-4eb0-a77b-4270ed20783b\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 08:24:12 crc kubenswrapper[5004]: I1201 08:24:12.081376 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1e2dafee-52e4-4eb0-a77b-4270ed20783b-config-out\") pod \"alertmanager-main-0\" (UID: \"1e2dafee-52e4-4eb0-a77b-4270ed20783b\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 08:24:12 crc kubenswrapper[5004]: I1201 08:24:12.081411 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1e2dafee-52e4-4eb0-a77b-4270ed20783b-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1e2dafee-52e4-4eb0-a77b-4270ed20783b\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 08:24:12 crc kubenswrapper[5004]: I1201 08:24:12.081439 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1e2dafee-52e4-4eb0-a77b-4270ed20783b-tls-assets\") pod \"alertmanager-main-0\" (UID: \"1e2dafee-52e4-4eb0-a77b-4270ed20783b\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 08:24:12 crc kubenswrapper[5004]: I1201 08:24:12.081455 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1e2dafee-52e4-4eb0-a77b-4270ed20783b-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"1e2dafee-52e4-4eb0-a77b-4270ed20783b\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 08:24:12 crc kubenswrapper[5004]: I1201 08:24:12.081474 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1e2dafee-52e4-4eb0-a77b-4270ed20783b-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"1e2dafee-52e4-4eb0-a77b-4270ed20783b\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 08:24:12 crc kubenswrapper[5004]: I1201 08:24:12.081492 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x82n8\" (UniqueName: \"kubernetes.io/projected/1e2dafee-52e4-4eb0-a77b-4270ed20783b-kube-api-access-x82n8\") pod \"alertmanager-main-0\" (UID: \"1e2dafee-52e4-4eb0-a77b-4270ed20783b\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 08:24:12 crc kubenswrapper[5004]: I1201 08:24:12.081506 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1e2dafee-52e4-4eb0-a77b-4270ed20783b-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"1e2dafee-52e4-4eb0-a77b-4270ed20783b\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 08:24:12 crc kubenswrapper[5004]: I1201 08:24:12.081524 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e2dafee-52e4-4eb0-a77b-4270ed20783b-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1e2dafee-52e4-4eb0-a77b-4270ed20783b\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 08:24:12 crc kubenswrapper[5004]: I1201 08:24:12.081544 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1e2dafee-52e4-4eb0-a77b-4270ed20783b-config-volume\") pod \"alertmanager-main-0\" (UID: \"1e2dafee-52e4-4eb0-a77b-4270ed20783b\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 08:24:12 crc kubenswrapper[5004]: I1201 08:24:12.082417 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1e2dafee-52e4-4eb0-a77b-4270ed20783b-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"1e2dafee-52e4-4eb0-a77b-4270ed20783b\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 08:24:12 crc kubenswrapper[5004]: I1201 08:24:12.082753 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e2dafee-52e4-4eb0-a77b-4270ed20783b-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1e2dafee-52e4-4eb0-a77b-4270ed20783b\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 08:24:12 crc kubenswrapper[5004]: I1201 08:24:12.082955 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1e2dafee-52e4-4eb0-a77b-4270ed20783b-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"1e2dafee-52e4-4eb0-a77b-4270ed20783b\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 08:24:12 crc kubenswrapper[5004]: I1201 08:24:12.088253 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1e2dafee-52e4-4eb0-a77b-4270ed20783b-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"1e2dafee-52e4-4eb0-a77b-4270ed20783b\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 08:24:12 crc kubenswrapper[5004]: I1201 08:24:12.088389 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1e2dafee-52e4-4eb0-a77b-4270ed20783b-web-config\") pod \"alertmanager-main-0\" (UID: \"1e2dafee-52e4-4eb0-a77b-4270ed20783b\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 08:24:12 crc kubenswrapper[5004]: I1201 08:24:12.088584 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1e2dafee-52e4-4eb0-a77b-4270ed20783b-config-volume\") pod \"alertmanager-main-0\" (UID: \"1e2dafee-52e4-4eb0-a77b-4270ed20783b\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 08:24:12 crc kubenswrapper[5004]: I1201 08:24:12.088650 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1e2dafee-52e4-4eb0-a77b-4270ed20783b-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"1e2dafee-52e4-4eb0-a77b-4270ed20783b\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 08:24:12 crc kubenswrapper[5004]: I1201 08:24:12.088767 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1e2dafee-52e4-4eb0-a77b-4270ed20783b-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"1e2dafee-52e4-4eb0-a77b-4270ed20783b\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 08:24:12 crc kubenswrapper[5004]: I1201 08:24:12.089247 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1e2dafee-52e4-4eb0-a77b-4270ed20783b-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1e2dafee-52e4-4eb0-a77b-4270ed20783b\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 08:24:12 crc kubenswrapper[5004]: I1201 08:24:12.090863 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1e2dafee-52e4-4eb0-a77b-4270ed20783b-tls-assets\") pod \"alertmanager-main-0\" (UID: \"1e2dafee-52e4-4eb0-a77b-4270ed20783b\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 08:24:12 crc kubenswrapper[5004]: I1201 08:24:12.091695 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1e2dafee-52e4-4eb0-a77b-4270ed20783b-config-out\") pod \"alertmanager-main-0\" (UID: \"1e2dafee-52e4-4eb0-a77b-4270ed20783b\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 08:24:12 crc kubenswrapper[5004]: I1201 08:24:12.104886 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x82n8\" (UniqueName: \"kubernetes.io/projected/1e2dafee-52e4-4eb0-a77b-4270ed20783b-kube-api-access-x82n8\") pod \"alertmanager-main-0\" (UID: \"1e2dafee-52e4-4eb0-a77b-4270ed20783b\") " pod="openshift-monitoring/alertmanager-main-0" Dec 01 08:24:12 crc kubenswrapper[5004]: I1201 08:24:12.135169 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Dec 01 08:24:12 crc kubenswrapper[5004]: I1201 08:24:12.145983 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-vfdrv"] Dec 01 08:24:12 crc kubenswrapper[5004]: I1201 08:24:12.362805 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Dec 01 08:24:12 crc kubenswrapper[5004]: W1201 08:24:12.371116 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e2dafee_52e4_4eb0_a77b_4270ed20783b.slice/crio-fcdd06e1ccab24edfb47b055f2a381f353c4f26dd4b0ce492c1c753ef6a38387 WatchSource:0}: Error finding container fcdd06e1ccab24edfb47b055f2a381f353c4f26dd4b0ce492c1c753ef6a38387: Status 404 returned error can't find the container with id fcdd06e1ccab24edfb47b055f2a381f353c4f26dd4b0ce492c1c753ef6a38387 Dec 01 08:24:12 crc kubenswrapper[5004]: I1201 08:24:12.477546 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-vfdrv" event={"ID":"621d07ba-8cd2-4aa8-8064-db1e702f8da5","Type":"ContainerStarted","Data":"8e9708c111b6b9ea044316e357f019a7275e70defa0a9739b3f0d8a078691dd4"} Dec 01 08:24:12 crc kubenswrapper[5004]: I1201 08:24:12.479395 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-5hh42" event={"ID":"d4291cf4-2b49-471b-9526-8591c8e15bab","Type":"ContainerStarted","Data":"c31600d8794d782908333fb324b731a17ec4d99f541a45d7644300cd678e9ceb"} Dec 01 08:24:12 crc kubenswrapper[5004]: I1201 08:24:12.479640 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-5hh42" event={"ID":"d4291cf4-2b49-471b-9526-8591c8e15bab","Type":"ContainerStarted","Data":"bf7d2ff9e338b9af34689f70ffdb0e3fe80c112bf5d017ec0f0bfae450489556"} Dec 01 08:24:12 crc kubenswrapper[5004]: I1201 08:24:12.479657 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-5hh42" event={"ID":"d4291cf4-2b49-471b-9526-8591c8e15bab","Type":"ContainerStarted","Data":"f85013c1c1e5c86859ca949376482fc49697147b99bc8c8dedde6aca50627169"} Dec 01 08:24:12 crc kubenswrapper[5004]: I1201 08:24:12.481123 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1e2dafee-52e4-4eb0-a77b-4270ed20783b","Type":"ContainerStarted","Data":"fcdd06e1ccab24edfb47b055f2a381f353c4f26dd4b0ce492c1c753ef6a38387"} Dec 01 08:24:12 crc kubenswrapper[5004]: I1201 08:24:12.807111 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-556c87fb6b-jgp46"] Dec 01 08:24:12 crc kubenswrapper[5004]: I1201 08:24:12.809419 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-556c87fb6b-jgp46" Dec 01 08:24:12 crc kubenswrapper[5004]: I1201 08:24:12.813927 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-grpc-tls-6ftl1vkusl2d1" Dec 01 08:24:12 crc kubenswrapper[5004]: I1201 08:24:12.814412 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-tls" Dec 01 08:24:12 crc kubenswrapper[5004]: I1201 08:24:12.814615 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-dockercfg-qdftj" Dec 01 08:24:12 crc kubenswrapper[5004]: I1201 08:24:12.814639 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-metrics" Dec 01 08:24:12 crc kubenswrapper[5004]: I1201 08:24:12.814772 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy" Dec 01 08:24:12 crc kubenswrapper[5004]: I1201 08:24:12.814782 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-web" Dec 01 08:24:12 crc kubenswrapper[5004]: I1201 08:24:12.815266 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-rules" Dec 01 08:24:12 crc kubenswrapper[5004]: I1201 08:24:12.836768 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-556c87fb6b-jgp46"] Dec 01 08:24:13 crc kubenswrapper[5004]: I1201 08:24:13.006892 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/12a0058f-3b45-4056-bdbd-5ad4cfcb8ad3-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-556c87fb6b-jgp46\" (UID: \"12a0058f-3b45-4056-bdbd-5ad4cfcb8ad3\") " pod="openshift-monitoring/thanos-querier-556c87fb6b-jgp46" Dec 01 08:24:13 crc kubenswrapper[5004]: I1201 08:24:13.006971 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/12a0058f-3b45-4056-bdbd-5ad4cfcb8ad3-secret-thanos-querier-tls\") pod \"thanos-querier-556c87fb6b-jgp46\" (UID: \"12a0058f-3b45-4056-bdbd-5ad4cfcb8ad3\") " pod="openshift-monitoring/thanos-querier-556c87fb6b-jgp46" Dec 01 08:24:13 crc kubenswrapper[5004]: I1201 08:24:13.007009 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/12a0058f-3b45-4056-bdbd-5ad4cfcb8ad3-secret-grpc-tls\") pod \"thanos-querier-556c87fb6b-jgp46\" (UID: \"12a0058f-3b45-4056-bdbd-5ad4cfcb8ad3\") " pod="openshift-monitoring/thanos-querier-556c87fb6b-jgp46" Dec 01 08:24:13 crc kubenswrapper[5004]: I1201 08:24:13.007164 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/12a0058f-3b45-4056-bdbd-5ad4cfcb8ad3-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-556c87fb6b-jgp46\" (UID: \"12a0058f-3b45-4056-bdbd-5ad4cfcb8ad3\") " pod="openshift-monitoring/thanos-querier-556c87fb6b-jgp46" Dec 01 08:24:13 crc kubenswrapper[5004]: I1201 08:24:13.007238 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/12a0058f-3b45-4056-bdbd-5ad4cfcb8ad3-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-556c87fb6b-jgp46\" (UID: \"12a0058f-3b45-4056-bdbd-5ad4cfcb8ad3\") " pod="openshift-monitoring/thanos-querier-556c87fb6b-jgp46" Dec 01 08:24:13 crc kubenswrapper[5004]: I1201 08:24:13.007275 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dh57w\" (UniqueName: \"kubernetes.io/projected/12a0058f-3b45-4056-bdbd-5ad4cfcb8ad3-kube-api-access-dh57w\") pod \"thanos-querier-556c87fb6b-jgp46\" (UID: \"12a0058f-3b45-4056-bdbd-5ad4cfcb8ad3\") " pod="openshift-monitoring/thanos-querier-556c87fb6b-jgp46" Dec 01 08:24:13 crc kubenswrapper[5004]: I1201 08:24:13.007415 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/12a0058f-3b45-4056-bdbd-5ad4cfcb8ad3-metrics-client-ca\") pod \"thanos-querier-556c87fb6b-jgp46\" (UID: \"12a0058f-3b45-4056-bdbd-5ad4cfcb8ad3\") " pod="openshift-monitoring/thanos-querier-556c87fb6b-jgp46" Dec 01 08:24:13 crc kubenswrapper[5004]: I1201 08:24:13.007454 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/12a0058f-3b45-4056-bdbd-5ad4cfcb8ad3-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-556c87fb6b-jgp46\" (UID: \"12a0058f-3b45-4056-bdbd-5ad4cfcb8ad3\") " pod="openshift-monitoring/thanos-querier-556c87fb6b-jgp46" Dec 01 08:24:13 crc kubenswrapper[5004]: I1201 08:24:13.108249 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/12a0058f-3b45-4056-bdbd-5ad4cfcb8ad3-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-556c87fb6b-jgp46\" (UID: \"12a0058f-3b45-4056-bdbd-5ad4cfcb8ad3\") " pod="openshift-monitoring/thanos-querier-556c87fb6b-jgp46" Dec 01 08:24:13 crc kubenswrapper[5004]: I1201 08:24:13.108318 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/12a0058f-3b45-4056-bdbd-5ad4cfcb8ad3-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-556c87fb6b-jgp46\" (UID: \"12a0058f-3b45-4056-bdbd-5ad4cfcb8ad3\") " pod="openshift-monitoring/thanos-querier-556c87fb6b-jgp46" Dec 01 08:24:13 crc kubenswrapper[5004]: I1201 08:24:13.108353 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dh57w\" (UniqueName: \"kubernetes.io/projected/12a0058f-3b45-4056-bdbd-5ad4cfcb8ad3-kube-api-access-dh57w\") pod \"thanos-querier-556c87fb6b-jgp46\" (UID: \"12a0058f-3b45-4056-bdbd-5ad4cfcb8ad3\") " pod="openshift-monitoring/thanos-querier-556c87fb6b-jgp46" Dec 01 08:24:13 crc kubenswrapper[5004]: I1201 08:24:13.108416 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/12a0058f-3b45-4056-bdbd-5ad4cfcb8ad3-metrics-client-ca\") pod \"thanos-querier-556c87fb6b-jgp46\" (UID: \"12a0058f-3b45-4056-bdbd-5ad4cfcb8ad3\") " pod="openshift-monitoring/thanos-querier-556c87fb6b-jgp46" Dec 01 08:24:13 crc kubenswrapper[5004]: I1201 08:24:13.108444 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/12a0058f-3b45-4056-bdbd-5ad4cfcb8ad3-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-556c87fb6b-jgp46\" (UID: \"12a0058f-3b45-4056-bdbd-5ad4cfcb8ad3\") " pod="openshift-monitoring/thanos-querier-556c87fb6b-jgp46" Dec 01 08:24:13 crc kubenswrapper[5004]: I1201 08:24:13.108473 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/12a0058f-3b45-4056-bdbd-5ad4cfcb8ad3-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-556c87fb6b-jgp46\" (UID: \"12a0058f-3b45-4056-bdbd-5ad4cfcb8ad3\") " pod="openshift-monitoring/thanos-querier-556c87fb6b-jgp46" Dec 01 08:24:13 crc kubenswrapper[5004]: I1201 08:24:13.108494 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/12a0058f-3b45-4056-bdbd-5ad4cfcb8ad3-secret-thanos-querier-tls\") pod \"thanos-querier-556c87fb6b-jgp46\" (UID: \"12a0058f-3b45-4056-bdbd-5ad4cfcb8ad3\") " pod="openshift-monitoring/thanos-querier-556c87fb6b-jgp46" Dec 01 08:24:13 crc kubenswrapper[5004]: I1201 08:24:13.108519 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/12a0058f-3b45-4056-bdbd-5ad4cfcb8ad3-secret-grpc-tls\") pod \"thanos-querier-556c87fb6b-jgp46\" (UID: \"12a0058f-3b45-4056-bdbd-5ad4cfcb8ad3\") " pod="openshift-monitoring/thanos-querier-556c87fb6b-jgp46" Dec 01 08:24:13 crc kubenswrapper[5004]: I1201 08:24:13.111624 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/12a0058f-3b45-4056-bdbd-5ad4cfcb8ad3-metrics-client-ca\") pod \"thanos-querier-556c87fb6b-jgp46\" (UID: \"12a0058f-3b45-4056-bdbd-5ad4cfcb8ad3\") " pod="openshift-monitoring/thanos-querier-556c87fb6b-jgp46" Dec 01 08:24:13 crc kubenswrapper[5004]: I1201 08:24:13.117399 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/12a0058f-3b45-4056-bdbd-5ad4cfcb8ad3-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-556c87fb6b-jgp46\" (UID: \"12a0058f-3b45-4056-bdbd-5ad4cfcb8ad3\") " pod="openshift-monitoring/thanos-querier-556c87fb6b-jgp46" Dec 01 08:24:13 crc kubenswrapper[5004]: I1201 08:24:13.121237 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/12a0058f-3b45-4056-bdbd-5ad4cfcb8ad3-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-556c87fb6b-jgp46\" (UID: \"12a0058f-3b45-4056-bdbd-5ad4cfcb8ad3\") " pod="openshift-monitoring/thanos-querier-556c87fb6b-jgp46" Dec 01 08:24:13 crc kubenswrapper[5004]: I1201 08:24:13.124359 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/12a0058f-3b45-4056-bdbd-5ad4cfcb8ad3-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-556c87fb6b-jgp46\" (UID: \"12a0058f-3b45-4056-bdbd-5ad4cfcb8ad3\") " pod="openshift-monitoring/thanos-querier-556c87fb6b-jgp46" Dec 01 08:24:13 crc kubenswrapper[5004]: I1201 08:24:13.124962 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/12a0058f-3b45-4056-bdbd-5ad4cfcb8ad3-secret-thanos-querier-tls\") pod \"thanos-querier-556c87fb6b-jgp46\" (UID: \"12a0058f-3b45-4056-bdbd-5ad4cfcb8ad3\") " pod="openshift-monitoring/thanos-querier-556c87fb6b-jgp46" Dec 01 08:24:13 crc kubenswrapper[5004]: I1201 08:24:13.128197 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/12a0058f-3b45-4056-bdbd-5ad4cfcb8ad3-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-556c87fb6b-jgp46\" (UID: \"12a0058f-3b45-4056-bdbd-5ad4cfcb8ad3\") " pod="openshift-monitoring/thanos-querier-556c87fb6b-jgp46" Dec 01 08:24:13 crc kubenswrapper[5004]: I1201 08:24:13.129489 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dh57w\" (UniqueName: \"kubernetes.io/projected/12a0058f-3b45-4056-bdbd-5ad4cfcb8ad3-kube-api-access-dh57w\") pod \"thanos-querier-556c87fb6b-jgp46\" (UID: \"12a0058f-3b45-4056-bdbd-5ad4cfcb8ad3\") " pod="openshift-monitoring/thanos-querier-556c87fb6b-jgp46" Dec 01 08:24:13 crc kubenswrapper[5004]: I1201 08:24:13.133432 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/12a0058f-3b45-4056-bdbd-5ad4cfcb8ad3-secret-grpc-tls\") pod \"thanos-querier-556c87fb6b-jgp46\" (UID: \"12a0058f-3b45-4056-bdbd-5ad4cfcb8ad3\") " pod="openshift-monitoring/thanos-querier-556c87fb6b-jgp46" Dec 01 08:24:13 crc kubenswrapper[5004]: I1201 08:24:13.135940 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-556c87fb6b-jgp46" Dec 01 08:24:13 crc kubenswrapper[5004]: I1201 08:24:13.492872 5004 generic.go:334] "Generic (PLEG): container finished" podID="5d20148e-b0bd-4052-a9d7-ed4064cc0f3a" containerID="cd14525cf8f227a72201a97ca179166e47b13d54bcbef39887ffc35eb87bc95e" exitCode=0 Dec 01 08:24:13 crc kubenswrapper[5004]: I1201 08:24:13.493292 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-g5xj9" event={"ID":"5d20148e-b0bd-4052-a9d7-ed4064cc0f3a","Type":"ContainerDied","Data":"cd14525cf8f227a72201a97ca179166e47b13d54bcbef39887ffc35eb87bc95e"} Dec 01 08:24:13 crc kubenswrapper[5004]: I1201 08:24:13.602060 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-556c87fb6b-jgp46"] Dec 01 08:24:14 crc kubenswrapper[5004]: I1201 08:24:14.502523 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-g5xj9" event={"ID":"5d20148e-b0bd-4052-a9d7-ed4064cc0f3a","Type":"ContainerStarted","Data":"e5828330172d24fdd0fbaab318ceefc8afc7e243c0646eedeff28d42c115ffd0"} Dec 01 08:24:14 crc kubenswrapper[5004]: I1201 08:24:14.503735 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-556c87fb6b-jgp46" event={"ID":"12a0058f-3b45-4056-bdbd-5ad4cfcb8ad3","Type":"ContainerStarted","Data":"5b510f646cfcb73b24ddbf2cb6272cc0b001f18f132d6bb6c7d266c7fbe48875"} Dec 01 08:24:15 crc kubenswrapper[5004]: I1201 08:24:15.502420 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-5d975cc89b-pdv2q"] Dec 01 08:24:15 crc kubenswrapper[5004]: I1201 08:24:15.504407 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d975cc89b-pdv2q" Dec 01 08:24:15 crc kubenswrapper[5004]: I1201 08:24:15.513094 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-g5xj9" event={"ID":"5d20148e-b0bd-4052-a9d7-ed4064cc0f3a","Type":"ContainerStarted","Data":"44fb559a0e09e7fcb464fc9d10f62bf5143dce0be7399ab6492e180e87b5c95c"} Dec 01 08:24:15 crc kubenswrapper[5004]: I1201 08:24:15.519780 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-vfdrv" event={"ID":"621d07ba-8cd2-4aa8-8064-db1e702f8da5","Type":"ContainerStarted","Data":"de83ab1ccdbc5c2740e85e34182801759aa4f255126752363f6cc080f481915e"} Dec 01 08:24:15 crc kubenswrapper[5004]: I1201 08:24:15.519825 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-vfdrv" event={"ID":"621d07ba-8cd2-4aa8-8064-db1e702f8da5","Type":"ContainerStarted","Data":"c58ced688a5267722d10b5f2209c7d7697cccf15245cb61943ad05f57f5ff44f"} Dec 01 08:24:15 crc kubenswrapper[5004]: I1201 08:24:15.519836 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-vfdrv" event={"ID":"621d07ba-8cd2-4aa8-8064-db1e702f8da5","Type":"ContainerStarted","Data":"ac0826c6b05e62c0a49c58083767fd157d0a55183f0928dbc8e8dd8b9a77a32c"} Dec 01 08:24:15 crc kubenswrapper[5004]: I1201 08:24:15.520709 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5d975cc89b-pdv2q"] Dec 01 08:24:15 crc kubenswrapper[5004]: I1201 08:24:15.526679 5004 generic.go:334] "Generic (PLEG): container finished" podID="1e2dafee-52e4-4eb0-a77b-4270ed20783b" containerID="714c3ee29615cb28e5f23f8adedd78b6c6bec5f35d4648ab0e80b93505f68b80" exitCode=0 Dec 01 08:24:15 crc kubenswrapper[5004]: I1201 08:24:15.527900 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-5hh42" event={"ID":"d4291cf4-2b49-471b-9526-8591c8e15bab","Type":"ContainerStarted","Data":"4157af433e7d2643d9e0f04ecc80719b9c2b6153ac7b24b1a52b8d7e4aa4c5ca"} Dec 01 08:24:15 crc kubenswrapper[5004]: I1201 08:24:15.527959 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1e2dafee-52e4-4eb0-a77b-4270ed20783b","Type":"ContainerDied","Data":"714c3ee29615cb28e5f23f8adedd78b6c6bec5f35d4648ab0e80b93505f68b80"} Dec 01 08:24:15 crc kubenswrapper[5004]: I1201 08:24:15.549466 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e64c8553-0fbc-499f-a966-357925f13415-oauth-serving-cert\") pod \"console-5d975cc89b-pdv2q\" (UID: \"e64c8553-0fbc-499f-a966-357925f13415\") " pod="openshift-console/console-5d975cc89b-pdv2q" Dec 01 08:24:15 crc kubenswrapper[5004]: I1201 08:24:15.549894 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e64c8553-0fbc-499f-a966-357925f13415-service-ca\") pod \"console-5d975cc89b-pdv2q\" (UID: \"e64c8553-0fbc-499f-a966-357925f13415\") " pod="openshift-console/console-5d975cc89b-pdv2q" Dec 01 08:24:15 crc kubenswrapper[5004]: I1201 08:24:15.549987 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmrjt\" (UniqueName: \"kubernetes.io/projected/e64c8553-0fbc-499f-a966-357925f13415-kube-api-access-jmrjt\") pod \"console-5d975cc89b-pdv2q\" (UID: \"e64c8553-0fbc-499f-a966-357925f13415\") " pod="openshift-console/console-5d975cc89b-pdv2q" Dec 01 08:24:15 crc kubenswrapper[5004]: I1201 08:24:15.550065 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e64c8553-0fbc-499f-a966-357925f13415-trusted-ca-bundle\") pod \"console-5d975cc89b-pdv2q\" (UID: \"e64c8553-0fbc-499f-a966-357925f13415\") " pod="openshift-console/console-5d975cc89b-pdv2q" Dec 01 08:24:15 crc kubenswrapper[5004]: I1201 08:24:15.550165 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e64c8553-0fbc-499f-a966-357925f13415-console-oauth-config\") pod \"console-5d975cc89b-pdv2q\" (UID: \"e64c8553-0fbc-499f-a966-357925f13415\") " pod="openshift-console/console-5d975cc89b-pdv2q" Dec 01 08:24:15 crc kubenswrapper[5004]: I1201 08:24:15.550192 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e64c8553-0fbc-499f-a966-357925f13415-console-config\") pod \"console-5d975cc89b-pdv2q\" (UID: \"e64c8553-0fbc-499f-a966-357925f13415\") " pod="openshift-console/console-5d975cc89b-pdv2q" Dec 01 08:24:15 crc kubenswrapper[5004]: I1201 08:24:15.550301 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e64c8553-0fbc-499f-a966-357925f13415-console-serving-cert\") pod \"console-5d975cc89b-pdv2q\" (UID: \"e64c8553-0fbc-499f-a966-357925f13415\") " pod="openshift-console/console-5d975cc89b-pdv2q" Dec 01 08:24:15 crc kubenswrapper[5004]: I1201 08:24:15.552305 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-vfdrv" podStartSLOduration=3.01954783 podStartE2EDuration="5.55228192s" podCreationTimestamp="2025-12-01 08:24:10 +0000 UTC" firstStartedPulling="2025-12-01 08:24:12.175576473 +0000 UTC m=+429.740568455" lastFinishedPulling="2025-12-01 08:24:14.708310563 +0000 UTC m=+432.273302545" observedRunningTime="2025-12-01 08:24:15.548904034 +0000 UTC m=+433.113896016" watchObservedRunningTime="2025-12-01 08:24:15.55228192 +0000 UTC m=+433.117273902" Dec 01 08:24:15 crc kubenswrapper[5004]: I1201 08:24:15.594368 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-566fddb674-5hh42" podStartSLOduration=3.295181029 podStartE2EDuration="5.594348185s" podCreationTimestamp="2025-12-01 08:24:10 +0000 UTC" firstStartedPulling="2025-12-01 08:24:12.407656479 +0000 UTC m=+429.972648461" lastFinishedPulling="2025-12-01 08:24:14.706823635 +0000 UTC m=+432.271815617" observedRunningTime="2025-12-01 08:24:15.592494738 +0000 UTC m=+433.157486720" watchObservedRunningTime="2025-12-01 08:24:15.594348185 +0000 UTC m=+433.159340187" Dec 01 08:24:15 crc kubenswrapper[5004]: I1201 08:24:15.614627 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-g5xj9" podStartSLOduration=3.698509626 podStartE2EDuration="5.614604324s" podCreationTimestamp="2025-12-01 08:24:10 +0000 UTC" firstStartedPulling="2025-12-01 08:24:11.086664342 +0000 UTC m=+428.651656324" lastFinishedPulling="2025-12-01 08:24:13.00275904 +0000 UTC m=+430.567751022" observedRunningTime="2025-12-01 08:24:15.61252878 +0000 UTC m=+433.177520782" watchObservedRunningTime="2025-12-01 08:24:15.614604324 +0000 UTC m=+433.179596326" Dec 01 08:24:15 crc kubenswrapper[5004]: I1201 08:24:15.651734 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e64c8553-0fbc-499f-a966-357925f13415-console-oauth-config\") pod \"console-5d975cc89b-pdv2q\" (UID: \"e64c8553-0fbc-499f-a966-357925f13415\") " pod="openshift-console/console-5d975cc89b-pdv2q" Dec 01 08:24:15 crc kubenswrapper[5004]: I1201 08:24:15.652239 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e64c8553-0fbc-499f-a966-357925f13415-console-config\") pod \"console-5d975cc89b-pdv2q\" (UID: \"e64c8553-0fbc-499f-a966-357925f13415\") " pod="openshift-console/console-5d975cc89b-pdv2q" Dec 01 08:24:15 crc kubenswrapper[5004]: I1201 08:24:15.653111 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e64c8553-0fbc-499f-a966-357925f13415-console-serving-cert\") pod \"console-5d975cc89b-pdv2q\" (UID: \"e64c8553-0fbc-499f-a966-357925f13415\") " pod="openshift-console/console-5d975cc89b-pdv2q" Dec 01 08:24:15 crc kubenswrapper[5004]: I1201 08:24:15.653598 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e64c8553-0fbc-499f-a966-357925f13415-oauth-serving-cert\") pod \"console-5d975cc89b-pdv2q\" (UID: \"e64c8553-0fbc-499f-a966-357925f13415\") " pod="openshift-console/console-5d975cc89b-pdv2q" Dec 01 08:24:15 crc kubenswrapper[5004]: I1201 08:24:15.653973 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e64c8553-0fbc-499f-a966-357925f13415-service-ca\") pod \"console-5d975cc89b-pdv2q\" (UID: \"e64c8553-0fbc-499f-a966-357925f13415\") " pod="openshift-console/console-5d975cc89b-pdv2q" Dec 01 08:24:15 crc kubenswrapper[5004]: I1201 08:24:15.654085 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmrjt\" (UniqueName: \"kubernetes.io/projected/e64c8553-0fbc-499f-a966-357925f13415-kube-api-access-jmrjt\") pod \"console-5d975cc89b-pdv2q\" (UID: \"e64c8553-0fbc-499f-a966-357925f13415\") " pod="openshift-console/console-5d975cc89b-pdv2q" Dec 01 08:24:15 crc kubenswrapper[5004]: I1201 08:24:15.654181 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e64c8553-0fbc-499f-a966-357925f13415-trusted-ca-bundle\") pod \"console-5d975cc89b-pdv2q\" (UID: \"e64c8553-0fbc-499f-a966-357925f13415\") " pod="openshift-console/console-5d975cc89b-pdv2q" Dec 01 08:24:15 crc kubenswrapper[5004]: I1201 08:24:15.654229 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e64c8553-0fbc-499f-a966-357925f13415-oauth-serving-cert\") pod \"console-5d975cc89b-pdv2q\" (UID: \"e64c8553-0fbc-499f-a966-357925f13415\") " pod="openshift-console/console-5d975cc89b-pdv2q" Dec 01 08:24:15 crc kubenswrapper[5004]: I1201 08:24:15.655034 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e64c8553-0fbc-499f-a966-357925f13415-trusted-ca-bundle\") pod \"console-5d975cc89b-pdv2q\" (UID: \"e64c8553-0fbc-499f-a966-357925f13415\") " pod="openshift-console/console-5d975cc89b-pdv2q" Dec 01 08:24:15 crc kubenswrapper[5004]: I1201 08:24:15.656584 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e64c8553-0fbc-499f-a966-357925f13415-console-oauth-config\") pod \"console-5d975cc89b-pdv2q\" (UID: \"e64c8553-0fbc-499f-a966-357925f13415\") " pod="openshift-console/console-5d975cc89b-pdv2q" Dec 01 08:24:15 crc kubenswrapper[5004]: I1201 08:24:15.657265 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e64c8553-0fbc-499f-a966-357925f13415-service-ca\") pod \"console-5d975cc89b-pdv2q\" (UID: \"e64c8553-0fbc-499f-a966-357925f13415\") " pod="openshift-console/console-5d975cc89b-pdv2q" Dec 01 08:24:15 crc kubenswrapper[5004]: I1201 08:24:15.653066 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e64c8553-0fbc-499f-a966-357925f13415-console-config\") pod \"console-5d975cc89b-pdv2q\" (UID: \"e64c8553-0fbc-499f-a966-357925f13415\") " pod="openshift-console/console-5d975cc89b-pdv2q" Dec 01 08:24:15 crc kubenswrapper[5004]: I1201 08:24:15.661453 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e64c8553-0fbc-499f-a966-357925f13415-console-serving-cert\") pod \"console-5d975cc89b-pdv2q\" (UID: \"e64c8553-0fbc-499f-a966-357925f13415\") " pod="openshift-console/console-5d975cc89b-pdv2q" Dec 01 08:24:15 crc kubenswrapper[5004]: I1201 08:24:15.674080 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmrjt\" (UniqueName: \"kubernetes.io/projected/e64c8553-0fbc-499f-a966-357925f13415-kube-api-access-jmrjt\") pod \"console-5d975cc89b-pdv2q\" (UID: \"e64c8553-0fbc-499f-a966-357925f13415\") " pod="openshift-console/console-5d975cc89b-pdv2q" Dec 01 08:24:15 crc kubenswrapper[5004]: I1201 08:24:15.822774 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d975cc89b-pdv2q" Dec 01 08:24:16 crc kubenswrapper[5004]: I1201 08:24:16.001569 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-65d68bbd67-86r62"] Dec 01 08:24:16 crc kubenswrapper[5004]: I1201 08:24:16.002537 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-65d68bbd67-86r62" Dec 01 08:24:16 crc kubenswrapper[5004]: I1201 08:24:16.009850 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Dec 01 08:24:16 crc kubenswrapper[5004]: I1201 08:24:16.010208 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Dec 01 08:24:16 crc kubenswrapper[5004]: I1201 08:24:16.010695 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-6psgce27j5ec8" Dec 01 08:24:16 crc kubenswrapper[5004]: I1201 08:24:16.010701 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Dec 01 08:24:16 crc kubenswrapper[5004]: I1201 08:24:16.011246 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Dec 01 08:24:16 crc kubenswrapper[5004]: I1201 08:24:16.011627 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-xmxqz" Dec 01 08:24:16 crc kubenswrapper[5004]: I1201 08:24:16.020146 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-65d68bbd67-86r62"] Dec 01 08:24:16 crc kubenswrapper[5004]: I1201 08:24:16.060010 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvnfw\" (UniqueName: \"kubernetes.io/projected/39653707-801b-4a4b-8f89-b736aedcaa55-kube-api-access-nvnfw\") pod \"metrics-server-65d68bbd67-86r62\" (UID: \"39653707-801b-4a4b-8f89-b736aedcaa55\") " pod="openshift-monitoring/metrics-server-65d68bbd67-86r62" Dec 01 08:24:16 crc kubenswrapper[5004]: I1201 08:24:16.060349 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/39653707-801b-4a4b-8f89-b736aedcaa55-secret-metrics-server-tls\") pod \"metrics-server-65d68bbd67-86r62\" (UID: \"39653707-801b-4a4b-8f89-b736aedcaa55\") " pod="openshift-monitoring/metrics-server-65d68bbd67-86r62" Dec 01 08:24:16 crc kubenswrapper[5004]: I1201 08:24:16.060388 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39653707-801b-4a4b-8f89-b736aedcaa55-client-ca-bundle\") pod \"metrics-server-65d68bbd67-86r62\" (UID: \"39653707-801b-4a4b-8f89-b736aedcaa55\") " pod="openshift-monitoring/metrics-server-65d68bbd67-86r62" Dec 01 08:24:16 crc kubenswrapper[5004]: I1201 08:24:16.060463 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39653707-801b-4a4b-8f89-b736aedcaa55-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-65d68bbd67-86r62\" (UID: \"39653707-801b-4a4b-8f89-b736aedcaa55\") " pod="openshift-monitoring/metrics-server-65d68bbd67-86r62" Dec 01 08:24:16 crc kubenswrapper[5004]: I1201 08:24:16.060577 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/39653707-801b-4a4b-8f89-b736aedcaa55-audit-log\") pod \"metrics-server-65d68bbd67-86r62\" (UID: \"39653707-801b-4a4b-8f89-b736aedcaa55\") " pod="openshift-monitoring/metrics-server-65d68bbd67-86r62" Dec 01 08:24:16 crc kubenswrapper[5004]: I1201 08:24:16.060659 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/39653707-801b-4a4b-8f89-b736aedcaa55-secret-metrics-client-certs\") pod \"metrics-server-65d68bbd67-86r62\" (UID: \"39653707-801b-4a4b-8f89-b736aedcaa55\") " pod="openshift-monitoring/metrics-server-65d68bbd67-86r62" Dec 01 08:24:16 crc kubenswrapper[5004]: I1201 08:24:16.060678 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/39653707-801b-4a4b-8f89-b736aedcaa55-metrics-server-audit-profiles\") pod \"metrics-server-65d68bbd67-86r62\" (UID: \"39653707-801b-4a4b-8f89-b736aedcaa55\") " pod="openshift-monitoring/metrics-server-65d68bbd67-86r62" Dec 01 08:24:16 crc kubenswrapper[5004]: I1201 08:24:16.161329 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvnfw\" (UniqueName: \"kubernetes.io/projected/39653707-801b-4a4b-8f89-b736aedcaa55-kube-api-access-nvnfw\") pod \"metrics-server-65d68bbd67-86r62\" (UID: \"39653707-801b-4a4b-8f89-b736aedcaa55\") " pod="openshift-monitoring/metrics-server-65d68bbd67-86r62" Dec 01 08:24:16 crc kubenswrapper[5004]: I1201 08:24:16.161383 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/39653707-801b-4a4b-8f89-b736aedcaa55-secret-metrics-server-tls\") pod \"metrics-server-65d68bbd67-86r62\" (UID: \"39653707-801b-4a4b-8f89-b736aedcaa55\") " pod="openshift-monitoring/metrics-server-65d68bbd67-86r62" Dec 01 08:24:16 crc kubenswrapper[5004]: I1201 08:24:16.161409 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39653707-801b-4a4b-8f89-b736aedcaa55-client-ca-bundle\") pod \"metrics-server-65d68bbd67-86r62\" (UID: \"39653707-801b-4a4b-8f89-b736aedcaa55\") " pod="openshift-monitoring/metrics-server-65d68bbd67-86r62" Dec 01 08:24:16 crc kubenswrapper[5004]: I1201 08:24:16.161464 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39653707-801b-4a4b-8f89-b736aedcaa55-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-65d68bbd67-86r62\" (UID: \"39653707-801b-4a4b-8f89-b736aedcaa55\") " pod="openshift-monitoring/metrics-server-65d68bbd67-86r62" Dec 01 08:24:16 crc kubenswrapper[5004]: I1201 08:24:16.161484 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/39653707-801b-4a4b-8f89-b736aedcaa55-audit-log\") pod \"metrics-server-65d68bbd67-86r62\" (UID: \"39653707-801b-4a4b-8f89-b736aedcaa55\") " pod="openshift-monitoring/metrics-server-65d68bbd67-86r62" Dec 01 08:24:16 crc kubenswrapper[5004]: I1201 08:24:16.161507 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/39653707-801b-4a4b-8f89-b736aedcaa55-secret-metrics-client-certs\") pod \"metrics-server-65d68bbd67-86r62\" (UID: \"39653707-801b-4a4b-8f89-b736aedcaa55\") " pod="openshift-monitoring/metrics-server-65d68bbd67-86r62" Dec 01 08:24:16 crc kubenswrapper[5004]: I1201 08:24:16.161529 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/39653707-801b-4a4b-8f89-b736aedcaa55-metrics-server-audit-profiles\") pod \"metrics-server-65d68bbd67-86r62\" (UID: \"39653707-801b-4a4b-8f89-b736aedcaa55\") " pod="openshift-monitoring/metrics-server-65d68bbd67-86r62" Dec 01 08:24:16 crc kubenswrapper[5004]: I1201 08:24:16.162455 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/39653707-801b-4a4b-8f89-b736aedcaa55-audit-log\") pod \"metrics-server-65d68bbd67-86r62\" (UID: \"39653707-801b-4a4b-8f89-b736aedcaa55\") " pod="openshift-monitoring/metrics-server-65d68bbd67-86r62" Dec 01 08:24:16 crc kubenswrapper[5004]: I1201 08:24:16.162935 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/39653707-801b-4a4b-8f89-b736aedcaa55-metrics-server-audit-profiles\") pod \"metrics-server-65d68bbd67-86r62\" (UID: \"39653707-801b-4a4b-8f89-b736aedcaa55\") " pod="openshift-monitoring/metrics-server-65d68bbd67-86r62" Dec 01 08:24:16 crc kubenswrapper[5004]: I1201 08:24:16.164632 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39653707-801b-4a4b-8f89-b736aedcaa55-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-65d68bbd67-86r62\" (UID: \"39653707-801b-4a4b-8f89-b736aedcaa55\") " pod="openshift-monitoring/metrics-server-65d68bbd67-86r62" Dec 01 08:24:16 crc kubenswrapper[5004]: I1201 08:24:16.169554 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/39653707-801b-4a4b-8f89-b736aedcaa55-secret-metrics-client-certs\") pod \"metrics-server-65d68bbd67-86r62\" (UID: \"39653707-801b-4a4b-8f89-b736aedcaa55\") " pod="openshift-monitoring/metrics-server-65d68bbd67-86r62" Dec 01 08:24:16 crc kubenswrapper[5004]: I1201 08:24:16.179985 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39653707-801b-4a4b-8f89-b736aedcaa55-client-ca-bundle\") pod \"metrics-server-65d68bbd67-86r62\" (UID: \"39653707-801b-4a4b-8f89-b736aedcaa55\") " pod="openshift-monitoring/metrics-server-65d68bbd67-86r62" Dec 01 08:24:16 crc kubenswrapper[5004]: I1201 08:24:16.180352 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvnfw\" (UniqueName: \"kubernetes.io/projected/39653707-801b-4a4b-8f89-b736aedcaa55-kube-api-access-nvnfw\") pod \"metrics-server-65d68bbd67-86r62\" (UID: \"39653707-801b-4a4b-8f89-b736aedcaa55\") " pod="openshift-monitoring/metrics-server-65d68bbd67-86r62" Dec 01 08:24:16 crc kubenswrapper[5004]: I1201 08:24:16.184075 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/39653707-801b-4a4b-8f89-b736aedcaa55-secret-metrics-server-tls\") pod \"metrics-server-65d68bbd67-86r62\" (UID: \"39653707-801b-4a4b-8f89-b736aedcaa55\") " pod="openshift-monitoring/metrics-server-65d68bbd67-86r62" Dec 01 08:24:16 crc kubenswrapper[5004]: I1201 08:24:16.327659 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-65d68bbd67-86r62" Dec 01 08:24:16 crc kubenswrapper[5004]: I1201 08:24:16.487420 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-64986f45c5-xvjw6"] Dec 01 08:24:16 crc kubenswrapper[5004]: I1201 08:24:16.488580 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-64986f45c5-xvjw6" Dec 01 08:24:16 crc kubenswrapper[5004]: I1201 08:24:16.490089 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"monitoring-plugin-cert" Dec 01 08:24:16 crc kubenswrapper[5004]: I1201 08:24:16.490713 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"default-dockercfg-6tstp" Dec 01 08:24:16 crc kubenswrapper[5004]: I1201 08:24:16.494186 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-64986f45c5-xvjw6"] Dec 01 08:24:16 crc kubenswrapper[5004]: I1201 08:24:16.566627 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a178b02c-e5bb-4e63-9c8f-a84abab3bbba-monitoring-plugin-cert\") pod \"monitoring-plugin-64986f45c5-xvjw6\" (UID: \"a178b02c-e5bb-4e63-9c8f-a84abab3bbba\") " pod="openshift-monitoring/monitoring-plugin-64986f45c5-xvjw6" Dec 01 08:24:16 crc kubenswrapper[5004]: I1201 08:24:16.668396 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a178b02c-e5bb-4e63-9c8f-a84abab3bbba-monitoring-plugin-cert\") pod \"monitoring-plugin-64986f45c5-xvjw6\" (UID: \"a178b02c-e5bb-4e63-9c8f-a84abab3bbba\") " pod="openshift-monitoring/monitoring-plugin-64986f45c5-xvjw6" Dec 01 08:24:16 crc kubenswrapper[5004]: I1201 08:24:16.671652 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a178b02c-e5bb-4e63-9c8f-a84abab3bbba-monitoring-plugin-cert\") pod \"monitoring-plugin-64986f45c5-xvjw6\" (UID: \"a178b02c-e5bb-4e63-9c8f-a84abab3bbba\") " pod="openshift-monitoring/monitoring-plugin-64986f45c5-xvjw6" Dec 01 08:24:16 crc kubenswrapper[5004]: I1201 08:24:16.806518 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-64986f45c5-xvjw6" Dec 01 08:24:16 crc kubenswrapper[5004]: I1201 08:24:16.911470 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5d975cc89b-pdv2q"] Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.040128 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.078945 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.079440 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.083433 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-rbac-proxy" Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.090683 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls" Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.090733 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-dockercfg-vdppp" Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.091062 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-sidecar-tls" Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.095918 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-kube-rbac-proxy-web" Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.095982 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s" Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.095918 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-grpc-tls-5umu304i58auq" Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.096115 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-prometheus-http-client-file" Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.096131 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-web-config" Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.096169 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"serving-certs-ca-bundle" Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.096263 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls-assets-0" Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.104299 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-k8s-rulefiles-0" Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.105647 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-trusted-ca-bundle" Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.279339 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c093f9cb-0826-4b1b-a9be-a6a20dd5183c-config\") pod \"prometheus-k8s-0\" (UID: \"c093f9cb-0826-4b1b-a9be-a6a20dd5183c\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.279709 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c093f9cb-0826-4b1b-a9be-a6a20dd5183c-config-out\") pod \"prometheus-k8s-0\" (UID: \"c093f9cb-0826-4b1b-a9be-a6a20dd5183c\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.279736 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c093f9cb-0826-4b1b-a9be-a6a20dd5183c-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c093f9cb-0826-4b1b-a9be-a6a20dd5183c\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.279758 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c093f9cb-0826-4b1b-a9be-a6a20dd5183c-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c093f9cb-0826-4b1b-a9be-a6a20dd5183c\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.279784 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c093f9cb-0826-4b1b-a9be-a6a20dd5183c-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c093f9cb-0826-4b1b-a9be-a6a20dd5183c\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.279814 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c093f9cb-0826-4b1b-a9be-a6a20dd5183c-web-config\") pod \"prometheus-k8s-0\" (UID: \"c093f9cb-0826-4b1b-a9be-a6a20dd5183c\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.279836 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsjpc\" (UniqueName: \"kubernetes.io/projected/c093f9cb-0826-4b1b-a9be-a6a20dd5183c-kube-api-access-lsjpc\") pod \"prometheus-k8s-0\" (UID: \"c093f9cb-0826-4b1b-a9be-a6a20dd5183c\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.279855 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c093f9cb-0826-4b1b-a9be-a6a20dd5183c-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c093f9cb-0826-4b1b-a9be-a6a20dd5183c\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.279890 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c093f9cb-0826-4b1b-a9be-a6a20dd5183c-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c093f9cb-0826-4b1b-a9be-a6a20dd5183c\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.279918 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c093f9cb-0826-4b1b-a9be-a6a20dd5183c-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c093f9cb-0826-4b1b-a9be-a6a20dd5183c\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.279942 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c093f9cb-0826-4b1b-a9be-a6a20dd5183c-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c093f9cb-0826-4b1b-a9be-a6a20dd5183c\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.279981 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c093f9cb-0826-4b1b-a9be-a6a20dd5183c-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c093f9cb-0826-4b1b-a9be-a6a20dd5183c\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.280007 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c093f9cb-0826-4b1b-a9be-a6a20dd5183c-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c093f9cb-0826-4b1b-a9be-a6a20dd5183c\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.280051 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c093f9cb-0826-4b1b-a9be-a6a20dd5183c-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c093f9cb-0826-4b1b-a9be-a6a20dd5183c\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.280073 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c093f9cb-0826-4b1b-a9be-a6a20dd5183c-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c093f9cb-0826-4b1b-a9be-a6a20dd5183c\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.280098 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c093f9cb-0826-4b1b-a9be-a6a20dd5183c-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c093f9cb-0826-4b1b-a9be-a6a20dd5183c\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.280117 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c093f9cb-0826-4b1b-a9be-a6a20dd5183c-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c093f9cb-0826-4b1b-a9be-a6a20dd5183c\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.280143 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c093f9cb-0826-4b1b-a9be-a6a20dd5183c-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c093f9cb-0826-4b1b-a9be-a6a20dd5183c\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.349911 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-64986f45c5-xvjw6"] Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.381183 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c093f9cb-0826-4b1b-a9be-a6a20dd5183c-config\") pod \"prometheus-k8s-0\" (UID: \"c093f9cb-0826-4b1b-a9be-a6a20dd5183c\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.381235 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c093f9cb-0826-4b1b-a9be-a6a20dd5183c-config-out\") pod \"prometheus-k8s-0\" (UID: \"c093f9cb-0826-4b1b-a9be-a6a20dd5183c\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.381267 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c093f9cb-0826-4b1b-a9be-a6a20dd5183c-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c093f9cb-0826-4b1b-a9be-a6a20dd5183c\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.381293 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c093f9cb-0826-4b1b-a9be-a6a20dd5183c-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c093f9cb-0826-4b1b-a9be-a6a20dd5183c\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.381326 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c093f9cb-0826-4b1b-a9be-a6a20dd5183c-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c093f9cb-0826-4b1b-a9be-a6a20dd5183c\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.381359 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c093f9cb-0826-4b1b-a9be-a6a20dd5183c-web-config\") pod \"prometheus-k8s-0\" (UID: \"c093f9cb-0826-4b1b-a9be-a6a20dd5183c\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.381380 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsjpc\" (UniqueName: \"kubernetes.io/projected/c093f9cb-0826-4b1b-a9be-a6a20dd5183c-kube-api-access-lsjpc\") pod \"prometheus-k8s-0\" (UID: \"c093f9cb-0826-4b1b-a9be-a6a20dd5183c\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.381406 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c093f9cb-0826-4b1b-a9be-a6a20dd5183c-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c093f9cb-0826-4b1b-a9be-a6a20dd5183c\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.381439 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c093f9cb-0826-4b1b-a9be-a6a20dd5183c-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c093f9cb-0826-4b1b-a9be-a6a20dd5183c\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.381466 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c093f9cb-0826-4b1b-a9be-a6a20dd5183c-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c093f9cb-0826-4b1b-a9be-a6a20dd5183c\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.381490 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c093f9cb-0826-4b1b-a9be-a6a20dd5183c-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c093f9cb-0826-4b1b-a9be-a6a20dd5183c\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.381525 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c093f9cb-0826-4b1b-a9be-a6a20dd5183c-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c093f9cb-0826-4b1b-a9be-a6a20dd5183c\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.381549 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c093f9cb-0826-4b1b-a9be-a6a20dd5183c-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c093f9cb-0826-4b1b-a9be-a6a20dd5183c\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.381614 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c093f9cb-0826-4b1b-a9be-a6a20dd5183c-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c093f9cb-0826-4b1b-a9be-a6a20dd5183c\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.381638 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c093f9cb-0826-4b1b-a9be-a6a20dd5183c-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c093f9cb-0826-4b1b-a9be-a6a20dd5183c\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.381669 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c093f9cb-0826-4b1b-a9be-a6a20dd5183c-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c093f9cb-0826-4b1b-a9be-a6a20dd5183c\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.381694 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c093f9cb-0826-4b1b-a9be-a6a20dd5183c-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c093f9cb-0826-4b1b-a9be-a6a20dd5183c\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.381720 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c093f9cb-0826-4b1b-a9be-a6a20dd5183c-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c093f9cb-0826-4b1b-a9be-a6a20dd5183c\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.382837 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c093f9cb-0826-4b1b-a9be-a6a20dd5183c-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c093f9cb-0826-4b1b-a9be-a6a20dd5183c\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.383848 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c093f9cb-0826-4b1b-a9be-a6a20dd5183c-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c093f9cb-0826-4b1b-a9be-a6a20dd5183c\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.390741 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c093f9cb-0826-4b1b-a9be-a6a20dd5183c-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c093f9cb-0826-4b1b-a9be-a6a20dd5183c\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.391251 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c093f9cb-0826-4b1b-a9be-a6a20dd5183c-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c093f9cb-0826-4b1b-a9be-a6a20dd5183c\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.392211 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c093f9cb-0826-4b1b-a9be-a6a20dd5183c-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c093f9cb-0826-4b1b-a9be-a6a20dd5183c\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.392607 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c093f9cb-0826-4b1b-a9be-a6a20dd5183c-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c093f9cb-0826-4b1b-a9be-a6a20dd5183c\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.392958 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c093f9cb-0826-4b1b-a9be-a6a20dd5183c-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c093f9cb-0826-4b1b-a9be-a6a20dd5183c\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.393327 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c093f9cb-0826-4b1b-a9be-a6a20dd5183c-config-out\") pod \"prometheus-k8s-0\" (UID: \"c093f9cb-0826-4b1b-a9be-a6a20dd5183c\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.393430 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c093f9cb-0826-4b1b-a9be-a6a20dd5183c-config\") pod \"prometheus-k8s-0\" (UID: \"c093f9cb-0826-4b1b-a9be-a6a20dd5183c\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.394728 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c093f9cb-0826-4b1b-a9be-a6a20dd5183c-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c093f9cb-0826-4b1b-a9be-a6a20dd5183c\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.395127 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c093f9cb-0826-4b1b-a9be-a6a20dd5183c-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c093f9cb-0826-4b1b-a9be-a6a20dd5183c\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.395321 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c093f9cb-0826-4b1b-a9be-a6a20dd5183c-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c093f9cb-0826-4b1b-a9be-a6a20dd5183c\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.396066 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c093f9cb-0826-4b1b-a9be-a6a20dd5183c-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c093f9cb-0826-4b1b-a9be-a6a20dd5183c\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.396350 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c093f9cb-0826-4b1b-a9be-a6a20dd5183c-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c093f9cb-0826-4b1b-a9be-a6a20dd5183c\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.396779 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c093f9cb-0826-4b1b-a9be-a6a20dd5183c-web-config\") pod \"prometheus-k8s-0\" (UID: \"c093f9cb-0826-4b1b-a9be-a6a20dd5183c\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.397021 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c093f9cb-0826-4b1b-a9be-a6a20dd5183c-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c093f9cb-0826-4b1b-a9be-a6a20dd5183c\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.398426 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c093f9cb-0826-4b1b-a9be-a6a20dd5183c-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c093f9cb-0826-4b1b-a9be-a6a20dd5183c\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.404409 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsjpc\" (UniqueName: \"kubernetes.io/projected/c093f9cb-0826-4b1b-a9be-a6a20dd5183c-kube-api-access-lsjpc\") pod \"prometheus-k8s-0\" (UID: \"c093f9cb-0826-4b1b-a9be-a6a20dd5183c\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.498772 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.500211 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-65d68bbd67-86r62"] Dec 01 08:24:17 crc kubenswrapper[5004]: W1201 08:24:17.504393 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39653707_801b_4a4b_8f89_b736aedcaa55.slice/crio-8a826c4bf41518a8171984bb98b9d58ba77bc5ba81782b8c62ea0c0afdc94ed7 WatchSource:0}: Error finding container 8a826c4bf41518a8171984bb98b9d58ba77bc5ba81782b8c62ea0c0afdc94ed7: Status 404 returned error can't find the container with id 8a826c4bf41518a8171984bb98b9d58ba77bc5ba81782b8c62ea0c0afdc94ed7 Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.541900 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d975cc89b-pdv2q" event={"ID":"e64c8553-0fbc-499f-a966-357925f13415","Type":"ContainerStarted","Data":"2b8d091ab5d287d00b72bf6045598cc6bef004ab1539a5c5ba3edeffd47a8234"} Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.541958 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d975cc89b-pdv2q" event={"ID":"e64c8553-0fbc-499f-a966-357925f13415","Type":"ContainerStarted","Data":"c96443c937a5258d09e5a5a03494a8f92d42aaca69a5c2d6dbe1dca7526f736f"} Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.547139 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-556c87fb6b-jgp46" event={"ID":"12a0058f-3b45-4056-bdbd-5ad4cfcb8ad3","Type":"ContainerStarted","Data":"6a196ab9daa74636d164e347d481fa45585ae315a43ece3623cd27952370808a"} Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.547176 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-556c87fb6b-jgp46" event={"ID":"12a0058f-3b45-4056-bdbd-5ad4cfcb8ad3","Type":"ContainerStarted","Data":"e9fab5cde40439962351dac5149c4b695533bb78a831df42aa56f3d5e8edbae4"} Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.548687 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-65d68bbd67-86r62" event={"ID":"39653707-801b-4a4b-8f89-b736aedcaa55","Type":"ContainerStarted","Data":"8a826c4bf41518a8171984bb98b9d58ba77bc5ba81782b8c62ea0c0afdc94ed7"} Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.549852 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-64986f45c5-xvjw6" event={"ID":"a178b02c-e5bb-4e63-9c8f-a84abab3bbba","Type":"ContainerStarted","Data":"06c1b5fb1028bdb8a069524c4cc3bb405e12a7e4026ca01f2d432e6c93d2d9ec"} Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.561932 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5d975cc89b-pdv2q" podStartSLOduration=2.56191531 podStartE2EDuration="2.56191531s" podCreationTimestamp="2025-12-01 08:24:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:24:17.559461618 +0000 UTC m=+435.124453600" watchObservedRunningTime="2025-12-01 08:24:17.56191531 +0000 UTC m=+435.126907292" Dec 01 08:24:17 crc kubenswrapper[5004]: I1201 08:24:17.918717 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Dec 01 08:24:17 crc kubenswrapper[5004]: W1201 08:24:17.926092 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc093f9cb_0826_4b1b_a9be_a6a20dd5183c.slice/crio-4ddd8968e632150b8ad2dbd388f8c8851d7533fb8d7d955dfd7b73f1ae8e80ea WatchSource:0}: Error finding container 4ddd8968e632150b8ad2dbd388f8c8851d7533fb8d7d955dfd7b73f1ae8e80ea: Status 404 returned error can't find the container with id 4ddd8968e632150b8ad2dbd388f8c8851d7533fb8d7d955dfd7b73f1ae8e80ea Dec 01 08:24:18 crc kubenswrapper[5004]: I1201 08:24:18.561792 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-556c87fb6b-jgp46" event={"ID":"12a0058f-3b45-4056-bdbd-5ad4cfcb8ad3","Type":"ContainerStarted","Data":"93d9105be56d285da8f7e7719117cfbb5bf3ab4ff72d732eb2362987df55fca2"} Dec 01 08:24:18 crc kubenswrapper[5004]: I1201 08:24:18.563709 5004 generic.go:334] "Generic (PLEG): container finished" podID="c093f9cb-0826-4b1b-a9be-a6a20dd5183c" containerID="37a169d604db9454fbeace85a4eef7cde91f90d8481dcc0a4ae59c43b066ae2d" exitCode=0 Dec 01 08:24:18 crc kubenswrapper[5004]: I1201 08:24:18.563766 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c093f9cb-0826-4b1b-a9be-a6a20dd5183c","Type":"ContainerDied","Data":"37a169d604db9454fbeace85a4eef7cde91f90d8481dcc0a4ae59c43b066ae2d"} Dec 01 08:24:18 crc kubenswrapper[5004]: I1201 08:24:18.563819 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c093f9cb-0826-4b1b-a9be-a6a20dd5183c","Type":"ContainerStarted","Data":"4ddd8968e632150b8ad2dbd388f8c8851d7533fb8d7d955dfd7b73f1ae8e80ea"} Dec 01 08:24:20 crc kubenswrapper[5004]: I1201 08:24:20.579795 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-65d68bbd67-86r62" event={"ID":"39653707-801b-4a4b-8f89-b736aedcaa55","Type":"ContainerStarted","Data":"e78415c8844833575836d939759f9b8c4f3790b83955f67cabc9664b3dd5376a"} Dec 01 08:24:20 crc kubenswrapper[5004]: I1201 08:24:20.581061 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-64986f45c5-xvjw6" event={"ID":"a178b02c-e5bb-4e63-9c8f-a84abab3bbba","Type":"ContainerStarted","Data":"e3a621ba6951a9081ea3d40023cdd58ca5d81c38f57e89a58bd6aa42cf7dbfc5"} Dec 01 08:24:20 crc kubenswrapper[5004]: I1201 08:24:20.581281 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/monitoring-plugin-64986f45c5-xvjw6" Dec 01 08:24:20 crc kubenswrapper[5004]: I1201 08:24:20.585444 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-556c87fb6b-jgp46" event={"ID":"12a0058f-3b45-4056-bdbd-5ad4cfcb8ad3","Type":"ContainerStarted","Data":"eabe75f4e7194acece15e5487b234719905fbacaa9a89896cfcad30a1af1227b"} Dec 01 08:24:20 crc kubenswrapper[5004]: I1201 08:24:20.585483 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-556c87fb6b-jgp46" event={"ID":"12a0058f-3b45-4056-bdbd-5ad4cfcb8ad3","Type":"ContainerStarted","Data":"5b3584650a6cf0dc5b9c1efa94c09e77f8153d4558e554d3a87ca0e341a23210"} Dec 01 08:24:20 crc kubenswrapper[5004]: I1201 08:24:20.595077 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-64986f45c5-xvjw6" Dec 01 08:24:20 crc kubenswrapper[5004]: I1201 08:24:20.598958 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1e2dafee-52e4-4eb0-a77b-4270ed20783b","Type":"ContainerStarted","Data":"19ef82af7a458de5b2f8e612c90b428790162a3effd886163a64999d52b97c9a"} Dec 01 08:24:20 crc kubenswrapper[5004]: I1201 08:24:20.599048 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1e2dafee-52e4-4eb0-a77b-4270ed20783b","Type":"ContainerStarted","Data":"c684896bf6cdb917f251baf699c66ecb28e3003a5cc4dd1eac64e1e1a487c97f"} Dec 01 08:24:20 crc kubenswrapper[5004]: I1201 08:24:20.625195 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-65d68bbd67-86r62" podStartSLOduration=3.080804743 podStartE2EDuration="5.625177761s" podCreationTimestamp="2025-12-01 08:24:15 +0000 UTC" firstStartedPulling="2025-12-01 08:24:17.506720079 +0000 UTC m=+435.071712061" lastFinishedPulling="2025-12-01 08:24:20.051093087 +0000 UTC m=+437.616085079" observedRunningTime="2025-12-01 08:24:20.599879014 +0000 UTC m=+438.164871006" watchObservedRunningTime="2025-12-01 08:24:20.625177761 +0000 UTC m=+438.190169743" Dec 01 08:24:20 crc kubenswrapper[5004]: I1201 08:24:20.628765 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-64986f45c5-xvjw6" podStartSLOduration=1.933453555 podStartE2EDuration="4.628754113s" podCreationTimestamp="2025-12-01 08:24:16 +0000 UTC" firstStartedPulling="2025-12-01 08:24:17.356642981 +0000 UTC m=+434.921634963" lastFinishedPulling="2025-12-01 08:24:20.051943529 +0000 UTC m=+437.616935521" observedRunningTime="2025-12-01 08:24:20.624605897 +0000 UTC m=+438.189597889" watchObservedRunningTime="2025-12-01 08:24:20.628754113 +0000 UTC m=+438.193746095" Dec 01 08:24:21 crc kubenswrapper[5004]: I1201 08:24:21.608091 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-556c87fb6b-jgp46" event={"ID":"12a0058f-3b45-4056-bdbd-5ad4cfcb8ad3","Type":"ContainerStarted","Data":"d927fab6cb06da2ece0a58aa70c09c728ae1319bcd697a3b5ef73d73fb53f97e"} Dec 01 08:24:21 crc kubenswrapper[5004]: I1201 08:24:21.608411 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/thanos-querier-556c87fb6b-jgp46" Dec 01 08:24:21 crc kubenswrapper[5004]: I1201 08:24:21.615946 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1e2dafee-52e4-4eb0-a77b-4270ed20783b","Type":"ContainerStarted","Data":"8e344e6478bb045618075a9b696476ae19e7953b15cd8aec46798c15e7ee77e2"} Dec 01 08:24:21 crc kubenswrapper[5004]: I1201 08:24:21.616021 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1e2dafee-52e4-4eb0-a77b-4270ed20783b","Type":"ContainerStarted","Data":"1c1ffafad41d9b0f6257525cedd1433c0ef345cf1046a8efe5a24a0538786743"} Dec 01 08:24:21 crc kubenswrapper[5004]: I1201 08:24:21.616044 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1e2dafee-52e4-4eb0-a77b-4270ed20783b","Type":"ContainerStarted","Data":"c90cff4641b70b0d06edfbb289a6f5722fde05c5809ea9e81e676c562b03db41"} Dec 01 08:24:21 crc kubenswrapper[5004]: I1201 08:24:21.616061 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1e2dafee-52e4-4eb0-a77b-4270ed20783b","Type":"ContainerStarted","Data":"45fe72557a998d4c5ad47d5ef3bd454e5f0cd20faad90869da3e4bbc5e16e55d"} Dec 01 08:24:21 crc kubenswrapper[5004]: I1201 08:24:21.637059 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-556c87fb6b-jgp46" podStartSLOduration=3.240079375 podStartE2EDuration="9.637035802s" podCreationTimestamp="2025-12-01 08:24:12 +0000 UTC" firstStartedPulling="2025-12-01 08:24:13.622757208 +0000 UTC m=+431.187749190" lastFinishedPulling="2025-12-01 08:24:20.019713595 +0000 UTC m=+437.584705617" observedRunningTime="2025-12-01 08:24:21.635695247 +0000 UTC m=+439.200687249" watchObservedRunningTime="2025-12-01 08:24:21.637035802 +0000 UTC m=+439.202027784" Dec 01 08:24:22 crc kubenswrapper[5004]: I1201 08:24:22.634067 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-556c87fb6b-jgp46" Dec 01 08:24:22 crc kubenswrapper[5004]: I1201 08:24:22.664222 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=4.017545145 podStartE2EDuration="11.664205944s" podCreationTimestamp="2025-12-01 08:24:11 +0000 UTC" firstStartedPulling="2025-12-01 08:24:12.373190289 +0000 UTC m=+429.938182271" lastFinishedPulling="2025-12-01 08:24:20.019851048 +0000 UTC m=+437.584843070" observedRunningTime="2025-12-01 08:24:21.666027213 +0000 UTC m=+439.231019195" watchObservedRunningTime="2025-12-01 08:24:22.664205944 +0000 UTC m=+440.229197926" Dec 01 08:24:24 crc kubenswrapper[5004]: I1201 08:24:24.644621 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c093f9cb-0826-4b1b-a9be-a6a20dd5183c","Type":"ContainerStarted","Data":"3d8d81c4a6ffe71bf7e12153583c2c87d50e8204637c9b33310b4130d2d66203"} Dec 01 08:24:24 crc kubenswrapper[5004]: I1201 08:24:24.644906 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c093f9cb-0826-4b1b-a9be-a6a20dd5183c","Type":"ContainerStarted","Data":"b4cf552714fcfee0889de51af7714fb60f9a3c1b0dd95790b56e9945b35247b6"} Dec 01 08:24:24 crc kubenswrapper[5004]: I1201 08:24:24.644919 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c093f9cb-0826-4b1b-a9be-a6a20dd5183c","Type":"ContainerStarted","Data":"f2063724342807d731a0fc595348c28e08220ccf67973a84a82ed9101533bf06"} Dec 01 08:24:24 crc kubenswrapper[5004]: I1201 08:24:24.644930 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c093f9cb-0826-4b1b-a9be-a6a20dd5183c","Type":"ContainerStarted","Data":"28782773d64cdd6c6efe6970acb0376b0eb2ce10171bbf4364ad4bd4a7c080c2"} Dec 01 08:24:24 crc kubenswrapper[5004]: I1201 08:24:24.644943 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c093f9cb-0826-4b1b-a9be-a6a20dd5183c","Type":"ContainerStarted","Data":"a66f1aef3cd9013ffaf7926362a62618c2d0f075e5f7568d5163ea3cacca3488"} Dec 01 08:24:25 crc kubenswrapper[5004]: I1201 08:24:25.655413 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c093f9cb-0826-4b1b-a9be-a6a20dd5183c","Type":"ContainerStarted","Data":"828fcb9b63a8170de1e0bc95051d6e1f3506750fdeb126d109feaf8b971bf80e"} Dec 01 08:24:25 crc kubenswrapper[5004]: I1201 08:24:25.690372 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=3.430233374 podStartE2EDuration="8.690355015s" podCreationTimestamp="2025-12-01 08:24:17 +0000 UTC" firstStartedPulling="2025-12-01 08:24:18.566136236 +0000 UTC m=+436.131128228" lastFinishedPulling="2025-12-01 08:24:23.826257847 +0000 UTC m=+441.391249869" observedRunningTime="2025-12-01 08:24:25.686478666 +0000 UTC m=+443.251470668" watchObservedRunningTime="2025-12-01 08:24:25.690355015 +0000 UTC m=+443.255346997" Dec 01 08:24:25 crc kubenswrapper[5004]: I1201 08:24:25.823781 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5d975cc89b-pdv2q" Dec 01 08:24:25 crc kubenswrapper[5004]: I1201 08:24:25.823885 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-5d975cc89b-pdv2q" Dec 01 08:24:25 crc kubenswrapper[5004]: I1201 08:24:25.832708 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5d975cc89b-pdv2q" Dec 01 08:24:26 crc kubenswrapper[5004]: I1201 08:24:26.674535 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5d975cc89b-pdv2q" Dec 01 08:24:26 crc kubenswrapper[5004]: I1201 08:24:26.777629 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-th28b"] Dec 01 08:24:27 crc kubenswrapper[5004]: I1201 08:24:27.499494 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-k8s-0" Dec 01 08:24:36 crc kubenswrapper[5004]: I1201 08:24:36.328686 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-65d68bbd67-86r62" Dec 01 08:24:36 crc kubenswrapper[5004]: I1201 08:24:36.329357 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-65d68bbd67-86r62" Dec 01 08:24:51 crc kubenswrapper[5004]: I1201 08:24:51.845078 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-th28b" podUID="ce579b07-073d-450d-b056-1be2c7bed20f" containerName="console" containerID="cri-o://75bc034389fa490d4b05cb7cd396fac0f7ac00cd2e0e51a4f7d03dc71bc13202" gracePeriod=15 Dec 01 08:24:52 crc kubenswrapper[5004]: I1201 08:24:52.880876 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-th28b_ce579b07-073d-450d-b056-1be2c7bed20f/console/0.log" Dec 01 08:24:52 crc kubenswrapper[5004]: I1201 08:24:52.881204 5004 generic.go:334] "Generic (PLEG): container finished" podID="ce579b07-073d-450d-b056-1be2c7bed20f" containerID="75bc034389fa490d4b05cb7cd396fac0f7ac00cd2e0e51a4f7d03dc71bc13202" exitCode=2 Dec 01 08:24:52 crc kubenswrapper[5004]: I1201 08:24:52.881262 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-th28b" event={"ID":"ce579b07-073d-450d-b056-1be2c7bed20f","Type":"ContainerDied","Data":"75bc034389fa490d4b05cb7cd396fac0f7ac00cd2e0e51a4f7d03dc71bc13202"} Dec 01 08:24:55 crc kubenswrapper[5004]: I1201 08:24:55.699077 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-th28b_ce579b07-073d-450d-b056-1be2c7bed20f/console/0.log" Dec 01 08:24:55 crc kubenswrapper[5004]: I1201 08:24:55.699337 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-th28b" Dec 01 08:24:55 crc kubenswrapper[5004]: I1201 08:24:55.836856 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lv9xz\" (UniqueName: \"kubernetes.io/projected/ce579b07-073d-450d-b056-1be2c7bed20f-kube-api-access-lv9xz\") pod \"ce579b07-073d-450d-b056-1be2c7bed20f\" (UID: \"ce579b07-073d-450d-b056-1be2c7bed20f\") " Dec 01 08:24:55 crc kubenswrapper[5004]: I1201 08:24:55.836995 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ce579b07-073d-450d-b056-1be2c7bed20f-console-config\") pod \"ce579b07-073d-450d-b056-1be2c7bed20f\" (UID: \"ce579b07-073d-450d-b056-1be2c7bed20f\") " Dec 01 08:24:55 crc kubenswrapper[5004]: I1201 08:24:55.837235 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ce579b07-073d-450d-b056-1be2c7bed20f-service-ca\") pod \"ce579b07-073d-450d-b056-1be2c7bed20f\" (UID: \"ce579b07-073d-450d-b056-1be2c7bed20f\") " Dec 01 08:24:55 crc kubenswrapper[5004]: I1201 08:24:55.837296 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce579b07-073d-450d-b056-1be2c7bed20f-trusted-ca-bundle\") pod \"ce579b07-073d-450d-b056-1be2c7bed20f\" (UID: \"ce579b07-073d-450d-b056-1be2c7bed20f\") " Dec 01 08:24:55 crc kubenswrapper[5004]: I1201 08:24:55.837351 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ce579b07-073d-450d-b056-1be2c7bed20f-oauth-serving-cert\") pod \"ce579b07-073d-450d-b056-1be2c7bed20f\" (UID: \"ce579b07-073d-450d-b056-1be2c7bed20f\") " Dec 01 08:24:55 crc kubenswrapper[5004]: I1201 08:24:55.837425 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ce579b07-073d-450d-b056-1be2c7bed20f-console-oauth-config\") pod \"ce579b07-073d-450d-b056-1be2c7bed20f\" (UID: \"ce579b07-073d-450d-b056-1be2c7bed20f\") " Dec 01 08:24:55 crc kubenswrapper[5004]: I1201 08:24:55.837454 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ce579b07-073d-450d-b056-1be2c7bed20f-console-serving-cert\") pod \"ce579b07-073d-450d-b056-1be2c7bed20f\" (UID: \"ce579b07-073d-450d-b056-1be2c7bed20f\") " Dec 01 08:24:55 crc kubenswrapper[5004]: I1201 08:24:55.837996 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce579b07-073d-450d-b056-1be2c7bed20f-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "ce579b07-073d-450d-b056-1be2c7bed20f" (UID: "ce579b07-073d-450d-b056-1be2c7bed20f"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:24:55 crc kubenswrapper[5004]: I1201 08:24:55.838032 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce579b07-073d-450d-b056-1be2c7bed20f-service-ca" (OuterVolumeSpecName: "service-ca") pod "ce579b07-073d-450d-b056-1be2c7bed20f" (UID: "ce579b07-073d-450d-b056-1be2c7bed20f"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:24:55 crc kubenswrapper[5004]: I1201 08:24:55.838395 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce579b07-073d-450d-b056-1be2c7bed20f-console-config" (OuterVolumeSpecName: "console-config") pod "ce579b07-073d-450d-b056-1be2c7bed20f" (UID: "ce579b07-073d-450d-b056-1be2c7bed20f"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:24:55 crc kubenswrapper[5004]: I1201 08:24:55.838677 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce579b07-073d-450d-b056-1be2c7bed20f-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "ce579b07-073d-450d-b056-1be2c7bed20f" (UID: "ce579b07-073d-450d-b056-1be2c7bed20f"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:24:55 crc kubenswrapper[5004]: I1201 08:24:55.842804 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce579b07-073d-450d-b056-1be2c7bed20f-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "ce579b07-073d-450d-b056-1be2c7bed20f" (UID: "ce579b07-073d-450d-b056-1be2c7bed20f"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:24:55 crc kubenswrapper[5004]: I1201 08:24:55.842869 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce579b07-073d-450d-b056-1be2c7bed20f-kube-api-access-lv9xz" (OuterVolumeSpecName: "kube-api-access-lv9xz") pod "ce579b07-073d-450d-b056-1be2c7bed20f" (UID: "ce579b07-073d-450d-b056-1be2c7bed20f"). InnerVolumeSpecName "kube-api-access-lv9xz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:24:55 crc kubenswrapper[5004]: I1201 08:24:55.855656 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce579b07-073d-450d-b056-1be2c7bed20f-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "ce579b07-073d-450d-b056-1be2c7bed20f" (UID: "ce579b07-073d-450d-b056-1be2c7bed20f"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:24:55 crc kubenswrapper[5004]: I1201 08:24:55.904771 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-th28b_ce579b07-073d-450d-b056-1be2c7bed20f/console/0.log" Dec 01 08:24:55 crc kubenswrapper[5004]: I1201 08:24:55.904852 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-th28b" event={"ID":"ce579b07-073d-450d-b056-1be2c7bed20f","Type":"ContainerDied","Data":"0db55e0a2ffa2d216635500ddb7467e400bd220b238f92073dbde27e60df51cb"} Dec 01 08:24:55 crc kubenswrapper[5004]: I1201 08:24:55.904904 5004 scope.go:117] "RemoveContainer" containerID="75bc034389fa490d4b05cb7cd396fac0f7ac00cd2e0e51a4f7d03dc71bc13202" Dec 01 08:24:55 crc kubenswrapper[5004]: I1201 08:24:55.904918 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-th28b" Dec 01 08:24:55 crc kubenswrapper[5004]: I1201 08:24:55.936005 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-th28b"] Dec 01 08:24:55 crc kubenswrapper[5004]: I1201 08:24:55.943875 5004 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce579b07-073d-450d-b056-1be2c7bed20f-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:24:55 crc kubenswrapper[5004]: I1201 08:24:55.943904 5004 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ce579b07-073d-450d-b056-1be2c7bed20f-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 08:24:55 crc kubenswrapper[5004]: I1201 08:24:55.943916 5004 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ce579b07-073d-450d-b056-1be2c7bed20f-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 01 08:24:55 crc kubenswrapper[5004]: I1201 08:24:55.943930 5004 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ce579b07-073d-450d-b056-1be2c7bed20f-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 08:24:55 crc kubenswrapper[5004]: I1201 08:24:55.943944 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lv9xz\" (UniqueName: \"kubernetes.io/projected/ce579b07-073d-450d-b056-1be2c7bed20f-kube-api-access-lv9xz\") on node \"crc\" DevicePath \"\"" Dec 01 08:24:55 crc kubenswrapper[5004]: I1201 08:24:55.943941 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-th28b"] Dec 01 08:24:55 crc kubenswrapper[5004]: I1201 08:24:55.943956 5004 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ce579b07-073d-450d-b056-1be2c7bed20f-console-config\") on node \"crc\" DevicePath \"\"" Dec 01 08:24:55 crc kubenswrapper[5004]: I1201 08:24:55.944044 5004 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ce579b07-073d-450d-b056-1be2c7bed20f-service-ca\") on node \"crc\" DevicePath \"\"" Dec 01 08:24:56 crc kubenswrapper[5004]: I1201 08:24:56.337519 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-65d68bbd67-86r62" Dec 01 08:24:56 crc kubenswrapper[5004]: I1201 08:24:56.343644 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-65d68bbd67-86r62" Dec 01 08:24:56 crc kubenswrapper[5004]: I1201 08:24:56.775537 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce579b07-073d-450d-b056-1be2c7bed20f" path="/var/lib/kubelet/pods/ce579b07-073d-450d-b056-1be2c7bed20f/volumes" Dec 01 08:25:17 crc kubenswrapper[5004]: I1201 08:25:17.499090 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Dec 01 08:25:17 crc kubenswrapper[5004]: I1201 08:25:17.543075 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Dec 01 08:25:18 crc kubenswrapper[5004]: I1201 08:25:18.112122 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Dec 01 08:25:32 crc kubenswrapper[5004]: I1201 08:25:32.443526 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-95c5d55ff-kpnt7"] Dec 01 08:25:32 crc kubenswrapper[5004]: E1201 08:25:32.445444 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce579b07-073d-450d-b056-1be2c7bed20f" containerName="console" Dec 01 08:25:32 crc kubenswrapper[5004]: I1201 08:25:32.445525 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce579b07-073d-450d-b056-1be2c7bed20f" containerName="console" Dec 01 08:25:32 crc kubenswrapper[5004]: I1201 08:25:32.445756 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce579b07-073d-450d-b056-1be2c7bed20f" containerName="console" Dec 01 08:25:32 crc kubenswrapper[5004]: I1201 08:25:32.446318 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-95c5d55ff-kpnt7" Dec 01 08:25:32 crc kubenswrapper[5004]: I1201 08:25:32.467069 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-95c5d55ff-kpnt7"] Dec 01 08:25:32 crc kubenswrapper[5004]: I1201 08:25:32.544128 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/966ebea9-4ef2-491b-b170-b7f2788fbe9a-console-serving-cert\") pod \"console-95c5d55ff-kpnt7\" (UID: \"966ebea9-4ef2-491b-b170-b7f2788fbe9a\") " pod="openshift-console/console-95c5d55ff-kpnt7" Dec 01 08:25:32 crc kubenswrapper[5004]: I1201 08:25:32.544217 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/966ebea9-4ef2-491b-b170-b7f2788fbe9a-oauth-serving-cert\") pod \"console-95c5d55ff-kpnt7\" (UID: \"966ebea9-4ef2-491b-b170-b7f2788fbe9a\") " pod="openshift-console/console-95c5d55ff-kpnt7" Dec 01 08:25:32 crc kubenswrapper[5004]: I1201 08:25:32.544238 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/966ebea9-4ef2-491b-b170-b7f2788fbe9a-service-ca\") pod \"console-95c5d55ff-kpnt7\" (UID: \"966ebea9-4ef2-491b-b170-b7f2788fbe9a\") " pod="openshift-console/console-95c5d55ff-kpnt7" Dec 01 08:25:32 crc kubenswrapper[5004]: I1201 08:25:32.544308 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/966ebea9-4ef2-491b-b170-b7f2788fbe9a-console-oauth-config\") pod \"console-95c5d55ff-kpnt7\" (UID: \"966ebea9-4ef2-491b-b170-b7f2788fbe9a\") " pod="openshift-console/console-95c5d55ff-kpnt7" Dec 01 08:25:32 crc kubenswrapper[5004]: I1201 08:25:32.544356 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/966ebea9-4ef2-491b-b170-b7f2788fbe9a-trusted-ca-bundle\") pod \"console-95c5d55ff-kpnt7\" (UID: \"966ebea9-4ef2-491b-b170-b7f2788fbe9a\") " pod="openshift-console/console-95c5d55ff-kpnt7" Dec 01 08:25:32 crc kubenswrapper[5004]: I1201 08:25:32.544429 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7x8j\" (UniqueName: \"kubernetes.io/projected/966ebea9-4ef2-491b-b170-b7f2788fbe9a-kube-api-access-g7x8j\") pod \"console-95c5d55ff-kpnt7\" (UID: \"966ebea9-4ef2-491b-b170-b7f2788fbe9a\") " pod="openshift-console/console-95c5d55ff-kpnt7" Dec 01 08:25:32 crc kubenswrapper[5004]: I1201 08:25:32.544513 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/966ebea9-4ef2-491b-b170-b7f2788fbe9a-console-config\") pod \"console-95c5d55ff-kpnt7\" (UID: \"966ebea9-4ef2-491b-b170-b7f2788fbe9a\") " pod="openshift-console/console-95c5d55ff-kpnt7" Dec 01 08:25:32 crc kubenswrapper[5004]: I1201 08:25:32.645241 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/966ebea9-4ef2-491b-b170-b7f2788fbe9a-console-config\") pod \"console-95c5d55ff-kpnt7\" (UID: \"966ebea9-4ef2-491b-b170-b7f2788fbe9a\") " pod="openshift-console/console-95c5d55ff-kpnt7" Dec 01 08:25:32 crc kubenswrapper[5004]: I1201 08:25:32.645296 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/966ebea9-4ef2-491b-b170-b7f2788fbe9a-oauth-serving-cert\") pod \"console-95c5d55ff-kpnt7\" (UID: \"966ebea9-4ef2-491b-b170-b7f2788fbe9a\") " pod="openshift-console/console-95c5d55ff-kpnt7" Dec 01 08:25:32 crc kubenswrapper[5004]: I1201 08:25:32.645316 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/966ebea9-4ef2-491b-b170-b7f2788fbe9a-console-serving-cert\") pod \"console-95c5d55ff-kpnt7\" (UID: \"966ebea9-4ef2-491b-b170-b7f2788fbe9a\") " pod="openshift-console/console-95c5d55ff-kpnt7" Dec 01 08:25:32 crc kubenswrapper[5004]: I1201 08:25:32.645333 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/966ebea9-4ef2-491b-b170-b7f2788fbe9a-service-ca\") pod \"console-95c5d55ff-kpnt7\" (UID: \"966ebea9-4ef2-491b-b170-b7f2788fbe9a\") " pod="openshift-console/console-95c5d55ff-kpnt7" Dec 01 08:25:32 crc kubenswrapper[5004]: I1201 08:25:32.645368 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/966ebea9-4ef2-491b-b170-b7f2788fbe9a-console-oauth-config\") pod \"console-95c5d55ff-kpnt7\" (UID: \"966ebea9-4ef2-491b-b170-b7f2788fbe9a\") " pod="openshift-console/console-95c5d55ff-kpnt7" Dec 01 08:25:32 crc kubenswrapper[5004]: I1201 08:25:32.645412 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/966ebea9-4ef2-491b-b170-b7f2788fbe9a-trusted-ca-bundle\") pod \"console-95c5d55ff-kpnt7\" (UID: \"966ebea9-4ef2-491b-b170-b7f2788fbe9a\") " pod="openshift-console/console-95c5d55ff-kpnt7" Dec 01 08:25:32 crc kubenswrapper[5004]: I1201 08:25:32.645444 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7x8j\" (UniqueName: \"kubernetes.io/projected/966ebea9-4ef2-491b-b170-b7f2788fbe9a-kube-api-access-g7x8j\") pod \"console-95c5d55ff-kpnt7\" (UID: \"966ebea9-4ef2-491b-b170-b7f2788fbe9a\") " pod="openshift-console/console-95c5d55ff-kpnt7" Dec 01 08:25:32 crc kubenswrapper[5004]: I1201 08:25:32.650468 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/966ebea9-4ef2-491b-b170-b7f2788fbe9a-oauth-serving-cert\") pod \"console-95c5d55ff-kpnt7\" (UID: \"966ebea9-4ef2-491b-b170-b7f2788fbe9a\") " pod="openshift-console/console-95c5d55ff-kpnt7" Dec 01 08:25:32 crc kubenswrapper[5004]: I1201 08:25:32.651118 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/966ebea9-4ef2-491b-b170-b7f2788fbe9a-trusted-ca-bundle\") pod \"console-95c5d55ff-kpnt7\" (UID: \"966ebea9-4ef2-491b-b170-b7f2788fbe9a\") " pod="openshift-console/console-95c5d55ff-kpnt7" Dec 01 08:25:32 crc kubenswrapper[5004]: I1201 08:25:32.651322 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/966ebea9-4ef2-491b-b170-b7f2788fbe9a-console-config\") pod \"console-95c5d55ff-kpnt7\" (UID: \"966ebea9-4ef2-491b-b170-b7f2788fbe9a\") " pod="openshift-console/console-95c5d55ff-kpnt7" Dec 01 08:25:32 crc kubenswrapper[5004]: I1201 08:25:32.657105 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/966ebea9-4ef2-491b-b170-b7f2788fbe9a-service-ca\") pod \"console-95c5d55ff-kpnt7\" (UID: \"966ebea9-4ef2-491b-b170-b7f2788fbe9a\") " pod="openshift-console/console-95c5d55ff-kpnt7" Dec 01 08:25:32 crc kubenswrapper[5004]: I1201 08:25:32.665006 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/966ebea9-4ef2-491b-b170-b7f2788fbe9a-console-serving-cert\") pod \"console-95c5d55ff-kpnt7\" (UID: \"966ebea9-4ef2-491b-b170-b7f2788fbe9a\") " pod="openshift-console/console-95c5d55ff-kpnt7" Dec 01 08:25:32 crc kubenswrapper[5004]: I1201 08:25:32.665712 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/966ebea9-4ef2-491b-b170-b7f2788fbe9a-console-oauth-config\") pod \"console-95c5d55ff-kpnt7\" (UID: \"966ebea9-4ef2-491b-b170-b7f2788fbe9a\") " pod="openshift-console/console-95c5d55ff-kpnt7" Dec 01 08:25:32 crc kubenswrapper[5004]: I1201 08:25:32.666671 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7x8j\" (UniqueName: \"kubernetes.io/projected/966ebea9-4ef2-491b-b170-b7f2788fbe9a-kube-api-access-g7x8j\") pod \"console-95c5d55ff-kpnt7\" (UID: \"966ebea9-4ef2-491b-b170-b7f2788fbe9a\") " pod="openshift-console/console-95c5d55ff-kpnt7" Dec 01 08:25:32 crc kubenswrapper[5004]: I1201 08:25:32.765342 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-95c5d55ff-kpnt7" Dec 01 08:25:33 crc kubenswrapper[5004]: I1201 08:25:33.029895 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-95c5d55ff-kpnt7"] Dec 01 08:25:33 crc kubenswrapper[5004]: I1201 08:25:33.163934 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-95c5d55ff-kpnt7" event={"ID":"966ebea9-4ef2-491b-b170-b7f2788fbe9a","Type":"ContainerStarted","Data":"2e947d22aa5f4710fa36eb3b60bf5a502718024255e96906e3af9570159d3f5c"} Dec 01 08:25:34 crc kubenswrapper[5004]: I1201 08:25:34.183953 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-95c5d55ff-kpnt7" event={"ID":"966ebea9-4ef2-491b-b170-b7f2788fbe9a","Type":"ContainerStarted","Data":"5bb03cb9dcc51c08fbf21c194bcf770e7ab94aeb6c7a8a38d0b6154099f069f1"} Dec 01 08:25:34 crc kubenswrapper[5004]: I1201 08:25:34.227905 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-95c5d55ff-kpnt7" podStartSLOduration=2.227880755 podStartE2EDuration="2.227880755s" podCreationTimestamp="2025-12-01 08:25:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:25:34.221768285 +0000 UTC m=+511.786760347" watchObservedRunningTime="2025-12-01 08:25:34.227880755 +0000 UTC m=+511.792872777" Dec 01 08:25:42 crc kubenswrapper[5004]: I1201 08:25:42.774471 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-95c5d55ff-kpnt7" Dec 01 08:25:42 crc kubenswrapper[5004]: I1201 08:25:42.775209 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-95c5d55ff-kpnt7" Dec 01 08:25:42 crc kubenswrapper[5004]: I1201 08:25:42.775234 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-95c5d55ff-kpnt7" Dec 01 08:25:42 crc kubenswrapper[5004]: I1201 08:25:42.783825 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-95c5d55ff-kpnt7" Dec 01 08:25:42 crc kubenswrapper[5004]: I1201 08:25:42.889097 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5d975cc89b-pdv2q"] Dec 01 08:26:07 crc kubenswrapper[5004]: I1201 08:26:07.960074 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-5d975cc89b-pdv2q" podUID="e64c8553-0fbc-499f-a966-357925f13415" containerName="console" containerID="cri-o://2b8d091ab5d287d00b72bf6045598cc6bef004ab1539a5c5ba3edeffd47a8234" gracePeriod=15 Dec 01 08:26:08 crc kubenswrapper[5004]: I1201 08:26:08.355190 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5d975cc89b-pdv2q_e64c8553-0fbc-499f-a966-357925f13415/console/0.log" Dec 01 08:26:08 crc kubenswrapper[5004]: I1201 08:26:08.355437 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d975cc89b-pdv2q" Dec 01 08:26:08 crc kubenswrapper[5004]: I1201 08:26:08.436330 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5d975cc89b-pdv2q_e64c8553-0fbc-499f-a966-357925f13415/console/0.log" Dec 01 08:26:08 crc kubenswrapper[5004]: I1201 08:26:08.436383 5004 generic.go:334] "Generic (PLEG): container finished" podID="e64c8553-0fbc-499f-a966-357925f13415" containerID="2b8d091ab5d287d00b72bf6045598cc6bef004ab1539a5c5ba3edeffd47a8234" exitCode=2 Dec 01 08:26:08 crc kubenswrapper[5004]: I1201 08:26:08.436409 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d975cc89b-pdv2q" event={"ID":"e64c8553-0fbc-499f-a966-357925f13415","Type":"ContainerDied","Data":"2b8d091ab5d287d00b72bf6045598cc6bef004ab1539a5c5ba3edeffd47a8234"} Dec 01 08:26:08 crc kubenswrapper[5004]: I1201 08:26:08.436434 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d975cc89b-pdv2q" event={"ID":"e64c8553-0fbc-499f-a966-357925f13415","Type":"ContainerDied","Data":"c96443c937a5258d09e5a5a03494a8f92d42aaca69a5c2d6dbe1dca7526f736f"} Dec 01 08:26:08 crc kubenswrapper[5004]: I1201 08:26:08.436449 5004 scope.go:117] "RemoveContainer" containerID="2b8d091ab5d287d00b72bf6045598cc6bef004ab1539a5c5ba3edeffd47a8234" Dec 01 08:26:08 crc kubenswrapper[5004]: I1201 08:26:08.436484 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d975cc89b-pdv2q" Dec 01 08:26:08 crc kubenswrapper[5004]: I1201 08:26:08.450259 5004 scope.go:117] "RemoveContainer" containerID="2b8d091ab5d287d00b72bf6045598cc6bef004ab1539a5c5ba3edeffd47a8234" Dec 01 08:26:08 crc kubenswrapper[5004]: E1201 08:26:08.450685 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b8d091ab5d287d00b72bf6045598cc6bef004ab1539a5c5ba3edeffd47a8234\": container with ID starting with 2b8d091ab5d287d00b72bf6045598cc6bef004ab1539a5c5ba3edeffd47a8234 not found: ID does not exist" containerID="2b8d091ab5d287d00b72bf6045598cc6bef004ab1539a5c5ba3edeffd47a8234" Dec 01 08:26:08 crc kubenswrapper[5004]: I1201 08:26:08.450717 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b8d091ab5d287d00b72bf6045598cc6bef004ab1539a5c5ba3edeffd47a8234"} err="failed to get container status \"2b8d091ab5d287d00b72bf6045598cc6bef004ab1539a5c5ba3edeffd47a8234\": rpc error: code = NotFound desc = could not find container \"2b8d091ab5d287d00b72bf6045598cc6bef004ab1539a5c5ba3edeffd47a8234\": container with ID starting with 2b8d091ab5d287d00b72bf6045598cc6bef004ab1539a5c5ba3edeffd47a8234 not found: ID does not exist" Dec 01 08:26:08 crc kubenswrapper[5004]: I1201 08:26:08.502172 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmrjt\" (UniqueName: \"kubernetes.io/projected/e64c8553-0fbc-499f-a966-357925f13415-kube-api-access-jmrjt\") pod \"e64c8553-0fbc-499f-a966-357925f13415\" (UID: \"e64c8553-0fbc-499f-a966-357925f13415\") " Dec 01 08:26:08 crc kubenswrapper[5004]: I1201 08:26:08.502225 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e64c8553-0fbc-499f-a966-357925f13415-trusted-ca-bundle\") pod \"e64c8553-0fbc-499f-a966-357925f13415\" (UID: \"e64c8553-0fbc-499f-a966-357925f13415\") " Dec 01 08:26:08 crc kubenswrapper[5004]: I1201 08:26:08.502310 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e64c8553-0fbc-499f-a966-357925f13415-oauth-serving-cert\") pod \"e64c8553-0fbc-499f-a966-357925f13415\" (UID: \"e64c8553-0fbc-499f-a966-357925f13415\") " Dec 01 08:26:08 crc kubenswrapper[5004]: I1201 08:26:08.502400 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e64c8553-0fbc-499f-a966-357925f13415-console-config\") pod \"e64c8553-0fbc-499f-a966-357925f13415\" (UID: \"e64c8553-0fbc-499f-a966-357925f13415\") " Dec 01 08:26:08 crc kubenswrapper[5004]: I1201 08:26:08.502444 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e64c8553-0fbc-499f-a966-357925f13415-service-ca\") pod \"e64c8553-0fbc-499f-a966-357925f13415\" (UID: \"e64c8553-0fbc-499f-a966-357925f13415\") " Dec 01 08:26:08 crc kubenswrapper[5004]: I1201 08:26:08.502503 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e64c8553-0fbc-499f-a966-357925f13415-console-serving-cert\") pod \"e64c8553-0fbc-499f-a966-357925f13415\" (UID: \"e64c8553-0fbc-499f-a966-357925f13415\") " Dec 01 08:26:08 crc kubenswrapper[5004]: I1201 08:26:08.502526 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e64c8553-0fbc-499f-a966-357925f13415-console-oauth-config\") pod \"e64c8553-0fbc-499f-a966-357925f13415\" (UID: \"e64c8553-0fbc-499f-a966-357925f13415\") " Dec 01 08:26:08 crc kubenswrapper[5004]: I1201 08:26:08.503366 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e64c8553-0fbc-499f-a966-357925f13415-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "e64c8553-0fbc-499f-a966-357925f13415" (UID: "e64c8553-0fbc-499f-a966-357925f13415"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:26:08 crc kubenswrapper[5004]: I1201 08:26:08.503468 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e64c8553-0fbc-499f-a966-357925f13415-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "e64c8553-0fbc-499f-a966-357925f13415" (UID: "e64c8553-0fbc-499f-a966-357925f13415"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:26:08 crc kubenswrapper[5004]: I1201 08:26:08.503910 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e64c8553-0fbc-499f-a966-357925f13415-service-ca" (OuterVolumeSpecName: "service-ca") pod "e64c8553-0fbc-499f-a966-357925f13415" (UID: "e64c8553-0fbc-499f-a966-357925f13415"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:26:08 crc kubenswrapper[5004]: I1201 08:26:08.504071 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e64c8553-0fbc-499f-a966-357925f13415-console-config" (OuterVolumeSpecName: "console-config") pod "e64c8553-0fbc-499f-a966-357925f13415" (UID: "e64c8553-0fbc-499f-a966-357925f13415"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:26:08 crc kubenswrapper[5004]: I1201 08:26:08.508112 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e64c8553-0fbc-499f-a966-357925f13415-kube-api-access-jmrjt" (OuterVolumeSpecName: "kube-api-access-jmrjt") pod "e64c8553-0fbc-499f-a966-357925f13415" (UID: "e64c8553-0fbc-499f-a966-357925f13415"). InnerVolumeSpecName "kube-api-access-jmrjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:26:08 crc kubenswrapper[5004]: I1201 08:26:08.508359 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e64c8553-0fbc-499f-a966-357925f13415-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "e64c8553-0fbc-499f-a966-357925f13415" (UID: "e64c8553-0fbc-499f-a966-357925f13415"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:26:08 crc kubenswrapper[5004]: I1201 08:26:08.509870 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e64c8553-0fbc-499f-a966-357925f13415-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "e64c8553-0fbc-499f-a966-357925f13415" (UID: "e64c8553-0fbc-499f-a966-357925f13415"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:26:08 crc kubenswrapper[5004]: I1201 08:26:08.604303 5004 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e64c8553-0fbc-499f-a966-357925f13415-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 08:26:08 crc kubenswrapper[5004]: I1201 08:26:08.604343 5004 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e64c8553-0fbc-499f-a966-357925f13415-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 01 08:26:08 crc kubenswrapper[5004]: I1201 08:26:08.604357 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmrjt\" (UniqueName: \"kubernetes.io/projected/e64c8553-0fbc-499f-a966-357925f13415-kube-api-access-jmrjt\") on node \"crc\" DevicePath \"\"" Dec 01 08:26:08 crc kubenswrapper[5004]: I1201 08:26:08.604372 5004 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e64c8553-0fbc-499f-a966-357925f13415-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:26:08 crc kubenswrapper[5004]: I1201 08:26:08.604384 5004 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e64c8553-0fbc-499f-a966-357925f13415-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 08:26:08 crc kubenswrapper[5004]: I1201 08:26:08.604395 5004 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e64c8553-0fbc-499f-a966-357925f13415-console-config\") on node \"crc\" DevicePath \"\"" Dec 01 08:26:08 crc kubenswrapper[5004]: I1201 08:26:08.604423 5004 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e64c8553-0fbc-499f-a966-357925f13415-service-ca\") on node \"crc\" DevicePath \"\"" Dec 01 08:26:08 crc kubenswrapper[5004]: I1201 08:26:08.783358 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5d975cc89b-pdv2q"] Dec 01 08:26:08 crc kubenswrapper[5004]: I1201 08:26:08.784453 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5d975cc89b-pdv2q"] Dec 01 08:26:10 crc kubenswrapper[5004]: I1201 08:26:10.771492 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e64c8553-0fbc-499f-a966-357925f13415" path="/var/lib/kubelet/pods/e64c8553-0fbc-499f-a966-357925f13415/volumes" Dec 01 08:26:38 crc kubenswrapper[5004]: I1201 08:26:38.729291 5004 patch_prober.go:28] interesting pod/machine-config-daemon-fvdgt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 08:26:38 crc kubenswrapper[5004]: I1201 08:26:38.729930 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 08:27:08 crc kubenswrapper[5004]: I1201 08:27:08.729719 5004 patch_prober.go:28] interesting pod/machine-config-daemon-fvdgt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 08:27:08 crc kubenswrapper[5004]: I1201 08:27:08.730250 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 08:27:38 crc kubenswrapper[5004]: I1201 08:27:38.729266 5004 patch_prober.go:28] interesting pod/machine-config-daemon-fvdgt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 08:27:38 crc kubenswrapper[5004]: I1201 08:27:38.730099 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 08:27:38 crc kubenswrapper[5004]: I1201 08:27:38.730171 5004 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" Dec 01 08:27:38 crc kubenswrapper[5004]: I1201 08:27:38.731620 5004 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f5cb4e2ac3b55859ead5c898a2b42c280b2d9fe9b770bdbbd6d9799deecd9d6a"} pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 08:27:38 crc kubenswrapper[5004]: I1201 08:27:38.731738 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" containerID="cri-o://f5cb4e2ac3b55859ead5c898a2b42c280b2d9fe9b770bdbbd6d9799deecd9d6a" gracePeriod=600 Dec 01 08:27:39 crc kubenswrapper[5004]: I1201 08:27:39.446378 5004 generic.go:334] "Generic (PLEG): container finished" podID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerID="f5cb4e2ac3b55859ead5c898a2b42c280b2d9fe9b770bdbbd6d9799deecd9d6a" exitCode=0 Dec 01 08:27:39 crc kubenswrapper[5004]: I1201 08:27:39.446431 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" event={"ID":"f9977ebb-82de-4e96-8763-0b5a84f8d4ce","Type":"ContainerDied","Data":"f5cb4e2ac3b55859ead5c898a2b42c280b2d9fe9b770bdbbd6d9799deecd9d6a"} Dec 01 08:27:39 crc kubenswrapper[5004]: I1201 08:27:39.446469 5004 scope.go:117] "RemoveContainer" containerID="aee81d40a16962a7717cc3a5a3263157cb0e536c40bc2b3b83dfa0f852f31e2a" Dec 01 08:27:40 crc kubenswrapper[5004]: I1201 08:27:40.457258 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" event={"ID":"f9977ebb-82de-4e96-8763-0b5a84f8d4ce","Type":"ContainerStarted","Data":"8b94d92321b66c5263a45c381dbbdfe95975b64015e15b4b3949d9d6b2469402"} Dec 01 08:28:44 crc kubenswrapper[5004]: I1201 08:28:44.568994 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210gbrgb"] Dec 01 08:28:44 crc kubenswrapper[5004]: E1201 08:28:44.569758 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e64c8553-0fbc-499f-a966-357925f13415" containerName="console" Dec 01 08:28:44 crc kubenswrapper[5004]: I1201 08:28:44.569773 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="e64c8553-0fbc-499f-a966-357925f13415" containerName="console" Dec 01 08:28:44 crc kubenswrapper[5004]: I1201 08:28:44.569892 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="e64c8553-0fbc-499f-a966-357925f13415" containerName="console" Dec 01 08:28:44 crc kubenswrapper[5004]: I1201 08:28:44.570881 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210gbrgb" Dec 01 08:28:44 crc kubenswrapper[5004]: I1201 08:28:44.573434 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 01 08:28:44 crc kubenswrapper[5004]: I1201 08:28:44.580385 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210gbrgb"] Dec 01 08:28:44 crc kubenswrapper[5004]: I1201 08:28:44.747236 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/06544def-087a-4ce3-ae2b-af6a06799add-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210gbrgb\" (UID: \"06544def-087a-4ce3-ae2b-af6a06799add\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210gbrgb" Dec 01 08:28:44 crc kubenswrapper[5004]: I1201 08:28:44.747341 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwgtl\" (UniqueName: \"kubernetes.io/projected/06544def-087a-4ce3-ae2b-af6a06799add-kube-api-access-rwgtl\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210gbrgb\" (UID: \"06544def-087a-4ce3-ae2b-af6a06799add\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210gbrgb" Dec 01 08:28:44 crc kubenswrapper[5004]: I1201 08:28:44.747451 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/06544def-087a-4ce3-ae2b-af6a06799add-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210gbrgb\" (UID: \"06544def-087a-4ce3-ae2b-af6a06799add\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210gbrgb" Dec 01 08:28:44 crc kubenswrapper[5004]: I1201 08:28:44.849056 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/06544def-087a-4ce3-ae2b-af6a06799add-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210gbrgb\" (UID: \"06544def-087a-4ce3-ae2b-af6a06799add\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210gbrgb" Dec 01 08:28:44 crc kubenswrapper[5004]: I1201 08:28:44.849515 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwgtl\" (UniqueName: \"kubernetes.io/projected/06544def-087a-4ce3-ae2b-af6a06799add-kube-api-access-rwgtl\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210gbrgb\" (UID: \"06544def-087a-4ce3-ae2b-af6a06799add\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210gbrgb" Dec 01 08:28:44 crc kubenswrapper[5004]: I1201 08:28:44.849785 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/06544def-087a-4ce3-ae2b-af6a06799add-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210gbrgb\" (UID: \"06544def-087a-4ce3-ae2b-af6a06799add\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210gbrgb" Dec 01 08:28:44 crc kubenswrapper[5004]: I1201 08:28:44.849791 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/06544def-087a-4ce3-ae2b-af6a06799add-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210gbrgb\" (UID: \"06544def-087a-4ce3-ae2b-af6a06799add\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210gbrgb" Dec 01 08:28:44 crc kubenswrapper[5004]: I1201 08:28:44.850539 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/06544def-087a-4ce3-ae2b-af6a06799add-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210gbrgb\" (UID: \"06544def-087a-4ce3-ae2b-af6a06799add\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210gbrgb" Dec 01 08:28:44 crc kubenswrapper[5004]: I1201 08:28:44.886522 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwgtl\" (UniqueName: \"kubernetes.io/projected/06544def-087a-4ce3-ae2b-af6a06799add-kube-api-access-rwgtl\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210gbrgb\" (UID: \"06544def-087a-4ce3-ae2b-af6a06799add\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210gbrgb" Dec 01 08:28:44 crc kubenswrapper[5004]: I1201 08:28:44.890588 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210gbrgb" Dec 01 08:28:45 crc kubenswrapper[5004]: I1201 08:28:45.139284 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210gbrgb"] Dec 01 08:28:45 crc kubenswrapper[5004]: W1201 08:28:45.143745 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06544def_087a_4ce3_ae2b_af6a06799add.slice/crio-1c9672e3576383786750e3a3b4d69f7a0238fc519a4bc1872dee0f0b35f6909e WatchSource:0}: Error finding container 1c9672e3576383786750e3a3b4d69f7a0238fc519a4bc1872dee0f0b35f6909e: Status 404 returned error can't find the container with id 1c9672e3576383786750e3a3b4d69f7a0238fc519a4bc1872dee0f0b35f6909e Dec 01 08:28:45 crc kubenswrapper[5004]: I1201 08:28:45.952857 5004 generic.go:334] "Generic (PLEG): container finished" podID="06544def-087a-4ce3-ae2b-af6a06799add" containerID="1961fec90fcc69ebf113e59583945366f01f33e8855dbdc03a0c48ebf84ddd1c" exitCode=0 Dec 01 08:28:45 crc kubenswrapper[5004]: I1201 08:28:45.952974 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210gbrgb" event={"ID":"06544def-087a-4ce3-ae2b-af6a06799add","Type":"ContainerDied","Data":"1961fec90fcc69ebf113e59583945366f01f33e8855dbdc03a0c48ebf84ddd1c"} Dec 01 08:28:45 crc kubenswrapper[5004]: I1201 08:28:45.953241 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210gbrgb" event={"ID":"06544def-087a-4ce3-ae2b-af6a06799add","Type":"ContainerStarted","Data":"1c9672e3576383786750e3a3b4d69f7a0238fc519a4bc1872dee0f0b35f6909e"} Dec 01 08:28:45 crc kubenswrapper[5004]: I1201 08:28:45.955543 5004 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 08:28:47 crc kubenswrapper[5004]: I1201 08:28:47.972606 5004 generic.go:334] "Generic (PLEG): container finished" podID="06544def-087a-4ce3-ae2b-af6a06799add" containerID="6392d495ecd4bf7a3131c22c4d0d7c0eb4c8e4f936cc65e5d3c49e88f9250652" exitCode=0 Dec 01 08:28:47 crc kubenswrapper[5004]: I1201 08:28:47.972780 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210gbrgb" event={"ID":"06544def-087a-4ce3-ae2b-af6a06799add","Type":"ContainerDied","Data":"6392d495ecd4bf7a3131c22c4d0d7c0eb4c8e4f936cc65e5d3c49e88f9250652"} Dec 01 08:28:48 crc kubenswrapper[5004]: I1201 08:28:48.980608 5004 generic.go:334] "Generic (PLEG): container finished" podID="06544def-087a-4ce3-ae2b-af6a06799add" containerID="9b2f8ba50e65ddb11baa5367c66198f6b3bf44fe170c03a320f100c2693887c2" exitCode=0 Dec 01 08:28:48 crc kubenswrapper[5004]: I1201 08:28:48.980671 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210gbrgb" event={"ID":"06544def-087a-4ce3-ae2b-af6a06799add","Type":"ContainerDied","Data":"9b2f8ba50e65ddb11baa5367c66198f6b3bf44fe170c03a320f100c2693887c2"} Dec 01 08:28:50 crc kubenswrapper[5004]: I1201 08:28:50.289031 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210gbrgb" Dec 01 08:28:50 crc kubenswrapper[5004]: I1201 08:28:50.426083 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwgtl\" (UniqueName: \"kubernetes.io/projected/06544def-087a-4ce3-ae2b-af6a06799add-kube-api-access-rwgtl\") pod \"06544def-087a-4ce3-ae2b-af6a06799add\" (UID: \"06544def-087a-4ce3-ae2b-af6a06799add\") " Dec 01 08:28:50 crc kubenswrapper[5004]: I1201 08:28:50.426176 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/06544def-087a-4ce3-ae2b-af6a06799add-bundle\") pod \"06544def-087a-4ce3-ae2b-af6a06799add\" (UID: \"06544def-087a-4ce3-ae2b-af6a06799add\") " Dec 01 08:28:50 crc kubenswrapper[5004]: I1201 08:28:50.426303 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/06544def-087a-4ce3-ae2b-af6a06799add-util\") pod \"06544def-087a-4ce3-ae2b-af6a06799add\" (UID: \"06544def-087a-4ce3-ae2b-af6a06799add\") " Dec 01 08:28:50 crc kubenswrapper[5004]: I1201 08:28:50.429901 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06544def-087a-4ce3-ae2b-af6a06799add-bundle" (OuterVolumeSpecName: "bundle") pod "06544def-087a-4ce3-ae2b-af6a06799add" (UID: "06544def-087a-4ce3-ae2b-af6a06799add"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:28:50 crc kubenswrapper[5004]: I1201 08:28:50.432612 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06544def-087a-4ce3-ae2b-af6a06799add-kube-api-access-rwgtl" (OuterVolumeSpecName: "kube-api-access-rwgtl") pod "06544def-087a-4ce3-ae2b-af6a06799add" (UID: "06544def-087a-4ce3-ae2b-af6a06799add"). InnerVolumeSpecName "kube-api-access-rwgtl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:28:50 crc kubenswrapper[5004]: I1201 08:28:50.456270 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06544def-087a-4ce3-ae2b-af6a06799add-util" (OuterVolumeSpecName: "util") pod "06544def-087a-4ce3-ae2b-af6a06799add" (UID: "06544def-087a-4ce3-ae2b-af6a06799add"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:28:50 crc kubenswrapper[5004]: I1201 08:28:50.528582 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwgtl\" (UniqueName: \"kubernetes.io/projected/06544def-087a-4ce3-ae2b-af6a06799add-kube-api-access-rwgtl\") on node \"crc\" DevicePath \"\"" Dec 01 08:28:50 crc kubenswrapper[5004]: I1201 08:28:50.528636 5004 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/06544def-087a-4ce3-ae2b-af6a06799add-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:28:50 crc kubenswrapper[5004]: I1201 08:28:50.528648 5004 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/06544def-087a-4ce3-ae2b-af6a06799add-util\") on node \"crc\" DevicePath \"\"" Dec 01 08:28:50 crc kubenswrapper[5004]: I1201 08:28:50.999404 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210gbrgb" event={"ID":"06544def-087a-4ce3-ae2b-af6a06799add","Type":"ContainerDied","Data":"1c9672e3576383786750e3a3b4d69f7a0238fc519a4bc1872dee0f0b35f6909e"} Dec 01 08:28:50 crc kubenswrapper[5004]: I1201 08:28:50.999441 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c9672e3576383786750e3a3b4d69f7a0238fc519a4bc1872dee0f0b35f6909e" Dec 01 08:28:50 crc kubenswrapper[5004]: I1201 08:28:50.999513 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210gbrgb" Dec 01 08:28:56 crc kubenswrapper[5004]: I1201 08:28:56.065976 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-knmdv"] Dec 01 08:28:56 crc kubenswrapper[5004]: I1201 08:28:56.066638 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" podUID="15cdec0a-5925-4966-a30b-f60c503f633e" containerName="ovn-controller" containerID="cri-o://1844a18a6109c25a25b9c8cb3ff65e69cc657f66be03dd97e883d24a774b7638" gracePeriod=30 Dec 01 08:28:56 crc kubenswrapper[5004]: I1201 08:28:56.066880 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" podUID="15cdec0a-5925-4966-a30b-f60c503f633e" containerName="kube-rbac-proxy-node" containerID="cri-o://f528b94ff12c7a804930dca4b93c23334a9ae3a53317750cd34b5695f08b4582" gracePeriod=30 Dec 01 08:28:56 crc kubenswrapper[5004]: I1201 08:28:56.066696 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" podUID="15cdec0a-5925-4966-a30b-f60c503f633e" containerName="nbdb" containerID="cri-o://e7c5c043438b515e37a6d6d5c47b92479bbb7322fe35c3a1bc57ee7b3a746544" gracePeriod=30 Dec 01 08:28:56 crc kubenswrapper[5004]: I1201 08:28:56.066776 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" podUID="15cdec0a-5925-4966-a30b-f60c503f633e" containerName="sbdb" containerID="cri-o://1f231e0092e1a85cc5d6e00fb8d3bd982223b670191ac3dc21d81ff1fab462a0" gracePeriod=30 Dec 01 08:28:56 crc kubenswrapper[5004]: I1201 08:28:56.066854 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" podUID="15cdec0a-5925-4966-a30b-f60c503f633e" containerName="northd" containerID="cri-o://23df33e45cc27e4d343ae6ec3fb88f2dfc125ba7da424515d4bee5ec19281832" gracePeriod=30 Dec 01 08:28:56 crc kubenswrapper[5004]: I1201 08:28:56.066796 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" podUID="15cdec0a-5925-4966-a30b-f60c503f633e" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://f001b70a83da28b1ea78a94f1cd70dfc63b3de5a21b041932d33c2ac970c986b" gracePeriod=30 Dec 01 08:28:56 crc kubenswrapper[5004]: I1201 08:28:56.067953 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" podUID="15cdec0a-5925-4966-a30b-f60c503f633e" containerName="ovn-acl-logging" containerID="cri-o://d4a7cab8e1594814a687b1433a92642a19eff7c2eb12f96fa06f08a2e270733d" gracePeriod=30 Dec 01 08:28:56 crc kubenswrapper[5004]: I1201 08:28:56.115206 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" podUID="15cdec0a-5925-4966-a30b-f60c503f633e" containerName="ovnkube-controller" containerID="cri-o://57bc8d8aca54ca6d854bd5065cc8d1346968b6531f884d87dc38f291e505419a" gracePeriod=30 Dec 01 08:28:56 crc kubenswrapper[5004]: E1201 08:28:56.782243 5004 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 57bc8d8aca54ca6d854bd5065cc8d1346968b6531f884d87dc38f291e505419a is running failed: container process not found" containerID="57bc8d8aca54ca6d854bd5065cc8d1346968b6531f884d87dc38f291e505419a" cmd=["/bin/bash","-c","#!/bin/bash\ntest -f /etc/cni/net.d/10-ovn-kubernetes.conf\n"] Dec 01 08:28:56 crc kubenswrapper[5004]: E1201 08:28:56.782662 5004 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 57bc8d8aca54ca6d854bd5065cc8d1346968b6531f884d87dc38f291e505419a is running failed: container process not found" containerID="57bc8d8aca54ca6d854bd5065cc8d1346968b6531f884d87dc38f291e505419a" cmd=["/bin/bash","-c","#!/bin/bash\ntest -f /etc/cni/net.d/10-ovn-kubernetes.conf\n"] Dec 01 08:28:56 crc kubenswrapper[5004]: E1201 08:28:56.782938 5004 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 57bc8d8aca54ca6d854bd5065cc8d1346968b6531f884d87dc38f291e505419a is running failed: container process not found" containerID="57bc8d8aca54ca6d854bd5065cc8d1346968b6531f884d87dc38f291e505419a" cmd=["/bin/bash","-c","#!/bin/bash\ntest -f /etc/cni/net.d/10-ovn-kubernetes.conf\n"] Dec 01 08:28:56 crc kubenswrapper[5004]: E1201 08:28:56.783019 5004 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 57bc8d8aca54ca6d854bd5065cc8d1346968b6531f884d87dc38f291e505419a is running failed: container process not found" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" podUID="15cdec0a-5925-4966-a30b-f60c503f633e" containerName="ovnkube-controller" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.045419 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zjksw_70e79009-93be-49c4-a6b3-e8a06bcea7f4/kube-multus/2.log" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.046430 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zjksw_70e79009-93be-49c4-a6b3-e8a06bcea7f4/kube-multus/1.log" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.046511 5004 generic.go:334] "Generic (PLEG): container finished" podID="70e79009-93be-49c4-a6b3-e8a06bcea7f4" containerID="c4499168a80cb7fe2301c6db0d0d9c80110f6f9bc8fc94b291f0b9b306dbb057" exitCode=2 Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.046695 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zjksw" event={"ID":"70e79009-93be-49c4-a6b3-e8a06bcea7f4","Type":"ContainerDied","Data":"c4499168a80cb7fe2301c6db0d0d9c80110f6f9bc8fc94b291f0b9b306dbb057"} Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.046757 5004 scope.go:117] "RemoveContainer" containerID="862dd7caaf04a01f96f2dba70cd2226e85b55f172ee6a34c178f756a75832a08" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.047887 5004 scope.go:117] "RemoveContainer" containerID="c4499168a80cb7fe2301c6db0d0d9c80110f6f9bc8fc94b291f0b9b306dbb057" Dec 01 08:28:57 crc kubenswrapper[5004]: E1201 08:28:57.048614 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-zjksw_openshift-multus(70e79009-93be-49c4-a6b3-e8a06bcea7f4)\"" pod="openshift-multus/multus-zjksw" podUID="70e79009-93be-49c4-a6b3-e8a06bcea7f4" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.055096 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-knmdv_15cdec0a-5925-4966-a30b-f60c503f633e/ovnkube-controller/3.log" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.067531 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-knmdv_15cdec0a-5925-4966-a30b-f60c503f633e/ovn-acl-logging/0.log" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.070244 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-knmdv_15cdec0a-5925-4966-a30b-f60c503f633e/ovn-controller/0.log" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.070668 5004 generic.go:334] "Generic (PLEG): container finished" podID="15cdec0a-5925-4966-a30b-f60c503f633e" containerID="57bc8d8aca54ca6d854bd5065cc8d1346968b6531f884d87dc38f291e505419a" exitCode=0 Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.070694 5004 generic.go:334] "Generic (PLEG): container finished" podID="15cdec0a-5925-4966-a30b-f60c503f633e" containerID="1f231e0092e1a85cc5d6e00fb8d3bd982223b670191ac3dc21d81ff1fab462a0" exitCode=0 Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.070704 5004 generic.go:334] "Generic (PLEG): container finished" podID="15cdec0a-5925-4966-a30b-f60c503f633e" containerID="e7c5c043438b515e37a6d6d5c47b92479bbb7322fe35c3a1bc57ee7b3a746544" exitCode=0 Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.070712 5004 generic.go:334] "Generic (PLEG): container finished" podID="15cdec0a-5925-4966-a30b-f60c503f633e" containerID="23df33e45cc27e4d343ae6ec3fb88f2dfc125ba7da424515d4bee5ec19281832" exitCode=0 Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.070722 5004 generic.go:334] "Generic (PLEG): container finished" podID="15cdec0a-5925-4966-a30b-f60c503f633e" containerID="d4a7cab8e1594814a687b1433a92642a19eff7c2eb12f96fa06f08a2e270733d" exitCode=143 Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.070729 5004 generic.go:334] "Generic (PLEG): container finished" podID="15cdec0a-5925-4966-a30b-f60c503f633e" containerID="1844a18a6109c25a25b9c8cb3ff65e69cc657f66be03dd97e883d24a774b7638" exitCode=143 Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.070744 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" event={"ID":"15cdec0a-5925-4966-a30b-f60c503f633e","Type":"ContainerDied","Data":"57bc8d8aca54ca6d854bd5065cc8d1346968b6531f884d87dc38f291e505419a"} Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.070923 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" event={"ID":"15cdec0a-5925-4966-a30b-f60c503f633e","Type":"ContainerDied","Data":"1f231e0092e1a85cc5d6e00fb8d3bd982223b670191ac3dc21d81ff1fab462a0"} Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.071008 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" event={"ID":"15cdec0a-5925-4966-a30b-f60c503f633e","Type":"ContainerDied","Data":"e7c5c043438b515e37a6d6d5c47b92479bbb7322fe35c3a1bc57ee7b3a746544"} Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.071034 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" event={"ID":"15cdec0a-5925-4966-a30b-f60c503f633e","Type":"ContainerDied","Data":"23df33e45cc27e4d343ae6ec3fb88f2dfc125ba7da424515d4bee5ec19281832"} Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.071089 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" event={"ID":"15cdec0a-5925-4966-a30b-f60c503f633e","Type":"ContainerDied","Data":"d4a7cab8e1594814a687b1433a92642a19eff7c2eb12f96fa06f08a2e270733d"} Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.071119 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" event={"ID":"15cdec0a-5925-4966-a30b-f60c503f633e","Type":"ContainerDied","Data":"1844a18a6109c25a25b9c8cb3ff65e69cc657f66be03dd97e883d24a774b7638"} Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.107070 5004 scope.go:117] "RemoveContainer" containerID="06e08a797d0a4810bb4314a26c9ecefd52e0bd2f04615b2bd39c4ccf951a33c9" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.319939 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-knmdv_15cdec0a-5925-4966-a30b-f60c503f633e/ovn-acl-logging/0.log" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.320578 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-knmdv_15cdec0a-5925-4966-a30b-f60c503f633e/ovn-controller/0.log" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.321011 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.424734 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-node-log\") pod \"15cdec0a-5925-4966-a30b-f60c503f633e\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.424784 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-host-kubelet\") pod \"15cdec0a-5925-4966-a30b-f60c503f633e\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.424818 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-log-socket\") pod \"15cdec0a-5925-4966-a30b-f60c503f633e\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.424846 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-host-cni-bin\") pod \"15cdec0a-5925-4966-a30b-f60c503f633e\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.424870 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-run-systemd\") pod \"15cdec0a-5925-4966-a30b-f60c503f633e\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.424898 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-host-run-ovn-kubernetes\") pod \"15cdec0a-5925-4966-a30b-f60c503f633e\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.424928 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"15cdec0a-5925-4966-a30b-f60c503f633e\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.424959 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/15cdec0a-5925-4966-a30b-f60c503f633e-env-overrides\") pod \"15cdec0a-5925-4966-a30b-f60c503f633e\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.424980 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-var-lib-openvswitch\") pod \"15cdec0a-5925-4966-a30b-f60c503f633e\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.425021 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-systemd-units\") pod \"15cdec0a-5925-4966-a30b-f60c503f633e\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.425046 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-host-run-netns\") pod \"15cdec0a-5925-4966-a30b-f60c503f633e\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.425067 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/15cdec0a-5925-4966-a30b-f60c503f633e-ovnkube-script-lib\") pod \"15cdec0a-5925-4966-a30b-f60c503f633e\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.425089 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-host-slash\") pod \"15cdec0a-5925-4966-a30b-f60c503f633e\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.425116 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-etc-openvswitch\") pod \"15cdec0a-5925-4966-a30b-f60c503f633e\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.425155 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-run-ovn\") pod \"15cdec0a-5925-4966-a30b-f60c503f633e\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.425175 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-run-openvswitch\") pod \"15cdec0a-5925-4966-a30b-f60c503f633e\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.425198 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-host-cni-netd\") pod \"15cdec0a-5925-4966-a30b-f60c503f633e\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.425227 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/15cdec0a-5925-4966-a30b-f60c503f633e-ovnkube-config\") pod \"15cdec0a-5925-4966-a30b-f60c503f633e\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.425251 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/15cdec0a-5925-4966-a30b-f60c503f633e-ovn-node-metrics-cert\") pod \"15cdec0a-5925-4966-a30b-f60c503f633e\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.425282 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f472x\" (UniqueName: \"kubernetes.io/projected/15cdec0a-5925-4966-a30b-f60c503f633e-kube-api-access-f472x\") pod \"15cdec0a-5925-4966-a30b-f60c503f633e\" (UID: \"15cdec0a-5925-4966-a30b-f60c503f633e\") " Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.426615 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "15cdec0a-5925-4966-a30b-f60c503f633e" (UID: "15cdec0a-5925-4966-a30b-f60c503f633e"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.426670 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "15cdec0a-5925-4966-a30b-f60c503f633e" (UID: "15cdec0a-5925-4966-a30b-f60c503f633e"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.426707 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "15cdec0a-5925-4966-a30b-f60c503f633e" (UID: "15cdec0a-5925-4966-a30b-f60c503f633e"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.427128 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15cdec0a-5925-4966-a30b-f60c503f633e-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "15cdec0a-5925-4966-a30b-f60c503f633e" (UID: "15cdec0a-5925-4966-a30b-f60c503f633e"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.427170 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "15cdec0a-5925-4966-a30b-f60c503f633e" (UID: "15cdec0a-5925-4966-a30b-f60c503f633e"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.427206 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-node-log" (OuterVolumeSpecName: "node-log") pod "15cdec0a-5925-4966-a30b-f60c503f633e" (UID: "15cdec0a-5925-4966-a30b-f60c503f633e"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.427235 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "15cdec0a-5925-4966-a30b-f60c503f633e" (UID: "15cdec0a-5925-4966-a30b-f60c503f633e"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.427232 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "15cdec0a-5925-4966-a30b-f60c503f633e" (UID: "15cdec0a-5925-4966-a30b-f60c503f633e"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.427270 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "15cdec0a-5925-4966-a30b-f60c503f633e" (UID: "15cdec0a-5925-4966-a30b-f60c503f633e"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.427287 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "15cdec0a-5925-4966-a30b-f60c503f633e" (UID: "15cdec0a-5925-4966-a30b-f60c503f633e"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.427310 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "15cdec0a-5925-4966-a30b-f60c503f633e" (UID: "15cdec0a-5925-4966-a30b-f60c503f633e"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.427328 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-log-socket" (OuterVolumeSpecName: "log-socket") pod "15cdec0a-5925-4966-a30b-f60c503f633e" (UID: "15cdec0a-5925-4966-a30b-f60c503f633e"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.427611 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15cdec0a-5925-4966-a30b-f60c503f633e-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "15cdec0a-5925-4966-a30b-f60c503f633e" (UID: "15cdec0a-5925-4966-a30b-f60c503f633e"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.427655 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-host-slash" (OuterVolumeSpecName: "host-slash") pod "15cdec0a-5925-4966-a30b-f60c503f633e" (UID: "15cdec0a-5925-4966-a30b-f60c503f633e"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.427659 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15cdec0a-5925-4966-a30b-f60c503f633e-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "15cdec0a-5925-4966-a30b-f60c503f633e" (UID: "15cdec0a-5925-4966-a30b-f60c503f633e"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.427683 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "15cdec0a-5925-4966-a30b-f60c503f633e" (UID: "15cdec0a-5925-4966-a30b-f60c503f633e"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.427710 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "15cdec0a-5925-4966-a30b-f60c503f633e" (UID: "15cdec0a-5925-4966-a30b-f60c503f633e"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.431402 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-9mp7m"] Dec 01 08:28:57 crc kubenswrapper[5004]: E1201 08:28:57.433763 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15cdec0a-5925-4966-a30b-f60c503f633e" containerName="ovn-acl-logging" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.433796 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="15cdec0a-5925-4966-a30b-f60c503f633e" containerName="ovn-acl-logging" Dec 01 08:28:57 crc kubenswrapper[5004]: E1201 08:28:57.433806 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15cdec0a-5925-4966-a30b-f60c503f633e" containerName="nbdb" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.433812 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="15cdec0a-5925-4966-a30b-f60c503f633e" containerName="nbdb" Dec 01 08:28:57 crc kubenswrapper[5004]: E1201 08:28:57.433823 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15cdec0a-5925-4966-a30b-f60c503f633e" containerName="kube-rbac-proxy-node" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.433830 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="15cdec0a-5925-4966-a30b-f60c503f633e" containerName="kube-rbac-proxy-node" Dec 01 08:28:57 crc kubenswrapper[5004]: E1201 08:28:57.433838 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15cdec0a-5925-4966-a30b-f60c503f633e" containerName="ovnkube-controller" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.433844 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="15cdec0a-5925-4966-a30b-f60c503f633e" containerName="ovnkube-controller" Dec 01 08:28:57 crc kubenswrapper[5004]: E1201 08:28:57.433853 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06544def-087a-4ce3-ae2b-af6a06799add" containerName="pull" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.433859 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="06544def-087a-4ce3-ae2b-af6a06799add" containerName="pull" Dec 01 08:28:57 crc kubenswrapper[5004]: E1201 08:28:57.433865 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15cdec0a-5925-4966-a30b-f60c503f633e" containerName="ovnkube-controller" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.433874 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="15cdec0a-5925-4966-a30b-f60c503f633e" containerName="ovnkube-controller" Dec 01 08:28:57 crc kubenswrapper[5004]: E1201 08:28:57.433883 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06544def-087a-4ce3-ae2b-af6a06799add" containerName="extract" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.433888 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="06544def-087a-4ce3-ae2b-af6a06799add" containerName="extract" Dec 01 08:28:57 crc kubenswrapper[5004]: E1201 08:28:57.433897 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15cdec0a-5925-4966-a30b-f60c503f633e" containerName="northd" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.433903 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="15cdec0a-5925-4966-a30b-f60c503f633e" containerName="northd" Dec 01 08:28:57 crc kubenswrapper[5004]: E1201 08:28:57.433910 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15cdec0a-5925-4966-a30b-f60c503f633e" containerName="ovnkube-controller" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.433916 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="15cdec0a-5925-4966-a30b-f60c503f633e" containerName="ovnkube-controller" Dec 01 08:28:57 crc kubenswrapper[5004]: E1201 08:28:57.433926 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15cdec0a-5925-4966-a30b-f60c503f633e" containerName="ovnkube-controller" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.433933 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="15cdec0a-5925-4966-a30b-f60c503f633e" containerName="ovnkube-controller" Dec 01 08:28:57 crc kubenswrapper[5004]: E1201 08:28:57.433940 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15cdec0a-5925-4966-a30b-f60c503f633e" containerName="kube-rbac-proxy-ovn-metrics" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.433946 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="15cdec0a-5925-4966-a30b-f60c503f633e" containerName="kube-rbac-proxy-ovn-metrics" Dec 01 08:28:57 crc kubenswrapper[5004]: E1201 08:28:57.433958 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15cdec0a-5925-4966-a30b-f60c503f633e" containerName="ovn-controller" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.433964 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="15cdec0a-5925-4966-a30b-f60c503f633e" containerName="ovn-controller" Dec 01 08:28:57 crc kubenswrapper[5004]: E1201 08:28:57.433976 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15cdec0a-5925-4966-a30b-f60c503f633e" containerName="kubecfg-setup" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.433981 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="15cdec0a-5925-4966-a30b-f60c503f633e" containerName="kubecfg-setup" Dec 01 08:28:57 crc kubenswrapper[5004]: E1201 08:28:57.433994 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06544def-087a-4ce3-ae2b-af6a06799add" containerName="util" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.434000 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="06544def-087a-4ce3-ae2b-af6a06799add" containerName="util" Dec 01 08:28:57 crc kubenswrapper[5004]: E1201 08:28:57.434010 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15cdec0a-5925-4966-a30b-f60c503f633e" containerName="sbdb" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.434015 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="15cdec0a-5925-4966-a30b-f60c503f633e" containerName="sbdb" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.434135 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="15cdec0a-5925-4966-a30b-f60c503f633e" containerName="ovnkube-controller" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.434146 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="06544def-087a-4ce3-ae2b-af6a06799add" containerName="extract" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.434156 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="15cdec0a-5925-4966-a30b-f60c503f633e" containerName="northd" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.434165 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="15cdec0a-5925-4966-a30b-f60c503f633e" containerName="ovn-acl-logging" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.434172 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="15cdec0a-5925-4966-a30b-f60c503f633e" containerName="ovn-controller" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.434179 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="15cdec0a-5925-4966-a30b-f60c503f633e" containerName="ovnkube-controller" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.434187 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="15cdec0a-5925-4966-a30b-f60c503f633e" containerName="ovnkube-controller" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.434195 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="15cdec0a-5925-4966-a30b-f60c503f633e" containerName="nbdb" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.434203 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="15cdec0a-5925-4966-a30b-f60c503f633e" containerName="kube-rbac-proxy-ovn-metrics" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.434215 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="15cdec0a-5925-4966-a30b-f60c503f633e" containerName="kube-rbac-proxy-node" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.434221 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="15cdec0a-5925-4966-a30b-f60c503f633e" containerName="sbdb" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.434521 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15cdec0a-5925-4966-a30b-f60c503f633e-kube-api-access-f472x" (OuterVolumeSpecName: "kube-api-access-f472x") pod "15cdec0a-5925-4966-a30b-f60c503f633e" (UID: "15cdec0a-5925-4966-a30b-f60c503f633e"). InnerVolumeSpecName "kube-api-access-f472x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:28:57 crc kubenswrapper[5004]: E1201 08:28:57.434598 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15cdec0a-5925-4966-a30b-f60c503f633e" containerName="ovnkube-controller" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.434606 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="15cdec0a-5925-4966-a30b-f60c503f633e" containerName="ovnkube-controller" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.434705 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="15cdec0a-5925-4966-a30b-f60c503f633e" containerName="ovnkube-controller" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.434894 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="15cdec0a-5925-4966-a30b-f60c503f633e" containerName="ovnkube-controller" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.436551 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.450630 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15cdec0a-5925-4966-a30b-f60c503f633e-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "15cdec0a-5925-4966-a30b-f60c503f633e" (UID: "15cdec0a-5925-4966-a30b-f60c503f633e"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.463960 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "15cdec0a-5925-4966-a30b-f60c503f633e" (UID: "15cdec0a-5925-4966-a30b-f60c503f633e"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.527263 5004 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.527297 5004 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.527305 5004 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-run-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.527314 5004 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-host-cni-netd\") on node \"crc\" DevicePath \"\"" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.527322 5004 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/15cdec0a-5925-4966-a30b-f60c503f633e-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.527332 5004 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/15cdec0a-5925-4966-a30b-f60c503f633e-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.527341 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f472x\" (UniqueName: \"kubernetes.io/projected/15cdec0a-5925-4966-a30b-f60c503f633e-kube-api-access-f472x\") on node \"crc\" DevicePath \"\"" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.527348 5004 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-node-log\") on node \"crc\" DevicePath \"\"" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.527356 5004 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-host-kubelet\") on node \"crc\" DevicePath \"\"" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.527366 5004 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-log-socket\") on node \"crc\" DevicePath \"\"" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.527374 5004 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-host-cni-bin\") on node \"crc\" DevicePath \"\"" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.527382 5004 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-run-systemd\") on node \"crc\" DevicePath \"\"" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.527390 5004 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.527398 5004 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.527406 5004 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.527417 5004 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/15cdec0a-5925-4966-a30b-f60c503f633e-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.527426 5004 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-systemd-units\") on node \"crc\" DevicePath \"\"" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.527434 5004 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-host-run-netns\") on node \"crc\" DevicePath \"\"" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.527442 5004 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/15cdec0a-5925-4966-a30b-f60c503f633e-host-slash\") on node \"crc\" DevicePath \"\"" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.527449 5004 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/15cdec0a-5925-4966-a30b-f60c503f633e-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.628107 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b973bd81-e8d9-4d98-83a5-c2fd62edc555-host-run-netns\") pod \"ovnkube-node-9mp7m\" (UID: \"b973bd81-e8d9-4d98-83a5-c2fd62edc555\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.628156 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b973bd81-e8d9-4d98-83a5-c2fd62edc555-systemd-units\") pod \"ovnkube-node-9mp7m\" (UID: \"b973bd81-e8d9-4d98-83a5-c2fd62edc555\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.628175 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b973bd81-e8d9-4d98-83a5-c2fd62edc555-log-socket\") pod \"ovnkube-node-9mp7m\" (UID: \"b973bd81-e8d9-4d98-83a5-c2fd62edc555\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.628187 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b973bd81-e8d9-4d98-83a5-c2fd62edc555-host-cni-bin\") pod \"ovnkube-node-9mp7m\" (UID: \"b973bd81-e8d9-4d98-83a5-c2fd62edc555\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.628203 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b973bd81-e8d9-4d98-83a5-c2fd62edc555-ovnkube-config\") pod \"ovnkube-node-9mp7m\" (UID: \"b973bd81-e8d9-4d98-83a5-c2fd62edc555\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.628221 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b973bd81-e8d9-4d98-83a5-c2fd62edc555-run-openvswitch\") pod \"ovnkube-node-9mp7m\" (UID: \"b973bd81-e8d9-4d98-83a5-c2fd62edc555\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.628233 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b973bd81-e8d9-4d98-83a5-c2fd62edc555-node-log\") pod \"ovnkube-node-9mp7m\" (UID: \"b973bd81-e8d9-4d98-83a5-c2fd62edc555\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.628249 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b973bd81-e8d9-4d98-83a5-c2fd62edc555-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9mp7m\" (UID: \"b973bd81-e8d9-4d98-83a5-c2fd62edc555\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.628276 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b973bd81-e8d9-4d98-83a5-c2fd62edc555-host-run-ovn-kubernetes\") pod \"ovnkube-node-9mp7m\" (UID: \"b973bd81-e8d9-4d98-83a5-c2fd62edc555\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.628302 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b973bd81-e8d9-4d98-83a5-c2fd62edc555-host-slash\") pod \"ovnkube-node-9mp7m\" (UID: \"b973bd81-e8d9-4d98-83a5-c2fd62edc555\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.628318 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nf8qh\" (UniqueName: \"kubernetes.io/projected/b973bd81-e8d9-4d98-83a5-c2fd62edc555-kube-api-access-nf8qh\") pod \"ovnkube-node-9mp7m\" (UID: \"b973bd81-e8d9-4d98-83a5-c2fd62edc555\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.628335 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b973bd81-e8d9-4d98-83a5-c2fd62edc555-host-cni-netd\") pod \"ovnkube-node-9mp7m\" (UID: \"b973bd81-e8d9-4d98-83a5-c2fd62edc555\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.628356 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b973bd81-e8d9-4d98-83a5-c2fd62edc555-run-systemd\") pod \"ovnkube-node-9mp7m\" (UID: \"b973bd81-e8d9-4d98-83a5-c2fd62edc555\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.628378 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b973bd81-e8d9-4d98-83a5-c2fd62edc555-ovnkube-script-lib\") pod \"ovnkube-node-9mp7m\" (UID: \"b973bd81-e8d9-4d98-83a5-c2fd62edc555\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.628409 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b973bd81-e8d9-4d98-83a5-c2fd62edc555-var-lib-openvswitch\") pod \"ovnkube-node-9mp7m\" (UID: \"b973bd81-e8d9-4d98-83a5-c2fd62edc555\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.628425 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b973bd81-e8d9-4d98-83a5-c2fd62edc555-env-overrides\") pod \"ovnkube-node-9mp7m\" (UID: \"b973bd81-e8d9-4d98-83a5-c2fd62edc555\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.628438 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b973bd81-e8d9-4d98-83a5-c2fd62edc555-etc-openvswitch\") pod \"ovnkube-node-9mp7m\" (UID: \"b973bd81-e8d9-4d98-83a5-c2fd62edc555\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.628465 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b973bd81-e8d9-4d98-83a5-c2fd62edc555-run-ovn\") pod \"ovnkube-node-9mp7m\" (UID: \"b973bd81-e8d9-4d98-83a5-c2fd62edc555\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.628480 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b973bd81-e8d9-4d98-83a5-c2fd62edc555-ovn-node-metrics-cert\") pod \"ovnkube-node-9mp7m\" (UID: \"b973bd81-e8d9-4d98-83a5-c2fd62edc555\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.628494 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b973bd81-e8d9-4d98-83a5-c2fd62edc555-host-kubelet\") pod \"ovnkube-node-9mp7m\" (UID: \"b973bd81-e8d9-4d98-83a5-c2fd62edc555\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.729695 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b973bd81-e8d9-4d98-83a5-c2fd62edc555-run-ovn\") pod \"ovnkube-node-9mp7m\" (UID: \"b973bd81-e8d9-4d98-83a5-c2fd62edc555\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.729744 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b973bd81-e8d9-4d98-83a5-c2fd62edc555-ovn-node-metrics-cert\") pod \"ovnkube-node-9mp7m\" (UID: \"b973bd81-e8d9-4d98-83a5-c2fd62edc555\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.729762 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b973bd81-e8d9-4d98-83a5-c2fd62edc555-host-kubelet\") pod \"ovnkube-node-9mp7m\" (UID: \"b973bd81-e8d9-4d98-83a5-c2fd62edc555\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.729778 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b973bd81-e8d9-4d98-83a5-c2fd62edc555-host-run-netns\") pod \"ovnkube-node-9mp7m\" (UID: \"b973bd81-e8d9-4d98-83a5-c2fd62edc555\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.729798 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b973bd81-e8d9-4d98-83a5-c2fd62edc555-systemd-units\") pod \"ovnkube-node-9mp7m\" (UID: \"b973bd81-e8d9-4d98-83a5-c2fd62edc555\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.729813 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b973bd81-e8d9-4d98-83a5-c2fd62edc555-log-socket\") pod \"ovnkube-node-9mp7m\" (UID: \"b973bd81-e8d9-4d98-83a5-c2fd62edc555\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.729828 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b973bd81-e8d9-4d98-83a5-c2fd62edc555-host-cni-bin\") pod \"ovnkube-node-9mp7m\" (UID: \"b973bd81-e8d9-4d98-83a5-c2fd62edc555\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.729846 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b973bd81-e8d9-4d98-83a5-c2fd62edc555-ovnkube-config\") pod \"ovnkube-node-9mp7m\" (UID: \"b973bd81-e8d9-4d98-83a5-c2fd62edc555\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.729863 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b973bd81-e8d9-4d98-83a5-c2fd62edc555-run-openvswitch\") pod \"ovnkube-node-9mp7m\" (UID: \"b973bd81-e8d9-4d98-83a5-c2fd62edc555\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.729876 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b973bd81-e8d9-4d98-83a5-c2fd62edc555-node-log\") pod \"ovnkube-node-9mp7m\" (UID: \"b973bd81-e8d9-4d98-83a5-c2fd62edc555\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.729893 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b973bd81-e8d9-4d98-83a5-c2fd62edc555-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9mp7m\" (UID: \"b973bd81-e8d9-4d98-83a5-c2fd62edc555\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.729914 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b973bd81-e8d9-4d98-83a5-c2fd62edc555-host-run-ovn-kubernetes\") pod \"ovnkube-node-9mp7m\" (UID: \"b973bd81-e8d9-4d98-83a5-c2fd62edc555\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.729936 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b973bd81-e8d9-4d98-83a5-c2fd62edc555-host-slash\") pod \"ovnkube-node-9mp7m\" (UID: \"b973bd81-e8d9-4d98-83a5-c2fd62edc555\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.729951 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nf8qh\" (UniqueName: \"kubernetes.io/projected/b973bd81-e8d9-4d98-83a5-c2fd62edc555-kube-api-access-nf8qh\") pod \"ovnkube-node-9mp7m\" (UID: \"b973bd81-e8d9-4d98-83a5-c2fd62edc555\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.729972 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b973bd81-e8d9-4d98-83a5-c2fd62edc555-host-cni-netd\") pod \"ovnkube-node-9mp7m\" (UID: \"b973bd81-e8d9-4d98-83a5-c2fd62edc555\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.729995 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b973bd81-e8d9-4d98-83a5-c2fd62edc555-run-systemd\") pod \"ovnkube-node-9mp7m\" (UID: \"b973bd81-e8d9-4d98-83a5-c2fd62edc555\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.730014 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b973bd81-e8d9-4d98-83a5-c2fd62edc555-ovnkube-script-lib\") pod \"ovnkube-node-9mp7m\" (UID: \"b973bd81-e8d9-4d98-83a5-c2fd62edc555\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.730030 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b973bd81-e8d9-4d98-83a5-c2fd62edc555-var-lib-openvswitch\") pod \"ovnkube-node-9mp7m\" (UID: \"b973bd81-e8d9-4d98-83a5-c2fd62edc555\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.730046 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b973bd81-e8d9-4d98-83a5-c2fd62edc555-etc-openvswitch\") pod \"ovnkube-node-9mp7m\" (UID: \"b973bd81-e8d9-4d98-83a5-c2fd62edc555\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.730059 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b973bd81-e8d9-4d98-83a5-c2fd62edc555-env-overrides\") pod \"ovnkube-node-9mp7m\" (UID: \"b973bd81-e8d9-4d98-83a5-c2fd62edc555\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.730611 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b973bd81-e8d9-4d98-83a5-c2fd62edc555-env-overrides\") pod \"ovnkube-node-9mp7m\" (UID: \"b973bd81-e8d9-4d98-83a5-c2fd62edc555\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.730657 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b973bd81-e8d9-4d98-83a5-c2fd62edc555-run-ovn\") pod \"ovnkube-node-9mp7m\" (UID: \"b973bd81-e8d9-4d98-83a5-c2fd62edc555\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.730998 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b973bd81-e8d9-4d98-83a5-c2fd62edc555-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9mp7m\" (UID: \"b973bd81-e8d9-4d98-83a5-c2fd62edc555\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.731031 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b973bd81-e8d9-4d98-83a5-c2fd62edc555-host-kubelet\") pod \"ovnkube-node-9mp7m\" (UID: \"b973bd81-e8d9-4d98-83a5-c2fd62edc555\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.731036 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b973bd81-e8d9-4d98-83a5-c2fd62edc555-host-cni-netd\") pod \"ovnkube-node-9mp7m\" (UID: \"b973bd81-e8d9-4d98-83a5-c2fd62edc555\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.731115 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b973bd81-e8d9-4d98-83a5-c2fd62edc555-host-cni-bin\") pod \"ovnkube-node-9mp7m\" (UID: \"b973bd81-e8d9-4d98-83a5-c2fd62edc555\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.731166 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b973bd81-e8d9-4d98-83a5-c2fd62edc555-host-run-ovn-kubernetes\") pod \"ovnkube-node-9mp7m\" (UID: \"b973bd81-e8d9-4d98-83a5-c2fd62edc555\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.731196 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b973bd81-e8d9-4d98-83a5-c2fd62edc555-host-run-netns\") pod \"ovnkube-node-9mp7m\" (UID: \"b973bd81-e8d9-4d98-83a5-c2fd62edc555\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.731205 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b973bd81-e8d9-4d98-83a5-c2fd62edc555-run-openvswitch\") pod \"ovnkube-node-9mp7m\" (UID: \"b973bd81-e8d9-4d98-83a5-c2fd62edc555\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.731241 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b973bd81-e8d9-4d98-83a5-c2fd62edc555-log-socket\") pod \"ovnkube-node-9mp7m\" (UID: \"b973bd81-e8d9-4d98-83a5-c2fd62edc555\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.731278 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b973bd81-e8d9-4d98-83a5-c2fd62edc555-run-systemd\") pod \"ovnkube-node-9mp7m\" (UID: \"b973bd81-e8d9-4d98-83a5-c2fd62edc555\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.731290 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b973bd81-e8d9-4d98-83a5-c2fd62edc555-node-log\") pod \"ovnkube-node-9mp7m\" (UID: \"b973bd81-e8d9-4d98-83a5-c2fd62edc555\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.731303 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b973bd81-e8d9-4d98-83a5-c2fd62edc555-var-lib-openvswitch\") pod \"ovnkube-node-9mp7m\" (UID: \"b973bd81-e8d9-4d98-83a5-c2fd62edc555\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.731223 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b973bd81-e8d9-4d98-83a5-c2fd62edc555-systemd-units\") pod \"ovnkube-node-9mp7m\" (UID: \"b973bd81-e8d9-4d98-83a5-c2fd62edc555\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.731329 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b973bd81-e8d9-4d98-83a5-c2fd62edc555-etc-openvswitch\") pod \"ovnkube-node-9mp7m\" (UID: \"b973bd81-e8d9-4d98-83a5-c2fd62edc555\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.731461 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b973bd81-e8d9-4d98-83a5-c2fd62edc555-ovnkube-config\") pod \"ovnkube-node-9mp7m\" (UID: \"b973bd81-e8d9-4d98-83a5-c2fd62edc555\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.731872 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b973bd81-e8d9-4d98-83a5-c2fd62edc555-ovnkube-script-lib\") pod \"ovnkube-node-9mp7m\" (UID: \"b973bd81-e8d9-4d98-83a5-c2fd62edc555\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.731951 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b973bd81-e8d9-4d98-83a5-c2fd62edc555-host-slash\") pod \"ovnkube-node-9mp7m\" (UID: \"b973bd81-e8d9-4d98-83a5-c2fd62edc555\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.735182 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b973bd81-e8d9-4d98-83a5-c2fd62edc555-ovn-node-metrics-cert\") pod \"ovnkube-node-9mp7m\" (UID: \"b973bd81-e8d9-4d98-83a5-c2fd62edc555\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.782123 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nf8qh\" (UniqueName: \"kubernetes.io/projected/b973bd81-e8d9-4d98-83a5-c2fd62edc555-kube-api-access-nf8qh\") pod \"ovnkube-node-9mp7m\" (UID: \"b973bd81-e8d9-4d98-83a5-c2fd62edc555\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" Dec 01 08:28:57 crc kubenswrapper[5004]: I1201 08:28:57.783845 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" Dec 01 08:28:58 crc kubenswrapper[5004]: I1201 08:28:58.079881 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-knmdv_15cdec0a-5925-4966-a30b-f60c503f633e/ovn-acl-logging/0.log" Dec 01 08:28:58 crc kubenswrapper[5004]: I1201 08:28:58.080710 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-knmdv_15cdec0a-5925-4966-a30b-f60c503f633e/ovn-controller/0.log" Dec 01 08:28:58 crc kubenswrapper[5004]: I1201 08:28:58.081021 5004 generic.go:334] "Generic (PLEG): container finished" podID="15cdec0a-5925-4966-a30b-f60c503f633e" containerID="f001b70a83da28b1ea78a94f1cd70dfc63b3de5a21b041932d33c2ac970c986b" exitCode=0 Dec 01 08:28:58 crc kubenswrapper[5004]: I1201 08:28:58.081048 5004 generic.go:334] "Generic (PLEG): container finished" podID="15cdec0a-5925-4966-a30b-f60c503f633e" containerID="f528b94ff12c7a804930dca4b93c23334a9ae3a53317750cd34b5695f08b4582" exitCode=0 Dec 01 08:28:58 crc kubenswrapper[5004]: I1201 08:28:58.081063 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" event={"ID":"15cdec0a-5925-4966-a30b-f60c503f633e","Type":"ContainerDied","Data":"f001b70a83da28b1ea78a94f1cd70dfc63b3de5a21b041932d33c2ac970c986b"} Dec 01 08:28:58 crc kubenswrapper[5004]: I1201 08:28:58.081108 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" event={"ID":"15cdec0a-5925-4966-a30b-f60c503f633e","Type":"ContainerDied","Data":"f528b94ff12c7a804930dca4b93c23334a9ae3a53317750cd34b5695f08b4582"} Dec 01 08:28:58 crc kubenswrapper[5004]: I1201 08:28:58.081110 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" Dec 01 08:28:58 crc kubenswrapper[5004]: I1201 08:28:58.081130 5004 scope.go:117] "RemoveContainer" containerID="57bc8d8aca54ca6d854bd5065cc8d1346968b6531f884d87dc38f291e505419a" Dec 01 08:28:58 crc kubenswrapper[5004]: I1201 08:28:58.081119 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knmdv" event={"ID":"15cdec0a-5925-4966-a30b-f60c503f633e","Type":"ContainerDied","Data":"d5212583abfb5ca3c1010406306c1284a177ea12c285e6a737cc7318e7f3ffb8"} Dec 01 08:28:58 crc kubenswrapper[5004]: I1201 08:28:58.082759 5004 generic.go:334] "Generic (PLEG): container finished" podID="b973bd81-e8d9-4d98-83a5-c2fd62edc555" containerID="53b71595e2e76eb775fcd2c10d6559e5c77343f07b38f0177b7034faa9c1f46f" exitCode=0 Dec 01 08:28:58 crc kubenswrapper[5004]: I1201 08:28:58.082803 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" event={"ID":"b973bd81-e8d9-4d98-83a5-c2fd62edc555","Type":"ContainerDied","Data":"53b71595e2e76eb775fcd2c10d6559e5c77343f07b38f0177b7034faa9c1f46f"} Dec 01 08:28:58 crc kubenswrapper[5004]: I1201 08:28:58.082817 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" event={"ID":"b973bd81-e8d9-4d98-83a5-c2fd62edc555","Type":"ContainerStarted","Data":"cc43d144d28e8966e782051d9120c0da6ebd639228285de0accdb93f3e2f82c1"} Dec 01 08:28:58 crc kubenswrapper[5004]: I1201 08:28:58.097773 5004 scope.go:117] "RemoveContainer" containerID="1f231e0092e1a85cc5d6e00fb8d3bd982223b670191ac3dc21d81ff1fab462a0" Dec 01 08:28:58 crc kubenswrapper[5004]: I1201 08:28:58.098201 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zjksw_70e79009-93be-49c4-a6b3-e8a06bcea7f4/kube-multus/2.log" Dec 01 08:28:58 crc kubenswrapper[5004]: I1201 08:28:58.125222 5004 scope.go:117] "RemoveContainer" containerID="e7c5c043438b515e37a6d6d5c47b92479bbb7322fe35c3a1bc57ee7b3a746544" Dec 01 08:28:58 crc kubenswrapper[5004]: I1201 08:28:58.140197 5004 scope.go:117] "RemoveContainer" containerID="23df33e45cc27e4d343ae6ec3fb88f2dfc125ba7da424515d4bee5ec19281832" Dec 01 08:28:58 crc kubenswrapper[5004]: I1201 08:28:58.159491 5004 scope.go:117] "RemoveContainer" containerID="f001b70a83da28b1ea78a94f1cd70dfc63b3de5a21b041932d33c2ac970c986b" Dec 01 08:28:58 crc kubenswrapper[5004]: I1201 08:28:58.178498 5004 scope.go:117] "RemoveContainer" containerID="f528b94ff12c7a804930dca4b93c23334a9ae3a53317750cd34b5695f08b4582" Dec 01 08:28:58 crc kubenswrapper[5004]: I1201 08:28:58.179211 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-knmdv"] Dec 01 08:28:58 crc kubenswrapper[5004]: I1201 08:28:58.192481 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-knmdv"] Dec 01 08:28:58 crc kubenswrapper[5004]: I1201 08:28:58.202154 5004 scope.go:117] "RemoveContainer" containerID="d4a7cab8e1594814a687b1433a92642a19eff7c2eb12f96fa06f08a2e270733d" Dec 01 08:28:58 crc kubenswrapper[5004]: I1201 08:28:58.259916 5004 scope.go:117] "RemoveContainer" containerID="1844a18a6109c25a25b9c8cb3ff65e69cc657f66be03dd97e883d24a774b7638" Dec 01 08:28:58 crc kubenswrapper[5004]: I1201 08:28:58.290003 5004 scope.go:117] "RemoveContainer" containerID="97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c" Dec 01 08:28:58 crc kubenswrapper[5004]: I1201 08:28:58.328764 5004 scope.go:117] "RemoveContainer" containerID="57bc8d8aca54ca6d854bd5065cc8d1346968b6531f884d87dc38f291e505419a" Dec 01 08:28:58 crc kubenswrapper[5004]: E1201 08:28:58.331889 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57bc8d8aca54ca6d854bd5065cc8d1346968b6531f884d87dc38f291e505419a\": container with ID starting with 57bc8d8aca54ca6d854bd5065cc8d1346968b6531f884d87dc38f291e505419a not found: ID does not exist" containerID="57bc8d8aca54ca6d854bd5065cc8d1346968b6531f884d87dc38f291e505419a" Dec 01 08:28:58 crc kubenswrapper[5004]: I1201 08:28:58.331925 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57bc8d8aca54ca6d854bd5065cc8d1346968b6531f884d87dc38f291e505419a"} err="failed to get container status \"57bc8d8aca54ca6d854bd5065cc8d1346968b6531f884d87dc38f291e505419a\": rpc error: code = NotFound desc = could not find container \"57bc8d8aca54ca6d854bd5065cc8d1346968b6531f884d87dc38f291e505419a\": container with ID starting with 57bc8d8aca54ca6d854bd5065cc8d1346968b6531f884d87dc38f291e505419a not found: ID does not exist" Dec 01 08:28:58 crc kubenswrapper[5004]: I1201 08:28:58.331950 5004 scope.go:117] "RemoveContainer" containerID="1f231e0092e1a85cc5d6e00fb8d3bd982223b670191ac3dc21d81ff1fab462a0" Dec 01 08:28:58 crc kubenswrapper[5004]: E1201 08:28:58.337481 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f231e0092e1a85cc5d6e00fb8d3bd982223b670191ac3dc21d81ff1fab462a0\": container with ID starting with 1f231e0092e1a85cc5d6e00fb8d3bd982223b670191ac3dc21d81ff1fab462a0 not found: ID does not exist" containerID="1f231e0092e1a85cc5d6e00fb8d3bd982223b670191ac3dc21d81ff1fab462a0" Dec 01 08:28:58 crc kubenswrapper[5004]: I1201 08:28:58.337516 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f231e0092e1a85cc5d6e00fb8d3bd982223b670191ac3dc21d81ff1fab462a0"} err="failed to get container status \"1f231e0092e1a85cc5d6e00fb8d3bd982223b670191ac3dc21d81ff1fab462a0\": rpc error: code = NotFound desc = could not find container \"1f231e0092e1a85cc5d6e00fb8d3bd982223b670191ac3dc21d81ff1fab462a0\": container with ID starting with 1f231e0092e1a85cc5d6e00fb8d3bd982223b670191ac3dc21d81ff1fab462a0 not found: ID does not exist" Dec 01 08:28:58 crc kubenswrapper[5004]: I1201 08:28:58.337541 5004 scope.go:117] "RemoveContainer" containerID="e7c5c043438b515e37a6d6d5c47b92479bbb7322fe35c3a1bc57ee7b3a746544" Dec 01 08:28:58 crc kubenswrapper[5004]: E1201 08:28:58.340852 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7c5c043438b515e37a6d6d5c47b92479bbb7322fe35c3a1bc57ee7b3a746544\": container with ID starting with e7c5c043438b515e37a6d6d5c47b92479bbb7322fe35c3a1bc57ee7b3a746544 not found: ID does not exist" containerID="e7c5c043438b515e37a6d6d5c47b92479bbb7322fe35c3a1bc57ee7b3a746544" Dec 01 08:28:58 crc kubenswrapper[5004]: I1201 08:28:58.340884 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7c5c043438b515e37a6d6d5c47b92479bbb7322fe35c3a1bc57ee7b3a746544"} err="failed to get container status \"e7c5c043438b515e37a6d6d5c47b92479bbb7322fe35c3a1bc57ee7b3a746544\": rpc error: code = NotFound desc = could not find container \"e7c5c043438b515e37a6d6d5c47b92479bbb7322fe35c3a1bc57ee7b3a746544\": container with ID starting with e7c5c043438b515e37a6d6d5c47b92479bbb7322fe35c3a1bc57ee7b3a746544 not found: ID does not exist" Dec 01 08:28:58 crc kubenswrapper[5004]: I1201 08:28:58.340904 5004 scope.go:117] "RemoveContainer" containerID="23df33e45cc27e4d343ae6ec3fb88f2dfc125ba7da424515d4bee5ec19281832" Dec 01 08:28:58 crc kubenswrapper[5004]: E1201 08:28:58.344785 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23df33e45cc27e4d343ae6ec3fb88f2dfc125ba7da424515d4bee5ec19281832\": container with ID starting with 23df33e45cc27e4d343ae6ec3fb88f2dfc125ba7da424515d4bee5ec19281832 not found: ID does not exist" containerID="23df33e45cc27e4d343ae6ec3fb88f2dfc125ba7da424515d4bee5ec19281832" Dec 01 08:28:58 crc kubenswrapper[5004]: I1201 08:28:58.344812 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23df33e45cc27e4d343ae6ec3fb88f2dfc125ba7da424515d4bee5ec19281832"} err="failed to get container status \"23df33e45cc27e4d343ae6ec3fb88f2dfc125ba7da424515d4bee5ec19281832\": rpc error: code = NotFound desc = could not find container \"23df33e45cc27e4d343ae6ec3fb88f2dfc125ba7da424515d4bee5ec19281832\": container with ID starting with 23df33e45cc27e4d343ae6ec3fb88f2dfc125ba7da424515d4bee5ec19281832 not found: ID does not exist" Dec 01 08:28:58 crc kubenswrapper[5004]: I1201 08:28:58.344830 5004 scope.go:117] "RemoveContainer" containerID="f001b70a83da28b1ea78a94f1cd70dfc63b3de5a21b041932d33c2ac970c986b" Dec 01 08:28:58 crc kubenswrapper[5004]: E1201 08:28:58.348965 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f001b70a83da28b1ea78a94f1cd70dfc63b3de5a21b041932d33c2ac970c986b\": container with ID starting with f001b70a83da28b1ea78a94f1cd70dfc63b3de5a21b041932d33c2ac970c986b not found: ID does not exist" containerID="f001b70a83da28b1ea78a94f1cd70dfc63b3de5a21b041932d33c2ac970c986b" Dec 01 08:28:58 crc kubenswrapper[5004]: I1201 08:28:58.349001 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f001b70a83da28b1ea78a94f1cd70dfc63b3de5a21b041932d33c2ac970c986b"} err="failed to get container status \"f001b70a83da28b1ea78a94f1cd70dfc63b3de5a21b041932d33c2ac970c986b\": rpc error: code = NotFound desc = could not find container \"f001b70a83da28b1ea78a94f1cd70dfc63b3de5a21b041932d33c2ac970c986b\": container with ID starting with f001b70a83da28b1ea78a94f1cd70dfc63b3de5a21b041932d33c2ac970c986b not found: ID does not exist" Dec 01 08:28:58 crc kubenswrapper[5004]: I1201 08:28:58.349026 5004 scope.go:117] "RemoveContainer" containerID="f528b94ff12c7a804930dca4b93c23334a9ae3a53317750cd34b5695f08b4582" Dec 01 08:28:58 crc kubenswrapper[5004]: E1201 08:28:58.352922 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f528b94ff12c7a804930dca4b93c23334a9ae3a53317750cd34b5695f08b4582\": container with ID starting with f528b94ff12c7a804930dca4b93c23334a9ae3a53317750cd34b5695f08b4582 not found: ID does not exist" containerID="f528b94ff12c7a804930dca4b93c23334a9ae3a53317750cd34b5695f08b4582" Dec 01 08:28:58 crc kubenswrapper[5004]: I1201 08:28:58.352958 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f528b94ff12c7a804930dca4b93c23334a9ae3a53317750cd34b5695f08b4582"} err="failed to get container status \"f528b94ff12c7a804930dca4b93c23334a9ae3a53317750cd34b5695f08b4582\": rpc error: code = NotFound desc = could not find container \"f528b94ff12c7a804930dca4b93c23334a9ae3a53317750cd34b5695f08b4582\": container with ID starting with f528b94ff12c7a804930dca4b93c23334a9ae3a53317750cd34b5695f08b4582 not found: ID does not exist" Dec 01 08:28:58 crc kubenswrapper[5004]: I1201 08:28:58.352983 5004 scope.go:117] "RemoveContainer" containerID="d4a7cab8e1594814a687b1433a92642a19eff7c2eb12f96fa06f08a2e270733d" Dec 01 08:28:58 crc kubenswrapper[5004]: E1201 08:28:58.377098 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4a7cab8e1594814a687b1433a92642a19eff7c2eb12f96fa06f08a2e270733d\": container with ID starting with d4a7cab8e1594814a687b1433a92642a19eff7c2eb12f96fa06f08a2e270733d not found: ID does not exist" containerID="d4a7cab8e1594814a687b1433a92642a19eff7c2eb12f96fa06f08a2e270733d" Dec 01 08:28:58 crc kubenswrapper[5004]: I1201 08:28:58.377138 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4a7cab8e1594814a687b1433a92642a19eff7c2eb12f96fa06f08a2e270733d"} err="failed to get container status \"d4a7cab8e1594814a687b1433a92642a19eff7c2eb12f96fa06f08a2e270733d\": rpc error: code = NotFound desc = could not find container \"d4a7cab8e1594814a687b1433a92642a19eff7c2eb12f96fa06f08a2e270733d\": container with ID starting with d4a7cab8e1594814a687b1433a92642a19eff7c2eb12f96fa06f08a2e270733d not found: ID does not exist" Dec 01 08:28:58 crc kubenswrapper[5004]: I1201 08:28:58.377166 5004 scope.go:117] "RemoveContainer" containerID="1844a18a6109c25a25b9c8cb3ff65e69cc657f66be03dd97e883d24a774b7638" Dec 01 08:28:58 crc kubenswrapper[5004]: E1201 08:28:58.377418 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1844a18a6109c25a25b9c8cb3ff65e69cc657f66be03dd97e883d24a774b7638\": container with ID starting with 1844a18a6109c25a25b9c8cb3ff65e69cc657f66be03dd97e883d24a774b7638 not found: ID does not exist" containerID="1844a18a6109c25a25b9c8cb3ff65e69cc657f66be03dd97e883d24a774b7638" Dec 01 08:28:58 crc kubenswrapper[5004]: I1201 08:28:58.377439 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1844a18a6109c25a25b9c8cb3ff65e69cc657f66be03dd97e883d24a774b7638"} err="failed to get container status \"1844a18a6109c25a25b9c8cb3ff65e69cc657f66be03dd97e883d24a774b7638\": rpc error: code = NotFound desc = could not find container \"1844a18a6109c25a25b9c8cb3ff65e69cc657f66be03dd97e883d24a774b7638\": container with ID starting with 1844a18a6109c25a25b9c8cb3ff65e69cc657f66be03dd97e883d24a774b7638 not found: ID does not exist" Dec 01 08:28:58 crc kubenswrapper[5004]: I1201 08:28:58.377451 5004 scope.go:117] "RemoveContainer" containerID="97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c" Dec 01 08:28:58 crc kubenswrapper[5004]: E1201 08:28:58.377896 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c\": container with ID starting with 97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c not found: ID does not exist" containerID="97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c" Dec 01 08:28:58 crc kubenswrapper[5004]: I1201 08:28:58.377921 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c"} err="failed to get container status \"97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c\": rpc error: code = NotFound desc = could not find container \"97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c\": container with ID starting with 97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c not found: ID does not exist" Dec 01 08:28:58 crc kubenswrapper[5004]: I1201 08:28:58.377935 5004 scope.go:117] "RemoveContainer" containerID="57bc8d8aca54ca6d854bd5065cc8d1346968b6531f884d87dc38f291e505419a" Dec 01 08:28:58 crc kubenswrapper[5004]: I1201 08:28:58.378307 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57bc8d8aca54ca6d854bd5065cc8d1346968b6531f884d87dc38f291e505419a"} err="failed to get container status \"57bc8d8aca54ca6d854bd5065cc8d1346968b6531f884d87dc38f291e505419a\": rpc error: code = NotFound desc = could not find container \"57bc8d8aca54ca6d854bd5065cc8d1346968b6531f884d87dc38f291e505419a\": container with ID starting with 57bc8d8aca54ca6d854bd5065cc8d1346968b6531f884d87dc38f291e505419a not found: ID does not exist" Dec 01 08:28:58 crc kubenswrapper[5004]: I1201 08:28:58.378325 5004 scope.go:117] "RemoveContainer" containerID="1f231e0092e1a85cc5d6e00fb8d3bd982223b670191ac3dc21d81ff1fab462a0" Dec 01 08:28:58 crc kubenswrapper[5004]: I1201 08:28:58.378578 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f231e0092e1a85cc5d6e00fb8d3bd982223b670191ac3dc21d81ff1fab462a0"} err="failed to get container status \"1f231e0092e1a85cc5d6e00fb8d3bd982223b670191ac3dc21d81ff1fab462a0\": rpc error: code = NotFound desc = could not find container \"1f231e0092e1a85cc5d6e00fb8d3bd982223b670191ac3dc21d81ff1fab462a0\": container with ID starting with 1f231e0092e1a85cc5d6e00fb8d3bd982223b670191ac3dc21d81ff1fab462a0 not found: ID does not exist" Dec 01 08:28:58 crc kubenswrapper[5004]: I1201 08:28:58.378596 5004 scope.go:117] "RemoveContainer" containerID="e7c5c043438b515e37a6d6d5c47b92479bbb7322fe35c3a1bc57ee7b3a746544" Dec 01 08:28:58 crc kubenswrapper[5004]: I1201 08:28:58.386966 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7c5c043438b515e37a6d6d5c47b92479bbb7322fe35c3a1bc57ee7b3a746544"} err="failed to get container status \"e7c5c043438b515e37a6d6d5c47b92479bbb7322fe35c3a1bc57ee7b3a746544\": rpc error: code = NotFound desc = could not find container \"e7c5c043438b515e37a6d6d5c47b92479bbb7322fe35c3a1bc57ee7b3a746544\": container with ID starting with e7c5c043438b515e37a6d6d5c47b92479bbb7322fe35c3a1bc57ee7b3a746544 not found: ID does not exist" Dec 01 08:28:58 crc kubenswrapper[5004]: I1201 08:28:58.387000 5004 scope.go:117] "RemoveContainer" containerID="23df33e45cc27e4d343ae6ec3fb88f2dfc125ba7da424515d4bee5ec19281832" Dec 01 08:28:58 crc kubenswrapper[5004]: I1201 08:28:58.387300 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23df33e45cc27e4d343ae6ec3fb88f2dfc125ba7da424515d4bee5ec19281832"} err="failed to get container status \"23df33e45cc27e4d343ae6ec3fb88f2dfc125ba7da424515d4bee5ec19281832\": rpc error: code = NotFound desc = could not find container \"23df33e45cc27e4d343ae6ec3fb88f2dfc125ba7da424515d4bee5ec19281832\": container with ID starting with 23df33e45cc27e4d343ae6ec3fb88f2dfc125ba7da424515d4bee5ec19281832 not found: ID does not exist" Dec 01 08:28:58 crc kubenswrapper[5004]: I1201 08:28:58.387319 5004 scope.go:117] "RemoveContainer" containerID="f001b70a83da28b1ea78a94f1cd70dfc63b3de5a21b041932d33c2ac970c986b" Dec 01 08:28:58 crc kubenswrapper[5004]: I1201 08:28:58.387589 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f001b70a83da28b1ea78a94f1cd70dfc63b3de5a21b041932d33c2ac970c986b"} err="failed to get container status \"f001b70a83da28b1ea78a94f1cd70dfc63b3de5a21b041932d33c2ac970c986b\": rpc error: code = NotFound desc = could not find container \"f001b70a83da28b1ea78a94f1cd70dfc63b3de5a21b041932d33c2ac970c986b\": container with ID starting with f001b70a83da28b1ea78a94f1cd70dfc63b3de5a21b041932d33c2ac970c986b not found: ID does not exist" Dec 01 08:28:58 crc kubenswrapper[5004]: I1201 08:28:58.387606 5004 scope.go:117] "RemoveContainer" containerID="f528b94ff12c7a804930dca4b93c23334a9ae3a53317750cd34b5695f08b4582" Dec 01 08:28:58 crc kubenswrapper[5004]: I1201 08:28:58.389120 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f528b94ff12c7a804930dca4b93c23334a9ae3a53317750cd34b5695f08b4582"} err="failed to get container status \"f528b94ff12c7a804930dca4b93c23334a9ae3a53317750cd34b5695f08b4582\": rpc error: code = NotFound desc = could not find container \"f528b94ff12c7a804930dca4b93c23334a9ae3a53317750cd34b5695f08b4582\": container with ID starting with f528b94ff12c7a804930dca4b93c23334a9ae3a53317750cd34b5695f08b4582 not found: ID does not exist" Dec 01 08:28:58 crc kubenswrapper[5004]: I1201 08:28:58.389139 5004 scope.go:117] "RemoveContainer" containerID="d4a7cab8e1594814a687b1433a92642a19eff7c2eb12f96fa06f08a2e270733d" Dec 01 08:28:58 crc kubenswrapper[5004]: I1201 08:28:58.389861 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4a7cab8e1594814a687b1433a92642a19eff7c2eb12f96fa06f08a2e270733d"} err="failed to get container status \"d4a7cab8e1594814a687b1433a92642a19eff7c2eb12f96fa06f08a2e270733d\": rpc error: code = NotFound desc = could not find container \"d4a7cab8e1594814a687b1433a92642a19eff7c2eb12f96fa06f08a2e270733d\": container with ID starting with d4a7cab8e1594814a687b1433a92642a19eff7c2eb12f96fa06f08a2e270733d not found: ID does not exist" Dec 01 08:28:58 crc kubenswrapper[5004]: I1201 08:28:58.389882 5004 scope.go:117] "RemoveContainer" containerID="1844a18a6109c25a25b9c8cb3ff65e69cc657f66be03dd97e883d24a774b7638" Dec 01 08:28:58 crc kubenswrapper[5004]: I1201 08:28:58.393645 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1844a18a6109c25a25b9c8cb3ff65e69cc657f66be03dd97e883d24a774b7638"} err="failed to get container status \"1844a18a6109c25a25b9c8cb3ff65e69cc657f66be03dd97e883d24a774b7638\": rpc error: code = NotFound desc = could not find container \"1844a18a6109c25a25b9c8cb3ff65e69cc657f66be03dd97e883d24a774b7638\": container with ID starting with 1844a18a6109c25a25b9c8cb3ff65e69cc657f66be03dd97e883d24a774b7638 not found: ID does not exist" Dec 01 08:28:58 crc kubenswrapper[5004]: I1201 08:28:58.393676 5004 scope.go:117] "RemoveContainer" containerID="97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c" Dec 01 08:28:58 crc kubenswrapper[5004]: I1201 08:28:58.394165 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c"} err="failed to get container status \"97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c\": rpc error: code = NotFound desc = could not find container \"97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c\": container with ID starting with 97c9fab31c2655b5d579a2ed9b837229ee678973643c6a526c1241e3c84e315c not found: ID does not exist" Dec 01 08:28:58 crc kubenswrapper[5004]: I1201 08:28:58.765362 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15cdec0a-5925-4966-a30b-f60c503f633e" path="/var/lib/kubelet/pods/15cdec0a-5925-4966-a30b-f60c503f633e/volumes" Dec 01 08:28:59 crc kubenswrapper[5004]: I1201 08:28:59.107308 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" event={"ID":"b973bd81-e8d9-4d98-83a5-c2fd62edc555","Type":"ContainerStarted","Data":"495679ea0e1092eba49ee704e4858abbc1a12e66fce179d4083e9472fd018ca1"} Dec 01 08:28:59 crc kubenswrapper[5004]: I1201 08:28:59.107346 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" event={"ID":"b973bd81-e8d9-4d98-83a5-c2fd62edc555","Type":"ContainerStarted","Data":"9cdb6524340426853f37acbab3b569433a28cb322a3ce5d4f43b1dec8def7477"} Dec 01 08:28:59 crc kubenswrapper[5004]: I1201 08:28:59.107356 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" event={"ID":"b973bd81-e8d9-4d98-83a5-c2fd62edc555","Type":"ContainerStarted","Data":"fdfd78e0ca4432d0c17f13587e16e8d8c61b36b7431a272d14ba8edfafbb6e38"} Dec 01 08:28:59 crc kubenswrapper[5004]: I1201 08:28:59.107363 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" event={"ID":"b973bd81-e8d9-4d98-83a5-c2fd62edc555","Type":"ContainerStarted","Data":"c8191f69b648a57f8f3073268b59208c7f876ba181fc048967e27500a0f5db3b"} Dec 01 08:28:59 crc kubenswrapper[5004]: I1201 08:28:59.107373 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" event={"ID":"b973bd81-e8d9-4d98-83a5-c2fd62edc555","Type":"ContainerStarted","Data":"1169e3a69aefcc6849cda346dd4b891518b35a3cd9802e9397241defd1f53c66"} Dec 01 08:28:59 crc kubenswrapper[5004]: I1201 08:28:59.107383 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" event={"ID":"b973bd81-e8d9-4d98-83a5-c2fd62edc555","Type":"ContainerStarted","Data":"a05717a2c04e5d126a20fdc42a8d9338fa4e14246c71d36c67ca084d6308b616"} Dec 01 08:29:02 crc kubenswrapper[5004]: I1201 08:29:02.129674 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" event={"ID":"b973bd81-e8d9-4d98-83a5-c2fd62edc555","Type":"ContainerStarted","Data":"ac07ead33993f8ff091e0e7cd48b5ea221f9405db75e0cbf8fdac3a0c8f52931"} Dec 01 08:29:02 crc kubenswrapper[5004]: I1201 08:29:02.191222 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-gwtbl"] Dec 01 08:29:02 crc kubenswrapper[5004]: I1201 08:29:02.191935 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-gwtbl" Dec 01 08:29:02 crc kubenswrapper[5004]: I1201 08:29:02.194494 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-tcfgw" Dec 01 08:29:02 crc kubenswrapper[5004]: I1201 08:29:02.194716 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Dec 01 08:29:02 crc kubenswrapper[5004]: I1201 08:29:02.196706 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Dec 01 08:29:02 crc kubenswrapper[5004]: I1201 08:29:02.244734 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5bf4d79569-bhs7g"] Dec 01 08:29:02 crc kubenswrapper[5004]: I1201 08:29:02.245424 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bf4d79569-bhs7g" Dec 01 08:29:02 crc kubenswrapper[5004]: W1201 08:29:02.247578 5004 reflector.go:561] object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert": failed to list *v1.Secret: secrets "obo-prometheus-operator-admission-webhook-service-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-operators": no relationship found between node 'crc' and this object Dec 01 08:29:02 crc kubenswrapper[5004]: E1201 08:29:02.247625 5004 reflector.go:158] "Unhandled Error" err="object-\"openshift-operators\"/\"obo-prometheus-operator-admission-webhook-service-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"obo-prometheus-operator-admission-webhook-service-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-operators\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 01 08:29:02 crc kubenswrapper[5004]: W1201 08:29:02.247680 5004 reflector.go:561] object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-shm4n": failed to list *v1.Secret: secrets "obo-prometheus-operator-admission-webhook-dockercfg-shm4n" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-operators": no relationship found between node 'crc' and this object Dec 01 08:29:02 crc kubenswrapper[5004]: E1201 08:29:02.247692 5004 reflector.go:158] "Unhandled Error" err="object-\"openshift-operators\"/\"obo-prometheus-operator-admission-webhook-dockercfg-shm4n\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"obo-prometheus-operator-admission-webhook-dockercfg-shm4n\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-operators\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 01 08:29:02 crc kubenswrapper[5004]: I1201 08:29:02.253343 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5bf4d79569-4nf7p"] Dec 01 08:29:02 crc kubenswrapper[5004]: I1201 08:29:02.254065 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bf4d79569-4nf7p" Dec 01 08:29:02 crc kubenswrapper[5004]: I1201 08:29:02.289867 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dm42w\" (UniqueName: \"kubernetes.io/projected/89a7714c-cec3-46e6-8cdb-016669fcf18e-kube-api-access-dm42w\") pod \"obo-prometheus-operator-668cf9dfbb-gwtbl\" (UID: \"89a7714c-cec3-46e6-8cdb-016669fcf18e\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-gwtbl" Dec 01 08:29:02 crc kubenswrapper[5004]: I1201 08:29:02.290060 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3a6db099-c25a-4d19-8fa1-8269429274fc-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5bf4d79569-4nf7p\" (UID: \"3a6db099-c25a-4d19-8fa1-8269429274fc\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bf4d79569-4nf7p" Dec 01 08:29:02 crc kubenswrapper[5004]: I1201 08:29:02.290083 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/386e0e23-8226-45e3-a1b5-38fc4fb44eec-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5bf4d79569-bhs7g\" (UID: \"386e0e23-8226-45e3-a1b5-38fc4fb44eec\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bf4d79569-bhs7g" Dec 01 08:29:02 crc kubenswrapper[5004]: I1201 08:29:02.290205 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/386e0e23-8226-45e3-a1b5-38fc4fb44eec-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5bf4d79569-bhs7g\" (UID: \"386e0e23-8226-45e3-a1b5-38fc4fb44eec\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bf4d79569-bhs7g" Dec 01 08:29:02 crc kubenswrapper[5004]: I1201 08:29:02.290327 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3a6db099-c25a-4d19-8fa1-8269429274fc-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5bf4d79569-4nf7p\" (UID: \"3a6db099-c25a-4d19-8fa1-8269429274fc\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bf4d79569-4nf7p" Dec 01 08:29:02 crc kubenswrapper[5004]: I1201 08:29:02.391444 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dm42w\" (UniqueName: \"kubernetes.io/projected/89a7714c-cec3-46e6-8cdb-016669fcf18e-kube-api-access-dm42w\") pod \"obo-prometheus-operator-668cf9dfbb-gwtbl\" (UID: \"89a7714c-cec3-46e6-8cdb-016669fcf18e\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-gwtbl" Dec 01 08:29:02 crc kubenswrapper[5004]: I1201 08:29:02.391498 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3a6db099-c25a-4d19-8fa1-8269429274fc-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5bf4d79569-4nf7p\" (UID: \"3a6db099-c25a-4d19-8fa1-8269429274fc\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bf4d79569-4nf7p" Dec 01 08:29:02 crc kubenswrapper[5004]: I1201 08:29:02.391530 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/386e0e23-8226-45e3-a1b5-38fc4fb44eec-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5bf4d79569-bhs7g\" (UID: \"386e0e23-8226-45e3-a1b5-38fc4fb44eec\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bf4d79569-bhs7g" Dec 01 08:29:02 crc kubenswrapper[5004]: I1201 08:29:02.391651 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/386e0e23-8226-45e3-a1b5-38fc4fb44eec-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5bf4d79569-bhs7g\" (UID: \"386e0e23-8226-45e3-a1b5-38fc4fb44eec\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bf4d79569-bhs7g" Dec 01 08:29:02 crc kubenswrapper[5004]: I1201 08:29:02.391693 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3a6db099-c25a-4d19-8fa1-8269429274fc-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5bf4d79569-4nf7p\" (UID: \"3a6db099-c25a-4d19-8fa1-8269429274fc\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bf4d79569-4nf7p" Dec 01 08:29:02 crc kubenswrapper[5004]: I1201 08:29:02.416539 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dm42w\" (UniqueName: \"kubernetes.io/projected/89a7714c-cec3-46e6-8cdb-016669fcf18e-kube-api-access-dm42w\") pod \"obo-prometheus-operator-668cf9dfbb-gwtbl\" (UID: \"89a7714c-cec3-46e6-8cdb-016669fcf18e\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-gwtbl" Dec 01 08:29:02 crc kubenswrapper[5004]: I1201 08:29:02.430209 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-kfwr8"] Dec 01 08:29:02 crc kubenswrapper[5004]: I1201 08:29:02.431342 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-kfwr8" Dec 01 08:29:02 crc kubenswrapper[5004]: I1201 08:29:02.433590 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-l658k" Dec 01 08:29:02 crc kubenswrapper[5004]: I1201 08:29:02.434333 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Dec 01 08:29:02 crc kubenswrapper[5004]: I1201 08:29:02.492695 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmwbv\" (UniqueName: \"kubernetes.io/projected/c2573e8e-c093-4ddb-b02d-f4f7e270b97a-kube-api-access-kmwbv\") pod \"observability-operator-d8bb48f5d-kfwr8\" (UID: \"c2573e8e-c093-4ddb-b02d-f4f7e270b97a\") " pod="openshift-operators/observability-operator-d8bb48f5d-kfwr8" Dec 01 08:29:02 crc kubenswrapper[5004]: I1201 08:29:02.492791 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/c2573e8e-c093-4ddb-b02d-f4f7e270b97a-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-kfwr8\" (UID: \"c2573e8e-c093-4ddb-b02d-f4f7e270b97a\") " pod="openshift-operators/observability-operator-d8bb48f5d-kfwr8" Dec 01 08:29:02 crc kubenswrapper[5004]: I1201 08:29:02.508496 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-gwtbl" Dec 01 08:29:02 crc kubenswrapper[5004]: I1201 08:29:02.529168 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5446b9c989-9pms6"] Dec 01 08:29:02 crc kubenswrapper[5004]: I1201 08:29:02.530738 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-9pms6" Dec 01 08:29:02 crc kubenswrapper[5004]: I1201 08:29:02.536438 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-gdx2w" Dec 01 08:29:02 crc kubenswrapper[5004]: E1201 08:29:02.551809 5004 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-gwtbl_openshift-operators_89a7714c-cec3-46e6-8cdb-016669fcf18e_0(eeeb6ddfdaa21a770efcb5f86734237b66c0fd2fd413c6fc267aa64f944cf34d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 01 08:29:02 crc kubenswrapper[5004]: E1201 08:29:02.551882 5004 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-gwtbl_openshift-operators_89a7714c-cec3-46e6-8cdb-016669fcf18e_0(eeeb6ddfdaa21a770efcb5f86734237b66c0fd2fd413c6fc267aa64f944cf34d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-gwtbl" Dec 01 08:29:02 crc kubenswrapper[5004]: E1201 08:29:02.551904 5004 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-gwtbl_openshift-operators_89a7714c-cec3-46e6-8cdb-016669fcf18e_0(eeeb6ddfdaa21a770efcb5f86734237b66c0fd2fd413c6fc267aa64f944cf34d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-gwtbl" Dec 01 08:29:02 crc kubenswrapper[5004]: E1201 08:29:02.551951 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-668cf9dfbb-gwtbl_openshift-operators(89a7714c-cec3-46e6-8cdb-016669fcf18e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-668cf9dfbb-gwtbl_openshift-operators(89a7714c-cec3-46e6-8cdb-016669fcf18e)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-gwtbl_openshift-operators_89a7714c-cec3-46e6-8cdb-016669fcf18e_0(eeeb6ddfdaa21a770efcb5f86734237b66c0fd2fd413c6fc267aa64f944cf34d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-gwtbl" podUID="89a7714c-cec3-46e6-8cdb-016669fcf18e" Dec 01 08:29:02 crc kubenswrapper[5004]: I1201 08:29:02.593762 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/c2573e8e-c093-4ddb-b02d-f4f7e270b97a-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-kfwr8\" (UID: \"c2573e8e-c093-4ddb-b02d-f4f7e270b97a\") " pod="openshift-operators/observability-operator-d8bb48f5d-kfwr8" Dec 01 08:29:02 crc kubenswrapper[5004]: I1201 08:29:02.593897 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmwbv\" (UniqueName: \"kubernetes.io/projected/c2573e8e-c093-4ddb-b02d-f4f7e270b97a-kube-api-access-kmwbv\") pod \"observability-operator-d8bb48f5d-kfwr8\" (UID: \"c2573e8e-c093-4ddb-b02d-f4f7e270b97a\") " pod="openshift-operators/observability-operator-d8bb48f5d-kfwr8" Dec 01 08:29:02 crc kubenswrapper[5004]: I1201 08:29:02.610591 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/c2573e8e-c093-4ddb-b02d-f4f7e270b97a-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-kfwr8\" (UID: \"c2573e8e-c093-4ddb-b02d-f4f7e270b97a\") " pod="openshift-operators/observability-operator-d8bb48f5d-kfwr8" Dec 01 08:29:02 crc kubenswrapper[5004]: I1201 08:29:02.612024 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmwbv\" (UniqueName: \"kubernetes.io/projected/c2573e8e-c093-4ddb-b02d-f4f7e270b97a-kube-api-access-kmwbv\") pod \"observability-operator-d8bb48f5d-kfwr8\" (UID: \"c2573e8e-c093-4ddb-b02d-f4f7e270b97a\") " pod="openshift-operators/observability-operator-d8bb48f5d-kfwr8" Dec 01 08:29:02 crc kubenswrapper[5004]: I1201 08:29:02.694735 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zngc7\" (UniqueName: \"kubernetes.io/projected/1579721a-7166-4152-9703-97b893433c9a-kube-api-access-zngc7\") pod \"perses-operator-5446b9c989-9pms6\" (UID: \"1579721a-7166-4152-9703-97b893433c9a\") " pod="openshift-operators/perses-operator-5446b9c989-9pms6" Dec 01 08:29:02 crc kubenswrapper[5004]: I1201 08:29:02.695025 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/1579721a-7166-4152-9703-97b893433c9a-openshift-service-ca\") pod \"perses-operator-5446b9c989-9pms6\" (UID: \"1579721a-7166-4152-9703-97b893433c9a\") " pod="openshift-operators/perses-operator-5446b9c989-9pms6" Dec 01 08:29:02 crc kubenswrapper[5004]: I1201 08:29:02.769117 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-l658k" Dec 01 08:29:02 crc kubenswrapper[5004]: I1201 08:29:02.777504 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-kfwr8" Dec 01 08:29:02 crc kubenswrapper[5004]: I1201 08:29:02.801084 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zngc7\" (UniqueName: \"kubernetes.io/projected/1579721a-7166-4152-9703-97b893433c9a-kube-api-access-zngc7\") pod \"perses-operator-5446b9c989-9pms6\" (UID: \"1579721a-7166-4152-9703-97b893433c9a\") " pod="openshift-operators/perses-operator-5446b9c989-9pms6" Dec 01 08:29:02 crc kubenswrapper[5004]: I1201 08:29:02.801227 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/1579721a-7166-4152-9703-97b893433c9a-openshift-service-ca\") pod \"perses-operator-5446b9c989-9pms6\" (UID: \"1579721a-7166-4152-9703-97b893433c9a\") " pod="openshift-operators/perses-operator-5446b9c989-9pms6" Dec 01 08:29:02 crc kubenswrapper[5004]: I1201 08:29:02.802348 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/1579721a-7166-4152-9703-97b893433c9a-openshift-service-ca\") pod \"perses-operator-5446b9c989-9pms6\" (UID: \"1579721a-7166-4152-9703-97b893433c9a\") " pod="openshift-operators/perses-operator-5446b9c989-9pms6" Dec 01 08:29:02 crc kubenswrapper[5004]: E1201 08:29:02.806856 5004 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-kfwr8_openshift-operators_c2573e8e-c093-4ddb-b02d-f4f7e270b97a_0(72c22259f7d75e0da32f517131a8383a2eec3bcb71681962709d89a9045d1a63): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 01 08:29:02 crc kubenswrapper[5004]: E1201 08:29:02.806933 5004 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-kfwr8_openshift-operators_c2573e8e-c093-4ddb-b02d-f4f7e270b97a_0(72c22259f7d75e0da32f517131a8383a2eec3bcb71681962709d89a9045d1a63): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-kfwr8" Dec 01 08:29:02 crc kubenswrapper[5004]: E1201 08:29:02.806957 5004 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-kfwr8_openshift-operators_c2573e8e-c093-4ddb-b02d-f4f7e270b97a_0(72c22259f7d75e0da32f517131a8383a2eec3bcb71681962709d89a9045d1a63): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-kfwr8" Dec 01 08:29:02 crc kubenswrapper[5004]: E1201 08:29:02.807007 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-d8bb48f5d-kfwr8_openshift-operators(c2573e8e-c093-4ddb-b02d-f4f7e270b97a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-d8bb48f5d-kfwr8_openshift-operators(c2573e8e-c093-4ddb-b02d-f4f7e270b97a)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-kfwr8_openshift-operators_c2573e8e-c093-4ddb-b02d-f4f7e270b97a_0(72c22259f7d75e0da32f517131a8383a2eec3bcb71681962709d89a9045d1a63): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-d8bb48f5d-kfwr8" podUID="c2573e8e-c093-4ddb-b02d-f4f7e270b97a" Dec 01 08:29:02 crc kubenswrapper[5004]: I1201 08:29:02.820116 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zngc7\" (UniqueName: \"kubernetes.io/projected/1579721a-7166-4152-9703-97b893433c9a-kube-api-access-zngc7\") pod \"perses-operator-5446b9c989-9pms6\" (UID: \"1579721a-7166-4152-9703-97b893433c9a\") " pod="openshift-operators/perses-operator-5446b9c989-9pms6" Dec 01 08:29:02 crc kubenswrapper[5004]: I1201 08:29:02.942359 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-gdx2w" Dec 01 08:29:02 crc kubenswrapper[5004]: I1201 08:29:02.952227 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-9pms6" Dec 01 08:29:02 crc kubenswrapper[5004]: E1201 08:29:02.979286 5004 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-9pms6_openshift-operators_1579721a-7166-4152-9703-97b893433c9a_0(583fedaadd5c1d810c30cbcec560a773c830b7a4c77cc7317387f1c1b245bd4f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 01 08:29:02 crc kubenswrapper[5004]: E1201 08:29:02.979352 5004 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-9pms6_openshift-operators_1579721a-7166-4152-9703-97b893433c9a_0(583fedaadd5c1d810c30cbcec560a773c830b7a4c77cc7317387f1c1b245bd4f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-9pms6" Dec 01 08:29:02 crc kubenswrapper[5004]: E1201 08:29:02.979381 5004 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-9pms6_openshift-operators_1579721a-7166-4152-9703-97b893433c9a_0(583fedaadd5c1d810c30cbcec560a773c830b7a4c77cc7317387f1c1b245bd4f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-9pms6" Dec 01 08:29:02 crc kubenswrapper[5004]: E1201 08:29:02.979426 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5446b9c989-9pms6_openshift-operators(1579721a-7166-4152-9703-97b893433c9a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5446b9c989-9pms6_openshift-operators(1579721a-7166-4152-9703-97b893433c9a)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-9pms6_openshift-operators_1579721a-7166-4152-9703-97b893433c9a_0(583fedaadd5c1d810c30cbcec560a773c830b7a4c77cc7317387f1c1b245bd4f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5446b9c989-9pms6" podUID="1579721a-7166-4152-9703-97b893433c9a" Dec 01 08:29:03 crc kubenswrapper[5004]: E1201 08:29:03.392207 5004 secret.go:188] Couldn't get secret openshift-operators/obo-prometheus-operator-admission-webhook-service-cert: failed to sync secret cache: timed out waiting for the condition Dec 01 08:29:03 crc kubenswrapper[5004]: E1201 08:29:03.392245 5004 secret.go:188] Couldn't get secret openshift-operators/obo-prometheus-operator-admission-webhook-service-cert: failed to sync secret cache: timed out waiting for the condition Dec 01 08:29:03 crc kubenswrapper[5004]: E1201 08:29:03.392290 5004 secret.go:188] Couldn't get secret openshift-operators/obo-prometheus-operator-admission-webhook-service-cert: failed to sync secret cache: timed out waiting for the condition Dec 01 08:29:03 crc kubenswrapper[5004]: E1201 08:29:03.392325 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a6db099-c25a-4d19-8fa1-8269429274fc-webhook-cert podName:3a6db099-c25a-4d19-8fa1-8269429274fc nodeName:}" failed. No retries permitted until 2025-12-01 08:29:03.892304515 +0000 UTC m=+721.457296507 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/3a6db099-c25a-4d19-8fa1-8269429274fc-webhook-cert") pod "obo-prometheus-operator-admission-webhook-5bf4d79569-4nf7p" (UID: "3a6db099-c25a-4d19-8fa1-8269429274fc") : failed to sync secret cache: timed out waiting for the condition Dec 01 08:29:03 crc kubenswrapper[5004]: E1201 08:29:03.392333 5004 secret.go:188] Couldn't get secret openshift-operators/obo-prometheus-operator-admission-webhook-service-cert: failed to sync secret cache: timed out waiting for the condition Dec 01 08:29:03 crc kubenswrapper[5004]: E1201 08:29:03.392352 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/386e0e23-8226-45e3-a1b5-38fc4fb44eec-apiservice-cert podName:386e0e23-8226-45e3-a1b5-38fc4fb44eec nodeName:}" failed. No retries permitted until 2025-12-01 08:29:03.892337425 +0000 UTC m=+721.457329427 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/386e0e23-8226-45e3-a1b5-38fc4fb44eec-apiservice-cert") pod "obo-prometheus-operator-admission-webhook-5bf4d79569-bhs7g" (UID: "386e0e23-8226-45e3-a1b5-38fc4fb44eec") : failed to sync secret cache: timed out waiting for the condition Dec 01 08:29:03 crc kubenswrapper[5004]: E1201 08:29:03.392443 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/386e0e23-8226-45e3-a1b5-38fc4fb44eec-webhook-cert podName:386e0e23-8226-45e3-a1b5-38fc4fb44eec nodeName:}" failed. No retries permitted until 2025-12-01 08:29:03.892418988 +0000 UTC m=+721.457411040 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/386e0e23-8226-45e3-a1b5-38fc4fb44eec-webhook-cert") pod "obo-prometheus-operator-admission-webhook-5bf4d79569-bhs7g" (UID: "386e0e23-8226-45e3-a1b5-38fc4fb44eec") : failed to sync secret cache: timed out waiting for the condition Dec 01 08:29:03 crc kubenswrapper[5004]: E1201 08:29:03.392460 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a6db099-c25a-4d19-8fa1-8269429274fc-apiservice-cert podName:3a6db099-c25a-4d19-8fa1-8269429274fc nodeName:}" failed. No retries permitted until 2025-12-01 08:29:03.892450219 +0000 UTC m=+721.457442291 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/3a6db099-c25a-4d19-8fa1-8269429274fc-apiservice-cert") pod "obo-prometheus-operator-admission-webhook-5bf4d79569-4nf7p" (UID: "3a6db099-c25a-4d19-8fa1-8269429274fc") : failed to sync secret cache: timed out waiting for the condition Dec 01 08:29:03 crc kubenswrapper[5004]: I1201 08:29:03.720587 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-shm4n" Dec 01 08:29:03 crc kubenswrapper[5004]: I1201 08:29:03.833447 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Dec 01 08:29:03 crc kubenswrapper[5004]: I1201 08:29:03.915402 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3a6db099-c25a-4d19-8fa1-8269429274fc-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5bf4d79569-4nf7p\" (UID: \"3a6db099-c25a-4d19-8fa1-8269429274fc\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bf4d79569-4nf7p" Dec 01 08:29:03 crc kubenswrapper[5004]: I1201 08:29:03.915453 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/386e0e23-8226-45e3-a1b5-38fc4fb44eec-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5bf4d79569-bhs7g\" (UID: \"386e0e23-8226-45e3-a1b5-38fc4fb44eec\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bf4d79569-bhs7g" Dec 01 08:29:03 crc kubenswrapper[5004]: I1201 08:29:03.915763 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/386e0e23-8226-45e3-a1b5-38fc4fb44eec-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5bf4d79569-bhs7g\" (UID: \"386e0e23-8226-45e3-a1b5-38fc4fb44eec\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bf4d79569-bhs7g" Dec 01 08:29:03 crc kubenswrapper[5004]: I1201 08:29:03.915801 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3a6db099-c25a-4d19-8fa1-8269429274fc-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5bf4d79569-4nf7p\" (UID: \"3a6db099-c25a-4d19-8fa1-8269429274fc\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bf4d79569-4nf7p" Dec 01 08:29:03 crc kubenswrapper[5004]: I1201 08:29:03.919351 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3a6db099-c25a-4d19-8fa1-8269429274fc-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5bf4d79569-4nf7p\" (UID: \"3a6db099-c25a-4d19-8fa1-8269429274fc\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bf4d79569-4nf7p" Dec 01 08:29:03 crc kubenswrapper[5004]: I1201 08:29:03.919427 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/386e0e23-8226-45e3-a1b5-38fc4fb44eec-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5bf4d79569-bhs7g\" (UID: \"386e0e23-8226-45e3-a1b5-38fc4fb44eec\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bf4d79569-bhs7g" Dec 01 08:29:03 crc kubenswrapper[5004]: I1201 08:29:03.921716 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/386e0e23-8226-45e3-a1b5-38fc4fb44eec-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5bf4d79569-bhs7g\" (UID: \"386e0e23-8226-45e3-a1b5-38fc4fb44eec\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bf4d79569-bhs7g" Dec 01 08:29:03 crc kubenswrapper[5004]: I1201 08:29:03.926144 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3a6db099-c25a-4d19-8fa1-8269429274fc-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5bf4d79569-4nf7p\" (UID: \"3a6db099-c25a-4d19-8fa1-8269429274fc\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bf4d79569-4nf7p" Dec 01 08:29:04 crc kubenswrapper[5004]: I1201 08:29:04.064337 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bf4d79569-bhs7g" Dec 01 08:29:04 crc kubenswrapper[5004]: I1201 08:29:04.074235 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bf4d79569-4nf7p" Dec 01 08:29:04 crc kubenswrapper[5004]: E1201 08:29:04.097287 5004 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5bf4d79569-bhs7g_openshift-operators_386e0e23-8226-45e3-a1b5-38fc4fb44eec_0(75d4c402b216a7b7e80aef3d7a92a65d17160ac1b0b64ed95c7522cf6682cc7e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 01 08:29:04 crc kubenswrapper[5004]: E1201 08:29:04.097356 5004 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5bf4d79569-bhs7g_openshift-operators_386e0e23-8226-45e3-a1b5-38fc4fb44eec_0(75d4c402b216a7b7e80aef3d7a92a65d17160ac1b0b64ed95c7522cf6682cc7e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bf4d79569-bhs7g" Dec 01 08:29:04 crc kubenswrapper[5004]: E1201 08:29:04.097377 5004 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5bf4d79569-bhs7g_openshift-operators_386e0e23-8226-45e3-a1b5-38fc4fb44eec_0(75d4c402b216a7b7e80aef3d7a92a65d17160ac1b0b64ed95c7522cf6682cc7e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bf4d79569-bhs7g" Dec 01 08:29:04 crc kubenswrapper[5004]: E1201 08:29:04.097422 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-5bf4d79569-bhs7g_openshift-operators(386e0e23-8226-45e3-a1b5-38fc4fb44eec)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-5bf4d79569-bhs7g_openshift-operators(386e0e23-8226-45e3-a1b5-38fc4fb44eec)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5bf4d79569-bhs7g_openshift-operators_386e0e23-8226-45e3-a1b5-38fc4fb44eec_0(75d4c402b216a7b7e80aef3d7a92a65d17160ac1b0b64ed95c7522cf6682cc7e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bf4d79569-bhs7g" podUID="386e0e23-8226-45e3-a1b5-38fc4fb44eec" Dec 01 08:29:04 crc kubenswrapper[5004]: E1201 08:29:04.112329 5004 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5bf4d79569-4nf7p_openshift-operators_3a6db099-c25a-4d19-8fa1-8269429274fc_0(357f38b245b0c665efa5ca8ab90acc9e1dd6768b12bba6cec1aafdd4a8af2a2e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 01 08:29:04 crc kubenswrapper[5004]: E1201 08:29:04.112384 5004 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5bf4d79569-4nf7p_openshift-operators_3a6db099-c25a-4d19-8fa1-8269429274fc_0(357f38b245b0c665efa5ca8ab90acc9e1dd6768b12bba6cec1aafdd4a8af2a2e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bf4d79569-4nf7p" Dec 01 08:29:04 crc kubenswrapper[5004]: E1201 08:29:04.112405 5004 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5bf4d79569-4nf7p_openshift-operators_3a6db099-c25a-4d19-8fa1-8269429274fc_0(357f38b245b0c665efa5ca8ab90acc9e1dd6768b12bba6cec1aafdd4a8af2a2e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bf4d79569-4nf7p" Dec 01 08:29:04 crc kubenswrapper[5004]: E1201 08:29:04.112457 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-5bf4d79569-4nf7p_openshift-operators(3a6db099-c25a-4d19-8fa1-8269429274fc)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-5bf4d79569-4nf7p_openshift-operators(3a6db099-c25a-4d19-8fa1-8269429274fc)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5bf4d79569-4nf7p_openshift-operators_3a6db099-c25a-4d19-8fa1-8269429274fc_0(357f38b245b0c665efa5ca8ab90acc9e1dd6768b12bba6cec1aafdd4a8af2a2e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bf4d79569-4nf7p" podUID="3a6db099-c25a-4d19-8fa1-8269429274fc" Dec 01 08:29:04 crc kubenswrapper[5004]: I1201 08:29:04.155065 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" event={"ID":"b973bd81-e8d9-4d98-83a5-c2fd62edc555","Type":"ContainerStarted","Data":"5a3657799e11dbfb10159f8720067374623b1393e53943c2ad443283c50f383a"} Dec 01 08:29:04 crc kubenswrapper[5004]: I1201 08:29:04.155343 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" Dec 01 08:29:04 crc kubenswrapper[5004]: I1201 08:29:04.155387 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" Dec 01 08:29:04 crc kubenswrapper[5004]: I1201 08:29:04.180371 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" Dec 01 08:29:04 crc kubenswrapper[5004]: I1201 08:29:04.185673 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" podStartSLOduration=7.185655671 podStartE2EDuration="7.185655671s" podCreationTimestamp="2025-12-01 08:28:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:29:04.182724046 +0000 UTC m=+721.747716038" watchObservedRunningTime="2025-12-01 08:29:04.185655671 +0000 UTC m=+721.750647663" Dec 01 08:29:04 crc kubenswrapper[5004]: I1201 08:29:04.299215 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-9pms6"] Dec 01 08:29:04 crc kubenswrapper[5004]: I1201 08:29:04.299435 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-9pms6" Dec 01 08:29:04 crc kubenswrapper[5004]: I1201 08:29:04.300033 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-9pms6" Dec 01 08:29:04 crc kubenswrapper[5004]: I1201 08:29:04.304480 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-kfwr8"] Dec 01 08:29:04 crc kubenswrapper[5004]: I1201 08:29:04.304722 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-kfwr8" Dec 01 08:29:04 crc kubenswrapper[5004]: I1201 08:29:04.305241 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-kfwr8" Dec 01 08:29:04 crc kubenswrapper[5004]: I1201 08:29:04.324958 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5bf4d79569-4nf7p"] Dec 01 08:29:04 crc kubenswrapper[5004]: I1201 08:29:04.325143 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bf4d79569-4nf7p" Dec 01 08:29:04 crc kubenswrapper[5004]: I1201 08:29:04.325709 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bf4d79569-4nf7p" Dec 01 08:29:04 crc kubenswrapper[5004]: E1201 08:29:04.333190 5004 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-9pms6_openshift-operators_1579721a-7166-4152-9703-97b893433c9a_0(16f7be457398acc218e3e1aa6fec7d15a26072875a5fe4ccb94adc3efbc7ad8e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 01 08:29:04 crc kubenswrapper[5004]: E1201 08:29:04.333268 5004 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-9pms6_openshift-operators_1579721a-7166-4152-9703-97b893433c9a_0(16f7be457398acc218e3e1aa6fec7d15a26072875a5fe4ccb94adc3efbc7ad8e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-9pms6" Dec 01 08:29:04 crc kubenswrapper[5004]: E1201 08:29:04.333302 5004 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-9pms6_openshift-operators_1579721a-7166-4152-9703-97b893433c9a_0(16f7be457398acc218e3e1aa6fec7d15a26072875a5fe4ccb94adc3efbc7ad8e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-9pms6" Dec 01 08:29:04 crc kubenswrapper[5004]: E1201 08:29:04.333375 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5446b9c989-9pms6_openshift-operators(1579721a-7166-4152-9703-97b893433c9a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5446b9c989-9pms6_openshift-operators(1579721a-7166-4152-9703-97b893433c9a)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-9pms6_openshift-operators_1579721a-7166-4152-9703-97b893433c9a_0(16f7be457398acc218e3e1aa6fec7d15a26072875a5fe4ccb94adc3efbc7ad8e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5446b9c989-9pms6" podUID="1579721a-7166-4152-9703-97b893433c9a" Dec 01 08:29:04 crc kubenswrapper[5004]: I1201 08:29:04.333669 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-gwtbl"] Dec 01 08:29:04 crc kubenswrapper[5004]: I1201 08:29:04.333799 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-gwtbl" Dec 01 08:29:04 crc kubenswrapper[5004]: I1201 08:29:04.334322 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-gwtbl" Dec 01 08:29:04 crc kubenswrapper[5004]: E1201 08:29:04.338226 5004 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-kfwr8_openshift-operators_c2573e8e-c093-4ddb-b02d-f4f7e270b97a_0(932bed8c2c037735e265ffccfe5f5727e272e60ca24c2d665595afdd74661de9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 01 08:29:04 crc kubenswrapper[5004]: E1201 08:29:04.338277 5004 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-kfwr8_openshift-operators_c2573e8e-c093-4ddb-b02d-f4f7e270b97a_0(932bed8c2c037735e265ffccfe5f5727e272e60ca24c2d665595afdd74661de9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-kfwr8" Dec 01 08:29:04 crc kubenswrapper[5004]: E1201 08:29:04.338308 5004 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-kfwr8_openshift-operators_c2573e8e-c093-4ddb-b02d-f4f7e270b97a_0(932bed8c2c037735e265ffccfe5f5727e272e60ca24c2d665595afdd74661de9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-kfwr8" Dec 01 08:29:04 crc kubenswrapper[5004]: E1201 08:29:04.338356 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-d8bb48f5d-kfwr8_openshift-operators(c2573e8e-c093-4ddb-b02d-f4f7e270b97a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-d8bb48f5d-kfwr8_openshift-operators(c2573e8e-c093-4ddb-b02d-f4f7e270b97a)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-kfwr8_openshift-operators_c2573e8e-c093-4ddb-b02d-f4f7e270b97a_0(932bed8c2c037735e265ffccfe5f5727e272e60ca24c2d665595afdd74661de9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-d8bb48f5d-kfwr8" podUID="c2573e8e-c093-4ddb-b02d-f4f7e270b97a" Dec 01 08:29:04 crc kubenswrapper[5004]: I1201 08:29:04.353174 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5bf4d79569-bhs7g"] Dec 01 08:29:04 crc kubenswrapper[5004]: I1201 08:29:04.353285 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bf4d79569-bhs7g" Dec 01 08:29:04 crc kubenswrapper[5004]: I1201 08:29:04.353762 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bf4d79569-bhs7g" Dec 01 08:29:04 crc kubenswrapper[5004]: E1201 08:29:04.372785 5004 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5bf4d79569-4nf7p_openshift-operators_3a6db099-c25a-4d19-8fa1-8269429274fc_0(9cc992ba23af68c661ef61c8dbef6b93d810ea2d7d2d489f86d25009ec6938ce): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 01 08:29:04 crc kubenswrapper[5004]: E1201 08:29:04.372855 5004 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5bf4d79569-4nf7p_openshift-operators_3a6db099-c25a-4d19-8fa1-8269429274fc_0(9cc992ba23af68c661ef61c8dbef6b93d810ea2d7d2d489f86d25009ec6938ce): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bf4d79569-4nf7p" Dec 01 08:29:04 crc kubenswrapper[5004]: E1201 08:29:04.372879 5004 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5bf4d79569-4nf7p_openshift-operators_3a6db099-c25a-4d19-8fa1-8269429274fc_0(9cc992ba23af68c661ef61c8dbef6b93d810ea2d7d2d489f86d25009ec6938ce): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bf4d79569-4nf7p" Dec 01 08:29:04 crc kubenswrapper[5004]: E1201 08:29:04.372927 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-5bf4d79569-4nf7p_openshift-operators(3a6db099-c25a-4d19-8fa1-8269429274fc)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-5bf4d79569-4nf7p_openshift-operators(3a6db099-c25a-4d19-8fa1-8269429274fc)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5bf4d79569-4nf7p_openshift-operators_3a6db099-c25a-4d19-8fa1-8269429274fc_0(9cc992ba23af68c661ef61c8dbef6b93d810ea2d7d2d489f86d25009ec6938ce): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bf4d79569-4nf7p" podUID="3a6db099-c25a-4d19-8fa1-8269429274fc" Dec 01 08:29:04 crc kubenswrapper[5004]: E1201 08:29:04.386470 5004 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-gwtbl_openshift-operators_89a7714c-cec3-46e6-8cdb-016669fcf18e_0(2c10108d3c4f545ee561439bdfd66b530d23bdef342ec8dcb0f1218d4613548a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 01 08:29:04 crc kubenswrapper[5004]: E1201 08:29:04.386531 5004 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-gwtbl_openshift-operators_89a7714c-cec3-46e6-8cdb-016669fcf18e_0(2c10108d3c4f545ee561439bdfd66b530d23bdef342ec8dcb0f1218d4613548a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-gwtbl" Dec 01 08:29:04 crc kubenswrapper[5004]: E1201 08:29:04.386559 5004 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-gwtbl_openshift-operators_89a7714c-cec3-46e6-8cdb-016669fcf18e_0(2c10108d3c4f545ee561439bdfd66b530d23bdef342ec8dcb0f1218d4613548a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-gwtbl" Dec 01 08:29:04 crc kubenswrapper[5004]: E1201 08:29:04.386623 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-668cf9dfbb-gwtbl_openshift-operators(89a7714c-cec3-46e6-8cdb-016669fcf18e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-668cf9dfbb-gwtbl_openshift-operators(89a7714c-cec3-46e6-8cdb-016669fcf18e)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-gwtbl_openshift-operators_89a7714c-cec3-46e6-8cdb-016669fcf18e_0(2c10108d3c4f545ee561439bdfd66b530d23bdef342ec8dcb0f1218d4613548a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-gwtbl" podUID="89a7714c-cec3-46e6-8cdb-016669fcf18e" Dec 01 08:29:04 crc kubenswrapper[5004]: E1201 08:29:04.400466 5004 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5bf4d79569-bhs7g_openshift-operators_386e0e23-8226-45e3-a1b5-38fc4fb44eec_0(bdc2376197603dd18f1fdbf4dd1b2b028a079b53e8853409f9e3fc2d27a4fcd5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 01 08:29:04 crc kubenswrapper[5004]: E1201 08:29:04.400520 5004 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5bf4d79569-bhs7g_openshift-operators_386e0e23-8226-45e3-a1b5-38fc4fb44eec_0(bdc2376197603dd18f1fdbf4dd1b2b028a079b53e8853409f9e3fc2d27a4fcd5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bf4d79569-bhs7g" Dec 01 08:29:04 crc kubenswrapper[5004]: E1201 08:29:04.400539 5004 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5bf4d79569-bhs7g_openshift-operators_386e0e23-8226-45e3-a1b5-38fc4fb44eec_0(bdc2376197603dd18f1fdbf4dd1b2b028a079b53e8853409f9e3fc2d27a4fcd5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bf4d79569-bhs7g" Dec 01 08:29:04 crc kubenswrapper[5004]: E1201 08:29:04.400591 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-5bf4d79569-bhs7g_openshift-operators(386e0e23-8226-45e3-a1b5-38fc4fb44eec)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-5bf4d79569-bhs7g_openshift-operators(386e0e23-8226-45e3-a1b5-38fc4fb44eec)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5bf4d79569-bhs7g_openshift-operators_386e0e23-8226-45e3-a1b5-38fc4fb44eec_0(bdc2376197603dd18f1fdbf4dd1b2b028a079b53e8853409f9e3fc2d27a4fcd5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bf4d79569-bhs7g" podUID="386e0e23-8226-45e3-a1b5-38fc4fb44eec" Dec 01 08:29:05 crc kubenswrapper[5004]: I1201 08:29:05.159911 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" Dec 01 08:29:05 crc kubenswrapper[5004]: I1201 08:29:05.193187 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" Dec 01 08:29:08 crc kubenswrapper[5004]: I1201 08:29:08.759350 5004 scope.go:117] "RemoveContainer" containerID="c4499168a80cb7fe2301c6db0d0d9c80110f6f9bc8fc94b291f0b9b306dbb057" Dec 01 08:29:08 crc kubenswrapper[5004]: E1201 08:29:08.760052 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-zjksw_openshift-multus(70e79009-93be-49c4-a6b3-e8a06bcea7f4)\"" pod="openshift-multus/multus-zjksw" podUID="70e79009-93be-49c4-a6b3-e8a06bcea7f4" Dec 01 08:29:15 crc kubenswrapper[5004]: I1201 08:29:15.758874 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bf4d79569-4nf7p" Dec 01 08:29:15 crc kubenswrapper[5004]: I1201 08:29:15.760199 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bf4d79569-4nf7p" Dec 01 08:29:15 crc kubenswrapper[5004]: E1201 08:29:15.807803 5004 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5bf4d79569-4nf7p_openshift-operators_3a6db099-c25a-4d19-8fa1-8269429274fc_0(c0b3fdef2933d075a540f3fde23ac61b3097f7713ad495b884be6506cfdfa23a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 01 08:29:15 crc kubenswrapper[5004]: E1201 08:29:15.807901 5004 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5bf4d79569-4nf7p_openshift-operators_3a6db099-c25a-4d19-8fa1-8269429274fc_0(c0b3fdef2933d075a540f3fde23ac61b3097f7713ad495b884be6506cfdfa23a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bf4d79569-4nf7p" Dec 01 08:29:15 crc kubenswrapper[5004]: E1201 08:29:15.807935 5004 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5bf4d79569-4nf7p_openshift-operators_3a6db099-c25a-4d19-8fa1-8269429274fc_0(c0b3fdef2933d075a540f3fde23ac61b3097f7713ad495b884be6506cfdfa23a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bf4d79569-4nf7p" Dec 01 08:29:15 crc kubenswrapper[5004]: E1201 08:29:15.808001 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-5bf4d79569-4nf7p_openshift-operators(3a6db099-c25a-4d19-8fa1-8269429274fc)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-5bf4d79569-4nf7p_openshift-operators(3a6db099-c25a-4d19-8fa1-8269429274fc)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5bf4d79569-4nf7p_openshift-operators_3a6db099-c25a-4d19-8fa1-8269429274fc_0(c0b3fdef2933d075a540f3fde23ac61b3097f7713ad495b884be6506cfdfa23a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bf4d79569-4nf7p" podUID="3a6db099-c25a-4d19-8fa1-8269429274fc" Dec 01 08:29:17 crc kubenswrapper[5004]: I1201 08:29:17.758251 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-gwtbl" Dec 01 08:29:17 crc kubenswrapper[5004]: I1201 08:29:17.759369 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-gwtbl" Dec 01 08:29:17 crc kubenswrapper[5004]: E1201 08:29:17.808781 5004 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-gwtbl_openshift-operators_89a7714c-cec3-46e6-8cdb-016669fcf18e_0(6a1dcacb7d74b6a40389f2ea0b6aec2b34a6d4f5a1a7327a824afdc363706d06): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 01 08:29:17 crc kubenswrapper[5004]: E1201 08:29:17.808867 5004 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-gwtbl_openshift-operators_89a7714c-cec3-46e6-8cdb-016669fcf18e_0(6a1dcacb7d74b6a40389f2ea0b6aec2b34a6d4f5a1a7327a824afdc363706d06): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-gwtbl" Dec 01 08:29:17 crc kubenswrapper[5004]: E1201 08:29:17.808902 5004 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-gwtbl_openshift-operators_89a7714c-cec3-46e6-8cdb-016669fcf18e_0(6a1dcacb7d74b6a40389f2ea0b6aec2b34a6d4f5a1a7327a824afdc363706d06): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-gwtbl" Dec 01 08:29:17 crc kubenswrapper[5004]: E1201 08:29:17.808967 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-668cf9dfbb-gwtbl_openshift-operators(89a7714c-cec3-46e6-8cdb-016669fcf18e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-668cf9dfbb-gwtbl_openshift-operators(89a7714c-cec3-46e6-8cdb-016669fcf18e)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-gwtbl_openshift-operators_89a7714c-cec3-46e6-8cdb-016669fcf18e_0(6a1dcacb7d74b6a40389f2ea0b6aec2b34a6d4f5a1a7327a824afdc363706d06): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-gwtbl" podUID="89a7714c-cec3-46e6-8cdb-016669fcf18e" Dec 01 08:29:18 crc kubenswrapper[5004]: I1201 08:29:18.758606 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bf4d79569-bhs7g" Dec 01 08:29:18 crc kubenswrapper[5004]: I1201 08:29:18.758606 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-kfwr8" Dec 01 08:29:18 crc kubenswrapper[5004]: I1201 08:29:18.759618 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bf4d79569-bhs7g" Dec 01 08:29:18 crc kubenswrapper[5004]: I1201 08:29:18.759656 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-kfwr8" Dec 01 08:29:18 crc kubenswrapper[5004]: E1201 08:29:18.802289 5004 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5bf4d79569-bhs7g_openshift-operators_386e0e23-8226-45e3-a1b5-38fc4fb44eec_0(d402dad838b9bfe52529a8e62d035a56ebce4bd192c131ec2c5cab15492cac3d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 01 08:29:18 crc kubenswrapper[5004]: E1201 08:29:18.802365 5004 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5bf4d79569-bhs7g_openshift-operators_386e0e23-8226-45e3-a1b5-38fc4fb44eec_0(d402dad838b9bfe52529a8e62d035a56ebce4bd192c131ec2c5cab15492cac3d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bf4d79569-bhs7g" Dec 01 08:29:18 crc kubenswrapper[5004]: E1201 08:29:18.802391 5004 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5bf4d79569-bhs7g_openshift-operators_386e0e23-8226-45e3-a1b5-38fc4fb44eec_0(d402dad838b9bfe52529a8e62d035a56ebce4bd192c131ec2c5cab15492cac3d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bf4d79569-bhs7g" Dec 01 08:29:18 crc kubenswrapper[5004]: E1201 08:29:18.802438 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-5bf4d79569-bhs7g_openshift-operators(386e0e23-8226-45e3-a1b5-38fc4fb44eec)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-5bf4d79569-bhs7g_openshift-operators(386e0e23-8226-45e3-a1b5-38fc4fb44eec)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5bf4d79569-bhs7g_openshift-operators_386e0e23-8226-45e3-a1b5-38fc4fb44eec_0(d402dad838b9bfe52529a8e62d035a56ebce4bd192c131ec2c5cab15492cac3d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bf4d79569-bhs7g" podUID="386e0e23-8226-45e3-a1b5-38fc4fb44eec" Dec 01 08:29:18 crc kubenswrapper[5004]: E1201 08:29:18.829462 5004 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-kfwr8_openshift-operators_c2573e8e-c093-4ddb-b02d-f4f7e270b97a_0(804fbf277dabffe00578ed1b94080ea10c7b5c98816377c3dc2ccd36dfc46424): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 01 08:29:18 crc kubenswrapper[5004]: E1201 08:29:18.829532 5004 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-kfwr8_openshift-operators_c2573e8e-c093-4ddb-b02d-f4f7e270b97a_0(804fbf277dabffe00578ed1b94080ea10c7b5c98816377c3dc2ccd36dfc46424): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-kfwr8" Dec 01 08:29:18 crc kubenswrapper[5004]: E1201 08:29:18.829554 5004 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-kfwr8_openshift-operators_c2573e8e-c093-4ddb-b02d-f4f7e270b97a_0(804fbf277dabffe00578ed1b94080ea10c7b5c98816377c3dc2ccd36dfc46424): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-kfwr8" Dec 01 08:29:18 crc kubenswrapper[5004]: E1201 08:29:18.829681 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-d8bb48f5d-kfwr8_openshift-operators(c2573e8e-c093-4ddb-b02d-f4f7e270b97a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-d8bb48f5d-kfwr8_openshift-operators(c2573e8e-c093-4ddb-b02d-f4f7e270b97a)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-kfwr8_openshift-operators_c2573e8e-c093-4ddb-b02d-f4f7e270b97a_0(804fbf277dabffe00578ed1b94080ea10c7b5c98816377c3dc2ccd36dfc46424): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-d8bb48f5d-kfwr8" podUID="c2573e8e-c093-4ddb-b02d-f4f7e270b97a" Dec 01 08:29:19 crc kubenswrapper[5004]: I1201 08:29:19.757829 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-9pms6" Dec 01 08:29:19 crc kubenswrapper[5004]: I1201 08:29:19.758352 5004 scope.go:117] "RemoveContainer" containerID="c4499168a80cb7fe2301c6db0d0d9c80110f6f9bc8fc94b291f0b9b306dbb057" Dec 01 08:29:19 crc kubenswrapper[5004]: I1201 08:29:19.758779 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-9pms6" Dec 01 08:29:19 crc kubenswrapper[5004]: E1201 08:29:19.818974 5004 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-9pms6_openshift-operators_1579721a-7166-4152-9703-97b893433c9a_0(f3373dd0e9c0403503cff987c40b60068696fa78aad907089986071147009f2e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 01 08:29:19 crc kubenswrapper[5004]: E1201 08:29:19.819040 5004 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-9pms6_openshift-operators_1579721a-7166-4152-9703-97b893433c9a_0(f3373dd0e9c0403503cff987c40b60068696fa78aad907089986071147009f2e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-9pms6" Dec 01 08:29:19 crc kubenswrapper[5004]: E1201 08:29:19.819061 5004 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-9pms6_openshift-operators_1579721a-7166-4152-9703-97b893433c9a_0(f3373dd0e9c0403503cff987c40b60068696fa78aad907089986071147009f2e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-9pms6" Dec 01 08:29:19 crc kubenswrapper[5004]: E1201 08:29:19.819104 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5446b9c989-9pms6_openshift-operators(1579721a-7166-4152-9703-97b893433c9a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5446b9c989-9pms6_openshift-operators(1579721a-7166-4152-9703-97b893433c9a)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-9pms6_openshift-operators_1579721a-7166-4152-9703-97b893433c9a_0(f3373dd0e9c0403503cff987c40b60068696fa78aad907089986071147009f2e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5446b9c989-9pms6" podUID="1579721a-7166-4152-9703-97b893433c9a" Dec 01 08:29:20 crc kubenswrapper[5004]: I1201 08:29:20.258634 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zjksw_70e79009-93be-49c4-a6b3-e8a06bcea7f4/kube-multus/2.log" Dec 01 08:29:20 crc kubenswrapper[5004]: I1201 08:29:20.259022 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zjksw" event={"ID":"70e79009-93be-49c4-a6b3-e8a06bcea7f4","Type":"ContainerStarted","Data":"f6e4f7d8e9bb17d2dc5f1d75168218e27055f81c31d1b639e3b041370e03e615"} Dec 01 08:29:26 crc kubenswrapper[5004]: I1201 08:29:26.758494 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bf4d79569-4nf7p" Dec 01 08:29:26 crc kubenswrapper[5004]: I1201 08:29:26.759259 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bf4d79569-4nf7p" Dec 01 08:29:27 crc kubenswrapper[5004]: I1201 08:29:27.177196 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5bf4d79569-4nf7p"] Dec 01 08:29:27 crc kubenswrapper[5004]: W1201 08:29:27.190774 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a6db099_c25a_4d19_8fa1_8269429274fc.slice/crio-536dbebbbc563214665b956cb4d448d9d8f0a318ecaec9969617dd55483df6ce WatchSource:0}: Error finding container 536dbebbbc563214665b956cb4d448d9d8f0a318ecaec9969617dd55483df6ce: Status 404 returned error can't find the container with id 536dbebbbc563214665b956cb4d448d9d8f0a318ecaec9969617dd55483df6ce Dec 01 08:29:27 crc kubenswrapper[5004]: I1201 08:29:27.300216 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bf4d79569-4nf7p" event={"ID":"3a6db099-c25a-4d19-8fa1-8269429274fc","Type":"ContainerStarted","Data":"536dbebbbc563214665b956cb4d448d9d8f0a318ecaec9969617dd55483df6ce"} Dec 01 08:29:27 crc kubenswrapper[5004]: I1201 08:29:27.826008 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9mp7m" Dec 01 08:29:28 crc kubenswrapper[5004]: I1201 08:29:28.758338 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-gwtbl" Dec 01 08:29:28 crc kubenswrapper[5004]: I1201 08:29:28.759253 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-gwtbl" Dec 01 08:29:28 crc kubenswrapper[5004]: I1201 08:29:28.977658 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-gwtbl"] Dec 01 08:29:29 crc kubenswrapper[5004]: I1201 08:29:29.311778 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-gwtbl" event={"ID":"89a7714c-cec3-46e6-8cdb-016669fcf18e","Type":"ContainerStarted","Data":"ebcf69a6fc7530247b06d0d4e3354f0a5385b1ba0699ec0b0839a3f691daa46e"} Dec 01 08:29:31 crc kubenswrapper[5004]: I1201 08:29:31.460659 5004 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 01 08:29:31 crc kubenswrapper[5004]: I1201 08:29:31.758138 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bf4d79569-bhs7g" Dec 01 08:29:31 crc kubenswrapper[5004]: I1201 08:29:31.758622 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bf4d79569-bhs7g" Dec 01 08:29:33 crc kubenswrapper[5004]: I1201 08:29:33.759732 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-kfwr8" Dec 01 08:29:33 crc kubenswrapper[5004]: I1201 08:29:33.760258 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-9pms6" Dec 01 08:29:33 crc kubenswrapper[5004]: I1201 08:29:33.760333 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-kfwr8" Dec 01 08:29:33 crc kubenswrapper[5004]: I1201 08:29:33.760505 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-9pms6" Dec 01 08:29:35 crc kubenswrapper[5004]: I1201 08:29:35.797527 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-9pms6"] Dec 01 08:29:35 crc kubenswrapper[5004]: I1201 08:29:35.863226 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-kfwr8"] Dec 01 08:29:35 crc kubenswrapper[5004]: W1201 08:29:35.863392 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2573e8e_c093_4ddb_b02d_f4f7e270b97a.slice/crio-998bd0b757a55f007edff93f5d7728e1a77f34d9897a409ea9514afed7fb359e WatchSource:0}: Error finding container 998bd0b757a55f007edff93f5d7728e1a77f34d9897a409ea9514afed7fb359e: Status 404 returned error can't find the container with id 998bd0b757a55f007edff93f5d7728e1a77f34d9897a409ea9514afed7fb359e Dec 01 08:29:35 crc kubenswrapper[5004]: I1201 08:29:35.961465 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5bf4d79569-bhs7g"] Dec 01 08:29:35 crc kubenswrapper[5004]: W1201 08:29:35.967645 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod386e0e23_8226_45e3_a1b5_38fc4fb44eec.slice/crio-0173fd855a0323c4dfe137dafe6daf05b491229166d4b8d59068e7f29642aee7 WatchSource:0}: Error finding container 0173fd855a0323c4dfe137dafe6daf05b491229166d4b8d59068e7f29642aee7: Status 404 returned error can't find the container with id 0173fd855a0323c4dfe137dafe6daf05b491229166d4b8d59068e7f29642aee7 Dec 01 08:29:36 crc kubenswrapper[5004]: I1201 08:29:36.368728 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-9pms6" event={"ID":"1579721a-7166-4152-9703-97b893433c9a","Type":"ContainerStarted","Data":"22b17c3ef2d580f2cf6c07ba5dfc28b7573208b16baf8451b66e227c6fb3e5fc"} Dec 01 08:29:36 crc kubenswrapper[5004]: I1201 08:29:36.370814 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bf4d79569-bhs7g" event={"ID":"386e0e23-8226-45e3-a1b5-38fc4fb44eec","Type":"ContainerStarted","Data":"951c0586b9a4a90551943580ec6851169dda0d6f52c0b978706f61fdd5779089"} Dec 01 08:29:36 crc kubenswrapper[5004]: I1201 08:29:36.370873 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bf4d79569-bhs7g" event={"ID":"386e0e23-8226-45e3-a1b5-38fc4fb44eec","Type":"ContainerStarted","Data":"0173fd855a0323c4dfe137dafe6daf05b491229166d4b8d59068e7f29642aee7"} Dec 01 08:29:36 crc kubenswrapper[5004]: I1201 08:29:36.378612 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-gwtbl" event={"ID":"89a7714c-cec3-46e6-8cdb-016669fcf18e","Type":"ContainerStarted","Data":"594761ae0f405113201a9dace3dc3f3708b9906bfd5004616aacd26856ebebfb"} Dec 01 08:29:36 crc kubenswrapper[5004]: I1201 08:29:36.384258 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bf4d79569-4nf7p" event={"ID":"3a6db099-c25a-4d19-8fa1-8269429274fc","Type":"ContainerStarted","Data":"4e3e200c9cb1ef0e584549cacf437e9aa2d5b8d7dec955d13161f2f4918d7bb1"} Dec 01 08:29:36 crc kubenswrapper[5004]: I1201 08:29:36.386760 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-kfwr8" event={"ID":"c2573e8e-c093-4ddb-b02d-f4f7e270b97a","Type":"ContainerStarted","Data":"998bd0b757a55f007edff93f5d7728e1a77f34d9897a409ea9514afed7fb359e"} Dec 01 08:29:36 crc kubenswrapper[5004]: I1201 08:29:36.397250 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bf4d79569-bhs7g" podStartSLOduration=34.397231045 podStartE2EDuration="34.397231045s" podCreationTimestamp="2025-12-01 08:29:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:29:36.39698537 +0000 UTC m=+753.961977402" watchObservedRunningTime="2025-12-01 08:29:36.397231045 +0000 UTC m=+753.962223027" Dec 01 08:29:36 crc kubenswrapper[5004]: I1201 08:29:36.440993 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-gwtbl" podStartSLOduration=27.925935051 podStartE2EDuration="34.440972909s" podCreationTimestamp="2025-12-01 08:29:02 +0000 UTC" firstStartedPulling="2025-12-01 08:29:28.997759942 +0000 UTC m=+746.562751924" lastFinishedPulling="2025-12-01 08:29:35.5127978 +0000 UTC m=+753.077789782" observedRunningTime="2025-12-01 08:29:36.434786091 +0000 UTC m=+753.999778113" watchObservedRunningTime="2025-12-01 08:29:36.440972909 +0000 UTC m=+754.005964911" Dec 01 08:29:36 crc kubenswrapper[5004]: I1201 08:29:36.464521 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5bf4d79569-4nf7p" podStartSLOduration=26.142879436 podStartE2EDuration="34.464498983s" podCreationTimestamp="2025-12-01 08:29:02 +0000 UTC" firstStartedPulling="2025-12-01 08:29:27.196117663 +0000 UTC m=+744.761109645" lastFinishedPulling="2025-12-01 08:29:35.51773721 +0000 UTC m=+753.082729192" observedRunningTime="2025-12-01 08:29:36.458319846 +0000 UTC m=+754.023311848" watchObservedRunningTime="2025-12-01 08:29:36.464498983 +0000 UTC m=+754.029490965" Dec 01 08:29:39 crc kubenswrapper[5004]: I1201 08:29:39.409703 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-9pms6" event={"ID":"1579721a-7166-4152-9703-97b893433c9a","Type":"ContainerStarted","Data":"ddc9c54675c24183698a20a47bcf66216e54e124f21ed0dffc61c412f3b1eb0c"} Dec 01 08:29:39 crc kubenswrapper[5004]: I1201 08:29:39.411356 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5446b9c989-9pms6" Dec 01 08:29:39 crc kubenswrapper[5004]: I1201 08:29:39.427163 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5446b9c989-9pms6" podStartSLOduration=34.514839944 podStartE2EDuration="37.427148436s" podCreationTimestamp="2025-12-01 08:29:02 +0000 UTC" firstStartedPulling="2025-12-01 08:29:35.812989665 +0000 UTC m=+753.377981647" lastFinishedPulling="2025-12-01 08:29:38.725298147 +0000 UTC m=+756.290290139" observedRunningTime="2025-12-01 08:29:39.423436573 +0000 UTC m=+756.988428555" watchObservedRunningTime="2025-12-01 08:29:39.427148436 +0000 UTC m=+756.992140408" Dec 01 08:29:41 crc kubenswrapper[5004]: I1201 08:29:41.424433 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-kfwr8" event={"ID":"c2573e8e-c093-4ddb-b02d-f4f7e270b97a","Type":"ContainerStarted","Data":"35b9f60b38bbd8f4561944aae7c754bab5a43abbcf48d33d50d7d987016ef954"} Dec 01 08:29:41 crc kubenswrapper[5004]: I1201 08:29:41.425224 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-d8bb48f5d-kfwr8" Dec 01 08:29:41 crc kubenswrapper[5004]: I1201 08:29:41.426693 5004 patch_prober.go:28] interesting pod/observability-operator-d8bb48f5d-kfwr8 container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.11:8081/healthz\": dial tcp 10.217.0.11:8081: connect: connection refused" start-of-body= Dec 01 08:29:41 crc kubenswrapper[5004]: I1201 08:29:41.426749 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-d8bb48f5d-kfwr8" podUID="c2573e8e-c093-4ddb-b02d-f4f7e270b97a" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.11:8081/healthz\": dial tcp 10.217.0.11:8081: connect: connection refused" Dec 01 08:29:41 crc kubenswrapper[5004]: I1201 08:29:41.456765 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-d8bb48f5d-kfwr8" podStartSLOduration=34.12530055 podStartE2EDuration="39.456748152s" podCreationTimestamp="2025-12-01 08:29:02 +0000 UTC" firstStartedPulling="2025-12-01 08:29:35.866437945 +0000 UTC m=+753.431429937" lastFinishedPulling="2025-12-01 08:29:41.197885547 +0000 UTC m=+758.762877539" observedRunningTime="2025-12-01 08:29:41.441511962 +0000 UTC m=+759.006503954" watchObservedRunningTime="2025-12-01 08:29:41.456748152 +0000 UTC m=+759.021740134" Dec 01 08:29:42 crc kubenswrapper[5004]: I1201 08:29:42.432495 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-d8bb48f5d-kfwr8" Dec 01 08:29:47 crc kubenswrapper[5004]: I1201 08:29:47.826067 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-nb5zh"] Dec 01 08:29:47 crc kubenswrapper[5004]: I1201 08:29:47.834094 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-nb5zh" Dec 01 08:29:47 crc kubenswrapper[5004]: I1201 08:29:47.837942 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 01 08:29:47 crc kubenswrapper[5004]: I1201 08:29:47.838188 5004 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-blht8" Dec 01 08:29:47 crc kubenswrapper[5004]: I1201 08:29:47.838513 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 01 08:29:47 crc kubenswrapper[5004]: I1201 08:29:47.841871 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-xgcbt"] Dec 01 08:29:47 crc kubenswrapper[5004]: I1201 08:29:47.843024 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-xgcbt" Dec 01 08:29:47 crc kubenswrapper[5004]: I1201 08:29:47.846671 5004 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-s679n" Dec 01 08:29:47 crc kubenswrapper[5004]: I1201 08:29:47.848533 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-nb5zh"] Dec 01 08:29:47 crc kubenswrapper[5004]: I1201 08:29:47.855542 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-v6wk2"] Dec 01 08:29:47 crc kubenswrapper[5004]: I1201 08:29:47.856413 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-v6wk2" Dec 01 08:29:47 crc kubenswrapper[5004]: I1201 08:29:47.859044 5004 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-46phh" Dec 01 08:29:47 crc kubenswrapper[5004]: I1201 08:29:47.870460 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-xgcbt"] Dec 01 08:29:47 crc kubenswrapper[5004]: I1201 08:29:47.876301 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-v6wk2"] Dec 01 08:29:47 crc kubenswrapper[5004]: I1201 08:29:47.914452 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftwcw\" (UniqueName: \"kubernetes.io/projected/efe385b8-0001-4bc8-9071-9c273ee27982-kube-api-access-ftwcw\") pod \"cert-manager-5b446d88c5-xgcbt\" (UID: \"efe385b8-0001-4bc8-9071-9c273ee27982\") " pod="cert-manager/cert-manager-5b446d88c5-xgcbt" Dec 01 08:29:47 crc kubenswrapper[5004]: I1201 08:29:47.914498 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nx9l\" (UniqueName: \"kubernetes.io/projected/877958bb-ae47-4876-93be-dfa4393fabca-kube-api-access-6nx9l\") pod \"cert-manager-cainjector-7f985d654d-nb5zh\" (UID: \"877958bb-ae47-4876-93be-dfa4393fabca\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-nb5zh" Dec 01 08:29:47 crc kubenswrapper[5004]: I1201 08:29:47.914547 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dchql\" (UniqueName: \"kubernetes.io/projected/6d54b10a-717e-4952-b3ac-49705bee10b5-kube-api-access-dchql\") pod \"cert-manager-webhook-5655c58dd6-v6wk2\" (UID: \"6d54b10a-717e-4952-b3ac-49705bee10b5\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-v6wk2" Dec 01 08:29:48 crc kubenswrapper[5004]: I1201 08:29:48.015273 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nx9l\" (UniqueName: \"kubernetes.io/projected/877958bb-ae47-4876-93be-dfa4393fabca-kube-api-access-6nx9l\") pod \"cert-manager-cainjector-7f985d654d-nb5zh\" (UID: \"877958bb-ae47-4876-93be-dfa4393fabca\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-nb5zh" Dec 01 08:29:48 crc kubenswrapper[5004]: I1201 08:29:48.015315 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dchql\" (UniqueName: \"kubernetes.io/projected/6d54b10a-717e-4952-b3ac-49705bee10b5-kube-api-access-dchql\") pod \"cert-manager-webhook-5655c58dd6-v6wk2\" (UID: \"6d54b10a-717e-4952-b3ac-49705bee10b5\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-v6wk2" Dec 01 08:29:48 crc kubenswrapper[5004]: I1201 08:29:48.015400 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftwcw\" (UniqueName: \"kubernetes.io/projected/efe385b8-0001-4bc8-9071-9c273ee27982-kube-api-access-ftwcw\") pod \"cert-manager-5b446d88c5-xgcbt\" (UID: \"efe385b8-0001-4bc8-9071-9c273ee27982\") " pod="cert-manager/cert-manager-5b446d88c5-xgcbt" Dec 01 08:29:48 crc kubenswrapper[5004]: I1201 08:29:48.034605 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftwcw\" (UniqueName: \"kubernetes.io/projected/efe385b8-0001-4bc8-9071-9c273ee27982-kube-api-access-ftwcw\") pod \"cert-manager-5b446d88c5-xgcbt\" (UID: \"efe385b8-0001-4bc8-9071-9c273ee27982\") " pod="cert-manager/cert-manager-5b446d88c5-xgcbt" Dec 01 08:29:48 crc kubenswrapper[5004]: I1201 08:29:48.037838 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dchql\" (UniqueName: \"kubernetes.io/projected/6d54b10a-717e-4952-b3ac-49705bee10b5-kube-api-access-dchql\") pod \"cert-manager-webhook-5655c58dd6-v6wk2\" (UID: \"6d54b10a-717e-4952-b3ac-49705bee10b5\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-v6wk2" Dec 01 08:29:48 crc kubenswrapper[5004]: I1201 08:29:48.042839 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nx9l\" (UniqueName: \"kubernetes.io/projected/877958bb-ae47-4876-93be-dfa4393fabca-kube-api-access-6nx9l\") pod \"cert-manager-cainjector-7f985d654d-nb5zh\" (UID: \"877958bb-ae47-4876-93be-dfa4393fabca\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-nb5zh" Dec 01 08:29:48 crc kubenswrapper[5004]: I1201 08:29:48.154810 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-nb5zh" Dec 01 08:29:48 crc kubenswrapper[5004]: I1201 08:29:48.168960 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-xgcbt" Dec 01 08:29:48 crc kubenswrapper[5004]: I1201 08:29:48.173668 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-v6wk2" Dec 01 08:29:48 crc kubenswrapper[5004]: I1201 08:29:48.603870 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-nb5zh"] Dec 01 08:29:48 crc kubenswrapper[5004]: W1201 08:29:48.611899 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod877958bb_ae47_4876_93be_dfa4393fabca.slice/crio-d5a036d0efe698eef159fe469f61358db68c6aa1f876bcc409e4c108dac411a6 WatchSource:0}: Error finding container d5a036d0efe698eef159fe469f61358db68c6aa1f876bcc409e4c108dac411a6: Status 404 returned error can't find the container with id d5a036d0efe698eef159fe469f61358db68c6aa1f876bcc409e4c108dac411a6 Dec 01 08:29:48 crc kubenswrapper[5004]: I1201 08:29:48.658413 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-xgcbt"] Dec 01 08:29:48 crc kubenswrapper[5004]: I1201 08:29:48.664817 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-v6wk2"] Dec 01 08:29:49 crc kubenswrapper[5004]: I1201 08:29:49.479574 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-xgcbt" event={"ID":"efe385b8-0001-4bc8-9071-9c273ee27982","Type":"ContainerStarted","Data":"0ba3a4458917058e89be7e3e5b3c05e19bd78480cc9796cd502146a16423bb75"} Dec 01 08:29:49 crc kubenswrapper[5004]: I1201 08:29:49.481275 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-nb5zh" event={"ID":"877958bb-ae47-4876-93be-dfa4393fabca","Type":"ContainerStarted","Data":"d5a036d0efe698eef159fe469f61358db68c6aa1f876bcc409e4c108dac411a6"} Dec 01 08:29:49 crc kubenswrapper[5004]: I1201 08:29:49.483152 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-v6wk2" event={"ID":"6d54b10a-717e-4952-b3ac-49705bee10b5","Type":"ContainerStarted","Data":"5bbaae0d103776cf12589fb0b72b12c4eb90d0c1b963438ecd8db814d53b69bc"} Dec 01 08:29:52 crc kubenswrapper[5004]: I1201 08:29:52.959785 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5446b9c989-9pms6" Dec 01 08:29:57 crc kubenswrapper[5004]: I1201 08:29:57.531746 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-nb5zh" event={"ID":"877958bb-ae47-4876-93be-dfa4393fabca","Type":"ContainerStarted","Data":"d036ea5d2513be36f21adaac99b3fca917213b78cba73cc42bde1416783aad0b"} Dec 01 08:29:57 crc kubenswrapper[5004]: I1201 08:29:57.535730 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-v6wk2" event={"ID":"6d54b10a-717e-4952-b3ac-49705bee10b5","Type":"ContainerStarted","Data":"254599bfd2e55a7603d8705d70fe439f809b9db8caf4b76675b8ebfeee0fd3be"} Dec 01 08:29:57 crc kubenswrapper[5004]: I1201 08:29:57.535875 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-v6wk2" Dec 01 08:29:57 crc kubenswrapper[5004]: I1201 08:29:57.537358 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-xgcbt" event={"ID":"efe385b8-0001-4bc8-9071-9c273ee27982","Type":"ContainerStarted","Data":"d5cf444389e53432fcde9ce0965978d94ccfa170bd87ee94c7826f90e2e9b203"} Dec 01 08:29:57 crc kubenswrapper[5004]: I1201 08:29:57.546716 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-nb5zh" podStartSLOduration=2.2717029650000002 podStartE2EDuration="10.546694234s" podCreationTimestamp="2025-12-01 08:29:47 +0000 UTC" firstStartedPulling="2025-12-01 08:29:48.61445864 +0000 UTC m=+766.179450632" lastFinishedPulling="2025-12-01 08:29:56.889449919 +0000 UTC m=+774.454441901" observedRunningTime="2025-12-01 08:29:57.545313323 +0000 UTC m=+775.110305315" watchObservedRunningTime="2025-12-01 08:29:57.546694234 +0000 UTC m=+775.111686256" Dec 01 08:29:57 crc kubenswrapper[5004]: I1201 08:29:57.557868 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-xgcbt" podStartSLOduration=2.327399485 podStartE2EDuration="10.557851562s" podCreationTimestamp="2025-12-01 08:29:47 +0000 UTC" firstStartedPulling="2025-12-01 08:29:48.671100431 +0000 UTC m=+766.236092413" lastFinishedPulling="2025-12-01 08:29:56.901552508 +0000 UTC m=+774.466544490" observedRunningTime="2025-12-01 08:29:57.555389868 +0000 UTC m=+775.120381860" watchObservedRunningTime="2025-12-01 08:29:57.557851562 +0000 UTC m=+775.122843564" Dec 01 08:29:57 crc kubenswrapper[5004]: I1201 08:29:57.575939 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-v6wk2" podStartSLOduration=2.350650064 podStartE2EDuration="10.575915405s" podCreationTimestamp="2025-12-01 08:29:47 +0000 UTC" firstStartedPulling="2025-12-01 08:29:48.672468142 +0000 UTC m=+766.237460124" lastFinishedPulling="2025-12-01 08:29:56.897733483 +0000 UTC m=+774.462725465" observedRunningTime="2025-12-01 08:29:57.571023096 +0000 UTC m=+775.136015088" watchObservedRunningTime="2025-12-01 08:29:57.575915405 +0000 UTC m=+775.140907387" Dec 01 08:30:00 crc kubenswrapper[5004]: I1201 08:30:00.186874 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409630-7x9vn"] Dec 01 08:30:00 crc kubenswrapper[5004]: I1201 08:30:00.188151 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409630-7x9vn" Dec 01 08:30:00 crc kubenswrapper[5004]: I1201 08:30:00.191169 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 08:30:00 crc kubenswrapper[5004]: I1201 08:30:00.191504 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 08:30:00 crc kubenswrapper[5004]: I1201 08:30:00.199877 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409630-7x9vn"] Dec 01 08:30:00 crc kubenswrapper[5004]: I1201 08:30:00.203974 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klv5x\" (UniqueName: \"kubernetes.io/projected/38460618-7ff3-4591-98f3-f20af1bfff60-kube-api-access-klv5x\") pod \"collect-profiles-29409630-7x9vn\" (UID: \"38460618-7ff3-4591-98f3-f20af1bfff60\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409630-7x9vn" Dec 01 08:30:00 crc kubenswrapper[5004]: I1201 08:30:00.204055 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/38460618-7ff3-4591-98f3-f20af1bfff60-config-volume\") pod \"collect-profiles-29409630-7x9vn\" (UID: \"38460618-7ff3-4591-98f3-f20af1bfff60\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409630-7x9vn" Dec 01 08:30:00 crc kubenswrapper[5004]: I1201 08:30:00.204079 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/38460618-7ff3-4591-98f3-f20af1bfff60-secret-volume\") pod \"collect-profiles-29409630-7x9vn\" (UID: \"38460618-7ff3-4591-98f3-f20af1bfff60\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409630-7x9vn" Dec 01 08:30:00 crc kubenswrapper[5004]: I1201 08:30:00.305019 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/38460618-7ff3-4591-98f3-f20af1bfff60-secret-volume\") pod \"collect-profiles-29409630-7x9vn\" (UID: \"38460618-7ff3-4591-98f3-f20af1bfff60\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409630-7x9vn" Dec 01 08:30:00 crc kubenswrapper[5004]: I1201 08:30:00.305095 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klv5x\" (UniqueName: \"kubernetes.io/projected/38460618-7ff3-4591-98f3-f20af1bfff60-kube-api-access-klv5x\") pod \"collect-profiles-29409630-7x9vn\" (UID: \"38460618-7ff3-4591-98f3-f20af1bfff60\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409630-7x9vn" Dec 01 08:30:00 crc kubenswrapper[5004]: I1201 08:30:00.305155 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/38460618-7ff3-4591-98f3-f20af1bfff60-config-volume\") pod \"collect-profiles-29409630-7x9vn\" (UID: \"38460618-7ff3-4591-98f3-f20af1bfff60\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409630-7x9vn" Dec 01 08:30:00 crc kubenswrapper[5004]: I1201 08:30:00.305993 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/38460618-7ff3-4591-98f3-f20af1bfff60-config-volume\") pod \"collect-profiles-29409630-7x9vn\" (UID: \"38460618-7ff3-4591-98f3-f20af1bfff60\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409630-7x9vn" Dec 01 08:30:00 crc kubenswrapper[5004]: I1201 08:30:00.310804 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/38460618-7ff3-4591-98f3-f20af1bfff60-secret-volume\") pod \"collect-profiles-29409630-7x9vn\" (UID: \"38460618-7ff3-4591-98f3-f20af1bfff60\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409630-7x9vn" Dec 01 08:30:00 crc kubenswrapper[5004]: I1201 08:30:00.319952 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klv5x\" (UniqueName: \"kubernetes.io/projected/38460618-7ff3-4591-98f3-f20af1bfff60-kube-api-access-klv5x\") pod \"collect-profiles-29409630-7x9vn\" (UID: \"38460618-7ff3-4591-98f3-f20af1bfff60\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409630-7x9vn" Dec 01 08:30:00 crc kubenswrapper[5004]: I1201 08:30:00.506483 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409630-7x9vn" Dec 01 08:30:00 crc kubenswrapper[5004]: W1201 08:30:00.767504 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38460618_7ff3_4591_98f3_f20af1bfff60.slice/crio-a1e786d0b07379889c907e259d37df15e96edad1ab6e4b0c323d491610052de8 WatchSource:0}: Error finding container a1e786d0b07379889c907e259d37df15e96edad1ab6e4b0c323d491610052de8: Status 404 returned error can't find the container with id a1e786d0b07379889c907e259d37df15e96edad1ab6e4b0c323d491610052de8 Dec 01 08:30:00 crc kubenswrapper[5004]: I1201 08:30:00.769000 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409630-7x9vn"] Dec 01 08:30:01 crc kubenswrapper[5004]: I1201 08:30:01.570643 5004 generic.go:334] "Generic (PLEG): container finished" podID="38460618-7ff3-4591-98f3-f20af1bfff60" containerID="61e9fd11bbe94fabce0f763ab29f3e72e66d74ee0807111eadaadd3b22e7b12d" exitCode=0 Dec 01 08:30:01 crc kubenswrapper[5004]: I1201 08:30:01.570769 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409630-7x9vn" event={"ID":"38460618-7ff3-4591-98f3-f20af1bfff60","Type":"ContainerDied","Data":"61e9fd11bbe94fabce0f763ab29f3e72e66d74ee0807111eadaadd3b22e7b12d"} Dec 01 08:30:01 crc kubenswrapper[5004]: I1201 08:30:01.570978 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409630-7x9vn" event={"ID":"38460618-7ff3-4591-98f3-f20af1bfff60","Type":"ContainerStarted","Data":"a1e786d0b07379889c907e259d37df15e96edad1ab6e4b0c323d491610052de8"} Dec 01 08:30:02 crc kubenswrapper[5004]: I1201 08:30:02.847998 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409630-7x9vn" Dec 01 08:30:02 crc kubenswrapper[5004]: I1201 08:30:02.865365 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-klv5x\" (UniqueName: \"kubernetes.io/projected/38460618-7ff3-4591-98f3-f20af1bfff60-kube-api-access-klv5x\") pod \"38460618-7ff3-4591-98f3-f20af1bfff60\" (UID: \"38460618-7ff3-4591-98f3-f20af1bfff60\") " Dec 01 08:30:02 crc kubenswrapper[5004]: I1201 08:30:02.868768 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/38460618-7ff3-4591-98f3-f20af1bfff60-secret-volume\") pod \"38460618-7ff3-4591-98f3-f20af1bfff60\" (UID: \"38460618-7ff3-4591-98f3-f20af1bfff60\") " Dec 01 08:30:02 crc kubenswrapper[5004]: I1201 08:30:02.868926 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/38460618-7ff3-4591-98f3-f20af1bfff60-config-volume\") pod \"38460618-7ff3-4591-98f3-f20af1bfff60\" (UID: \"38460618-7ff3-4591-98f3-f20af1bfff60\") " Dec 01 08:30:02 crc kubenswrapper[5004]: I1201 08:30:02.869343 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38460618-7ff3-4591-98f3-f20af1bfff60-config-volume" (OuterVolumeSpecName: "config-volume") pod "38460618-7ff3-4591-98f3-f20af1bfff60" (UID: "38460618-7ff3-4591-98f3-f20af1bfff60"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:30:02 crc kubenswrapper[5004]: I1201 08:30:02.870012 5004 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/38460618-7ff3-4591-98f3-f20af1bfff60-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 08:30:02 crc kubenswrapper[5004]: I1201 08:30:02.871866 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38460618-7ff3-4591-98f3-f20af1bfff60-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "38460618-7ff3-4591-98f3-f20af1bfff60" (UID: "38460618-7ff3-4591-98f3-f20af1bfff60"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:30:02 crc kubenswrapper[5004]: I1201 08:30:02.873172 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38460618-7ff3-4591-98f3-f20af1bfff60-kube-api-access-klv5x" (OuterVolumeSpecName: "kube-api-access-klv5x") pod "38460618-7ff3-4591-98f3-f20af1bfff60" (UID: "38460618-7ff3-4591-98f3-f20af1bfff60"). InnerVolumeSpecName "kube-api-access-klv5x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:30:02 crc kubenswrapper[5004]: I1201 08:30:02.970717 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-klv5x\" (UniqueName: \"kubernetes.io/projected/38460618-7ff3-4591-98f3-f20af1bfff60-kube-api-access-klv5x\") on node \"crc\" DevicePath \"\"" Dec 01 08:30:02 crc kubenswrapper[5004]: I1201 08:30:02.970752 5004 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/38460618-7ff3-4591-98f3-f20af1bfff60-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 08:30:03 crc kubenswrapper[5004]: I1201 08:30:03.179420 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-v6wk2" Dec 01 08:30:03 crc kubenswrapper[5004]: I1201 08:30:03.586526 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409630-7x9vn" event={"ID":"38460618-7ff3-4591-98f3-f20af1bfff60","Type":"ContainerDied","Data":"a1e786d0b07379889c907e259d37df15e96edad1ab6e4b0c323d491610052de8"} Dec 01 08:30:03 crc kubenswrapper[5004]: I1201 08:30:03.586909 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1e786d0b07379889c907e259d37df15e96edad1ab6e4b0c323d491610052de8" Dec 01 08:30:03 crc kubenswrapper[5004]: I1201 08:30:03.586658 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409630-7x9vn" Dec 01 08:30:08 crc kubenswrapper[5004]: I1201 08:30:08.729462 5004 patch_prober.go:28] interesting pod/machine-config-daemon-fvdgt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 08:30:08 crc kubenswrapper[5004]: I1201 08:30:08.729944 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 08:30:31 crc kubenswrapper[5004]: I1201 08:30:31.717808 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fn57ss"] Dec 01 08:30:31 crc kubenswrapper[5004]: E1201 08:30:31.718879 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38460618-7ff3-4591-98f3-f20af1bfff60" containerName="collect-profiles" Dec 01 08:30:31 crc kubenswrapper[5004]: I1201 08:30:31.718900 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="38460618-7ff3-4591-98f3-f20af1bfff60" containerName="collect-profiles" Dec 01 08:30:31 crc kubenswrapper[5004]: I1201 08:30:31.719132 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="38460618-7ff3-4591-98f3-f20af1bfff60" containerName="collect-profiles" Dec 01 08:30:31 crc kubenswrapper[5004]: I1201 08:30:31.720838 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fn57ss" Dec 01 08:30:31 crc kubenswrapper[5004]: I1201 08:30:31.726199 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 01 08:30:31 crc kubenswrapper[5004]: I1201 08:30:31.731680 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fn57ss"] Dec 01 08:30:31 crc kubenswrapper[5004]: I1201 08:30:31.762898 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkhsn\" (UniqueName: \"kubernetes.io/projected/78cb161c-a9d6-4fd5-9144-6564ca31cd33-kube-api-access-dkhsn\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fn57ss\" (UID: \"78cb161c-a9d6-4fd5-9144-6564ca31cd33\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fn57ss" Dec 01 08:30:31 crc kubenswrapper[5004]: I1201 08:30:31.762962 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/78cb161c-a9d6-4fd5-9144-6564ca31cd33-bundle\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fn57ss\" (UID: \"78cb161c-a9d6-4fd5-9144-6564ca31cd33\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fn57ss" Dec 01 08:30:31 crc kubenswrapper[5004]: I1201 08:30:31.763023 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/78cb161c-a9d6-4fd5-9144-6564ca31cd33-util\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fn57ss\" (UID: \"78cb161c-a9d6-4fd5-9144-6564ca31cd33\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fn57ss" Dec 01 08:30:31 crc kubenswrapper[5004]: I1201 08:30:31.864820 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkhsn\" (UniqueName: \"kubernetes.io/projected/78cb161c-a9d6-4fd5-9144-6564ca31cd33-kube-api-access-dkhsn\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fn57ss\" (UID: \"78cb161c-a9d6-4fd5-9144-6564ca31cd33\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fn57ss" Dec 01 08:30:31 crc kubenswrapper[5004]: I1201 08:30:31.864890 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/78cb161c-a9d6-4fd5-9144-6564ca31cd33-bundle\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fn57ss\" (UID: \"78cb161c-a9d6-4fd5-9144-6564ca31cd33\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fn57ss" Dec 01 08:30:31 crc kubenswrapper[5004]: I1201 08:30:31.864967 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/78cb161c-a9d6-4fd5-9144-6564ca31cd33-util\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fn57ss\" (UID: \"78cb161c-a9d6-4fd5-9144-6564ca31cd33\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fn57ss" Dec 01 08:30:31 crc kubenswrapper[5004]: I1201 08:30:31.865661 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/78cb161c-a9d6-4fd5-9144-6564ca31cd33-util\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fn57ss\" (UID: \"78cb161c-a9d6-4fd5-9144-6564ca31cd33\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fn57ss" Dec 01 08:30:31 crc kubenswrapper[5004]: I1201 08:30:31.866444 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/78cb161c-a9d6-4fd5-9144-6564ca31cd33-bundle\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fn57ss\" (UID: \"78cb161c-a9d6-4fd5-9144-6564ca31cd33\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fn57ss" Dec 01 08:30:31 crc kubenswrapper[5004]: I1201 08:30:31.904499 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8bsmpc"] Dec 01 08:30:31 crc kubenswrapper[5004]: I1201 08:30:31.905575 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkhsn\" (UniqueName: \"kubernetes.io/projected/78cb161c-a9d6-4fd5-9144-6564ca31cd33-kube-api-access-dkhsn\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fn57ss\" (UID: \"78cb161c-a9d6-4fd5-9144-6564ca31cd33\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fn57ss" Dec 01 08:30:31 crc kubenswrapper[5004]: I1201 08:30:31.906139 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8bsmpc" Dec 01 08:30:31 crc kubenswrapper[5004]: I1201 08:30:31.919858 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8bsmpc"] Dec 01 08:30:31 crc kubenswrapper[5004]: I1201 08:30:31.966610 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8659be81-88bd-4a0a-b117-c72f2c9e9035-bundle\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8bsmpc\" (UID: \"8659be81-88bd-4a0a-b117-c72f2c9e9035\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8bsmpc" Dec 01 08:30:31 crc kubenswrapper[5004]: I1201 08:30:31.966923 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8659be81-88bd-4a0a-b117-c72f2c9e9035-util\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8bsmpc\" (UID: \"8659be81-88bd-4a0a-b117-c72f2c9e9035\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8bsmpc" Dec 01 08:30:31 crc kubenswrapper[5004]: I1201 08:30:31.966945 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f76s\" (UniqueName: \"kubernetes.io/projected/8659be81-88bd-4a0a-b117-c72f2c9e9035-kube-api-access-9f76s\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8bsmpc\" (UID: \"8659be81-88bd-4a0a-b117-c72f2c9e9035\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8bsmpc" Dec 01 08:30:32 crc kubenswrapper[5004]: I1201 08:30:32.067939 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8659be81-88bd-4a0a-b117-c72f2c9e9035-util\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8bsmpc\" (UID: \"8659be81-88bd-4a0a-b117-c72f2c9e9035\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8bsmpc" Dec 01 08:30:32 crc kubenswrapper[5004]: I1201 08:30:32.067980 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9f76s\" (UniqueName: \"kubernetes.io/projected/8659be81-88bd-4a0a-b117-c72f2c9e9035-kube-api-access-9f76s\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8bsmpc\" (UID: \"8659be81-88bd-4a0a-b117-c72f2c9e9035\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8bsmpc" Dec 01 08:30:32 crc kubenswrapper[5004]: I1201 08:30:32.068058 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8659be81-88bd-4a0a-b117-c72f2c9e9035-bundle\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8bsmpc\" (UID: \"8659be81-88bd-4a0a-b117-c72f2c9e9035\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8bsmpc" Dec 01 08:30:32 crc kubenswrapper[5004]: I1201 08:30:32.068431 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8659be81-88bd-4a0a-b117-c72f2c9e9035-util\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8bsmpc\" (UID: \"8659be81-88bd-4a0a-b117-c72f2c9e9035\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8bsmpc" Dec 01 08:30:32 crc kubenswrapper[5004]: I1201 08:30:32.068520 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8659be81-88bd-4a0a-b117-c72f2c9e9035-bundle\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8bsmpc\" (UID: \"8659be81-88bd-4a0a-b117-c72f2c9e9035\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8bsmpc" Dec 01 08:30:32 crc kubenswrapper[5004]: I1201 08:30:32.077238 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fn57ss" Dec 01 08:30:32 crc kubenswrapper[5004]: I1201 08:30:32.101804 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9f76s\" (UniqueName: \"kubernetes.io/projected/8659be81-88bd-4a0a-b117-c72f2c9e9035-kube-api-access-9f76s\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8bsmpc\" (UID: \"8659be81-88bd-4a0a-b117-c72f2c9e9035\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8bsmpc" Dec 01 08:30:32 crc kubenswrapper[5004]: I1201 08:30:32.246910 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8bsmpc" Dec 01 08:30:32 crc kubenswrapper[5004]: I1201 08:30:32.342030 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fn57ss"] Dec 01 08:30:32 crc kubenswrapper[5004]: I1201 08:30:32.653921 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8bsmpc"] Dec 01 08:30:32 crc kubenswrapper[5004]: I1201 08:30:32.819523 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8bsmpc" event={"ID":"8659be81-88bd-4a0a-b117-c72f2c9e9035","Type":"ContainerStarted","Data":"9cdef799fb5383599c29997724bc10acb695e8e822b0b06fd4034695cf87b87c"} Dec 01 08:30:32 crc kubenswrapper[5004]: I1201 08:30:32.819863 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8bsmpc" event={"ID":"8659be81-88bd-4a0a-b117-c72f2c9e9035","Type":"ContainerStarted","Data":"3f1d024b7706920adf1bacdeb49532645db1dce0820286f795eb65ce766c1536"} Dec 01 08:30:32 crc kubenswrapper[5004]: I1201 08:30:32.826681 5004 generic.go:334] "Generic (PLEG): container finished" podID="78cb161c-a9d6-4fd5-9144-6564ca31cd33" containerID="6fc818cd9ddf3c2ad64109f2dc58f24a5552f0ecaa2026f6a92589968d6e2fb2" exitCode=0 Dec 01 08:30:32 crc kubenswrapper[5004]: I1201 08:30:32.826777 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fn57ss" event={"ID":"78cb161c-a9d6-4fd5-9144-6564ca31cd33","Type":"ContainerDied","Data":"6fc818cd9ddf3c2ad64109f2dc58f24a5552f0ecaa2026f6a92589968d6e2fb2"} Dec 01 08:30:32 crc kubenswrapper[5004]: I1201 08:30:32.826847 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fn57ss" event={"ID":"78cb161c-a9d6-4fd5-9144-6564ca31cd33","Type":"ContainerStarted","Data":"a64b728e089f26b8da5bae5ec40f19c832d6d3547a7dac9fa859674b719965ea"} Dec 01 08:30:33 crc kubenswrapper[5004]: I1201 08:30:33.837460 5004 generic.go:334] "Generic (PLEG): container finished" podID="8659be81-88bd-4a0a-b117-c72f2c9e9035" containerID="9cdef799fb5383599c29997724bc10acb695e8e822b0b06fd4034695cf87b87c" exitCode=0 Dec 01 08:30:33 crc kubenswrapper[5004]: I1201 08:30:33.837593 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8bsmpc" event={"ID":"8659be81-88bd-4a0a-b117-c72f2c9e9035","Type":"ContainerDied","Data":"9cdef799fb5383599c29997724bc10acb695e8e822b0b06fd4034695cf87b87c"} Dec 01 08:30:34 crc kubenswrapper[5004]: I1201 08:30:34.847861 5004 generic.go:334] "Generic (PLEG): container finished" podID="78cb161c-a9d6-4fd5-9144-6564ca31cd33" containerID="02f9320b00c1fd4e763910ab8548c292febc6720ce6874403350cab1d29ce1d9" exitCode=0 Dec 01 08:30:34 crc kubenswrapper[5004]: I1201 08:30:34.847910 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fn57ss" event={"ID":"78cb161c-a9d6-4fd5-9144-6564ca31cd33","Type":"ContainerDied","Data":"02f9320b00c1fd4e763910ab8548c292febc6720ce6874403350cab1d29ce1d9"} Dec 01 08:30:35 crc kubenswrapper[5004]: I1201 08:30:35.451655 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dwk28"] Dec 01 08:30:35 crc kubenswrapper[5004]: I1201 08:30:35.454285 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dwk28" Dec 01 08:30:35 crc kubenswrapper[5004]: I1201 08:30:35.466811 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dwk28"] Dec 01 08:30:35 crc kubenswrapper[5004]: I1201 08:30:35.565817 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e560049a-ccf6-4dc3-b0aa-fb5f5482ead1-catalog-content\") pod \"redhat-operators-dwk28\" (UID: \"e560049a-ccf6-4dc3-b0aa-fb5f5482ead1\") " pod="openshift-marketplace/redhat-operators-dwk28" Dec 01 08:30:35 crc kubenswrapper[5004]: I1201 08:30:35.565920 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmkv6\" (UniqueName: \"kubernetes.io/projected/e560049a-ccf6-4dc3-b0aa-fb5f5482ead1-kube-api-access-hmkv6\") pod \"redhat-operators-dwk28\" (UID: \"e560049a-ccf6-4dc3-b0aa-fb5f5482ead1\") " pod="openshift-marketplace/redhat-operators-dwk28" Dec 01 08:30:35 crc kubenswrapper[5004]: I1201 08:30:35.566124 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e560049a-ccf6-4dc3-b0aa-fb5f5482ead1-utilities\") pod \"redhat-operators-dwk28\" (UID: \"e560049a-ccf6-4dc3-b0aa-fb5f5482ead1\") " pod="openshift-marketplace/redhat-operators-dwk28" Dec 01 08:30:35 crc kubenswrapper[5004]: I1201 08:30:35.668101 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e560049a-ccf6-4dc3-b0aa-fb5f5482ead1-catalog-content\") pod \"redhat-operators-dwk28\" (UID: \"e560049a-ccf6-4dc3-b0aa-fb5f5482ead1\") " pod="openshift-marketplace/redhat-operators-dwk28" Dec 01 08:30:35 crc kubenswrapper[5004]: I1201 08:30:35.668462 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmkv6\" (UniqueName: \"kubernetes.io/projected/e560049a-ccf6-4dc3-b0aa-fb5f5482ead1-kube-api-access-hmkv6\") pod \"redhat-operators-dwk28\" (UID: \"e560049a-ccf6-4dc3-b0aa-fb5f5482ead1\") " pod="openshift-marketplace/redhat-operators-dwk28" Dec 01 08:30:35 crc kubenswrapper[5004]: I1201 08:30:35.668726 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e560049a-ccf6-4dc3-b0aa-fb5f5482ead1-utilities\") pod \"redhat-operators-dwk28\" (UID: \"e560049a-ccf6-4dc3-b0aa-fb5f5482ead1\") " pod="openshift-marketplace/redhat-operators-dwk28" Dec 01 08:30:35 crc kubenswrapper[5004]: I1201 08:30:35.668796 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e560049a-ccf6-4dc3-b0aa-fb5f5482ead1-catalog-content\") pod \"redhat-operators-dwk28\" (UID: \"e560049a-ccf6-4dc3-b0aa-fb5f5482ead1\") " pod="openshift-marketplace/redhat-operators-dwk28" Dec 01 08:30:35 crc kubenswrapper[5004]: I1201 08:30:35.669077 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e560049a-ccf6-4dc3-b0aa-fb5f5482ead1-utilities\") pod \"redhat-operators-dwk28\" (UID: \"e560049a-ccf6-4dc3-b0aa-fb5f5482ead1\") " pod="openshift-marketplace/redhat-operators-dwk28" Dec 01 08:30:35 crc kubenswrapper[5004]: I1201 08:30:35.695339 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmkv6\" (UniqueName: \"kubernetes.io/projected/e560049a-ccf6-4dc3-b0aa-fb5f5482ead1-kube-api-access-hmkv6\") pod \"redhat-operators-dwk28\" (UID: \"e560049a-ccf6-4dc3-b0aa-fb5f5482ead1\") " pod="openshift-marketplace/redhat-operators-dwk28" Dec 01 08:30:35 crc kubenswrapper[5004]: I1201 08:30:35.800553 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dwk28" Dec 01 08:30:35 crc kubenswrapper[5004]: I1201 08:30:35.858742 5004 generic.go:334] "Generic (PLEG): container finished" podID="8659be81-88bd-4a0a-b117-c72f2c9e9035" containerID="6464afbfb53b20ce393a96abf805a26733a34503f00dc8f6e7f1bc1228cbd6a2" exitCode=0 Dec 01 08:30:35 crc kubenswrapper[5004]: I1201 08:30:35.858803 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8bsmpc" event={"ID":"8659be81-88bd-4a0a-b117-c72f2c9e9035","Type":"ContainerDied","Data":"6464afbfb53b20ce393a96abf805a26733a34503f00dc8f6e7f1bc1228cbd6a2"} Dec 01 08:30:35 crc kubenswrapper[5004]: I1201 08:30:35.870006 5004 generic.go:334] "Generic (PLEG): container finished" podID="78cb161c-a9d6-4fd5-9144-6564ca31cd33" containerID="7df851b599e9bc91ffa92afab4dc7e3ad11f6230da889ce63dc272ea248ea3cf" exitCode=0 Dec 01 08:30:35 crc kubenswrapper[5004]: I1201 08:30:35.870042 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fn57ss" event={"ID":"78cb161c-a9d6-4fd5-9144-6564ca31cd33","Type":"ContainerDied","Data":"7df851b599e9bc91ffa92afab4dc7e3ad11f6230da889ce63dc272ea248ea3cf"} Dec 01 08:30:36 crc kubenswrapper[5004]: I1201 08:30:36.080305 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dwk28"] Dec 01 08:30:36 crc kubenswrapper[5004]: I1201 08:30:36.880388 5004 generic.go:334] "Generic (PLEG): container finished" podID="8659be81-88bd-4a0a-b117-c72f2c9e9035" containerID="e0f2aaca4b4bbad0059d0e1f3d96ae775a2868497f07bcab4dbb237a79796e3d" exitCode=0 Dec 01 08:30:36 crc kubenswrapper[5004]: I1201 08:30:36.880462 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8bsmpc" event={"ID":"8659be81-88bd-4a0a-b117-c72f2c9e9035","Type":"ContainerDied","Data":"e0f2aaca4b4bbad0059d0e1f3d96ae775a2868497f07bcab4dbb237a79796e3d"} Dec 01 08:30:36 crc kubenswrapper[5004]: I1201 08:30:36.883170 5004 generic.go:334] "Generic (PLEG): container finished" podID="e560049a-ccf6-4dc3-b0aa-fb5f5482ead1" containerID="d0ddf21f4270b92354c64b4dcdb87d8894085b7006a4328d4b815b39babfcd2a" exitCode=0 Dec 01 08:30:36 crc kubenswrapper[5004]: I1201 08:30:36.883253 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dwk28" event={"ID":"e560049a-ccf6-4dc3-b0aa-fb5f5482ead1","Type":"ContainerDied","Data":"d0ddf21f4270b92354c64b4dcdb87d8894085b7006a4328d4b815b39babfcd2a"} Dec 01 08:30:36 crc kubenswrapper[5004]: I1201 08:30:36.883322 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dwk28" event={"ID":"e560049a-ccf6-4dc3-b0aa-fb5f5482ead1","Type":"ContainerStarted","Data":"6981db3a1799edd7bde268a4299be3a7921e59dc38a2075353cafa3311660e07"} Dec 01 08:30:37 crc kubenswrapper[5004]: I1201 08:30:37.260278 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fn57ss" Dec 01 08:30:37 crc kubenswrapper[5004]: I1201 08:30:37.391393 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/78cb161c-a9d6-4fd5-9144-6564ca31cd33-bundle\") pod \"78cb161c-a9d6-4fd5-9144-6564ca31cd33\" (UID: \"78cb161c-a9d6-4fd5-9144-6564ca31cd33\") " Dec 01 08:30:37 crc kubenswrapper[5004]: I1201 08:30:37.391617 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/78cb161c-a9d6-4fd5-9144-6564ca31cd33-util\") pod \"78cb161c-a9d6-4fd5-9144-6564ca31cd33\" (UID: \"78cb161c-a9d6-4fd5-9144-6564ca31cd33\") " Dec 01 08:30:37 crc kubenswrapper[5004]: I1201 08:30:37.391665 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkhsn\" (UniqueName: \"kubernetes.io/projected/78cb161c-a9d6-4fd5-9144-6564ca31cd33-kube-api-access-dkhsn\") pod \"78cb161c-a9d6-4fd5-9144-6564ca31cd33\" (UID: \"78cb161c-a9d6-4fd5-9144-6564ca31cd33\") " Dec 01 08:30:37 crc kubenswrapper[5004]: I1201 08:30:37.393042 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78cb161c-a9d6-4fd5-9144-6564ca31cd33-bundle" (OuterVolumeSpecName: "bundle") pod "78cb161c-a9d6-4fd5-9144-6564ca31cd33" (UID: "78cb161c-a9d6-4fd5-9144-6564ca31cd33"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:30:37 crc kubenswrapper[5004]: I1201 08:30:37.403597 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78cb161c-a9d6-4fd5-9144-6564ca31cd33-kube-api-access-dkhsn" (OuterVolumeSpecName: "kube-api-access-dkhsn") pod "78cb161c-a9d6-4fd5-9144-6564ca31cd33" (UID: "78cb161c-a9d6-4fd5-9144-6564ca31cd33"). InnerVolumeSpecName "kube-api-access-dkhsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:30:37 crc kubenswrapper[5004]: I1201 08:30:37.421919 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78cb161c-a9d6-4fd5-9144-6564ca31cd33-util" (OuterVolumeSpecName: "util") pod "78cb161c-a9d6-4fd5-9144-6564ca31cd33" (UID: "78cb161c-a9d6-4fd5-9144-6564ca31cd33"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:30:37 crc kubenswrapper[5004]: I1201 08:30:37.494084 5004 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/78cb161c-a9d6-4fd5-9144-6564ca31cd33-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:30:37 crc kubenswrapper[5004]: I1201 08:30:37.494140 5004 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/78cb161c-a9d6-4fd5-9144-6564ca31cd33-util\") on node \"crc\" DevicePath \"\"" Dec 01 08:30:37 crc kubenswrapper[5004]: I1201 08:30:37.494162 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkhsn\" (UniqueName: \"kubernetes.io/projected/78cb161c-a9d6-4fd5-9144-6564ca31cd33-kube-api-access-dkhsn\") on node \"crc\" DevicePath \"\"" Dec 01 08:30:37 crc kubenswrapper[5004]: I1201 08:30:37.893090 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fn57ss" event={"ID":"78cb161c-a9d6-4fd5-9144-6564ca31cd33","Type":"ContainerDied","Data":"a64b728e089f26b8da5bae5ec40f19c832d6d3547a7dac9fa859674b719965ea"} Dec 01 08:30:37 crc kubenswrapper[5004]: I1201 08:30:37.893382 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a64b728e089f26b8da5bae5ec40f19c832d6d3547a7dac9fa859674b719965ea" Dec 01 08:30:37 crc kubenswrapper[5004]: I1201 08:30:37.893164 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fn57ss" Dec 01 08:30:37 crc kubenswrapper[5004]: I1201 08:30:37.896141 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dwk28" event={"ID":"e560049a-ccf6-4dc3-b0aa-fb5f5482ead1","Type":"ContainerStarted","Data":"875ba34bcaf33f3dae1a572e856740e7c97d9f6307f38988f0b0a32cdbf0825a"} Dec 01 08:30:38 crc kubenswrapper[5004]: I1201 08:30:38.281494 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8bsmpc" Dec 01 08:30:38 crc kubenswrapper[5004]: I1201 08:30:38.412429 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9f76s\" (UniqueName: \"kubernetes.io/projected/8659be81-88bd-4a0a-b117-c72f2c9e9035-kube-api-access-9f76s\") pod \"8659be81-88bd-4a0a-b117-c72f2c9e9035\" (UID: \"8659be81-88bd-4a0a-b117-c72f2c9e9035\") " Dec 01 08:30:38 crc kubenswrapper[5004]: I1201 08:30:38.412554 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8659be81-88bd-4a0a-b117-c72f2c9e9035-bundle\") pod \"8659be81-88bd-4a0a-b117-c72f2c9e9035\" (UID: \"8659be81-88bd-4a0a-b117-c72f2c9e9035\") " Dec 01 08:30:38 crc kubenswrapper[5004]: I1201 08:30:38.412629 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8659be81-88bd-4a0a-b117-c72f2c9e9035-util\") pod \"8659be81-88bd-4a0a-b117-c72f2c9e9035\" (UID: \"8659be81-88bd-4a0a-b117-c72f2c9e9035\") " Dec 01 08:30:38 crc kubenswrapper[5004]: I1201 08:30:38.413829 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8659be81-88bd-4a0a-b117-c72f2c9e9035-bundle" (OuterVolumeSpecName: "bundle") pod "8659be81-88bd-4a0a-b117-c72f2c9e9035" (UID: "8659be81-88bd-4a0a-b117-c72f2c9e9035"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:30:38 crc kubenswrapper[5004]: I1201 08:30:38.423294 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8659be81-88bd-4a0a-b117-c72f2c9e9035-util" (OuterVolumeSpecName: "util") pod "8659be81-88bd-4a0a-b117-c72f2c9e9035" (UID: "8659be81-88bd-4a0a-b117-c72f2c9e9035"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:30:38 crc kubenswrapper[5004]: I1201 08:30:38.427825 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8659be81-88bd-4a0a-b117-c72f2c9e9035-kube-api-access-9f76s" (OuterVolumeSpecName: "kube-api-access-9f76s") pod "8659be81-88bd-4a0a-b117-c72f2c9e9035" (UID: "8659be81-88bd-4a0a-b117-c72f2c9e9035"). InnerVolumeSpecName "kube-api-access-9f76s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:30:38 crc kubenswrapper[5004]: I1201 08:30:38.514687 5004 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8659be81-88bd-4a0a-b117-c72f2c9e9035-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:30:38 crc kubenswrapper[5004]: I1201 08:30:38.514708 5004 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8659be81-88bd-4a0a-b117-c72f2c9e9035-util\") on node \"crc\" DevicePath \"\"" Dec 01 08:30:38 crc kubenswrapper[5004]: I1201 08:30:38.515162 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9f76s\" (UniqueName: \"kubernetes.io/projected/8659be81-88bd-4a0a-b117-c72f2c9e9035-kube-api-access-9f76s\") on node \"crc\" DevicePath \"\"" Dec 01 08:30:38 crc kubenswrapper[5004]: I1201 08:30:38.729011 5004 patch_prober.go:28] interesting pod/machine-config-daemon-fvdgt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 08:30:38 crc kubenswrapper[5004]: I1201 08:30:38.729102 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 08:30:38 crc kubenswrapper[5004]: I1201 08:30:38.909038 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8bsmpc" event={"ID":"8659be81-88bd-4a0a-b117-c72f2c9e9035","Type":"ContainerDied","Data":"3f1d024b7706920adf1bacdeb49532645db1dce0820286f795eb65ce766c1536"} Dec 01 08:30:38 crc kubenswrapper[5004]: I1201 08:30:38.909472 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f1d024b7706920adf1bacdeb49532645db1dce0820286f795eb65ce766c1536" Dec 01 08:30:38 crc kubenswrapper[5004]: I1201 08:30:38.909655 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8bsmpc" Dec 01 08:30:38 crc kubenswrapper[5004]: I1201 08:30:38.913134 5004 generic.go:334] "Generic (PLEG): container finished" podID="e560049a-ccf6-4dc3-b0aa-fb5f5482ead1" containerID="875ba34bcaf33f3dae1a572e856740e7c97d9f6307f38988f0b0a32cdbf0825a" exitCode=0 Dec 01 08:30:38 crc kubenswrapper[5004]: I1201 08:30:38.913199 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dwk28" event={"ID":"e560049a-ccf6-4dc3-b0aa-fb5f5482ead1","Type":"ContainerDied","Data":"875ba34bcaf33f3dae1a572e856740e7c97d9f6307f38988f0b0a32cdbf0825a"} Dec 01 08:30:39 crc kubenswrapper[5004]: I1201 08:30:39.925389 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dwk28" event={"ID":"e560049a-ccf6-4dc3-b0aa-fb5f5482ead1","Type":"ContainerStarted","Data":"8f07ff3d137cde131464d17bec81809a6c14217533857783f08f0933486e0278"} Dec 01 08:30:39 crc kubenswrapper[5004]: I1201 08:30:39.949241 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dwk28" podStartSLOduration=2.272314978 podStartE2EDuration="4.949189084s" podCreationTimestamp="2025-12-01 08:30:35 +0000 UTC" firstStartedPulling="2025-12-01 08:30:36.88637793 +0000 UTC m=+814.451369942" lastFinishedPulling="2025-12-01 08:30:39.563252036 +0000 UTC m=+817.128244048" observedRunningTime="2025-12-01 08:30:39.942832414 +0000 UTC m=+817.507824416" watchObservedRunningTime="2025-12-01 08:30:39.949189084 +0000 UTC m=+817.514181106" Dec 01 08:30:45 crc kubenswrapper[5004]: I1201 08:30:45.801168 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dwk28" Dec 01 08:30:45 crc kubenswrapper[5004]: I1201 08:30:45.801658 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dwk28" Dec 01 08:30:46 crc kubenswrapper[5004]: I1201 08:30:46.849129 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dwk28" podUID="e560049a-ccf6-4dc3-b0aa-fb5f5482ead1" containerName="registry-server" probeResult="failure" output=< Dec 01 08:30:46 crc kubenswrapper[5004]: timeout: failed to connect service ":50051" within 1s Dec 01 08:30:46 crc kubenswrapper[5004]: > Dec 01 08:30:50 crc kubenswrapper[5004]: I1201 08:30:50.473626 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-56fbccf5c9-5kcrv"] Dec 01 08:30:50 crc kubenswrapper[5004]: E1201 08:30:50.474258 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78cb161c-a9d6-4fd5-9144-6564ca31cd33" containerName="util" Dec 01 08:30:50 crc kubenswrapper[5004]: I1201 08:30:50.474269 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="78cb161c-a9d6-4fd5-9144-6564ca31cd33" containerName="util" Dec 01 08:30:50 crc kubenswrapper[5004]: E1201 08:30:50.474282 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78cb161c-a9d6-4fd5-9144-6564ca31cd33" containerName="pull" Dec 01 08:30:50 crc kubenswrapper[5004]: I1201 08:30:50.474288 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="78cb161c-a9d6-4fd5-9144-6564ca31cd33" containerName="pull" Dec 01 08:30:50 crc kubenswrapper[5004]: E1201 08:30:50.474298 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8659be81-88bd-4a0a-b117-c72f2c9e9035" containerName="extract" Dec 01 08:30:50 crc kubenswrapper[5004]: I1201 08:30:50.474305 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="8659be81-88bd-4a0a-b117-c72f2c9e9035" containerName="extract" Dec 01 08:30:50 crc kubenswrapper[5004]: E1201 08:30:50.474313 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8659be81-88bd-4a0a-b117-c72f2c9e9035" containerName="pull" Dec 01 08:30:50 crc kubenswrapper[5004]: I1201 08:30:50.474320 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="8659be81-88bd-4a0a-b117-c72f2c9e9035" containerName="pull" Dec 01 08:30:50 crc kubenswrapper[5004]: E1201 08:30:50.474332 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8659be81-88bd-4a0a-b117-c72f2c9e9035" containerName="util" Dec 01 08:30:50 crc kubenswrapper[5004]: I1201 08:30:50.474337 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="8659be81-88bd-4a0a-b117-c72f2c9e9035" containerName="util" Dec 01 08:30:50 crc kubenswrapper[5004]: E1201 08:30:50.474350 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78cb161c-a9d6-4fd5-9144-6564ca31cd33" containerName="extract" Dec 01 08:30:50 crc kubenswrapper[5004]: I1201 08:30:50.474355 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="78cb161c-a9d6-4fd5-9144-6564ca31cd33" containerName="extract" Dec 01 08:30:50 crc kubenswrapper[5004]: I1201 08:30:50.474456 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="8659be81-88bd-4a0a-b117-c72f2c9e9035" containerName="extract" Dec 01 08:30:50 crc kubenswrapper[5004]: I1201 08:30:50.474467 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="78cb161c-a9d6-4fd5-9144-6564ca31cd33" containerName="extract" Dec 01 08:30:50 crc kubenswrapper[5004]: I1201 08:30:50.475047 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-56fbccf5c9-5kcrv" Dec 01 08:30:50 crc kubenswrapper[5004]: I1201 08:30:50.476689 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-service-cert" Dec 01 08:30:50 crc kubenswrapper[5004]: I1201 08:30:50.477052 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-metrics" Dec 01 08:30:50 crc kubenswrapper[5004]: I1201 08:30:50.478124 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"openshift-service-ca.crt" Dec 01 08:30:50 crc kubenswrapper[5004]: I1201 08:30:50.479418 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"kube-root-ca.crt" Dec 01 08:30:50 crc kubenswrapper[5004]: I1201 08:30:50.480120 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"loki-operator-manager-config" Dec 01 08:30:50 crc kubenswrapper[5004]: I1201 08:30:50.485879 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-dockercfg-5wmtx" Dec 01 08:30:50 crc kubenswrapper[5004]: I1201 08:30:50.499021 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-56fbccf5c9-5kcrv"] Dec 01 08:30:50 crc kubenswrapper[5004]: I1201 08:30:50.610372 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3ef42d0b-a102-4112-b592-aa6d481041c7-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-56fbccf5c9-5kcrv\" (UID: \"3ef42d0b-a102-4112-b592-aa6d481041c7\") " pod="openshift-operators-redhat/loki-operator-controller-manager-56fbccf5c9-5kcrv" Dec 01 08:30:50 crc kubenswrapper[5004]: I1201 08:30:50.610466 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/3ef42d0b-a102-4112-b592-aa6d481041c7-manager-config\") pod \"loki-operator-controller-manager-56fbccf5c9-5kcrv\" (UID: \"3ef42d0b-a102-4112-b592-aa6d481041c7\") " pod="openshift-operators-redhat/loki-operator-controller-manager-56fbccf5c9-5kcrv" Dec 01 08:30:50 crc kubenswrapper[5004]: I1201 08:30:50.610508 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3ef42d0b-a102-4112-b592-aa6d481041c7-apiservice-cert\") pod \"loki-operator-controller-manager-56fbccf5c9-5kcrv\" (UID: \"3ef42d0b-a102-4112-b592-aa6d481041c7\") " pod="openshift-operators-redhat/loki-operator-controller-manager-56fbccf5c9-5kcrv" Dec 01 08:30:50 crc kubenswrapper[5004]: I1201 08:30:50.610534 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxwgq\" (UniqueName: \"kubernetes.io/projected/3ef42d0b-a102-4112-b592-aa6d481041c7-kube-api-access-cxwgq\") pod \"loki-operator-controller-manager-56fbccf5c9-5kcrv\" (UID: \"3ef42d0b-a102-4112-b592-aa6d481041c7\") " pod="openshift-operators-redhat/loki-operator-controller-manager-56fbccf5c9-5kcrv" Dec 01 08:30:50 crc kubenswrapper[5004]: I1201 08:30:50.610629 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3ef42d0b-a102-4112-b592-aa6d481041c7-webhook-cert\") pod \"loki-operator-controller-manager-56fbccf5c9-5kcrv\" (UID: \"3ef42d0b-a102-4112-b592-aa6d481041c7\") " pod="openshift-operators-redhat/loki-operator-controller-manager-56fbccf5c9-5kcrv" Dec 01 08:30:50 crc kubenswrapper[5004]: I1201 08:30:50.712057 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3ef42d0b-a102-4112-b592-aa6d481041c7-webhook-cert\") pod \"loki-operator-controller-manager-56fbccf5c9-5kcrv\" (UID: \"3ef42d0b-a102-4112-b592-aa6d481041c7\") " pod="openshift-operators-redhat/loki-operator-controller-manager-56fbccf5c9-5kcrv" Dec 01 08:30:50 crc kubenswrapper[5004]: I1201 08:30:50.712117 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3ef42d0b-a102-4112-b592-aa6d481041c7-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-56fbccf5c9-5kcrv\" (UID: \"3ef42d0b-a102-4112-b592-aa6d481041c7\") " pod="openshift-operators-redhat/loki-operator-controller-manager-56fbccf5c9-5kcrv" Dec 01 08:30:50 crc kubenswrapper[5004]: I1201 08:30:50.712183 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/3ef42d0b-a102-4112-b592-aa6d481041c7-manager-config\") pod \"loki-operator-controller-manager-56fbccf5c9-5kcrv\" (UID: \"3ef42d0b-a102-4112-b592-aa6d481041c7\") " pod="openshift-operators-redhat/loki-operator-controller-manager-56fbccf5c9-5kcrv" Dec 01 08:30:50 crc kubenswrapper[5004]: I1201 08:30:50.712214 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3ef42d0b-a102-4112-b592-aa6d481041c7-apiservice-cert\") pod \"loki-operator-controller-manager-56fbccf5c9-5kcrv\" (UID: \"3ef42d0b-a102-4112-b592-aa6d481041c7\") " pod="openshift-operators-redhat/loki-operator-controller-manager-56fbccf5c9-5kcrv" Dec 01 08:30:50 crc kubenswrapper[5004]: I1201 08:30:50.712371 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxwgq\" (UniqueName: \"kubernetes.io/projected/3ef42d0b-a102-4112-b592-aa6d481041c7-kube-api-access-cxwgq\") pod \"loki-operator-controller-manager-56fbccf5c9-5kcrv\" (UID: \"3ef42d0b-a102-4112-b592-aa6d481041c7\") " pod="openshift-operators-redhat/loki-operator-controller-manager-56fbccf5c9-5kcrv" Dec 01 08:30:50 crc kubenswrapper[5004]: I1201 08:30:50.713323 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/3ef42d0b-a102-4112-b592-aa6d481041c7-manager-config\") pod \"loki-operator-controller-manager-56fbccf5c9-5kcrv\" (UID: \"3ef42d0b-a102-4112-b592-aa6d481041c7\") " pod="openshift-operators-redhat/loki-operator-controller-manager-56fbccf5c9-5kcrv" Dec 01 08:30:50 crc kubenswrapper[5004]: I1201 08:30:50.718174 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3ef42d0b-a102-4112-b592-aa6d481041c7-apiservice-cert\") pod \"loki-operator-controller-manager-56fbccf5c9-5kcrv\" (UID: \"3ef42d0b-a102-4112-b592-aa6d481041c7\") " pod="openshift-operators-redhat/loki-operator-controller-manager-56fbccf5c9-5kcrv" Dec 01 08:30:50 crc kubenswrapper[5004]: I1201 08:30:50.727325 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3ef42d0b-a102-4112-b592-aa6d481041c7-webhook-cert\") pod \"loki-operator-controller-manager-56fbccf5c9-5kcrv\" (UID: \"3ef42d0b-a102-4112-b592-aa6d481041c7\") " pod="openshift-operators-redhat/loki-operator-controller-manager-56fbccf5c9-5kcrv" Dec 01 08:30:50 crc kubenswrapper[5004]: I1201 08:30:50.727800 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3ef42d0b-a102-4112-b592-aa6d481041c7-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-56fbccf5c9-5kcrv\" (UID: \"3ef42d0b-a102-4112-b592-aa6d481041c7\") " pod="openshift-operators-redhat/loki-operator-controller-manager-56fbccf5c9-5kcrv" Dec 01 08:30:50 crc kubenswrapper[5004]: I1201 08:30:50.730994 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxwgq\" (UniqueName: \"kubernetes.io/projected/3ef42d0b-a102-4112-b592-aa6d481041c7-kube-api-access-cxwgq\") pod \"loki-operator-controller-manager-56fbccf5c9-5kcrv\" (UID: \"3ef42d0b-a102-4112-b592-aa6d481041c7\") " pod="openshift-operators-redhat/loki-operator-controller-manager-56fbccf5c9-5kcrv" Dec 01 08:30:50 crc kubenswrapper[5004]: I1201 08:30:50.788713 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-56fbccf5c9-5kcrv" Dec 01 08:30:51 crc kubenswrapper[5004]: I1201 08:30:51.276149 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-56fbccf5c9-5kcrv"] Dec 01 08:30:51 crc kubenswrapper[5004]: W1201 08:30:51.277737 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ef42d0b_a102_4112_b592_aa6d481041c7.slice/crio-66f382c7f4999ebb3ae63112b34dd2c2826a0caedd14196628ee0b3313d5c219 WatchSource:0}: Error finding container 66f382c7f4999ebb3ae63112b34dd2c2826a0caedd14196628ee0b3313d5c219: Status 404 returned error can't find the container with id 66f382c7f4999ebb3ae63112b34dd2c2826a0caedd14196628ee0b3313d5c219 Dec 01 08:30:51 crc kubenswrapper[5004]: I1201 08:30:51.581705 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/cluster-logging-operator-ff9846bd-cxlpj"] Dec 01 08:30:51 crc kubenswrapper[5004]: I1201 08:30:51.582468 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/cluster-logging-operator-ff9846bd-cxlpj" Dec 01 08:30:51 crc kubenswrapper[5004]: I1201 08:30:51.586973 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"openshift-service-ca.crt" Dec 01 08:30:51 crc kubenswrapper[5004]: I1201 08:30:51.587266 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"kube-root-ca.crt" Dec 01 08:30:51 crc kubenswrapper[5004]: I1201 08:30:51.587467 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"cluster-logging-operator-dockercfg-ljgzm" Dec 01 08:30:51 crc kubenswrapper[5004]: I1201 08:30:51.605371 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/cluster-logging-operator-ff9846bd-cxlpj"] Dec 01 08:30:51 crc kubenswrapper[5004]: I1201 08:30:51.727159 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tcp6\" (UniqueName: \"kubernetes.io/projected/8ea8ea18-7ae8-44e4-9381-10948b9b47f6-kube-api-access-8tcp6\") pod \"cluster-logging-operator-ff9846bd-cxlpj\" (UID: \"8ea8ea18-7ae8-44e4-9381-10948b9b47f6\") " pod="openshift-logging/cluster-logging-operator-ff9846bd-cxlpj" Dec 01 08:30:51 crc kubenswrapper[5004]: I1201 08:30:51.829332 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tcp6\" (UniqueName: \"kubernetes.io/projected/8ea8ea18-7ae8-44e4-9381-10948b9b47f6-kube-api-access-8tcp6\") pod \"cluster-logging-operator-ff9846bd-cxlpj\" (UID: \"8ea8ea18-7ae8-44e4-9381-10948b9b47f6\") " pod="openshift-logging/cluster-logging-operator-ff9846bd-cxlpj" Dec 01 08:30:51 crc kubenswrapper[5004]: I1201 08:30:51.848429 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tcp6\" (UniqueName: \"kubernetes.io/projected/8ea8ea18-7ae8-44e4-9381-10948b9b47f6-kube-api-access-8tcp6\") pod \"cluster-logging-operator-ff9846bd-cxlpj\" (UID: \"8ea8ea18-7ae8-44e4-9381-10948b9b47f6\") " pod="openshift-logging/cluster-logging-operator-ff9846bd-cxlpj" Dec 01 08:30:51 crc kubenswrapper[5004]: I1201 08:30:51.897417 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/cluster-logging-operator-ff9846bd-cxlpj" Dec 01 08:30:52 crc kubenswrapper[5004]: I1201 08:30:52.007099 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-56fbccf5c9-5kcrv" event={"ID":"3ef42d0b-a102-4112-b592-aa6d481041c7","Type":"ContainerStarted","Data":"66f382c7f4999ebb3ae63112b34dd2c2826a0caedd14196628ee0b3313d5c219"} Dec 01 08:30:52 crc kubenswrapper[5004]: I1201 08:30:52.144498 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/cluster-logging-operator-ff9846bd-cxlpj"] Dec 01 08:30:52 crc kubenswrapper[5004]: W1201 08:30:52.145790 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ea8ea18_7ae8_44e4_9381_10948b9b47f6.slice/crio-d47f89b5e5fa8786c0bcf6c7f62aa277dd08eae0086d113e0cf50fc07f6b8b09 WatchSource:0}: Error finding container d47f89b5e5fa8786c0bcf6c7f62aa277dd08eae0086d113e0cf50fc07f6b8b09: Status 404 returned error can't find the container with id d47f89b5e5fa8786c0bcf6c7f62aa277dd08eae0086d113e0cf50fc07f6b8b09 Dec 01 08:30:53 crc kubenswrapper[5004]: I1201 08:30:53.014542 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/cluster-logging-operator-ff9846bd-cxlpj" event={"ID":"8ea8ea18-7ae8-44e4-9381-10948b9b47f6","Type":"ContainerStarted","Data":"d47f89b5e5fa8786c0bcf6c7f62aa277dd08eae0086d113e0cf50fc07f6b8b09"} Dec 01 08:30:55 crc kubenswrapper[5004]: I1201 08:30:55.862862 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dwk28" Dec 01 08:30:55 crc kubenswrapper[5004]: I1201 08:30:55.941965 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dwk28" Dec 01 08:30:58 crc kubenswrapper[5004]: I1201 08:30:58.837718 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dwk28"] Dec 01 08:30:58 crc kubenswrapper[5004]: I1201 08:30:58.838001 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dwk28" podUID="e560049a-ccf6-4dc3-b0aa-fb5f5482ead1" containerName="registry-server" containerID="cri-o://8f07ff3d137cde131464d17bec81809a6c14217533857783f08f0933486e0278" gracePeriod=2 Dec 01 08:30:59 crc kubenswrapper[5004]: I1201 08:30:59.062134 5004 generic.go:334] "Generic (PLEG): container finished" podID="e560049a-ccf6-4dc3-b0aa-fb5f5482ead1" containerID="8f07ff3d137cde131464d17bec81809a6c14217533857783f08f0933486e0278" exitCode=0 Dec 01 08:30:59 crc kubenswrapper[5004]: I1201 08:30:59.062312 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dwk28" event={"ID":"e560049a-ccf6-4dc3-b0aa-fb5f5482ead1","Type":"ContainerDied","Data":"8f07ff3d137cde131464d17bec81809a6c14217533857783f08f0933486e0278"} Dec 01 08:30:59 crc kubenswrapper[5004]: I1201 08:30:59.867481 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dwk28" Dec 01 08:30:59 crc kubenswrapper[5004]: I1201 08:30:59.890828 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e560049a-ccf6-4dc3-b0aa-fb5f5482ead1-utilities\") pod \"e560049a-ccf6-4dc3-b0aa-fb5f5482ead1\" (UID: \"e560049a-ccf6-4dc3-b0aa-fb5f5482ead1\") " Dec 01 08:30:59 crc kubenswrapper[5004]: I1201 08:30:59.890986 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e560049a-ccf6-4dc3-b0aa-fb5f5482ead1-catalog-content\") pod \"e560049a-ccf6-4dc3-b0aa-fb5f5482ead1\" (UID: \"e560049a-ccf6-4dc3-b0aa-fb5f5482ead1\") " Dec 01 08:30:59 crc kubenswrapper[5004]: I1201 08:30:59.891083 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmkv6\" (UniqueName: \"kubernetes.io/projected/e560049a-ccf6-4dc3-b0aa-fb5f5482ead1-kube-api-access-hmkv6\") pod \"e560049a-ccf6-4dc3-b0aa-fb5f5482ead1\" (UID: \"e560049a-ccf6-4dc3-b0aa-fb5f5482ead1\") " Dec 01 08:30:59 crc kubenswrapper[5004]: I1201 08:30:59.891923 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e560049a-ccf6-4dc3-b0aa-fb5f5482ead1-utilities" (OuterVolumeSpecName: "utilities") pod "e560049a-ccf6-4dc3-b0aa-fb5f5482ead1" (UID: "e560049a-ccf6-4dc3-b0aa-fb5f5482ead1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:30:59 crc kubenswrapper[5004]: I1201 08:30:59.904497 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e560049a-ccf6-4dc3-b0aa-fb5f5482ead1-kube-api-access-hmkv6" (OuterVolumeSpecName: "kube-api-access-hmkv6") pod "e560049a-ccf6-4dc3-b0aa-fb5f5482ead1" (UID: "e560049a-ccf6-4dc3-b0aa-fb5f5482ead1"). InnerVolumeSpecName "kube-api-access-hmkv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:30:59 crc kubenswrapper[5004]: I1201 08:30:59.992673 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmkv6\" (UniqueName: \"kubernetes.io/projected/e560049a-ccf6-4dc3-b0aa-fb5f5482ead1-kube-api-access-hmkv6\") on node \"crc\" DevicePath \"\"" Dec 01 08:30:59 crc kubenswrapper[5004]: I1201 08:30:59.992722 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e560049a-ccf6-4dc3-b0aa-fb5f5482ead1-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 08:30:59 crc kubenswrapper[5004]: I1201 08:30:59.993365 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e560049a-ccf6-4dc3-b0aa-fb5f5482ead1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e560049a-ccf6-4dc3-b0aa-fb5f5482ead1" (UID: "e560049a-ccf6-4dc3-b0aa-fb5f5482ead1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:31:00 crc kubenswrapper[5004]: I1201 08:31:00.069585 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/cluster-logging-operator-ff9846bd-cxlpj" event={"ID":"8ea8ea18-7ae8-44e4-9381-10948b9b47f6","Type":"ContainerStarted","Data":"030fe32ad4ae8ad1c2e50e59b6d83cf1c32f7b20b899ad871e79b846a47b4eb5"} Dec 01 08:31:00 crc kubenswrapper[5004]: I1201 08:31:00.070801 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-56fbccf5c9-5kcrv" event={"ID":"3ef42d0b-a102-4112-b592-aa6d481041c7","Type":"ContainerStarted","Data":"62b4d5ce6af782fba2377f49d9e0841d95a259b422477a3d2529688ac8aa6ed4"} Dec 01 08:31:00 crc kubenswrapper[5004]: I1201 08:31:00.072541 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dwk28" event={"ID":"e560049a-ccf6-4dc3-b0aa-fb5f5482ead1","Type":"ContainerDied","Data":"6981db3a1799edd7bde268a4299be3a7921e59dc38a2075353cafa3311660e07"} Dec 01 08:31:00 crc kubenswrapper[5004]: I1201 08:31:00.072595 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dwk28" Dec 01 08:31:00 crc kubenswrapper[5004]: I1201 08:31:00.072601 5004 scope.go:117] "RemoveContainer" containerID="8f07ff3d137cde131464d17bec81809a6c14217533857783f08f0933486e0278" Dec 01 08:31:00 crc kubenswrapper[5004]: I1201 08:31:00.092334 5004 scope.go:117] "RemoveContainer" containerID="875ba34bcaf33f3dae1a572e856740e7c97d9f6307f38988f0b0a32cdbf0825a" Dec 01 08:31:00 crc kubenswrapper[5004]: I1201 08:31:00.095399 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e560049a-ccf6-4dc3-b0aa-fb5f5482ead1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 08:31:00 crc kubenswrapper[5004]: I1201 08:31:00.103857 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/cluster-logging-operator-ff9846bd-cxlpj" podStartSLOduration=1.571185142 podStartE2EDuration="9.10384152s" podCreationTimestamp="2025-12-01 08:30:51 +0000 UTC" firstStartedPulling="2025-12-01 08:30:52.152651677 +0000 UTC m=+829.717643689" lastFinishedPulling="2025-12-01 08:30:59.685308075 +0000 UTC m=+837.250300067" observedRunningTime="2025-12-01 08:31:00.101846019 +0000 UTC m=+837.666838001" watchObservedRunningTime="2025-12-01 08:31:00.10384152 +0000 UTC m=+837.668833502" Dec 01 08:31:00 crc kubenswrapper[5004]: I1201 08:31:00.127803 5004 scope.go:117] "RemoveContainer" containerID="d0ddf21f4270b92354c64b4dcdb87d8894085b7006a4328d4b815b39babfcd2a" Dec 01 08:31:00 crc kubenswrapper[5004]: I1201 08:31:00.141352 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dwk28"] Dec 01 08:31:00 crc kubenswrapper[5004]: I1201 08:31:00.171999 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dwk28"] Dec 01 08:31:00 crc kubenswrapper[5004]: I1201 08:31:00.768652 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e560049a-ccf6-4dc3-b0aa-fb5f5482ead1" path="/var/lib/kubelet/pods/e560049a-ccf6-4dc3-b0aa-fb5f5482ead1/volumes" Dec 01 08:31:07 crc kubenswrapper[5004]: I1201 08:31:07.133437 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-56fbccf5c9-5kcrv" event={"ID":"3ef42d0b-a102-4112-b592-aa6d481041c7","Type":"ContainerStarted","Data":"809fc7fad2bf98fb30647f125922b89023fa98e04518e663e06e3c6815678a70"} Dec 01 08:31:07 crc kubenswrapper[5004]: I1201 08:31:07.134070 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators-redhat/loki-operator-controller-manager-56fbccf5c9-5kcrv" Dec 01 08:31:07 crc kubenswrapper[5004]: I1201 08:31:07.136711 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators-redhat/loki-operator-controller-manager-56fbccf5c9-5kcrv" Dec 01 08:31:07 crc kubenswrapper[5004]: I1201 08:31:07.161830 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators-redhat/loki-operator-controller-manager-56fbccf5c9-5kcrv" podStartSLOduration=1.610460904 podStartE2EDuration="17.161808403s" podCreationTimestamp="2025-12-01 08:30:50 +0000 UTC" firstStartedPulling="2025-12-01 08:30:51.280792634 +0000 UTC m=+828.845784606" lastFinishedPulling="2025-12-01 08:31:06.832140133 +0000 UTC m=+844.397132105" observedRunningTime="2025-12-01 08:31:07.155878416 +0000 UTC m=+844.720870438" watchObservedRunningTime="2025-12-01 08:31:07.161808403 +0000 UTC m=+844.726800375" Dec 01 08:31:08 crc kubenswrapper[5004]: I1201 08:31:08.729076 5004 patch_prober.go:28] interesting pod/machine-config-daemon-fvdgt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 08:31:08 crc kubenswrapper[5004]: I1201 08:31:08.729687 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 08:31:08 crc kubenswrapper[5004]: I1201 08:31:08.729778 5004 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" Dec 01 08:31:08 crc kubenswrapper[5004]: I1201 08:31:08.734223 5004 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8b94d92321b66c5263a45c381dbbdfe95975b64015e15b4b3949d9d6b2469402"} pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 08:31:08 crc kubenswrapper[5004]: I1201 08:31:08.734355 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" containerID="cri-o://8b94d92321b66c5263a45c381dbbdfe95975b64015e15b4b3949d9d6b2469402" gracePeriod=600 Dec 01 08:31:09 crc kubenswrapper[5004]: I1201 08:31:09.173458 5004 generic.go:334] "Generic (PLEG): container finished" podID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerID="8b94d92321b66c5263a45c381dbbdfe95975b64015e15b4b3949d9d6b2469402" exitCode=0 Dec 01 08:31:09 crc kubenswrapper[5004]: I1201 08:31:09.173497 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" event={"ID":"f9977ebb-82de-4e96-8763-0b5a84f8d4ce","Type":"ContainerDied","Data":"8b94d92321b66c5263a45c381dbbdfe95975b64015e15b4b3949d9d6b2469402"} Dec 01 08:31:09 crc kubenswrapper[5004]: I1201 08:31:09.174193 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" event={"ID":"f9977ebb-82de-4e96-8763-0b5a84f8d4ce","Type":"ContainerStarted","Data":"69d8f022c5a4f9a84dbe3000c7f3fecc6974868815a83043bd8a0d7a4a9a2e59"} Dec 01 08:31:09 crc kubenswrapper[5004]: I1201 08:31:09.174217 5004 scope.go:117] "RemoveContainer" containerID="f5cb4e2ac3b55859ead5c898a2b42c280b2d9fe9b770bdbbd6d9799deecd9d6a" Dec 01 08:31:11 crc kubenswrapper[5004]: I1201 08:31:11.098071 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["minio-dev/minio"] Dec 01 08:31:11 crc kubenswrapper[5004]: E1201 08:31:11.098674 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e560049a-ccf6-4dc3-b0aa-fb5f5482ead1" containerName="extract-utilities" Dec 01 08:31:11 crc kubenswrapper[5004]: I1201 08:31:11.098691 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="e560049a-ccf6-4dc3-b0aa-fb5f5482ead1" containerName="extract-utilities" Dec 01 08:31:11 crc kubenswrapper[5004]: E1201 08:31:11.098719 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e560049a-ccf6-4dc3-b0aa-fb5f5482ead1" containerName="extract-content" Dec 01 08:31:11 crc kubenswrapper[5004]: I1201 08:31:11.098727 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="e560049a-ccf6-4dc3-b0aa-fb5f5482ead1" containerName="extract-content" Dec 01 08:31:11 crc kubenswrapper[5004]: E1201 08:31:11.098738 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e560049a-ccf6-4dc3-b0aa-fb5f5482ead1" containerName="registry-server" Dec 01 08:31:11 crc kubenswrapper[5004]: I1201 08:31:11.098747 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="e560049a-ccf6-4dc3-b0aa-fb5f5482ead1" containerName="registry-server" Dec 01 08:31:11 crc kubenswrapper[5004]: I1201 08:31:11.098903 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="e560049a-ccf6-4dc3-b0aa-fb5f5482ead1" containerName="registry-server" Dec 01 08:31:11 crc kubenswrapper[5004]: I1201 08:31:11.099400 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Dec 01 08:31:11 crc kubenswrapper[5004]: I1201 08:31:11.102849 5004 reflector.go:368] Caches populated for *v1.Secret from object-"minio-dev"/"default-dockercfg-trfnd" Dec 01 08:31:11 crc kubenswrapper[5004]: I1201 08:31:11.103963 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"kube-root-ca.crt" Dec 01 08:31:11 crc kubenswrapper[5004]: I1201 08:31:11.104022 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"openshift-service-ca.crt" Dec 01 08:31:11 crc kubenswrapper[5004]: I1201 08:31:11.124846 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Dec 01 08:31:11 crc kubenswrapper[5004]: I1201 08:31:11.249977 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvfql\" (UniqueName: \"kubernetes.io/projected/6544209b-d17f-42c7-b1c3-4656d242cb71-kube-api-access-nvfql\") pod \"minio\" (UID: \"6544209b-d17f-42c7-b1c3-4656d242cb71\") " pod="minio-dev/minio" Dec 01 08:31:11 crc kubenswrapper[5004]: I1201 08:31:11.250064 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c880496d-e782-4e22-a32f-bd84447560f2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c880496d-e782-4e22-a32f-bd84447560f2\") pod \"minio\" (UID: \"6544209b-d17f-42c7-b1c3-4656d242cb71\") " pod="minio-dev/minio" Dec 01 08:31:11 crc kubenswrapper[5004]: I1201 08:31:11.351741 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvfql\" (UniqueName: \"kubernetes.io/projected/6544209b-d17f-42c7-b1c3-4656d242cb71-kube-api-access-nvfql\") pod \"minio\" (UID: \"6544209b-d17f-42c7-b1c3-4656d242cb71\") " pod="minio-dev/minio" Dec 01 08:31:11 crc kubenswrapper[5004]: I1201 08:31:11.351932 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c880496d-e782-4e22-a32f-bd84447560f2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c880496d-e782-4e22-a32f-bd84447560f2\") pod \"minio\" (UID: \"6544209b-d17f-42c7-b1c3-4656d242cb71\") " pod="minio-dev/minio" Dec 01 08:31:11 crc kubenswrapper[5004]: I1201 08:31:11.355952 5004 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 01 08:31:11 crc kubenswrapper[5004]: I1201 08:31:11.355977 5004 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c880496d-e782-4e22-a32f-bd84447560f2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c880496d-e782-4e22-a32f-bd84447560f2\") pod \"minio\" (UID: \"6544209b-d17f-42c7-b1c3-4656d242cb71\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/6b6bf96c8ce323007c50a977a622243e398d13ad0fc706bd045d9f3f46c78b98/globalmount\"" pod="minio-dev/minio" Dec 01 08:31:11 crc kubenswrapper[5004]: I1201 08:31:11.378802 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvfql\" (UniqueName: \"kubernetes.io/projected/6544209b-d17f-42c7-b1c3-4656d242cb71-kube-api-access-nvfql\") pod \"minio\" (UID: \"6544209b-d17f-42c7-b1c3-4656d242cb71\") " pod="minio-dev/minio" Dec 01 08:31:11 crc kubenswrapper[5004]: I1201 08:31:11.380532 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c880496d-e782-4e22-a32f-bd84447560f2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c880496d-e782-4e22-a32f-bd84447560f2\") pod \"minio\" (UID: \"6544209b-d17f-42c7-b1c3-4656d242cb71\") " pod="minio-dev/minio" Dec 01 08:31:11 crc kubenswrapper[5004]: I1201 08:31:11.430434 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Dec 01 08:31:11 crc kubenswrapper[5004]: I1201 08:31:11.885769 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Dec 01 08:31:12 crc kubenswrapper[5004]: I1201 08:31:12.202526 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"6544209b-d17f-42c7-b1c3-4656d242cb71","Type":"ContainerStarted","Data":"af94c4936b73450ea0980a2baa71b02f57ec8c74a4d9fc5e6da84f2991836160"} Dec 01 08:31:16 crc kubenswrapper[5004]: I1201 08:31:16.248298 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"6544209b-d17f-42c7-b1c3-4656d242cb71","Type":"ContainerStarted","Data":"fb3888eeb1d472185ac28f8421a6215bbc4cb9670fbc744772ebb35e375596ee"} Dec 01 08:31:16 crc kubenswrapper[5004]: I1201 08:31:16.259888 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="minio-dev/minio" podStartSLOduration=4.804929857 podStartE2EDuration="8.259873927s" podCreationTimestamp="2025-12-01 08:31:08 +0000 UTC" firstStartedPulling="2025-12-01 08:31:11.890202656 +0000 UTC m=+849.455194638" lastFinishedPulling="2025-12-01 08:31:15.345146716 +0000 UTC m=+852.910138708" observedRunningTime="2025-12-01 08:31:16.259066088 +0000 UTC m=+853.824058090" watchObservedRunningTime="2025-12-01 08:31:16.259873927 +0000 UTC m=+853.824865909" Dec 01 08:31:19 crc kubenswrapper[5004]: I1201 08:31:19.912835 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-distributor-76cc67bf56-mjgbt"] Dec 01 08:31:19 crc kubenswrapper[5004]: I1201 08:31:19.913984 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-distributor-76cc67bf56-mjgbt" Dec 01 08:31:19 crc kubenswrapper[5004]: I1201 08:31:19.916434 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-ca-bundle" Dec 01 08:31:19 crc kubenswrapper[5004]: I1201 08:31:19.917154 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-config" Dec 01 08:31:19 crc kubenswrapper[5004]: I1201 08:31:19.917404 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-distributor-http" Dec 01 08:31:19 crc kubenswrapper[5004]: I1201 08:31:19.917472 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-dockercfg-9cvzs" Dec 01 08:31:19 crc kubenswrapper[5004]: I1201 08:31:19.918402 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-distributor-grpc" Dec 01 08:31:19 crc kubenswrapper[5004]: I1201 08:31:19.935364 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-distributor-76cc67bf56-mjgbt"] Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.060653 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-querier-5895d59bb8-wdd4k"] Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.064676 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-querier-5895d59bb8-wdd4k" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.066672 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-s3" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.068040 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-querier-grpc" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.068057 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-querier-http" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.079868 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/462fa983-5357-44cf-afb3-4803b227bcfa-logging-loki-ca-bundle\") pod \"logging-loki-distributor-76cc67bf56-mjgbt\" (UID: \"462fa983-5357-44cf-afb3-4803b227bcfa\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-mjgbt" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.079997 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8q7m\" (UniqueName: \"kubernetes.io/projected/462fa983-5357-44cf-afb3-4803b227bcfa-kube-api-access-r8q7m\") pod \"logging-loki-distributor-76cc67bf56-mjgbt\" (UID: \"462fa983-5357-44cf-afb3-4803b227bcfa\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-mjgbt" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.080069 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/462fa983-5357-44cf-afb3-4803b227bcfa-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-76cc67bf56-mjgbt\" (UID: \"462fa983-5357-44cf-afb3-4803b227bcfa\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-mjgbt" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.080237 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/462fa983-5357-44cf-afb3-4803b227bcfa-config\") pod \"logging-loki-distributor-76cc67bf56-mjgbt\" (UID: \"462fa983-5357-44cf-afb3-4803b227bcfa\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-mjgbt" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.080543 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/462fa983-5357-44cf-afb3-4803b227bcfa-logging-loki-distributor-http\") pod \"logging-loki-distributor-76cc67bf56-mjgbt\" (UID: \"462fa983-5357-44cf-afb3-4803b227bcfa\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-mjgbt" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.088698 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-querier-5895d59bb8-wdd4k"] Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.135367 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-query-frontend-84558f7c9f-pkxkw"] Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.136181 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-pkxkw" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.140115 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-query-frontend-http" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.140319 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-query-frontend-grpc" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.146316 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-query-frontend-84558f7c9f-pkxkw"] Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.181973 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a551f476-d9e6-4e1c-9f48-60939bd6b6ff-config\") pod \"logging-loki-querier-5895d59bb8-wdd4k\" (UID: \"a551f476-d9e6-4e1c-9f48-60939bd6b6ff\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-wdd4k" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.182019 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a551f476-d9e6-4e1c-9f48-60939bd6b6ff-logging-loki-ca-bundle\") pod \"logging-loki-querier-5895d59bb8-wdd4k\" (UID: \"a551f476-d9e6-4e1c-9f48-60939bd6b6ff\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-wdd4k" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.182055 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/462fa983-5357-44cf-afb3-4803b227bcfa-logging-loki-distributor-http\") pod \"logging-loki-distributor-76cc67bf56-mjgbt\" (UID: \"462fa983-5357-44cf-afb3-4803b227bcfa\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-mjgbt" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.182074 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/a551f476-d9e6-4e1c-9f48-60939bd6b6ff-logging-loki-s3\") pod \"logging-loki-querier-5895d59bb8-wdd4k\" (UID: \"a551f476-d9e6-4e1c-9f48-60939bd6b6ff\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-wdd4k" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.182101 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6vsv\" (UniqueName: \"kubernetes.io/projected/a551f476-d9e6-4e1c-9f48-60939bd6b6ff-kube-api-access-q6vsv\") pod \"logging-loki-querier-5895d59bb8-wdd4k\" (UID: \"a551f476-d9e6-4e1c-9f48-60939bd6b6ff\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-wdd4k" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.182127 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/a551f476-d9e6-4e1c-9f48-60939bd6b6ff-logging-loki-querier-grpc\") pod \"logging-loki-querier-5895d59bb8-wdd4k\" (UID: \"a551f476-d9e6-4e1c-9f48-60939bd6b6ff\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-wdd4k" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.182158 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/462fa983-5357-44cf-afb3-4803b227bcfa-logging-loki-ca-bundle\") pod \"logging-loki-distributor-76cc67bf56-mjgbt\" (UID: \"462fa983-5357-44cf-afb3-4803b227bcfa\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-mjgbt" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.182175 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8q7m\" (UniqueName: \"kubernetes.io/projected/462fa983-5357-44cf-afb3-4803b227bcfa-kube-api-access-r8q7m\") pod \"logging-loki-distributor-76cc67bf56-mjgbt\" (UID: \"462fa983-5357-44cf-afb3-4803b227bcfa\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-mjgbt" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.182197 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/a551f476-d9e6-4e1c-9f48-60939bd6b6ff-logging-loki-querier-http\") pod \"logging-loki-querier-5895d59bb8-wdd4k\" (UID: \"a551f476-d9e6-4e1c-9f48-60939bd6b6ff\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-wdd4k" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.182215 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/462fa983-5357-44cf-afb3-4803b227bcfa-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-76cc67bf56-mjgbt\" (UID: \"462fa983-5357-44cf-afb3-4803b227bcfa\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-mjgbt" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.182256 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/462fa983-5357-44cf-afb3-4803b227bcfa-config\") pod \"logging-loki-distributor-76cc67bf56-mjgbt\" (UID: \"462fa983-5357-44cf-afb3-4803b227bcfa\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-mjgbt" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.184024 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/462fa983-5357-44cf-afb3-4803b227bcfa-config\") pod \"logging-loki-distributor-76cc67bf56-mjgbt\" (UID: \"462fa983-5357-44cf-afb3-4803b227bcfa\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-mjgbt" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.185354 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/462fa983-5357-44cf-afb3-4803b227bcfa-logging-loki-ca-bundle\") pod \"logging-loki-distributor-76cc67bf56-mjgbt\" (UID: \"462fa983-5357-44cf-afb3-4803b227bcfa\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-mjgbt" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.216291 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/462fa983-5357-44cf-afb3-4803b227bcfa-logging-loki-distributor-http\") pod \"logging-loki-distributor-76cc67bf56-mjgbt\" (UID: \"462fa983-5357-44cf-afb3-4803b227bcfa\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-mjgbt" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.216965 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8q7m\" (UniqueName: \"kubernetes.io/projected/462fa983-5357-44cf-afb3-4803b227bcfa-kube-api-access-r8q7m\") pod \"logging-loki-distributor-76cc67bf56-mjgbt\" (UID: \"462fa983-5357-44cf-afb3-4803b227bcfa\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-mjgbt" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.218278 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/462fa983-5357-44cf-afb3-4803b227bcfa-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-76cc67bf56-mjgbt\" (UID: \"462fa983-5357-44cf-afb3-4803b227bcfa\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-mjgbt" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.240065 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-distributor-76cc67bf56-mjgbt" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.284504 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6vsv\" (UniqueName: \"kubernetes.io/projected/a551f476-d9e6-4e1c-9f48-60939bd6b6ff-kube-api-access-q6vsv\") pod \"logging-loki-querier-5895d59bb8-wdd4k\" (UID: \"a551f476-d9e6-4e1c-9f48-60939bd6b6ff\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-wdd4k" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.284749 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1955c798-b6bd-4194-8097-889c0e86c90b-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-84558f7c9f-pkxkw\" (UID: \"1955c798-b6bd-4194-8097-889c0e86c90b\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-pkxkw" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.284831 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/a551f476-d9e6-4e1c-9f48-60939bd6b6ff-logging-loki-querier-grpc\") pod \"logging-loki-querier-5895d59bb8-wdd4k\" (UID: \"a551f476-d9e6-4e1c-9f48-60939bd6b6ff\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-wdd4k" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.284912 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/a551f476-d9e6-4e1c-9f48-60939bd6b6ff-logging-loki-querier-http\") pod \"logging-loki-querier-5895d59bb8-wdd4k\" (UID: \"a551f476-d9e6-4e1c-9f48-60939bd6b6ff\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-wdd4k" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.285045 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/1955c798-b6bd-4194-8097-889c0e86c90b-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-84558f7c9f-pkxkw\" (UID: \"1955c798-b6bd-4194-8097-889c0e86c90b\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-pkxkw" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.285177 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzfxj\" (UniqueName: \"kubernetes.io/projected/1955c798-b6bd-4194-8097-889c0e86c90b-kube-api-access-pzfxj\") pod \"logging-loki-query-frontend-84558f7c9f-pkxkw\" (UID: \"1955c798-b6bd-4194-8097-889c0e86c90b\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-pkxkw" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.285266 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/1955c798-b6bd-4194-8097-889c0e86c90b-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-84558f7c9f-pkxkw\" (UID: \"1955c798-b6bd-4194-8097-889c0e86c90b\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-pkxkw" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.285358 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a551f476-d9e6-4e1c-9f48-60939bd6b6ff-config\") pod \"logging-loki-querier-5895d59bb8-wdd4k\" (UID: \"a551f476-d9e6-4e1c-9f48-60939bd6b6ff\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-wdd4k" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.285457 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a551f476-d9e6-4e1c-9f48-60939bd6b6ff-logging-loki-ca-bundle\") pod \"logging-loki-querier-5895d59bb8-wdd4k\" (UID: \"a551f476-d9e6-4e1c-9f48-60939bd6b6ff\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-wdd4k" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.285663 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/a551f476-d9e6-4e1c-9f48-60939bd6b6ff-logging-loki-s3\") pod \"logging-loki-querier-5895d59bb8-wdd4k\" (UID: \"a551f476-d9e6-4e1c-9f48-60939bd6b6ff\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-wdd4k" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.285760 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1955c798-b6bd-4194-8097-889c0e86c90b-config\") pod \"logging-loki-query-frontend-84558f7c9f-pkxkw\" (UID: \"1955c798-b6bd-4194-8097-889c0e86c90b\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-pkxkw" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.289982 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a551f476-d9e6-4e1c-9f48-60939bd6b6ff-config\") pod \"logging-loki-querier-5895d59bb8-wdd4k\" (UID: \"a551f476-d9e6-4e1c-9f48-60939bd6b6ff\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-wdd4k" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.290698 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/a551f476-d9e6-4e1c-9f48-60939bd6b6ff-logging-loki-querier-http\") pod \"logging-loki-querier-5895d59bb8-wdd4k\" (UID: \"a551f476-d9e6-4e1c-9f48-60939bd6b6ff\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-wdd4k" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.291286 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a551f476-d9e6-4e1c-9f48-60939bd6b6ff-logging-loki-ca-bundle\") pod \"logging-loki-querier-5895d59bb8-wdd4k\" (UID: \"a551f476-d9e6-4e1c-9f48-60939bd6b6ff\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-wdd4k" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.298390 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/a551f476-d9e6-4e1c-9f48-60939bd6b6ff-logging-loki-s3\") pod \"logging-loki-querier-5895d59bb8-wdd4k\" (UID: \"a551f476-d9e6-4e1c-9f48-60939bd6b6ff\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-wdd4k" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.299291 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/a551f476-d9e6-4e1c-9f48-60939bd6b6ff-logging-loki-querier-grpc\") pod \"logging-loki-querier-5895d59bb8-wdd4k\" (UID: \"a551f476-d9e6-4e1c-9f48-60939bd6b6ff\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-wdd4k" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.299342 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-gateway-6c96ff8676-nnfwd"] Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.305868 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-6c96ff8676-nnfwd" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.308679 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.308895 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-gateway" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.309915 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-http" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.310151 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-client-http" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.310263 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-gateway-ca-bundle" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.310533 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6vsv\" (UniqueName: \"kubernetes.io/projected/a551f476-d9e6-4e1c-9f48-60939bd6b6ff-kube-api-access-q6vsv\") pod \"logging-loki-querier-5895d59bb8-wdd4k\" (UID: \"a551f476-d9e6-4e1c-9f48-60939bd6b6ff\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-wdd4k" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.322698 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-6c96ff8676-nnfwd"] Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.329172 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-gateway-6c96ff8676-nph2r"] Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.330300 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-6c96ff8676-nph2r" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.333928 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-dockercfg-vb6wn" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.337449 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-6c96ff8676-nph2r"] Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.385476 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-querier-5895d59bb8-wdd4k" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.388252 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1955c798-b6bd-4194-8097-889c0e86c90b-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-84558f7c9f-pkxkw\" (UID: \"1955c798-b6bd-4194-8097-889c0e86c90b\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-pkxkw" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.388281 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/851b5f03-b03f-4a8b-9000-1fa733fb7465-lokistack-gateway\") pod \"logging-loki-gateway-6c96ff8676-nnfwd\" (UID: \"851b5f03-b03f-4a8b-9000-1fa733fb7465\") " pod="openshift-logging/logging-loki-gateway-6c96ff8676-nnfwd" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.388310 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/851b5f03-b03f-4a8b-9000-1fa733fb7465-rbac\") pod \"logging-loki-gateway-6c96ff8676-nnfwd\" (UID: \"851b5f03-b03f-4a8b-9000-1fa733fb7465\") " pod="openshift-logging/logging-loki-gateway-6c96ff8676-nnfwd" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.388341 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/851b5f03-b03f-4a8b-9000-1fa733fb7465-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-6c96ff8676-nnfwd\" (UID: \"851b5f03-b03f-4a8b-9000-1fa733fb7465\") " pod="openshift-logging/logging-loki-gateway-6c96ff8676-nnfwd" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.388360 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/851b5f03-b03f-4a8b-9000-1fa733fb7465-tls-secret\") pod \"logging-loki-gateway-6c96ff8676-nnfwd\" (UID: \"851b5f03-b03f-4a8b-9000-1fa733fb7465\") " pod="openshift-logging/logging-loki-gateway-6c96ff8676-nnfwd" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.388410 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/1955c798-b6bd-4194-8097-889c0e86c90b-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-84558f7c9f-pkxkw\" (UID: \"1955c798-b6bd-4194-8097-889c0e86c90b\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-pkxkw" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.388438 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4nhs\" (UniqueName: \"kubernetes.io/projected/851b5f03-b03f-4a8b-9000-1fa733fb7465-kube-api-access-g4nhs\") pod \"logging-loki-gateway-6c96ff8676-nnfwd\" (UID: \"851b5f03-b03f-4a8b-9000-1fa733fb7465\") " pod="openshift-logging/logging-loki-gateway-6c96ff8676-nnfwd" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.388474 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/851b5f03-b03f-4a8b-9000-1fa733fb7465-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-6c96ff8676-nnfwd\" (UID: \"851b5f03-b03f-4a8b-9000-1fa733fb7465\") " pod="openshift-logging/logging-loki-gateway-6c96ff8676-nnfwd" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.388507 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/851b5f03-b03f-4a8b-9000-1fa733fb7465-tenants\") pod \"logging-loki-gateway-6c96ff8676-nnfwd\" (UID: \"851b5f03-b03f-4a8b-9000-1fa733fb7465\") " pod="openshift-logging/logging-loki-gateway-6c96ff8676-nnfwd" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.388531 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/1955c798-b6bd-4194-8097-889c0e86c90b-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-84558f7c9f-pkxkw\" (UID: \"1955c798-b6bd-4194-8097-889c0e86c90b\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-pkxkw" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.388549 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzfxj\" (UniqueName: \"kubernetes.io/projected/1955c798-b6bd-4194-8097-889c0e86c90b-kube-api-access-pzfxj\") pod \"logging-loki-query-frontend-84558f7c9f-pkxkw\" (UID: \"1955c798-b6bd-4194-8097-889c0e86c90b\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-pkxkw" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.388619 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/851b5f03-b03f-4a8b-9000-1fa733fb7465-logging-loki-ca-bundle\") pod \"logging-loki-gateway-6c96ff8676-nnfwd\" (UID: \"851b5f03-b03f-4a8b-9000-1fa733fb7465\") " pod="openshift-logging/logging-loki-gateway-6c96ff8676-nnfwd" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.388637 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1955c798-b6bd-4194-8097-889c0e86c90b-config\") pod \"logging-loki-query-frontend-84558f7c9f-pkxkw\" (UID: \"1955c798-b6bd-4194-8097-889c0e86c90b\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-pkxkw" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.391357 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1955c798-b6bd-4194-8097-889c0e86c90b-config\") pod \"logging-loki-query-frontend-84558f7c9f-pkxkw\" (UID: \"1955c798-b6bd-4194-8097-889c0e86c90b\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-pkxkw" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.392320 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1955c798-b6bd-4194-8097-889c0e86c90b-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-84558f7c9f-pkxkw\" (UID: \"1955c798-b6bd-4194-8097-889c0e86c90b\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-pkxkw" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.401323 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/1955c798-b6bd-4194-8097-889c0e86c90b-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-84558f7c9f-pkxkw\" (UID: \"1955c798-b6bd-4194-8097-889c0e86c90b\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-pkxkw" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.403902 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/1955c798-b6bd-4194-8097-889c0e86c90b-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-84558f7c9f-pkxkw\" (UID: \"1955c798-b6bd-4194-8097-889c0e86c90b\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-pkxkw" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.410463 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzfxj\" (UniqueName: \"kubernetes.io/projected/1955c798-b6bd-4194-8097-889c0e86c90b-kube-api-access-pzfxj\") pod \"logging-loki-query-frontend-84558f7c9f-pkxkw\" (UID: \"1955c798-b6bd-4194-8097-889c0e86c90b\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-pkxkw" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.459078 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-pkxkw" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.492195 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffhdk\" (UniqueName: \"kubernetes.io/projected/11a613c6-725b-4e91-867b-58b8d664dd55-kube-api-access-ffhdk\") pod \"logging-loki-gateway-6c96ff8676-nph2r\" (UID: \"11a613c6-725b-4e91-867b-58b8d664dd55\") " pod="openshift-logging/logging-loki-gateway-6c96ff8676-nph2r" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.492249 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/11a613c6-725b-4e91-867b-58b8d664dd55-lokistack-gateway\") pod \"logging-loki-gateway-6c96ff8676-nph2r\" (UID: \"11a613c6-725b-4e91-867b-58b8d664dd55\") " pod="openshift-logging/logging-loki-gateway-6c96ff8676-nph2r" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.492287 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/851b5f03-b03f-4a8b-9000-1fa733fb7465-logging-loki-ca-bundle\") pod \"logging-loki-gateway-6c96ff8676-nnfwd\" (UID: \"851b5f03-b03f-4a8b-9000-1fa733fb7465\") " pod="openshift-logging/logging-loki-gateway-6c96ff8676-nnfwd" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.492306 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/11a613c6-725b-4e91-867b-58b8d664dd55-tls-secret\") pod \"logging-loki-gateway-6c96ff8676-nph2r\" (UID: \"11a613c6-725b-4e91-867b-58b8d664dd55\") " pod="openshift-logging/logging-loki-gateway-6c96ff8676-nph2r" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.492339 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/11a613c6-725b-4e91-867b-58b8d664dd55-tenants\") pod \"logging-loki-gateway-6c96ff8676-nph2r\" (UID: \"11a613c6-725b-4e91-867b-58b8d664dd55\") " pod="openshift-logging/logging-loki-gateway-6c96ff8676-nph2r" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.492364 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/851b5f03-b03f-4a8b-9000-1fa733fb7465-lokistack-gateway\") pod \"logging-loki-gateway-6c96ff8676-nnfwd\" (UID: \"851b5f03-b03f-4a8b-9000-1fa733fb7465\") " pod="openshift-logging/logging-loki-gateway-6c96ff8676-nnfwd" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.492386 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/851b5f03-b03f-4a8b-9000-1fa733fb7465-rbac\") pod \"logging-loki-gateway-6c96ff8676-nnfwd\" (UID: \"851b5f03-b03f-4a8b-9000-1fa733fb7465\") " pod="openshift-logging/logging-loki-gateway-6c96ff8676-nnfwd" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.492412 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/11a613c6-725b-4e91-867b-58b8d664dd55-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-6c96ff8676-nph2r\" (UID: \"11a613c6-725b-4e91-867b-58b8d664dd55\") " pod="openshift-logging/logging-loki-gateway-6c96ff8676-nph2r" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.492453 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/851b5f03-b03f-4a8b-9000-1fa733fb7465-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-6c96ff8676-nnfwd\" (UID: \"851b5f03-b03f-4a8b-9000-1fa733fb7465\") " pod="openshift-logging/logging-loki-gateway-6c96ff8676-nnfwd" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.492491 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/851b5f03-b03f-4a8b-9000-1fa733fb7465-tls-secret\") pod \"logging-loki-gateway-6c96ff8676-nnfwd\" (UID: \"851b5f03-b03f-4a8b-9000-1fa733fb7465\") " pod="openshift-logging/logging-loki-gateway-6c96ff8676-nnfwd" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.492652 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11a613c6-725b-4e91-867b-58b8d664dd55-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-6c96ff8676-nph2r\" (UID: \"11a613c6-725b-4e91-867b-58b8d664dd55\") " pod="openshift-logging/logging-loki-gateway-6c96ff8676-nph2r" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.492685 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/11a613c6-725b-4e91-867b-58b8d664dd55-rbac\") pod \"logging-loki-gateway-6c96ff8676-nph2r\" (UID: \"11a613c6-725b-4e91-867b-58b8d664dd55\") " pod="openshift-logging/logging-loki-gateway-6c96ff8676-nph2r" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.492720 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4nhs\" (UniqueName: \"kubernetes.io/projected/851b5f03-b03f-4a8b-9000-1fa733fb7465-kube-api-access-g4nhs\") pod \"logging-loki-gateway-6c96ff8676-nnfwd\" (UID: \"851b5f03-b03f-4a8b-9000-1fa733fb7465\") " pod="openshift-logging/logging-loki-gateway-6c96ff8676-nnfwd" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.492761 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/851b5f03-b03f-4a8b-9000-1fa733fb7465-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-6c96ff8676-nnfwd\" (UID: \"851b5f03-b03f-4a8b-9000-1fa733fb7465\") " pod="openshift-logging/logging-loki-gateway-6c96ff8676-nnfwd" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.492787 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11a613c6-725b-4e91-867b-58b8d664dd55-logging-loki-ca-bundle\") pod \"logging-loki-gateway-6c96ff8676-nph2r\" (UID: \"11a613c6-725b-4e91-867b-58b8d664dd55\") " pod="openshift-logging/logging-loki-gateway-6c96ff8676-nph2r" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.492823 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/851b5f03-b03f-4a8b-9000-1fa733fb7465-tenants\") pod \"logging-loki-gateway-6c96ff8676-nnfwd\" (UID: \"851b5f03-b03f-4a8b-9000-1fa733fb7465\") " pod="openshift-logging/logging-loki-gateway-6c96ff8676-nnfwd" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.493692 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/851b5f03-b03f-4a8b-9000-1fa733fb7465-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-6c96ff8676-nnfwd\" (UID: \"851b5f03-b03f-4a8b-9000-1fa733fb7465\") " pod="openshift-logging/logging-loki-gateway-6c96ff8676-nnfwd" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.493947 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/851b5f03-b03f-4a8b-9000-1fa733fb7465-logging-loki-ca-bundle\") pod \"logging-loki-gateway-6c96ff8676-nnfwd\" (UID: \"851b5f03-b03f-4a8b-9000-1fa733fb7465\") " pod="openshift-logging/logging-loki-gateway-6c96ff8676-nnfwd" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.494154 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/851b5f03-b03f-4a8b-9000-1fa733fb7465-rbac\") pod \"logging-loki-gateway-6c96ff8676-nnfwd\" (UID: \"851b5f03-b03f-4a8b-9000-1fa733fb7465\") " pod="openshift-logging/logging-loki-gateway-6c96ff8676-nnfwd" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.494268 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/851b5f03-b03f-4a8b-9000-1fa733fb7465-lokistack-gateway\") pod \"logging-loki-gateway-6c96ff8676-nnfwd\" (UID: \"851b5f03-b03f-4a8b-9000-1fa733fb7465\") " pod="openshift-logging/logging-loki-gateway-6c96ff8676-nnfwd" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.497088 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/851b5f03-b03f-4a8b-9000-1fa733fb7465-tenants\") pod \"logging-loki-gateway-6c96ff8676-nnfwd\" (UID: \"851b5f03-b03f-4a8b-9000-1fa733fb7465\") " pod="openshift-logging/logging-loki-gateway-6c96ff8676-nnfwd" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.498322 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/851b5f03-b03f-4a8b-9000-1fa733fb7465-tls-secret\") pod \"logging-loki-gateway-6c96ff8676-nnfwd\" (UID: \"851b5f03-b03f-4a8b-9000-1fa733fb7465\") " pod="openshift-logging/logging-loki-gateway-6c96ff8676-nnfwd" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.499426 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/851b5f03-b03f-4a8b-9000-1fa733fb7465-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-6c96ff8676-nnfwd\" (UID: \"851b5f03-b03f-4a8b-9000-1fa733fb7465\") " pod="openshift-logging/logging-loki-gateway-6c96ff8676-nnfwd" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.511943 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4nhs\" (UniqueName: \"kubernetes.io/projected/851b5f03-b03f-4a8b-9000-1fa733fb7465-kube-api-access-g4nhs\") pod \"logging-loki-gateway-6c96ff8676-nnfwd\" (UID: \"851b5f03-b03f-4a8b-9000-1fa733fb7465\") " pod="openshift-logging/logging-loki-gateway-6c96ff8676-nnfwd" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.594401 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/11a613c6-725b-4e91-867b-58b8d664dd55-rbac\") pod \"logging-loki-gateway-6c96ff8676-nph2r\" (UID: \"11a613c6-725b-4e91-867b-58b8d664dd55\") " pod="openshift-logging/logging-loki-gateway-6c96ff8676-nph2r" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.594496 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11a613c6-725b-4e91-867b-58b8d664dd55-logging-loki-ca-bundle\") pod \"logging-loki-gateway-6c96ff8676-nph2r\" (UID: \"11a613c6-725b-4e91-867b-58b8d664dd55\") " pod="openshift-logging/logging-loki-gateway-6c96ff8676-nph2r" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.594532 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffhdk\" (UniqueName: \"kubernetes.io/projected/11a613c6-725b-4e91-867b-58b8d664dd55-kube-api-access-ffhdk\") pod \"logging-loki-gateway-6c96ff8676-nph2r\" (UID: \"11a613c6-725b-4e91-867b-58b8d664dd55\") " pod="openshift-logging/logging-loki-gateway-6c96ff8676-nph2r" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.594763 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/11a613c6-725b-4e91-867b-58b8d664dd55-lokistack-gateway\") pod \"logging-loki-gateway-6c96ff8676-nph2r\" (UID: \"11a613c6-725b-4e91-867b-58b8d664dd55\") " pod="openshift-logging/logging-loki-gateway-6c96ff8676-nph2r" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.594781 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/11a613c6-725b-4e91-867b-58b8d664dd55-tls-secret\") pod \"logging-loki-gateway-6c96ff8676-nph2r\" (UID: \"11a613c6-725b-4e91-867b-58b8d664dd55\") " pod="openshift-logging/logging-loki-gateway-6c96ff8676-nph2r" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.594848 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/11a613c6-725b-4e91-867b-58b8d664dd55-tenants\") pod \"logging-loki-gateway-6c96ff8676-nph2r\" (UID: \"11a613c6-725b-4e91-867b-58b8d664dd55\") " pod="openshift-logging/logging-loki-gateway-6c96ff8676-nph2r" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.595506 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11a613c6-725b-4e91-867b-58b8d664dd55-logging-loki-ca-bundle\") pod \"logging-loki-gateway-6c96ff8676-nph2r\" (UID: \"11a613c6-725b-4e91-867b-58b8d664dd55\") " pod="openshift-logging/logging-loki-gateway-6c96ff8676-nph2r" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.595901 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/11a613c6-725b-4e91-867b-58b8d664dd55-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-6c96ff8676-nph2r\" (UID: \"11a613c6-725b-4e91-867b-58b8d664dd55\") " pod="openshift-logging/logging-loki-gateway-6c96ff8676-nph2r" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.595959 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11a613c6-725b-4e91-867b-58b8d664dd55-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-6c96ff8676-nph2r\" (UID: \"11a613c6-725b-4e91-867b-58b8d664dd55\") " pod="openshift-logging/logging-loki-gateway-6c96ff8676-nph2r" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.596146 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/11a613c6-725b-4e91-867b-58b8d664dd55-rbac\") pod \"logging-loki-gateway-6c96ff8676-nph2r\" (UID: \"11a613c6-725b-4e91-867b-58b8d664dd55\") " pod="openshift-logging/logging-loki-gateway-6c96ff8676-nph2r" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.596482 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11a613c6-725b-4e91-867b-58b8d664dd55-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-6c96ff8676-nph2r\" (UID: \"11a613c6-725b-4e91-867b-58b8d664dd55\") " pod="openshift-logging/logging-loki-gateway-6c96ff8676-nph2r" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.596911 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/11a613c6-725b-4e91-867b-58b8d664dd55-lokistack-gateway\") pod \"logging-loki-gateway-6c96ff8676-nph2r\" (UID: \"11a613c6-725b-4e91-867b-58b8d664dd55\") " pod="openshift-logging/logging-loki-gateway-6c96ff8676-nph2r" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.600901 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/11a613c6-725b-4e91-867b-58b8d664dd55-tls-secret\") pod \"logging-loki-gateway-6c96ff8676-nph2r\" (UID: \"11a613c6-725b-4e91-867b-58b8d664dd55\") " pod="openshift-logging/logging-loki-gateway-6c96ff8676-nph2r" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.609794 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-querier-5895d59bb8-wdd4k"] Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.610161 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/11a613c6-725b-4e91-867b-58b8d664dd55-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-6c96ff8676-nph2r\" (UID: \"11a613c6-725b-4e91-867b-58b8d664dd55\") " pod="openshift-logging/logging-loki-gateway-6c96ff8676-nph2r" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.612906 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/11a613c6-725b-4e91-867b-58b8d664dd55-tenants\") pod \"logging-loki-gateway-6c96ff8676-nph2r\" (UID: \"11a613c6-725b-4e91-867b-58b8d664dd55\") " pod="openshift-logging/logging-loki-gateway-6c96ff8676-nph2r" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.615211 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffhdk\" (UniqueName: \"kubernetes.io/projected/11a613c6-725b-4e91-867b-58b8d664dd55-kube-api-access-ffhdk\") pod \"logging-loki-gateway-6c96ff8676-nph2r\" (UID: \"11a613c6-725b-4e91-867b-58b8d664dd55\") " pod="openshift-logging/logging-loki-gateway-6c96ff8676-nph2r" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.628330 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-6c96ff8676-nnfwd" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.651108 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-6c96ff8676-nph2r" Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.715812 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-distributor-76cc67bf56-mjgbt"] Dec 01 08:31:20 crc kubenswrapper[5004]: I1201 08:31:20.914780 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-query-frontend-84558f7c9f-pkxkw"] Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.063465 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.064872 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-ingester-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.068849 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-ingester-grpc" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.072209 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-ingester-http" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.081254 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.107346 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-6c96ff8676-nnfwd"] Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.129960 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.131066 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-compactor-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.137600 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-compactor-grpc" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.137872 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-compactor-http" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.152886 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.178847 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-6c96ff8676-nph2r"] Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.203304 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.204168 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-index-gateway-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.205757 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zl5g6\" (UniqueName: \"kubernetes.io/projected/346e7dcd-bc03-4c7f-b0b0-8e5206230152-kube-api-access-zl5g6\") pod \"logging-loki-ingester-0\" (UID: \"346e7dcd-bc03-4c7f-b0b0-8e5206230152\") " pod="openshift-logging/logging-loki-ingester-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.205865 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-1a755645-8498-48aa-b23d-d6799e49570e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1a755645-8498-48aa-b23d-d6799e49570e\") pod \"logging-loki-ingester-0\" (UID: \"346e7dcd-bc03-4c7f-b0b0-8e5206230152\") " pod="openshift-logging/logging-loki-ingester-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.205898 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/346e7dcd-bc03-4c7f-b0b0-8e5206230152-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"346e7dcd-bc03-4c7f-b0b0-8e5206230152\") " pod="openshift-logging/logging-loki-ingester-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.205937 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/346e7dcd-bc03-4c7f-b0b0-8e5206230152-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"346e7dcd-bc03-4c7f-b0b0-8e5206230152\") " pod="openshift-logging/logging-loki-ingester-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.205968 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/346e7dcd-bc03-4c7f-b0b0-8e5206230152-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"346e7dcd-bc03-4c7f-b0b0-8e5206230152\") " pod="openshift-logging/logging-loki-ingester-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.205990 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/346e7dcd-bc03-4c7f-b0b0-8e5206230152-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"346e7dcd-bc03-4c7f-b0b0-8e5206230152\") " pod="openshift-logging/logging-loki-ingester-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.206013 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-cb1bb18b-4908-4415-a3cd-f155cb080a81\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cb1bb18b-4908-4415-a3cd-f155cb080a81\") pod \"logging-loki-ingester-0\" (UID: \"346e7dcd-bc03-4c7f-b0b0-8e5206230152\") " pod="openshift-logging/logging-loki-ingester-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.206041 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/346e7dcd-bc03-4c7f-b0b0-8e5206230152-config\") pod \"logging-loki-ingester-0\" (UID: \"346e7dcd-bc03-4c7f-b0b0-8e5206230152\") " pod="openshift-logging/logging-loki-ingester-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.208739 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-index-gateway-grpc" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.208890 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-index-gateway-http" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.221025 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.307595 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zl5g6\" (UniqueName: \"kubernetes.io/projected/346e7dcd-bc03-4c7f-b0b0-8e5206230152-kube-api-access-zl5g6\") pod \"logging-loki-ingester-0\" (UID: \"346e7dcd-bc03-4c7f-b0b0-8e5206230152\") " pod="openshift-logging/logging-loki-ingester-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.307686 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-400e0293-706d-45ac-8684-c4b63838e53a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-400e0293-706d-45ac-8684-c4b63838e53a\") pod \"logging-loki-index-gateway-0\" (UID: \"9be864a5-1434-4402-ac67-a8cc8d09090a\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.307723 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddj8k\" (UniqueName: \"kubernetes.io/projected/9be864a5-1434-4402-ac67-a8cc8d09090a-kube-api-access-ddj8k\") pod \"logging-loki-index-gateway-0\" (UID: \"9be864a5-1434-4402-ac67-a8cc8d09090a\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.307747 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/346e7dcd-bc03-4c7f-b0b0-8e5206230152-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"346e7dcd-bc03-4c7f-b0b0-8e5206230152\") " pod="openshift-logging/logging-loki-ingester-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.307776 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1f3b2ee-067a-4887-875a-c9ca05cb65b6-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"c1f3b2ee-067a-4887-875a-c9ca05cb65b6\") " pod="openshift-logging/logging-loki-compactor-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.307806 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9be864a5-1434-4402-ac67-a8cc8d09090a-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"9be864a5-1434-4402-ac67-a8cc8d09090a\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.308946 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/c1f3b2ee-067a-4887-875a-c9ca05cb65b6-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"c1f3b2ee-067a-4887-875a-c9ca05cb65b6\") " pod="openshift-logging/logging-loki-compactor-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.309955 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-95b3d6a8-3d0e-4fc1-baad-0e2895957246\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-95b3d6a8-3d0e-4fc1-baad-0e2895957246\") pod \"logging-loki-compactor-0\" (UID: \"c1f3b2ee-067a-4887-875a-c9ca05cb65b6\") " pod="openshift-logging/logging-loki-compactor-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.310004 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/9be864a5-1434-4402-ac67-a8cc8d09090a-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"9be864a5-1434-4402-ac67-a8cc8d09090a\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.310041 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9be864a5-1434-4402-ac67-a8cc8d09090a-config\") pod \"logging-loki-index-gateway-0\" (UID: \"9be864a5-1434-4402-ac67-a8cc8d09090a\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.310063 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/9be864a5-1434-4402-ac67-a8cc8d09090a-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"9be864a5-1434-4402-ac67-a8cc8d09090a\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.310484 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-1a755645-8498-48aa-b23d-d6799e49570e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1a755645-8498-48aa-b23d-d6799e49570e\") pod \"logging-loki-ingester-0\" (UID: \"346e7dcd-bc03-4c7f-b0b0-8e5206230152\") " pod="openshift-logging/logging-loki-ingester-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.310518 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1f3b2ee-067a-4887-875a-c9ca05cb65b6-config\") pod \"logging-loki-compactor-0\" (UID: \"c1f3b2ee-067a-4887-875a-c9ca05cb65b6\") " pod="openshift-logging/logging-loki-compactor-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.310539 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/346e7dcd-bc03-4c7f-b0b0-8e5206230152-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"346e7dcd-bc03-4c7f-b0b0-8e5206230152\") " pod="openshift-logging/logging-loki-ingester-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.310577 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/c1f3b2ee-067a-4887-875a-c9ca05cb65b6-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"c1f3b2ee-067a-4887-875a-c9ca05cb65b6\") " pod="openshift-logging/logging-loki-compactor-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.310602 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/346e7dcd-bc03-4c7f-b0b0-8e5206230152-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"346e7dcd-bc03-4c7f-b0b0-8e5206230152\") " pod="openshift-logging/logging-loki-ingester-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.310629 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/346e7dcd-bc03-4c7f-b0b0-8e5206230152-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"346e7dcd-bc03-4c7f-b0b0-8e5206230152\") " pod="openshift-logging/logging-loki-ingester-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.310652 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2k4f\" (UniqueName: \"kubernetes.io/projected/c1f3b2ee-067a-4887-875a-c9ca05cb65b6-kube-api-access-t2k4f\") pod \"logging-loki-compactor-0\" (UID: \"c1f3b2ee-067a-4887-875a-c9ca05cb65b6\") " pod="openshift-logging/logging-loki-compactor-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.310679 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-cb1bb18b-4908-4415-a3cd-f155cb080a81\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cb1bb18b-4908-4415-a3cd-f155cb080a81\") pod \"logging-loki-ingester-0\" (UID: \"346e7dcd-bc03-4c7f-b0b0-8e5206230152\") " pod="openshift-logging/logging-loki-ingester-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.310706 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/346e7dcd-bc03-4c7f-b0b0-8e5206230152-config\") pod \"logging-loki-ingester-0\" (UID: \"346e7dcd-bc03-4c7f-b0b0-8e5206230152\") " pod="openshift-logging/logging-loki-ingester-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.310736 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/9be864a5-1434-4402-ac67-a8cc8d09090a-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"9be864a5-1434-4402-ac67-a8cc8d09090a\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.310761 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/c1f3b2ee-067a-4887-875a-c9ca05cb65b6-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"c1f3b2ee-067a-4887-875a-c9ca05cb65b6\") " pod="openshift-logging/logging-loki-compactor-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.312745 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/346e7dcd-bc03-4c7f-b0b0-8e5206230152-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"346e7dcd-bc03-4c7f-b0b0-8e5206230152\") " pod="openshift-logging/logging-loki-ingester-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.313474 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/346e7dcd-bc03-4c7f-b0b0-8e5206230152-config\") pod \"logging-loki-ingester-0\" (UID: \"346e7dcd-bc03-4c7f-b0b0-8e5206230152\") " pod="openshift-logging/logging-loki-ingester-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.316844 5004 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.316899 5004 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-1a755645-8498-48aa-b23d-d6799e49570e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1a755645-8498-48aa-b23d-d6799e49570e\") pod \"logging-loki-ingester-0\" (UID: \"346e7dcd-bc03-4c7f-b0b0-8e5206230152\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/63b3a9fa3fe1bf66b49217ccc8462cb1ac7ac742b19b5d61d0a54cf35b889994/globalmount\"" pod="openshift-logging/logging-loki-ingester-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.317718 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/346e7dcd-bc03-4c7f-b0b0-8e5206230152-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"346e7dcd-bc03-4c7f-b0b0-8e5206230152\") " pod="openshift-logging/logging-loki-ingester-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.319745 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/346e7dcd-bc03-4c7f-b0b0-8e5206230152-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"346e7dcd-bc03-4c7f-b0b0-8e5206230152\") " pod="openshift-logging/logging-loki-ingester-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.326851 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-distributor-76cc67bf56-mjgbt" event={"ID":"462fa983-5357-44cf-afb3-4803b227bcfa","Type":"ContainerStarted","Data":"7be53b8d598410dbc7a4d0261247f8c56d5866b60e0bead533aa3e552abe7561"} Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.334271 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/346e7dcd-bc03-4c7f-b0b0-8e5206230152-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"346e7dcd-bc03-4c7f-b0b0-8e5206230152\") " pod="openshift-logging/logging-loki-ingester-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.335124 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-6c96ff8676-nph2r" event={"ID":"11a613c6-725b-4e91-867b-58b8d664dd55","Type":"ContainerStarted","Data":"da34a7dfdb48081efecad95ca3ece5120c49c606878a5de772b78385de165790"} Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.336677 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-6c96ff8676-nnfwd" event={"ID":"851b5f03-b03f-4a8b-9000-1fa733fb7465","Type":"ContainerStarted","Data":"f540c6f0472d7ebbbe4d50ec495f6ac2f3eb8d01d9ce030e5229d5260f0dbcac"} Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.342359 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-pkxkw" event={"ID":"1955c798-b6bd-4194-8097-889c0e86c90b","Type":"ContainerStarted","Data":"7d5942b1727d0b7ebb62c0b7773dea5c2d719840f104e8d4b92cdd2dc378acc4"} Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.344268 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-querier-5895d59bb8-wdd4k" event={"ID":"a551f476-d9e6-4e1c-9f48-60939bd6b6ff","Type":"ContainerStarted","Data":"e2574dba1111694bdfd1f3d03946e92d14eceabed1899182c54398a81bc8ffbd"} Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.344374 5004 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.344413 5004 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-cb1bb18b-4908-4415-a3cd-f155cb080a81\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cb1bb18b-4908-4415-a3cd-f155cb080a81\") pod \"logging-loki-ingester-0\" (UID: \"346e7dcd-bc03-4c7f-b0b0-8e5206230152\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/c59dc70ffa996ccba59bd3fa0072b78c8278c32b6767fd806a1cc223b7b9641a/globalmount\"" pod="openshift-logging/logging-loki-ingester-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.357962 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zl5g6\" (UniqueName: \"kubernetes.io/projected/346e7dcd-bc03-4c7f-b0b0-8e5206230152-kube-api-access-zl5g6\") pod \"logging-loki-ingester-0\" (UID: \"346e7dcd-bc03-4c7f-b0b0-8e5206230152\") " pod="openshift-logging/logging-loki-ingester-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.368992 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-1a755645-8498-48aa-b23d-d6799e49570e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1a755645-8498-48aa-b23d-d6799e49570e\") pod \"logging-loki-ingester-0\" (UID: \"346e7dcd-bc03-4c7f-b0b0-8e5206230152\") " pod="openshift-logging/logging-loki-ingester-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.376578 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-cb1bb18b-4908-4415-a3cd-f155cb080a81\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cb1bb18b-4908-4415-a3cd-f155cb080a81\") pod \"logging-loki-ingester-0\" (UID: \"346e7dcd-bc03-4c7f-b0b0-8e5206230152\") " pod="openshift-logging/logging-loki-ingester-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.380943 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-ingester-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.412428 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-400e0293-706d-45ac-8684-c4b63838e53a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-400e0293-706d-45ac-8684-c4b63838e53a\") pod \"logging-loki-index-gateway-0\" (UID: \"9be864a5-1434-4402-ac67-a8cc8d09090a\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.412496 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddj8k\" (UniqueName: \"kubernetes.io/projected/9be864a5-1434-4402-ac67-a8cc8d09090a-kube-api-access-ddj8k\") pod \"logging-loki-index-gateway-0\" (UID: \"9be864a5-1434-4402-ac67-a8cc8d09090a\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.412542 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1f3b2ee-067a-4887-875a-c9ca05cb65b6-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"c1f3b2ee-067a-4887-875a-c9ca05cb65b6\") " pod="openshift-logging/logging-loki-compactor-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.412628 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9be864a5-1434-4402-ac67-a8cc8d09090a-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"9be864a5-1434-4402-ac67-a8cc8d09090a\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.412663 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/c1f3b2ee-067a-4887-875a-c9ca05cb65b6-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"c1f3b2ee-067a-4887-875a-c9ca05cb65b6\") " pod="openshift-logging/logging-loki-compactor-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.412688 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-95b3d6a8-3d0e-4fc1-baad-0e2895957246\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-95b3d6a8-3d0e-4fc1-baad-0e2895957246\") pod \"logging-loki-compactor-0\" (UID: \"c1f3b2ee-067a-4887-875a-c9ca05cb65b6\") " pod="openshift-logging/logging-loki-compactor-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.412726 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/9be864a5-1434-4402-ac67-a8cc8d09090a-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"9be864a5-1434-4402-ac67-a8cc8d09090a\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.412765 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9be864a5-1434-4402-ac67-a8cc8d09090a-config\") pod \"logging-loki-index-gateway-0\" (UID: \"9be864a5-1434-4402-ac67-a8cc8d09090a\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.412787 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/9be864a5-1434-4402-ac67-a8cc8d09090a-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"9be864a5-1434-4402-ac67-a8cc8d09090a\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.412822 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1f3b2ee-067a-4887-875a-c9ca05cb65b6-config\") pod \"logging-loki-compactor-0\" (UID: \"c1f3b2ee-067a-4887-875a-c9ca05cb65b6\") " pod="openshift-logging/logging-loki-compactor-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.412850 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/c1f3b2ee-067a-4887-875a-c9ca05cb65b6-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"c1f3b2ee-067a-4887-875a-c9ca05cb65b6\") " pod="openshift-logging/logging-loki-compactor-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.412874 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2k4f\" (UniqueName: \"kubernetes.io/projected/c1f3b2ee-067a-4887-875a-c9ca05cb65b6-kube-api-access-t2k4f\") pod \"logging-loki-compactor-0\" (UID: \"c1f3b2ee-067a-4887-875a-c9ca05cb65b6\") " pod="openshift-logging/logging-loki-compactor-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.412907 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/9be864a5-1434-4402-ac67-a8cc8d09090a-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"9be864a5-1434-4402-ac67-a8cc8d09090a\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.412932 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/c1f3b2ee-067a-4887-875a-c9ca05cb65b6-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"c1f3b2ee-067a-4887-875a-c9ca05cb65b6\") " pod="openshift-logging/logging-loki-compactor-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.413896 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9be864a5-1434-4402-ac67-a8cc8d09090a-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"9be864a5-1434-4402-ac67-a8cc8d09090a\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.414185 5004 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.414214 5004 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-400e0293-706d-45ac-8684-c4b63838e53a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-400e0293-706d-45ac-8684-c4b63838e53a\") pod \"logging-loki-index-gateway-0\" (UID: \"9be864a5-1434-4402-ac67-a8cc8d09090a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1d3202b69269d22d03974c60b3510929eb48cc9d1df45305435546b0e78ba33a/globalmount\"" pod="openshift-logging/logging-loki-index-gateway-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.414614 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1f3b2ee-067a-4887-875a-c9ca05cb65b6-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"c1f3b2ee-067a-4887-875a-c9ca05cb65b6\") " pod="openshift-logging/logging-loki-compactor-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.414729 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1f3b2ee-067a-4887-875a-c9ca05cb65b6-config\") pod \"logging-loki-compactor-0\" (UID: \"c1f3b2ee-067a-4887-875a-c9ca05cb65b6\") " pod="openshift-logging/logging-loki-compactor-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.415909 5004 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.415943 5004 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-95b3d6a8-3d0e-4fc1-baad-0e2895957246\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-95b3d6a8-3d0e-4fc1-baad-0e2895957246\") pod \"logging-loki-compactor-0\" (UID: \"c1f3b2ee-067a-4887-875a-c9ca05cb65b6\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/2b0935050c778fc7cd2329c772b14bbde0c8ea526fc6a9c28f3ee51bacff21d3/globalmount\"" pod="openshift-logging/logging-loki-compactor-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.418491 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/c1f3b2ee-067a-4887-875a-c9ca05cb65b6-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"c1f3b2ee-067a-4887-875a-c9ca05cb65b6\") " pod="openshift-logging/logging-loki-compactor-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.418697 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9be864a5-1434-4402-ac67-a8cc8d09090a-config\") pod \"logging-loki-index-gateway-0\" (UID: \"9be864a5-1434-4402-ac67-a8cc8d09090a\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.419622 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/9be864a5-1434-4402-ac67-a8cc8d09090a-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"9be864a5-1434-4402-ac67-a8cc8d09090a\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.419763 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/9be864a5-1434-4402-ac67-a8cc8d09090a-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"9be864a5-1434-4402-ac67-a8cc8d09090a\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.420048 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/c1f3b2ee-067a-4887-875a-c9ca05cb65b6-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"c1f3b2ee-067a-4887-875a-c9ca05cb65b6\") " pod="openshift-logging/logging-loki-compactor-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.420420 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/c1f3b2ee-067a-4887-875a-c9ca05cb65b6-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"c1f3b2ee-067a-4887-875a-c9ca05cb65b6\") " pod="openshift-logging/logging-loki-compactor-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.425131 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/9be864a5-1434-4402-ac67-a8cc8d09090a-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"9be864a5-1434-4402-ac67-a8cc8d09090a\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.429820 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2k4f\" (UniqueName: \"kubernetes.io/projected/c1f3b2ee-067a-4887-875a-c9ca05cb65b6-kube-api-access-t2k4f\") pod \"logging-loki-compactor-0\" (UID: \"c1f3b2ee-067a-4887-875a-c9ca05cb65b6\") " pod="openshift-logging/logging-loki-compactor-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.432669 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddj8k\" (UniqueName: \"kubernetes.io/projected/9be864a5-1434-4402-ac67-a8cc8d09090a-kube-api-access-ddj8k\") pod \"logging-loki-index-gateway-0\" (UID: \"9be864a5-1434-4402-ac67-a8cc8d09090a\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.446521 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-95b3d6a8-3d0e-4fc1-baad-0e2895957246\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-95b3d6a8-3d0e-4fc1-baad-0e2895957246\") pod \"logging-loki-compactor-0\" (UID: \"c1f3b2ee-067a-4887-875a-c9ca05cb65b6\") " pod="openshift-logging/logging-loki-compactor-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.449102 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-400e0293-706d-45ac-8684-c4b63838e53a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-400e0293-706d-45ac-8684-c4b63838e53a\") pod \"logging-loki-index-gateway-0\" (UID: \"9be864a5-1434-4402-ac67-a8cc8d09090a\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.454101 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-compactor-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.600102 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-index-gateway-0" Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.607816 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Dec 01 08:31:21 crc kubenswrapper[5004]: W1201 08:31:21.620476 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod346e7dcd_bc03_4c7f_b0b0_8e5206230152.slice/crio-ac90939f05f63d4c22f790ffe06f0b27ac4d3fd383870d60723526c4d9279f74 WatchSource:0}: Error finding container ac90939f05f63d4c22f790ffe06f0b27ac4d3fd383870d60723526c4d9279f74: Status 404 returned error can't find the container with id ac90939f05f63d4c22f790ffe06f0b27ac4d3fd383870d60723526c4d9279f74 Dec 01 08:31:21 crc kubenswrapper[5004]: I1201 08:31:21.682446 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Dec 01 08:31:21 crc kubenswrapper[5004]: W1201 08:31:21.701857 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1f3b2ee_067a_4887_875a_c9ca05cb65b6.slice/crio-e44f13be44d098a0720ed3a2d8a82f3d1fdf0a4abf489755c142c37b4177c089 WatchSource:0}: Error finding container e44f13be44d098a0720ed3a2d8a82f3d1fdf0a4abf489755c142c37b4177c089: Status 404 returned error can't find the container with id e44f13be44d098a0720ed3a2d8a82f3d1fdf0a4abf489755c142c37b4177c089 Dec 01 08:31:22 crc kubenswrapper[5004]: I1201 08:31:22.020844 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Dec 01 08:31:22 crc kubenswrapper[5004]: I1201 08:31:22.354174 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-index-gateway-0" event={"ID":"9be864a5-1434-4402-ac67-a8cc8d09090a","Type":"ContainerStarted","Data":"6617d221c70a87a20006903cd985257e5bdbacb161e631d1f5b35e795fc95127"} Dec 01 08:31:22 crc kubenswrapper[5004]: I1201 08:31:22.356491 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-compactor-0" event={"ID":"c1f3b2ee-067a-4887-875a-c9ca05cb65b6","Type":"ContainerStarted","Data":"e44f13be44d098a0720ed3a2d8a82f3d1fdf0a4abf489755c142c37b4177c089"} Dec 01 08:31:22 crc kubenswrapper[5004]: I1201 08:31:22.358352 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-ingester-0" event={"ID":"346e7dcd-bc03-4c7f-b0b0-8e5206230152","Type":"ContainerStarted","Data":"ac90939f05f63d4c22f790ffe06f0b27ac4d3fd383870d60723526c4d9279f74"} Dec 01 08:31:25 crc kubenswrapper[5004]: I1201 08:31:25.380784 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-6c96ff8676-nnfwd" event={"ID":"851b5f03-b03f-4a8b-9000-1fa733fb7465","Type":"ContainerStarted","Data":"8b5309794ec90548b6bf932736e588e25bbaa2f3af71fc7bc27a9d92efcb58ad"} Dec 01 08:31:25 crc kubenswrapper[5004]: I1201 08:31:25.383324 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-pkxkw" event={"ID":"1955c798-b6bd-4194-8097-889c0e86c90b","Type":"ContainerStarted","Data":"ff4fa5caef08434a402182278e033af1c20ccdef883eb5cbd8f9339668bb1532"} Dec 01 08:31:25 crc kubenswrapper[5004]: I1201 08:31:25.383474 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-pkxkw" Dec 01 08:31:25 crc kubenswrapper[5004]: I1201 08:31:25.385426 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-querier-5895d59bb8-wdd4k" event={"ID":"a551f476-d9e6-4e1c-9f48-60939bd6b6ff","Type":"ContainerStarted","Data":"28a6013924428cf9e048ae1518e483f8c6be2a4c7331cd3ce3c4043e4b11aaf5"} Dec 01 08:31:25 crc kubenswrapper[5004]: I1201 08:31:25.385621 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-querier-5895d59bb8-wdd4k" Dec 01 08:31:25 crc kubenswrapper[5004]: I1201 08:31:25.386893 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-compactor-0" event={"ID":"c1f3b2ee-067a-4887-875a-c9ca05cb65b6","Type":"ContainerStarted","Data":"153321b024d37b91f81b8f31d4afe6c3402726e6f970d83f2bcdbdb08adbda66"} Dec 01 08:31:25 crc kubenswrapper[5004]: I1201 08:31:25.386951 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-compactor-0" Dec 01 08:31:25 crc kubenswrapper[5004]: I1201 08:31:25.388970 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-ingester-0" event={"ID":"346e7dcd-bc03-4c7f-b0b0-8e5206230152","Type":"ContainerStarted","Data":"9726917782f0c763c71003f8e3507da64ae3795f08161010fb122b32b5ad12f2"} Dec 01 08:31:25 crc kubenswrapper[5004]: I1201 08:31:25.389034 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-ingester-0" Dec 01 08:31:25 crc kubenswrapper[5004]: I1201 08:31:25.390716 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-distributor-76cc67bf56-mjgbt" event={"ID":"462fa983-5357-44cf-afb3-4803b227bcfa","Type":"ContainerStarted","Data":"7417d527eeee5a3f4743b91065233bab9c50eccf0077048f9b4a842cfa239fde"} Dec 01 08:31:25 crc kubenswrapper[5004]: I1201 08:31:25.390776 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-distributor-76cc67bf56-mjgbt" Dec 01 08:31:25 crc kubenswrapper[5004]: I1201 08:31:25.391928 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-index-gateway-0" event={"ID":"9be864a5-1434-4402-ac67-a8cc8d09090a","Type":"ContainerStarted","Data":"e08f546ac644d4e2286d7b24d0635279d83b3b2134ef9fc2278e537409833a75"} Dec 01 08:31:25 crc kubenswrapper[5004]: I1201 08:31:25.392039 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-index-gateway-0" Dec 01 08:31:25 crc kubenswrapper[5004]: I1201 08:31:25.401846 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-6c96ff8676-nph2r" event={"ID":"11a613c6-725b-4e91-867b-58b8d664dd55","Type":"ContainerStarted","Data":"af164ef9fc9ef12c3b84888eb2d19194d0088266d9f894f9090aa2840bcc799f"} Dec 01 08:31:25 crc kubenswrapper[5004]: I1201 08:31:25.412028 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-pkxkw" podStartSLOduration=1.802994844 podStartE2EDuration="5.412002942s" podCreationTimestamp="2025-12-01 08:31:20 +0000 UTC" firstStartedPulling="2025-12-01 08:31:20.923884655 +0000 UTC m=+858.488876637" lastFinishedPulling="2025-12-01 08:31:24.532892743 +0000 UTC m=+862.097884735" observedRunningTime="2025-12-01 08:31:25.404418534 +0000 UTC m=+862.969410526" watchObservedRunningTime="2025-12-01 08:31:25.412002942 +0000 UTC m=+862.976994944" Dec 01 08:31:25 crc kubenswrapper[5004]: I1201 08:31:25.429938 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-distributor-76cc67bf56-mjgbt" podStartSLOduration=2.617470245 podStartE2EDuration="6.429915506s" podCreationTimestamp="2025-12-01 08:31:19 +0000 UTC" firstStartedPulling="2025-12-01 08:31:20.729057235 +0000 UTC m=+858.294049217" lastFinishedPulling="2025-12-01 08:31:24.541502486 +0000 UTC m=+862.106494478" observedRunningTime="2025-12-01 08:31:25.426414619 +0000 UTC m=+862.991406621" watchObservedRunningTime="2025-12-01 08:31:25.429915506 +0000 UTC m=+862.994907488" Dec 01 08:31:25 crc kubenswrapper[5004]: I1201 08:31:25.448028 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-ingester-0" podStartSLOduration=2.493319653 podStartE2EDuration="5.448012795s" podCreationTimestamp="2025-12-01 08:31:20 +0000 UTC" firstStartedPulling="2025-12-01 08:31:21.624581971 +0000 UTC m=+859.189573953" lastFinishedPulling="2025-12-01 08:31:24.579275103 +0000 UTC m=+862.144267095" observedRunningTime="2025-12-01 08:31:25.442639351 +0000 UTC m=+863.007631343" watchObservedRunningTime="2025-12-01 08:31:25.448012795 +0000 UTC m=+863.013004777" Dec 01 08:31:25 crc kubenswrapper[5004]: I1201 08:31:25.465444 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-index-gateway-0" podStartSLOduration=2.932845166 podStartE2EDuration="5.465426536s" podCreationTimestamp="2025-12-01 08:31:20 +0000 UTC" firstStartedPulling="2025-12-01 08:31:22.031843545 +0000 UTC m=+859.596835577" lastFinishedPulling="2025-12-01 08:31:24.564424965 +0000 UTC m=+862.129416947" observedRunningTime="2025-12-01 08:31:25.461821587 +0000 UTC m=+863.026813569" watchObservedRunningTime="2025-12-01 08:31:25.465426536 +0000 UTC m=+863.030418528" Dec 01 08:31:25 crc kubenswrapper[5004]: I1201 08:31:25.488537 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-compactor-0" podStartSLOduration=2.57105318 podStartE2EDuration="5.488518239s" podCreationTimestamp="2025-12-01 08:31:20 +0000 UTC" firstStartedPulling="2025-12-01 08:31:21.704912132 +0000 UTC m=+859.269904124" lastFinishedPulling="2025-12-01 08:31:24.622377181 +0000 UTC m=+862.187369183" observedRunningTime="2025-12-01 08:31:25.488041876 +0000 UTC m=+863.053033878" watchObservedRunningTime="2025-12-01 08:31:25.488518239 +0000 UTC m=+863.053510241" Dec 01 08:31:25 crc kubenswrapper[5004]: I1201 08:31:25.507293 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-querier-5895d59bb8-wdd4k" podStartSLOduration=1.535768391 podStartE2EDuration="5.507276303s" podCreationTimestamp="2025-12-01 08:31:20 +0000 UTC" firstStartedPulling="2025-12-01 08:31:20.623167312 +0000 UTC m=+858.188159294" lastFinishedPulling="2025-12-01 08:31:24.594675224 +0000 UTC m=+862.159667206" observedRunningTime="2025-12-01 08:31:25.506469493 +0000 UTC m=+863.071461495" watchObservedRunningTime="2025-12-01 08:31:25.507276303 +0000 UTC m=+863.072268295" Dec 01 08:31:28 crc kubenswrapper[5004]: I1201 08:31:28.430423 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-6c96ff8676-nph2r" event={"ID":"11a613c6-725b-4e91-867b-58b8d664dd55","Type":"ContainerStarted","Data":"db9d93ecdb0d0335aeda8824992e0b8166754552a1b28b1ac162dd655e8acb11"} Dec 01 08:31:28 crc kubenswrapper[5004]: I1201 08:31:28.431189 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-6c96ff8676-nph2r" Dec 01 08:31:28 crc kubenswrapper[5004]: I1201 08:31:28.431212 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-6c96ff8676-nph2r" Dec 01 08:31:28 crc kubenswrapper[5004]: I1201 08:31:28.432853 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-6c96ff8676-nnfwd" event={"ID":"851b5f03-b03f-4a8b-9000-1fa733fb7465","Type":"ContainerStarted","Data":"2bed3cc5acb191d34293329ba7ade80f65aa4eaff762368cba842c12ce44225d"} Dec 01 08:31:28 crc kubenswrapper[5004]: I1201 08:31:28.433186 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-6c96ff8676-nnfwd" Dec 01 08:31:28 crc kubenswrapper[5004]: I1201 08:31:28.433236 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-6c96ff8676-nnfwd" Dec 01 08:31:28 crc kubenswrapper[5004]: I1201 08:31:28.448373 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-6c96ff8676-nph2r" Dec 01 08:31:28 crc kubenswrapper[5004]: I1201 08:31:28.460153 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-6c96ff8676-nph2r" Dec 01 08:31:28 crc kubenswrapper[5004]: I1201 08:31:28.460711 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-6c96ff8676-nnfwd" Dec 01 08:31:28 crc kubenswrapper[5004]: I1201 08:31:28.465666 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-6c96ff8676-nnfwd" Dec 01 08:31:28 crc kubenswrapper[5004]: I1201 08:31:28.494879 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-gateway-6c96ff8676-nph2r" podStartSLOduration=2.308078182 podStartE2EDuration="8.49484924s" podCreationTimestamp="2025-12-01 08:31:20 +0000 UTC" firstStartedPulling="2025-12-01 08:31:21.174463845 +0000 UTC m=+858.739455837" lastFinishedPulling="2025-12-01 08:31:27.361234903 +0000 UTC m=+864.926226895" observedRunningTime="2025-12-01 08:31:28.463606255 +0000 UTC m=+866.028598277" watchObservedRunningTime="2025-12-01 08:31:28.49484924 +0000 UTC m=+866.059841262" Dec 01 08:31:28 crc kubenswrapper[5004]: I1201 08:31:28.508784 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-gateway-6c96ff8676-nnfwd" podStartSLOduration=2.263163049 podStartE2EDuration="8.508749364s" podCreationTimestamp="2025-12-01 08:31:20 +0000 UTC" firstStartedPulling="2025-12-01 08:31:21.11049496 +0000 UTC m=+858.675486952" lastFinishedPulling="2025-12-01 08:31:27.356081275 +0000 UTC m=+864.921073267" observedRunningTime="2025-12-01 08:31:28.497606128 +0000 UTC m=+866.062598170" watchObservedRunningTime="2025-12-01 08:31:28.508749364 +0000 UTC m=+866.073741396" Dec 01 08:31:40 crc kubenswrapper[5004]: I1201 08:31:40.248860 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-distributor-76cc67bf56-mjgbt" Dec 01 08:31:40 crc kubenswrapper[5004]: I1201 08:31:40.394146 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-querier-5895d59bb8-wdd4k" Dec 01 08:31:40 crc kubenswrapper[5004]: I1201 08:31:40.467041 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-pkxkw" Dec 01 08:31:41 crc kubenswrapper[5004]: I1201 08:31:41.389490 5004 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: this instance owns no tokens Dec 01 08:31:41 crc kubenswrapper[5004]: I1201 08:31:41.389615 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="346e7dcd-bc03-4c7f-b0b0-8e5206230152" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 01 08:31:41 crc kubenswrapper[5004]: I1201 08:31:41.462852 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-compactor-0" Dec 01 08:31:41 crc kubenswrapper[5004]: I1201 08:31:41.606187 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-index-gateway-0" Dec 01 08:31:51 crc kubenswrapper[5004]: I1201 08:31:51.390710 5004 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: this instance owns no tokens Dec 01 08:31:51 crc kubenswrapper[5004]: I1201 08:31:51.391283 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="346e7dcd-bc03-4c7f-b0b0-8e5206230152" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 01 08:32:01 crc kubenswrapper[5004]: I1201 08:32:01.388963 5004 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: waiting for 15s after being ready Dec 01 08:32:01 crc kubenswrapper[5004]: I1201 08:32:01.389598 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="346e7dcd-bc03-4c7f-b0b0-8e5206230152" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 01 08:32:05 crc kubenswrapper[5004]: I1201 08:32:05.173098 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pm8rg"] Dec 01 08:32:05 crc kubenswrapper[5004]: I1201 08:32:05.175319 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pm8rg" Dec 01 08:32:05 crc kubenswrapper[5004]: I1201 08:32:05.202363 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pm8rg"] Dec 01 08:32:05 crc kubenswrapper[5004]: I1201 08:32:05.285522 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3dc0820c-50da-40f7-8736-99133a66fb72-catalog-content\") pod \"redhat-marketplace-pm8rg\" (UID: \"3dc0820c-50da-40f7-8736-99133a66fb72\") " pod="openshift-marketplace/redhat-marketplace-pm8rg" Dec 01 08:32:05 crc kubenswrapper[5004]: I1201 08:32:05.285683 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3dc0820c-50da-40f7-8736-99133a66fb72-utilities\") pod \"redhat-marketplace-pm8rg\" (UID: \"3dc0820c-50da-40f7-8736-99133a66fb72\") " pod="openshift-marketplace/redhat-marketplace-pm8rg" Dec 01 08:32:05 crc kubenswrapper[5004]: I1201 08:32:05.285876 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dz6js\" (UniqueName: \"kubernetes.io/projected/3dc0820c-50da-40f7-8736-99133a66fb72-kube-api-access-dz6js\") pod \"redhat-marketplace-pm8rg\" (UID: \"3dc0820c-50da-40f7-8736-99133a66fb72\") " pod="openshift-marketplace/redhat-marketplace-pm8rg" Dec 01 08:32:05 crc kubenswrapper[5004]: I1201 08:32:05.388166 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dz6js\" (UniqueName: \"kubernetes.io/projected/3dc0820c-50da-40f7-8736-99133a66fb72-kube-api-access-dz6js\") pod \"redhat-marketplace-pm8rg\" (UID: \"3dc0820c-50da-40f7-8736-99133a66fb72\") " pod="openshift-marketplace/redhat-marketplace-pm8rg" Dec 01 08:32:05 crc kubenswrapper[5004]: I1201 08:32:05.388458 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3dc0820c-50da-40f7-8736-99133a66fb72-catalog-content\") pod \"redhat-marketplace-pm8rg\" (UID: \"3dc0820c-50da-40f7-8736-99133a66fb72\") " pod="openshift-marketplace/redhat-marketplace-pm8rg" Dec 01 08:32:05 crc kubenswrapper[5004]: I1201 08:32:05.388511 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3dc0820c-50da-40f7-8736-99133a66fb72-utilities\") pod \"redhat-marketplace-pm8rg\" (UID: \"3dc0820c-50da-40f7-8736-99133a66fb72\") " pod="openshift-marketplace/redhat-marketplace-pm8rg" Dec 01 08:32:05 crc kubenswrapper[5004]: I1201 08:32:05.389121 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3dc0820c-50da-40f7-8736-99133a66fb72-catalog-content\") pod \"redhat-marketplace-pm8rg\" (UID: \"3dc0820c-50da-40f7-8736-99133a66fb72\") " pod="openshift-marketplace/redhat-marketplace-pm8rg" Dec 01 08:32:05 crc kubenswrapper[5004]: I1201 08:32:05.389312 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3dc0820c-50da-40f7-8736-99133a66fb72-utilities\") pod \"redhat-marketplace-pm8rg\" (UID: \"3dc0820c-50da-40f7-8736-99133a66fb72\") " pod="openshift-marketplace/redhat-marketplace-pm8rg" Dec 01 08:32:05 crc kubenswrapper[5004]: I1201 08:32:05.411837 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dz6js\" (UniqueName: \"kubernetes.io/projected/3dc0820c-50da-40f7-8736-99133a66fb72-kube-api-access-dz6js\") pod \"redhat-marketplace-pm8rg\" (UID: \"3dc0820c-50da-40f7-8736-99133a66fb72\") " pod="openshift-marketplace/redhat-marketplace-pm8rg" Dec 01 08:32:05 crc kubenswrapper[5004]: I1201 08:32:05.499382 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pm8rg" Dec 01 08:32:05 crc kubenswrapper[5004]: I1201 08:32:05.755625 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pm8rg"] Dec 01 08:32:06 crc kubenswrapper[5004]: I1201 08:32:06.777979 5004 generic.go:334] "Generic (PLEG): container finished" podID="3dc0820c-50da-40f7-8736-99133a66fb72" containerID="28bb8d554902c78540707e40b71a7652af1d1428c644271d53c753500805d64a" exitCode=0 Dec 01 08:32:06 crc kubenswrapper[5004]: I1201 08:32:06.778081 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pm8rg" event={"ID":"3dc0820c-50da-40f7-8736-99133a66fb72","Type":"ContainerDied","Data":"28bb8d554902c78540707e40b71a7652af1d1428c644271d53c753500805d64a"} Dec 01 08:32:06 crc kubenswrapper[5004]: I1201 08:32:06.778482 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pm8rg" event={"ID":"3dc0820c-50da-40f7-8736-99133a66fb72","Type":"ContainerStarted","Data":"39284e9d4509363b47e05bf57e5067251e6e826e811c0f9d986af3e2b6a3b821"} Dec 01 08:32:07 crc kubenswrapper[5004]: I1201 08:32:07.796133 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pm8rg" event={"ID":"3dc0820c-50da-40f7-8736-99133a66fb72","Type":"ContainerStarted","Data":"e86a945b3aa45edbebc4d0cc360c50dfbeceae847fe85b152c9bf80432062ea3"} Dec 01 08:32:08 crc kubenswrapper[5004]: I1201 08:32:08.809644 5004 generic.go:334] "Generic (PLEG): container finished" podID="3dc0820c-50da-40f7-8736-99133a66fb72" containerID="e86a945b3aa45edbebc4d0cc360c50dfbeceae847fe85b152c9bf80432062ea3" exitCode=0 Dec 01 08:32:08 crc kubenswrapper[5004]: I1201 08:32:08.809881 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pm8rg" event={"ID":"3dc0820c-50da-40f7-8736-99133a66fb72","Type":"ContainerDied","Data":"e86a945b3aa45edbebc4d0cc360c50dfbeceae847fe85b152c9bf80432062ea3"} Dec 01 08:32:10 crc kubenswrapper[5004]: I1201 08:32:10.551198 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-k2gm5"] Dec 01 08:32:10 crc kubenswrapper[5004]: I1201 08:32:10.553234 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k2gm5" Dec 01 08:32:10 crc kubenswrapper[5004]: I1201 08:32:10.613398 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k2gm5"] Dec 01 08:32:10 crc kubenswrapper[5004]: I1201 08:32:10.685504 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qnzf\" (UniqueName: \"kubernetes.io/projected/d26ab356-1671-4d51-836b-d9bdc7204ea8-kube-api-access-2qnzf\") pod \"community-operators-k2gm5\" (UID: \"d26ab356-1671-4d51-836b-d9bdc7204ea8\") " pod="openshift-marketplace/community-operators-k2gm5" Dec 01 08:32:10 crc kubenswrapper[5004]: I1201 08:32:10.685770 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d26ab356-1671-4d51-836b-d9bdc7204ea8-utilities\") pod \"community-operators-k2gm5\" (UID: \"d26ab356-1671-4d51-836b-d9bdc7204ea8\") " pod="openshift-marketplace/community-operators-k2gm5" Dec 01 08:32:10 crc kubenswrapper[5004]: I1201 08:32:10.685880 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d26ab356-1671-4d51-836b-d9bdc7204ea8-catalog-content\") pod \"community-operators-k2gm5\" (UID: \"d26ab356-1671-4d51-836b-d9bdc7204ea8\") " pod="openshift-marketplace/community-operators-k2gm5" Dec 01 08:32:10 crc kubenswrapper[5004]: I1201 08:32:10.787422 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qnzf\" (UniqueName: \"kubernetes.io/projected/d26ab356-1671-4d51-836b-d9bdc7204ea8-kube-api-access-2qnzf\") pod \"community-operators-k2gm5\" (UID: \"d26ab356-1671-4d51-836b-d9bdc7204ea8\") " pod="openshift-marketplace/community-operators-k2gm5" Dec 01 08:32:10 crc kubenswrapper[5004]: I1201 08:32:10.787507 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d26ab356-1671-4d51-836b-d9bdc7204ea8-utilities\") pod \"community-operators-k2gm5\" (UID: \"d26ab356-1671-4d51-836b-d9bdc7204ea8\") " pod="openshift-marketplace/community-operators-k2gm5" Dec 01 08:32:10 crc kubenswrapper[5004]: I1201 08:32:10.787606 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d26ab356-1671-4d51-836b-d9bdc7204ea8-catalog-content\") pod \"community-operators-k2gm5\" (UID: \"d26ab356-1671-4d51-836b-d9bdc7204ea8\") " pod="openshift-marketplace/community-operators-k2gm5" Dec 01 08:32:10 crc kubenswrapper[5004]: I1201 08:32:10.788228 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d26ab356-1671-4d51-836b-d9bdc7204ea8-utilities\") pod \"community-operators-k2gm5\" (UID: \"d26ab356-1671-4d51-836b-d9bdc7204ea8\") " pod="openshift-marketplace/community-operators-k2gm5" Dec 01 08:32:10 crc kubenswrapper[5004]: I1201 08:32:10.788491 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d26ab356-1671-4d51-836b-d9bdc7204ea8-catalog-content\") pod \"community-operators-k2gm5\" (UID: \"d26ab356-1671-4d51-836b-d9bdc7204ea8\") " pod="openshift-marketplace/community-operators-k2gm5" Dec 01 08:32:10 crc kubenswrapper[5004]: I1201 08:32:10.810850 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qnzf\" (UniqueName: \"kubernetes.io/projected/d26ab356-1671-4d51-836b-d9bdc7204ea8-kube-api-access-2qnzf\") pod \"community-operators-k2gm5\" (UID: \"d26ab356-1671-4d51-836b-d9bdc7204ea8\") " pod="openshift-marketplace/community-operators-k2gm5" Dec 01 08:32:10 crc kubenswrapper[5004]: I1201 08:32:10.830293 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pm8rg" event={"ID":"3dc0820c-50da-40f7-8736-99133a66fb72","Type":"ContainerStarted","Data":"79d408c8a447c1fec3accd353f3d996c1b98ceaa4928e4228fe280ed2c564db9"} Dec 01 08:32:10 crc kubenswrapper[5004]: I1201 08:32:10.854587 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pm8rg" podStartSLOduration=2.732190191 podStartE2EDuration="5.854542878s" podCreationTimestamp="2025-12-01 08:32:05 +0000 UTC" firstStartedPulling="2025-12-01 08:32:06.781741044 +0000 UTC m=+904.346733056" lastFinishedPulling="2025-12-01 08:32:09.904093731 +0000 UTC m=+907.469085743" observedRunningTime="2025-12-01 08:32:10.85141084 +0000 UTC m=+908.416402862" watchObservedRunningTime="2025-12-01 08:32:10.854542878 +0000 UTC m=+908.419534860" Dec 01 08:32:10 crc kubenswrapper[5004]: I1201 08:32:10.870968 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k2gm5" Dec 01 08:32:11 crc kubenswrapper[5004]: I1201 08:32:11.386743 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k2gm5"] Dec 01 08:32:11 crc kubenswrapper[5004]: W1201 08:32:11.392746 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd26ab356_1671_4d51_836b_d9bdc7204ea8.slice/crio-987918ba502ae5607d548bf659b886bc7519cd2545c6d86ae95398f47e609840 WatchSource:0}: Error finding container 987918ba502ae5607d548bf659b886bc7519cd2545c6d86ae95398f47e609840: Status 404 returned error can't find the container with id 987918ba502ae5607d548bf659b886bc7519cd2545c6d86ae95398f47e609840 Dec 01 08:32:11 crc kubenswrapper[5004]: I1201 08:32:11.393128 5004 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: waiting for 15s after being ready Dec 01 08:32:11 crc kubenswrapper[5004]: I1201 08:32:11.393183 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="346e7dcd-bc03-4c7f-b0b0-8e5206230152" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 01 08:32:11 crc kubenswrapper[5004]: I1201 08:32:11.838873 5004 generic.go:334] "Generic (PLEG): container finished" podID="d26ab356-1671-4d51-836b-d9bdc7204ea8" containerID="88dd8a9542f34886e876260459ab696e320b2932f2e62337ea312b9e01c6ec6f" exitCode=0 Dec 01 08:32:11 crc kubenswrapper[5004]: I1201 08:32:11.838966 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k2gm5" event={"ID":"d26ab356-1671-4d51-836b-d9bdc7204ea8","Type":"ContainerDied","Data":"88dd8a9542f34886e876260459ab696e320b2932f2e62337ea312b9e01c6ec6f"} Dec 01 08:32:11 crc kubenswrapper[5004]: I1201 08:32:11.839054 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k2gm5" event={"ID":"d26ab356-1671-4d51-836b-d9bdc7204ea8","Type":"ContainerStarted","Data":"987918ba502ae5607d548bf659b886bc7519cd2545c6d86ae95398f47e609840"} Dec 01 08:32:13 crc kubenswrapper[5004]: I1201 08:32:13.868982 5004 generic.go:334] "Generic (PLEG): container finished" podID="d26ab356-1671-4d51-836b-d9bdc7204ea8" containerID="187295239eeb2327d04e0cad5fe0203ff5007b0d34122f94f8c0167f42ed8141" exitCode=0 Dec 01 08:32:13 crc kubenswrapper[5004]: I1201 08:32:13.869332 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k2gm5" event={"ID":"d26ab356-1671-4d51-836b-d9bdc7204ea8","Type":"ContainerDied","Data":"187295239eeb2327d04e0cad5fe0203ff5007b0d34122f94f8c0167f42ed8141"} Dec 01 08:32:14 crc kubenswrapper[5004]: I1201 08:32:14.882485 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k2gm5" event={"ID":"d26ab356-1671-4d51-836b-d9bdc7204ea8","Type":"ContainerStarted","Data":"58ef4ee9aa1719ddde9f646166e7a3e5a34e25df22e1b11c3965dd4b9c7edf9e"} Dec 01 08:32:14 crc kubenswrapper[5004]: I1201 08:32:14.918327 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-k2gm5" podStartSLOduration=2.24062002 podStartE2EDuration="4.918297137s" podCreationTimestamp="2025-12-01 08:32:10 +0000 UTC" firstStartedPulling="2025-12-01 08:32:11.840846023 +0000 UTC m=+909.405838005" lastFinishedPulling="2025-12-01 08:32:14.5185231 +0000 UTC m=+912.083515122" observedRunningTime="2025-12-01 08:32:14.911341515 +0000 UTC m=+912.476333537" watchObservedRunningTime="2025-12-01 08:32:14.918297137 +0000 UTC m=+912.483289179" Dec 01 08:32:15 crc kubenswrapper[5004]: I1201 08:32:15.499474 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pm8rg" Dec 01 08:32:15 crc kubenswrapper[5004]: I1201 08:32:15.499512 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pm8rg" Dec 01 08:32:15 crc kubenswrapper[5004]: I1201 08:32:15.543759 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pm8rg" Dec 01 08:32:15 crc kubenswrapper[5004]: I1201 08:32:15.935324 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pm8rg" Dec 01 08:32:17 crc kubenswrapper[5004]: I1201 08:32:17.738883 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pm8rg"] Dec 01 08:32:17 crc kubenswrapper[5004]: I1201 08:32:17.911024 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pm8rg" podUID="3dc0820c-50da-40f7-8736-99133a66fb72" containerName="registry-server" containerID="cri-o://79d408c8a447c1fec3accd353f3d996c1b98ceaa4928e4228fe280ed2c564db9" gracePeriod=2 Dec 01 08:32:18 crc kubenswrapper[5004]: I1201 08:32:18.380273 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pm8rg" Dec 01 08:32:18 crc kubenswrapper[5004]: I1201 08:32:18.549169 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3dc0820c-50da-40f7-8736-99133a66fb72-utilities\") pod \"3dc0820c-50da-40f7-8736-99133a66fb72\" (UID: \"3dc0820c-50da-40f7-8736-99133a66fb72\") " Dec 01 08:32:18 crc kubenswrapper[5004]: I1201 08:32:18.549242 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dz6js\" (UniqueName: \"kubernetes.io/projected/3dc0820c-50da-40f7-8736-99133a66fb72-kube-api-access-dz6js\") pod \"3dc0820c-50da-40f7-8736-99133a66fb72\" (UID: \"3dc0820c-50da-40f7-8736-99133a66fb72\") " Dec 01 08:32:18 crc kubenswrapper[5004]: I1201 08:32:18.549316 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3dc0820c-50da-40f7-8736-99133a66fb72-catalog-content\") pod \"3dc0820c-50da-40f7-8736-99133a66fb72\" (UID: \"3dc0820c-50da-40f7-8736-99133a66fb72\") " Dec 01 08:32:18 crc kubenswrapper[5004]: I1201 08:32:18.551051 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3dc0820c-50da-40f7-8736-99133a66fb72-utilities" (OuterVolumeSpecName: "utilities") pod "3dc0820c-50da-40f7-8736-99133a66fb72" (UID: "3dc0820c-50da-40f7-8736-99133a66fb72"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:32:18 crc kubenswrapper[5004]: I1201 08:32:18.558754 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dc0820c-50da-40f7-8736-99133a66fb72-kube-api-access-dz6js" (OuterVolumeSpecName: "kube-api-access-dz6js") pod "3dc0820c-50da-40f7-8736-99133a66fb72" (UID: "3dc0820c-50da-40f7-8736-99133a66fb72"). InnerVolumeSpecName "kube-api-access-dz6js". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:32:18 crc kubenswrapper[5004]: I1201 08:32:18.570657 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3dc0820c-50da-40f7-8736-99133a66fb72-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3dc0820c-50da-40f7-8736-99133a66fb72" (UID: "3dc0820c-50da-40f7-8736-99133a66fb72"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:32:18 crc kubenswrapper[5004]: I1201 08:32:18.651349 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3dc0820c-50da-40f7-8736-99133a66fb72-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 08:32:18 crc kubenswrapper[5004]: I1201 08:32:18.651405 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dz6js\" (UniqueName: \"kubernetes.io/projected/3dc0820c-50da-40f7-8736-99133a66fb72-kube-api-access-dz6js\") on node \"crc\" DevicePath \"\"" Dec 01 08:32:18 crc kubenswrapper[5004]: I1201 08:32:18.651428 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3dc0820c-50da-40f7-8736-99133a66fb72-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 08:32:18 crc kubenswrapper[5004]: I1201 08:32:18.925747 5004 generic.go:334] "Generic (PLEG): container finished" podID="3dc0820c-50da-40f7-8736-99133a66fb72" containerID="79d408c8a447c1fec3accd353f3d996c1b98ceaa4928e4228fe280ed2c564db9" exitCode=0 Dec 01 08:32:18 crc kubenswrapper[5004]: I1201 08:32:18.925805 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pm8rg" event={"ID":"3dc0820c-50da-40f7-8736-99133a66fb72","Type":"ContainerDied","Data":"79d408c8a447c1fec3accd353f3d996c1b98ceaa4928e4228fe280ed2c564db9"} Dec 01 08:32:18 crc kubenswrapper[5004]: I1201 08:32:18.925854 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pm8rg" Dec 01 08:32:18 crc kubenswrapper[5004]: I1201 08:32:18.925875 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pm8rg" event={"ID":"3dc0820c-50da-40f7-8736-99133a66fb72","Type":"ContainerDied","Data":"39284e9d4509363b47e05bf57e5067251e6e826e811c0f9d986af3e2b6a3b821"} Dec 01 08:32:18 crc kubenswrapper[5004]: I1201 08:32:18.925944 5004 scope.go:117] "RemoveContainer" containerID="79d408c8a447c1fec3accd353f3d996c1b98ceaa4928e4228fe280ed2c564db9" Dec 01 08:32:18 crc kubenswrapper[5004]: I1201 08:32:18.959809 5004 scope.go:117] "RemoveContainer" containerID="e86a945b3aa45edbebc4d0cc360c50dfbeceae847fe85b152c9bf80432062ea3" Dec 01 08:32:18 crc kubenswrapper[5004]: I1201 08:32:18.966878 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pm8rg"] Dec 01 08:32:18 crc kubenswrapper[5004]: I1201 08:32:18.987868 5004 scope.go:117] "RemoveContainer" containerID="28bb8d554902c78540707e40b71a7652af1d1428c644271d53c753500805d64a" Dec 01 08:32:18 crc kubenswrapper[5004]: I1201 08:32:18.988493 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pm8rg"] Dec 01 08:32:19 crc kubenswrapper[5004]: I1201 08:32:19.027597 5004 scope.go:117] "RemoveContainer" containerID="79d408c8a447c1fec3accd353f3d996c1b98ceaa4928e4228fe280ed2c564db9" Dec 01 08:32:19 crc kubenswrapper[5004]: E1201 08:32:19.033249 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79d408c8a447c1fec3accd353f3d996c1b98ceaa4928e4228fe280ed2c564db9\": container with ID starting with 79d408c8a447c1fec3accd353f3d996c1b98ceaa4928e4228fe280ed2c564db9 not found: ID does not exist" containerID="79d408c8a447c1fec3accd353f3d996c1b98ceaa4928e4228fe280ed2c564db9" Dec 01 08:32:19 crc kubenswrapper[5004]: I1201 08:32:19.033283 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79d408c8a447c1fec3accd353f3d996c1b98ceaa4928e4228fe280ed2c564db9"} err="failed to get container status \"79d408c8a447c1fec3accd353f3d996c1b98ceaa4928e4228fe280ed2c564db9\": rpc error: code = NotFound desc = could not find container \"79d408c8a447c1fec3accd353f3d996c1b98ceaa4928e4228fe280ed2c564db9\": container with ID starting with 79d408c8a447c1fec3accd353f3d996c1b98ceaa4928e4228fe280ed2c564db9 not found: ID does not exist" Dec 01 08:32:19 crc kubenswrapper[5004]: I1201 08:32:19.033381 5004 scope.go:117] "RemoveContainer" containerID="e86a945b3aa45edbebc4d0cc360c50dfbeceae847fe85b152c9bf80432062ea3" Dec 01 08:32:19 crc kubenswrapper[5004]: E1201 08:32:19.033936 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e86a945b3aa45edbebc4d0cc360c50dfbeceae847fe85b152c9bf80432062ea3\": container with ID starting with e86a945b3aa45edbebc4d0cc360c50dfbeceae847fe85b152c9bf80432062ea3 not found: ID does not exist" containerID="e86a945b3aa45edbebc4d0cc360c50dfbeceae847fe85b152c9bf80432062ea3" Dec 01 08:32:19 crc kubenswrapper[5004]: I1201 08:32:19.034021 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e86a945b3aa45edbebc4d0cc360c50dfbeceae847fe85b152c9bf80432062ea3"} err="failed to get container status \"e86a945b3aa45edbebc4d0cc360c50dfbeceae847fe85b152c9bf80432062ea3\": rpc error: code = NotFound desc = could not find container \"e86a945b3aa45edbebc4d0cc360c50dfbeceae847fe85b152c9bf80432062ea3\": container with ID starting with e86a945b3aa45edbebc4d0cc360c50dfbeceae847fe85b152c9bf80432062ea3 not found: ID does not exist" Dec 01 08:32:19 crc kubenswrapper[5004]: I1201 08:32:19.034074 5004 scope.go:117] "RemoveContainer" containerID="28bb8d554902c78540707e40b71a7652af1d1428c644271d53c753500805d64a" Dec 01 08:32:19 crc kubenswrapper[5004]: E1201 08:32:19.034788 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28bb8d554902c78540707e40b71a7652af1d1428c644271d53c753500805d64a\": container with ID starting with 28bb8d554902c78540707e40b71a7652af1d1428c644271d53c753500805d64a not found: ID does not exist" containerID="28bb8d554902c78540707e40b71a7652af1d1428c644271d53c753500805d64a" Dec 01 08:32:19 crc kubenswrapper[5004]: I1201 08:32:19.034818 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28bb8d554902c78540707e40b71a7652af1d1428c644271d53c753500805d64a"} err="failed to get container status \"28bb8d554902c78540707e40b71a7652af1d1428c644271d53c753500805d64a\": rpc error: code = NotFound desc = could not find container \"28bb8d554902c78540707e40b71a7652af1d1428c644271d53c753500805d64a\": container with ID starting with 28bb8d554902c78540707e40b71a7652af1d1428c644271d53c753500805d64a not found: ID does not exist" Dec 01 08:32:20 crc kubenswrapper[5004]: I1201 08:32:20.771994 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3dc0820c-50da-40f7-8736-99133a66fb72" path="/var/lib/kubelet/pods/3dc0820c-50da-40f7-8736-99133a66fb72/volumes" Dec 01 08:32:20 crc kubenswrapper[5004]: I1201 08:32:20.872328 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-k2gm5" Dec 01 08:32:20 crc kubenswrapper[5004]: I1201 08:32:20.873587 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-k2gm5" Dec 01 08:32:20 crc kubenswrapper[5004]: I1201 08:32:20.955394 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-k2gm5" Dec 01 08:32:21 crc kubenswrapper[5004]: I1201 08:32:21.388405 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-ingester-0" Dec 01 08:32:22 crc kubenswrapper[5004]: I1201 08:32:22.032628 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-k2gm5" Dec 01 08:32:22 crc kubenswrapper[5004]: I1201 08:32:22.930964 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k2gm5"] Dec 01 08:32:23 crc kubenswrapper[5004]: I1201 08:32:23.980511 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-k2gm5" podUID="d26ab356-1671-4d51-836b-d9bdc7204ea8" containerName="registry-server" containerID="cri-o://58ef4ee9aa1719ddde9f646166e7a3e5a34e25df22e1b11c3965dd4b9c7edf9e" gracePeriod=2 Dec 01 08:32:24 crc kubenswrapper[5004]: I1201 08:32:24.906340 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k2gm5" Dec 01 08:32:24 crc kubenswrapper[5004]: I1201 08:32:24.987591 5004 generic.go:334] "Generic (PLEG): container finished" podID="d26ab356-1671-4d51-836b-d9bdc7204ea8" containerID="58ef4ee9aa1719ddde9f646166e7a3e5a34e25df22e1b11c3965dd4b9c7edf9e" exitCode=0 Dec 01 08:32:24 crc kubenswrapper[5004]: I1201 08:32:24.987651 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k2gm5" Dec 01 08:32:24 crc kubenswrapper[5004]: I1201 08:32:24.987643 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k2gm5" event={"ID":"d26ab356-1671-4d51-836b-d9bdc7204ea8","Type":"ContainerDied","Data":"58ef4ee9aa1719ddde9f646166e7a3e5a34e25df22e1b11c3965dd4b9c7edf9e"} Dec 01 08:32:24 crc kubenswrapper[5004]: I1201 08:32:24.987704 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k2gm5" event={"ID":"d26ab356-1671-4d51-836b-d9bdc7204ea8","Type":"ContainerDied","Data":"987918ba502ae5607d548bf659b886bc7519cd2545c6d86ae95398f47e609840"} Dec 01 08:32:24 crc kubenswrapper[5004]: I1201 08:32:24.987731 5004 scope.go:117] "RemoveContainer" containerID="58ef4ee9aa1719ddde9f646166e7a3e5a34e25df22e1b11c3965dd4b9c7edf9e" Dec 01 08:32:25 crc kubenswrapper[5004]: I1201 08:32:25.013733 5004 scope.go:117] "RemoveContainer" containerID="187295239eeb2327d04e0cad5fe0203ff5007b0d34122f94f8c0167f42ed8141" Dec 01 08:32:25 crc kubenswrapper[5004]: I1201 08:32:25.036318 5004 scope.go:117] "RemoveContainer" containerID="88dd8a9542f34886e876260459ab696e320b2932f2e62337ea312b9e01c6ec6f" Dec 01 08:32:25 crc kubenswrapper[5004]: I1201 08:32:25.056606 5004 scope.go:117] "RemoveContainer" containerID="58ef4ee9aa1719ddde9f646166e7a3e5a34e25df22e1b11c3965dd4b9c7edf9e" Dec 01 08:32:25 crc kubenswrapper[5004]: E1201 08:32:25.056981 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58ef4ee9aa1719ddde9f646166e7a3e5a34e25df22e1b11c3965dd4b9c7edf9e\": container with ID starting with 58ef4ee9aa1719ddde9f646166e7a3e5a34e25df22e1b11c3965dd4b9c7edf9e not found: ID does not exist" containerID="58ef4ee9aa1719ddde9f646166e7a3e5a34e25df22e1b11c3965dd4b9c7edf9e" Dec 01 08:32:25 crc kubenswrapper[5004]: I1201 08:32:25.057014 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58ef4ee9aa1719ddde9f646166e7a3e5a34e25df22e1b11c3965dd4b9c7edf9e"} err="failed to get container status \"58ef4ee9aa1719ddde9f646166e7a3e5a34e25df22e1b11c3965dd4b9c7edf9e\": rpc error: code = NotFound desc = could not find container \"58ef4ee9aa1719ddde9f646166e7a3e5a34e25df22e1b11c3965dd4b9c7edf9e\": container with ID starting with 58ef4ee9aa1719ddde9f646166e7a3e5a34e25df22e1b11c3965dd4b9c7edf9e not found: ID does not exist" Dec 01 08:32:25 crc kubenswrapper[5004]: I1201 08:32:25.057036 5004 scope.go:117] "RemoveContainer" containerID="187295239eeb2327d04e0cad5fe0203ff5007b0d34122f94f8c0167f42ed8141" Dec 01 08:32:25 crc kubenswrapper[5004]: E1201 08:32:25.057621 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"187295239eeb2327d04e0cad5fe0203ff5007b0d34122f94f8c0167f42ed8141\": container with ID starting with 187295239eeb2327d04e0cad5fe0203ff5007b0d34122f94f8c0167f42ed8141 not found: ID does not exist" containerID="187295239eeb2327d04e0cad5fe0203ff5007b0d34122f94f8c0167f42ed8141" Dec 01 08:32:25 crc kubenswrapper[5004]: I1201 08:32:25.057686 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"187295239eeb2327d04e0cad5fe0203ff5007b0d34122f94f8c0167f42ed8141"} err="failed to get container status \"187295239eeb2327d04e0cad5fe0203ff5007b0d34122f94f8c0167f42ed8141\": rpc error: code = NotFound desc = could not find container \"187295239eeb2327d04e0cad5fe0203ff5007b0d34122f94f8c0167f42ed8141\": container with ID starting with 187295239eeb2327d04e0cad5fe0203ff5007b0d34122f94f8c0167f42ed8141 not found: ID does not exist" Dec 01 08:32:25 crc kubenswrapper[5004]: I1201 08:32:25.057726 5004 scope.go:117] "RemoveContainer" containerID="88dd8a9542f34886e876260459ab696e320b2932f2e62337ea312b9e01c6ec6f" Dec 01 08:32:25 crc kubenswrapper[5004]: E1201 08:32:25.058078 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88dd8a9542f34886e876260459ab696e320b2932f2e62337ea312b9e01c6ec6f\": container with ID starting with 88dd8a9542f34886e876260459ab696e320b2932f2e62337ea312b9e01c6ec6f not found: ID does not exist" containerID="88dd8a9542f34886e876260459ab696e320b2932f2e62337ea312b9e01c6ec6f" Dec 01 08:32:25 crc kubenswrapper[5004]: I1201 08:32:25.058101 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88dd8a9542f34886e876260459ab696e320b2932f2e62337ea312b9e01c6ec6f"} err="failed to get container status \"88dd8a9542f34886e876260459ab696e320b2932f2e62337ea312b9e01c6ec6f\": rpc error: code = NotFound desc = could not find container \"88dd8a9542f34886e876260459ab696e320b2932f2e62337ea312b9e01c6ec6f\": container with ID starting with 88dd8a9542f34886e876260459ab696e320b2932f2e62337ea312b9e01c6ec6f not found: ID does not exist" Dec 01 08:32:25 crc kubenswrapper[5004]: I1201 08:32:25.078486 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d26ab356-1671-4d51-836b-d9bdc7204ea8-utilities\") pod \"d26ab356-1671-4d51-836b-d9bdc7204ea8\" (UID: \"d26ab356-1671-4d51-836b-d9bdc7204ea8\") " Dec 01 08:32:25 crc kubenswrapper[5004]: I1201 08:32:25.078630 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d26ab356-1671-4d51-836b-d9bdc7204ea8-catalog-content\") pod \"d26ab356-1671-4d51-836b-d9bdc7204ea8\" (UID: \"d26ab356-1671-4d51-836b-d9bdc7204ea8\") " Dec 01 08:32:25 crc kubenswrapper[5004]: I1201 08:32:25.078773 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qnzf\" (UniqueName: \"kubernetes.io/projected/d26ab356-1671-4d51-836b-d9bdc7204ea8-kube-api-access-2qnzf\") pod \"d26ab356-1671-4d51-836b-d9bdc7204ea8\" (UID: \"d26ab356-1671-4d51-836b-d9bdc7204ea8\") " Dec 01 08:32:25 crc kubenswrapper[5004]: I1201 08:32:25.080404 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d26ab356-1671-4d51-836b-d9bdc7204ea8-utilities" (OuterVolumeSpecName: "utilities") pod "d26ab356-1671-4d51-836b-d9bdc7204ea8" (UID: "d26ab356-1671-4d51-836b-d9bdc7204ea8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:32:25 crc kubenswrapper[5004]: I1201 08:32:25.086275 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d26ab356-1671-4d51-836b-d9bdc7204ea8-kube-api-access-2qnzf" (OuterVolumeSpecName: "kube-api-access-2qnzf") pod "d26ab356-1671-4d51-836b-d9bdc7204ea8" (UID: "d26ab356-1671-4d51-836b-d9bdc7204ea8"). InnerVolumeSpecName "kube-api-access-2qnzf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:32:25 crc kubenswrapper[5004]: I1201 08:32:25.130913 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d26ab356-1671-4d51-836b-d9bdc7204ea8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d26ab356-1671-4d51-836b-d9bdc7204ea8" (UID: "d26ab356-1671-4d51-836b-d9bdc7204ea8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:32:25 crc kubenswrapper[5004]: I1201 08:32:25.180785 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qnzf\" (UniqueName: \"kubernetes.io/projected/d26ab356-1671-4d51-836b-d9bdc7204ea8-kube-api-access-2qnzf\") on node \"crc\" DevicePath \"\"" Dec 01 08:32:25 crc kubenswrapper[5004]: I1201 08:32:25.180823 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d26ab356-1671-4d51-836b-d9bdc7204ea8-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 08:32:25 crc kubenswrapper[5004]: I1201 08:32:25.180832 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d26ab356-1671-4d51-836b-d9bdc7204ea8-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 08:32:25 crc kubenswrapper[5004]: I1201 08:32:25.319817 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k2gm5"] Dec 01 08:32:25 crc kubenswrapper[5004]: I1201 08:32:25.324096 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-k2gm5"] Dec 01 08:32:26 crc kubenswrapper[5004]: I1201 08:32:26.770857 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d26ab356-1671-4d51-836b-d9bdc7204ea8" path="/var/lib/kubelet/pods/d26ab356-1671-4d51-836b-d9bdc7204ea8/volumes" Dec 01 08:32:34 crc kubenswrapper[5004]: I1201 08:32:34.178214 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xznmr"] Dec 01 08:32:34 crc kubenswrapper[5004]: E1201 08:32:34.179233 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d26ab356-1671-4d51-836b-d9bdc7204ea8" containerName="registry-server" Dec 01 08:32:34 crc kubenswrapper[5004]: I1201 08:32:34.179254 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="d26ab356-1671-4d51-836b-d9bdc7204ea8" containerName="registry-server" Dec 01 08:32:34 crc kubenswrapper[5004]: E1201 08:32:34.179276 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dc0820c-50da-40f7-8736-99133a66fb72" containerName="registry-server" Dec 01 08:32:34 crc kubenswrapper[5004]: I1201 08:32:34.179286 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dc0820c-50da-40f7-8736-99133a66fb72" containerName="registry-server" Dec 01 08:32:34 crc kubenswrapper[5004]: E1201 08:32:34.179303 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dc0820c-50da-40f7-8736-99133a66fb72" containerName="extract-utilities" Dec 01 08:32:34 crc kubenswrapper[5004]: I1201 08:32:34.179314 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dc0820c-50da-40f7-8736-99133a66fb72" containerName="extract-utilities" Dec 01 08:32:34 crc kubenswrapper[5004]: E1201 08:32:34.179325 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dc0820c-50da-40f7-8736-99133a66fb72" containerName="extract-content" Dec 01 08:32:34 crc kubenswrapper[5004]: I1201 08:32:34.179334 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dc0820c-50da-40f7-8736-99133a66fb72" containerName="extract-content" Dec 01 08:32:34 crc kubenswrapper[5004]: E1201 08:32:34.179376 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d26ab356-1671-4d51-836b-d9bdc7204ea8" containerName="extract-content" Dec 01 08:32:34 crc kubenswrapper[5004]: I1201 08:32:34.179387 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="d26ab356-1671-4d51-836b-d9bdc7204ea8" containerName="extract-content" Dec 01 08:32:34 crc kubenswrapper[5004]: E1201 08:32:34.179405 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d26ab356-1671-4d51-836b-d9bdc7204ea8" containerName="extract-utilities" Dec 01 08:32:34 crc kubenswrapper[5004]: I1201 08:32:34.179415 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="d26ab356-1671-4d51-836b-d9bdc7204ea8" containerName="extract-utilities" Dec 01 08:32:34 crc kubenswrapper[5004]: I1201 08:32:34.179633 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="3dc0820c-50da-40f7-8736-99133a66fb72" containerName="registry-server" Dec 01 08:32:34 crc kubenswrapper[5004]: I1201 08:32:34.179661 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="d26ab356-1671-4d51-836b-d9bdc7204ea8" containerName="registry-server" Dec 01 08:32:34 crc kubenswrapper[5004]: I1201 08:32:34.181234 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xznmr" Dec 01 08:32:34 crc kubenswrapper[5004]: I1201 08:32:34.206470 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xznmr"] Dec 01 08:32:34 crc kubenswrapper[5004]: I1201 08:32:34.345179 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/293e45a0-4c15-431b-9c41-8dd181912218-catalog-content\") pod \"certified-operators-xznmr\" (UID: \"293e45a0-4c15-431b-9c41-8dd181912218\") " pod="openshift-marketplace/certified-operators-xznmr" Dec 01 08:32:34 crc kubenswrapper[5004]: I1201 08:32:34.345414 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwsvk\" (UniqueName: \"kubernetes.io/projected/293e45a0-4c15-431b-9c41-8dd181912218-kube-api-access-xwsvk\") pod \"certified-operators-xznmr\" (UID: \"293e45a0-4c15-431b-9c41-8dd181912218\") " pod="openshift-marketplace/certified-operators-xznmr" Dec 01 08:32:34 crc kubenswrapper[5004]: I1201 08:32:34.345775 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/293e45a0-4c15-431b-9c41-8dd181912218-utilities\") pod \"certified-operators-xznmr\" (UID: \"293e45a0-4c15-431b-9c41-8dd181912218\") " pod="openshift-marketplace/certified-operators-xznmr" Dec 01 08:32:34 crc kubenswrapper[5004]: I1201 08:32:34.447341 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/293e45a0-4c15-431b-9c41-8dd181912218-catalog-content\") pod \"certified-operators-xznmr\" (UID: \"293e45a0-4c15-431b-9c41-8dd181912218\") " pod="openshift-marketplace/certified-operators-xznmr" Dec 01 08:32:34 crc kubenswrapper[5004]: I1201 08:32:34.447450 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwsvk\" (UniqueName: \"kubernetes.io/projected/293e45a0-4c15-431b-9c41-8dd181912218-kube-api-access-xwsvk\") pod \"certified-operators-xznmr\" (UID: \"293e45a0-4c15-431b-9c41-8dd181912218\") " pod="openshift-marketplace/certified-operators-xznmr" Dec 01 08:32:34 crc kubenswrapper[5004]: I1201 08:32:34.447538 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/293e45a0-4c15-431b-9c41-8dd181912218-utilities\") pod \"certified-operators-xznmr\" (UID: \"293e45a0-4c15-431b-9c41-8dd181912218\") " pod="openshift-marketplace/certified-operators-xznmr" Dec 01 08:32:34 crc kubenswrapper[5004]: I1201 08:32:34.448169 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/293e45a0-4c15-431b-9c41-8dd181912218-utilities\") pod \"certified-operators-xznmr\" (UID: \"293e45a0-4c15-431b-9c41-8dd181912218\") " pod="openshift-marketplace/certified-operators-xznmr" Dec 01 08:32:34 crc kubenswrapper[5004]: I1201 08:32:34.448169 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/293e45a0-4c15-431b-9c41-8dd181912218-catalog-content\") pod \"certified-operators-xznmr\" (UID: \"293e45a0-4c15-431b-9c41-8dd181912218\") " pod="openshift-marketplace/certified-operators-xznmr" Dec 01 08:32:34 crc kubenswrapper[5004]: I1201 08:32:34.480068 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwsvk\" (UniqueName: \"kubernetes.io/projected/293e45a0-4c15-431b-9c41-8dd181912218-kube-api-access-xwsvk\") pod \"certified-operators-xznmr\" (UID: \"293e45a0-4c15-431b-9c41-8dd181912218\") " pod="openshift-marketplace/certified-operators-xznmr" Dec 01 08:32:34 crc kubenswrapper[5004]: I1201 08:32:34.517007 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xznmr" Dec 01 08:32:34 crc kubenswrapper[5004]: I1201 08:32:34.800180 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xznmr"] Dec 01 08:32:35 crc kubenswrapper[5004]: I1201 08:32:35.091100 5004 generic.go:334] "Generic (PLEG): container finished" podID="293e45a0-4c15-431b-9c41-8dd181912218" containerID="a8f074e275836b152d5faa0eb24e1e0706366446ae6de11ea07bfa1be457d35e" exitCode=0 Dec 01 08:32:35 crc kubenswrapper[5004]: I1201 08:32:35.091314 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xznmr" event={"ID":"293e45a0-4c15-431b-9c41-8dd181912218","Type":"ContainerDied","Data":"a8f074e275836b152d5faa0eb24e1e0706366446ae6de11ea07bfa1be457d35e"} Dec 01 08:32:35 crc kubenswrapper[5004]: I1201 08:32:35.091382 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xznmr" event={"ID":"293e45a0-4c15-431b-9c41-8dd181912218","Type":"ContainerStarted","Data":"96dd0f77c27f71d9bf90a967fcbcc3fbd0399315f40e5ce5f372f3dc0a0c0468"} Dec 01 08:32:36 crc kubenswrapper[5004]: I1201 08:32:36.103855 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xznmr" event={"ID":"293e45a0-4c15-431b-9c41-8dd181912218","Type":"ContainerStarted","Data":"1ef3bd684cea2a94d6f11d5c583cda7183cfab6256488c40de86c21556df36e5"} Dec 01 08:32:37 crc kubenswrapper[5004]: I1201 08:32:37.115472 5004 generic.go:334] "Generic (PLEG): container finished" podID="293e45a0-4c15-431b-9c41-8dd181912218" containerID="1ef3bd684cea2a94d6f11d5c583cda7183cfab6256488c40de86c21556df36e5" exitCode=0 Dec 01 08:32:37 crc kubenswrapper[5004]: I1201 08:32:37.115529 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xznmr" event={"ID":"293e45a0-4c15-431b-9c41-8dd181912218","Type":"ContainerDied","Data":"1ef3bd684cea2a94d6f11d5c583cda7183cfab6256488c40de86c21556df36e5"} Dec 01 08:32:38 crc kubenswrapper[5004]: I1201 08:32:38.125368 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xznmr" event={"ID":"293e45a0-4c15-431b-9c41-8dd181912218","Type":"ContainerStarted","Data":"f085a75809d0f757a22f941159f0782368452107978b9b7268f10cb18b7de05c"} Dec 01 08:32:38 crc kubenswrapper[5004]: I1201 08:32:38.154509 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xznmr" podStartSLOduration=1.406818573 podStartE2EDuration="4.154484253s" podCreationTimestamp="2025-12-01 08:32:34 +0000 UTC" firstStartedPulling="2025-12-01 08:32:35.09240632 +0000 UTC m=+932.657398302" lastFinishedPulling="2025-12-01 08:32:37.84007197 +0000 UTC m=+935.405063982" observedRunningTime="2025-12-01 08:32:38.149666453 +0000 UTC m=+935.714658445" watchObservedRunningTime="2025-12-01 08:32:38.154484253 +0000 UTC m=+935.719476275" Dec 01 08:32:40 crc kubenswrapper[5004]: I1201 08:32:40.837189 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/collector-mwfcv"] Dec 01 08:32:40 crc kubenswrapper[5004]: I1201 08:32:40.839995 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-mwfcv" Dec 01 08:32:40 crc kubenswrapper[5004]: I1201 08:32:40.846447 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-syslog-receiver" Dec 01 08:32:40 crc kubenswrapper[5004]: I1201 08:32:40.846878 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-dockercfg-s8k7l" Dec 01 08:32:40 crc kubenswrapper[5004]: I1201 08:32:40.847168 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-token" Dec 01 08:32:40 crc kubenswrapper[5004]: I1201 08:32:40.847322 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-metrics" Dec 01 08:32:40 crc kubenswrapper[5004]: I1201 08:32:40.848731 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-config" Dec 01 08:32:40 crc kubenswrapper[5004]: I1201 08:32:40.858991 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-trustbundle" Dec 01 08:32:40 crc kubenswrapper[5004]: I1201 08:32:40.872447 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-mwfcv"] Dec 01 08:32:40 crc kubenswrapper[5004]: I1201 08:32:40.939307 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-logging/collector-mwfcv"] Dec 01 08:32:40 crc kubenswrapper[5004]: E1201 08:32:40.944030 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[collector-syslog-receiver collector-token config config-openshift-service-cacrt datadir entrypoint kube-api-access-fp78d metrics sa-token tmp trusted-ca], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-logging/collector-mwfcv" podUID="dc84512f-63bb-4738-a32f-ca0f53bd0e04" Dec 01 08:32:40 crc kubenswrapper[5004]: I1201 08:32:40.957759 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/dc84512f-63bb-4738-a32f-ca0f53bd0e04-sa-token\") pod \"collector-mwfcv\" (UID: \"dc84512f-63bb-4738-a32f-ca0f53bd0e04\") " pod="openshift-logging/collector-mwfcv" Dec 01 08:32:40 crc kubenswrapper[5004]: I1201 08:32:40.957812 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp78d\" (UniqueName: \"kubernetes.io/projected/dc84512f-63bb-4738-a32f-ca0f53bd0e04-kube-api-access-fp78d\") pod \"collector-mwfcv\" (UID: \"dc84512f-63bb-4738-a32f-ca0f53bd0e04\") " pod="openshift-logging/collector-mwfcv" Dec 01 08:32:40 crc kubenswrapper[5004]: I1201 08:32:40.957846 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/dc84512f-63bb-4738-a32f-ca0f53bd0e04-entrypoint\") pod \"collector-mwfcv\" (UID: \"dc84512f-63bb-4738-a32f-ca0f53bd0e04\") " pod="openshift-logging/collector-mwfcv" Dec 01 08:32:40 crc kubenswrapper[5004]: I1201 08:32:40.957872 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/dc84512f-63bb-4738-a32f-ca0f53bd0e04-datadir\") pod \"collector-mwfcv\" (UID: \"dc84512f-63bb-4738-a32f-ca0f53bd0e04\") " pod="openshift-logging/collector-mwfcv" Dec 01 08:32:40 crc kubenswrapper[5004]: I1201 08:32:40.957895 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/dc84512f-63bb-4738-a32f-ca0f53bd0e04-collector-token\") pod \"collector-mwfcv\" (UID: \"dc84512f-63bb-4738-a32f-ca0f53bd0e04\") " pod="openshift-logging/collector-mwfcv" Dec 01 08:32:40 crc kubenswrapper[5004]: I1201 08:32:40.957929 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc84512f-63bb-4738-a32f-ca0f53bd0e04-config\") pod \"collector-mwfcv\" (UID: \"dc84512f-63bb-4738-a32f-ca0f53bd0e04\") " pod="openshift-logging/collector-mwfcv" Dec 01 08:32:40 crc kubenswrapper[5004]: I1201 08:32:40.957953 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/dc84512f-63bb-4738-a32f-ca0f53bd0e04-metrics\") pod \"collector-mwfcv\" (UID: \"dc84512f-63bb-4738-a32f-ca0f53bd0e04\") " pod="openshift-logging/collector-mwfcv" Dec 01 08:32:40 crc kubenswrapper[5004]: I1201 08:32:40.957983 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/dc84512f-63bb-4738-a32f-ca0f53bd0e04-config-openshift-service-cacrt\") pod \"collector-mwfcv\" (UID: \"dc84512f-63bb-4738-a32f-ca0f53bd0e04\") " pod="openshift-logging/collector-mwfcv" Dec 01 08:32:40 crc kubenswrapper[5004]: I1201 08:32:40.958006 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/dc84512f-63bb-4738-a32f-ca0f53bd0e04-tmp\") pod \"collector-mwfcv\" (UID: \"dc84512f-63bb-4738-a32f-ca0f53bd0e04\") " pod="openshift-logging/collector-mwfcv" Dec 01 08:32:40 crc kubenswrapper[5004]: I1201 08:32:40.958056 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/dc84512f-63bb-4738-a32f-ca0f53bd0e04-collector-syslog-receiver\") pod \"collector-mwfcv\" (UID: \"dc84512f-63bb-4738-a32f-ca0f53bd0e04\") " pod="openshift-logging/collector-mwfcv" Dec 01 08:32:40 crc kubenswrapper[5004]: I1201 08:32:40.958091 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dc84512f-63bb-4738-a32f-ca0f53bd0e04-trusted-ca\") pod \"collector-mwfcv\" (UID: \"dc84512f-63bb-4738-a32f-ca0f53bd0e04\") " pod="openshift-logging/collector-mwfcv" Dec 01 08:32:41 crc kubenswrapper[5004]: I1201 08:32:41.059364 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/dc84512f-63bb-4738-a32f-ca0f53bd0e04-collector-syslog-receiver\") pod \"collector-mwfcv\" (UID: \"dc84512f-63bb-4738-a32f-ca0f53bd0e04\") " pod="openshift-logging/collector-mwfcv" Dec 01 08:32:41 crc kubenswrapper[5004]: I1201 08:32:41.059415 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dc84512f-63bb-4738-a32f-ca0f53bd0e04-trusted-ca\") pod \"collector-mwfcv\" (UID: \"dc84512f-63bb-4738-a32f-ca0f53bd0e04\") " pod="openshift-logging/collector-mwfcv" Dec 01 08:32:41 crc kubenswrapper[5004]: I1201 08:32:41.059475 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/dc84512f-63bb-4738-a32f-ca0f53bd0e04-sa-token\") pod \"collector-mwfcv\" (UID: \"dc84512f-63bb-4738-a32f-ca0f53bd0e04\") " pod="openshift-logging/collector-mwfcv" Dec 01 08:32:41 crc kubenswrapper[5004]: I1201 08:32:41.059491 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fp78d\" (UniqueName: \"kubernetes.io/projected/dc84512f-63bb-4738-a32f-ca0f53bd0e04-kube-api-access-fp78d\") pod \"collector-mwfcv\" (UID: \"dc84512f-63bb-4738-a32f-ca0f53bd0e04\") " pod="openshift-logging/collector-mwfcv" Dec 01 08:32:41 crc kubenswrapper[5004]: I1201 08:32:41.059511 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/dc84512f-63bb-4738-a32f-ca0f53bd0e04-entrypoint\") pod \"collector-mwfcv\" (UID: \"dc84512f-63bb-4738-a32f-ca0f53bd0e04\") " pod="openshift-logging/collector-mwfcv" Dec 01 08:32:41 crc kubenswrapper[5004]: I1201 08:32:41.059530 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/dc84512f-63bb-4738-a32f-ca0f53bd0e04-datadir\") pod \"collector-mwfcv\" (UID: \"dc84512f-63bb-4738-a32f-ca0f53bd0e04\") " pod="openshift-logging/collector-mwfcv" Dec 01 08:32:41 crc kubenswrapper[5004]: E1201 08:32:41.059543 5004 secret.go:188] Couldn't get secret openshift-logging/collector-syslog-receiver: secret "collector-syslog-receiver" not found Dec 01 08:32:41 crc kubenswrapper[5004]: I1201 08:32:41.059548 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/dc84512f-63bb-4738-a32f-ca0f53bd0e04-collector-token\") pod \"collector-mwfcv\" (UID: \"dc84512f-63bb-4738-a32f-ca0f53bd0e04\") " pod="openshift-logging/collector-mwfcv" Dec 01 08:32:41 crc kubenswrapper[5004]: E1201 08:32:41.059626 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc84512f-63bb-4738-a32f-ca0f53bd0e04-collector-syslog-receiver podName:dc84512f-63bb-4738-a32f-ca0f53bd0e04 nodeName:}" failed. No retries permitted until 2025-12-01 08:32:41.559608807 +0000 UTC m=+939.124600789 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "collector-syslog-receiver" (UniqueName: "kubernetes.io/secret/dc84512f-63bb-4738-a32f-ca0f53bd0e04-collector-syslog-receiver") pod "collector-mwfcv" (UID: "dc84512f-63bb-4738-a32f-ca0f53bd0e04") : secret "collector-syslog-receiver" not found Dec 01 08:32:41 crc kubenswrapper[5004]: I1201 08:32:41.059744 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc84512f-63bb-4738-a32f-ca0f53bd0e04-config\") pod \"collector-mwfcv\" (UID: \"dc84512f-63bb-4738-a32f-ca0f53bd0e04\") " pod="openshift-logging/collector-mwfcv" Dec 01 08:32:41 crc kubenswrapper[5004]: I1201 08:32:41.059745 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/dc84512f-63bb-4738-a32f-ca0f53bd0e04-datadir\") pod \"collector-mwfcv\" (UID: \"dc84512f-63bb-4738-a32f-ca0f53bd0e04\") " pod="openshift-logging/collector-mwfcv" Dec 01 08:32:41 crc kubenswrapper[5004]: I1201 08:32:41.059795 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/dc84512f-63bb-4738-a32f-ca0f53bd0e04-metrics\") pod \"collector-mwfcv\" (UID: \"dc84512f-63bb-4738-a32f-ca0f53bd0e04\") " pod="openshift-logging/collector-mwfcv" Dec 01 08:32:41 crc kubenswrapper[5004]: I1201 08:32:41.059870 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/dc84512f-63bb-4738-a32f-ca0f53bd0e04-config-openshift-service-cacrt\") pod \"collector-mwfcv\" (UID: \"dc84512f-63bb-4738-a32f-ca0f53bd0e04\") " pod="openshift-logging/collector-mwfcv" Dec 01 08:32:41 crc kubenswrapper[5004]: I1201 08:32:41.059918 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/dc84512f-63bb-4738-a32f-ca0f53bd0e04-tmp\") pod \"collector-mwfcv\" (UID: \"dc84512f-63bb-4738-a32f-ca0f53bd0e04\") " pod="openshift-logging/collector-mwfcv" Dec 01 08:32:41 crc kubenswrapper[5004]: I1201 08:32:41.060744 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/dc84512f-63bb-4738-a32f-ca0f53bd0e04-config-openshift-service-cacrt\") pod \"collector-mwfcv\" (UID: \"dc84512f-63bb-4738-a32f-ca0f53bd0e04\") " pod="openshift-logging/collector-mwfcv" Dec 01 08:32:41 crc kubenswrapper[5004]: I1201 08:32:41.060859 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc84512f-63bb-4738-a32f-ca0f53bd0e04-config\") pod \"collector-mwfcv\" (UID: \"dc84512f-63bb-4738-a32f-ca0f53bd0e04\") " pod="openshift-logging/collector-mwfcv" Dec 01 08:32:41 crc kubenswrapper[5004]: I1201 08:32:41.061079 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/dc84512f-63bb-4738-a32f-ca0f53bd0e04-entrypoint\") pod \"collector-mwfcv\" (UID: \"dc84512f-63bb-4738-a32f-ca0f53bd0e04\") " pod="openshift-logging/collector-mwfcv" Dec 01 08:32:41 crc kubenswrapper[5004]: I1201 08:32:41.061630 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dc84512f-63bb-4738-a32f-ca0f53bd0e04-trusted-ca\") pod \"collector-mwfcv\" (UID: \"dc84512f-63bb-4738-a32f-ca0f53bd0e04\") " pod="openshift-logging/collector-mwfcv" Dec 01 08:32:41 crc kubenswrapper[5004]: I1201 08:32:41.064314 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/dc84512f-63bb-4738-a32f-ca0f53bd0e04-tmp\") pod \"collector-mwfcv\" (UID: \"dc84512f-63bb-4738-a32f-ca0f53bd0e04\") " pod="openshift-logging/collector-mwfcv" Dec 01 08:32:41 crc kubenswrapper[5004]: I1201 08:32:41.065442 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/dc84512f-63bb-4738-a32f-ca0f53bd0e04-metrics\") pod \"collector-mwfcv\" (UID: \"dc84512f-63bb-4738-a32f-ca0f53bd0e04\") " pod="openshift-logging/collector-mwfcv" Dec 01 08:32:41 crc kubenswrapper[5004]: I1201 08:32:41.077266 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/dc84512f-63bb-4738-a32f-ca0f53bd0e04-collector-token\") pod \"collector-mwfcv\" (UID: \"dc84512f-63bb-4738-a32f-ca0f53bd0e04\") " pod="openshift-logging/collector-mwfcv" Dec 01 08:32:41 crc kubenswrapper[5004]: I1201 08:32:41.078276 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/dc84512f-63bb-4738-a32f-ca0f53bd0e04-sa-token\") pod \"collector-mwfcv\" (UID: \"dc84512f-63bb-4738-a32f-ca0f53bd0e04\") " pod="openshift-logging/collector-mwfcv" Dec 01 08:32:41 crc kubenswrapper[5004]: I1201 08:32:41.078415 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp78d\" (UniqueName: \"kubernetes.io/projected/dc84512f-63bb-4738-a32f-ca0f53bd0e04-kube-api-access-fp78d\") pod \"collector-mwfcv\" (UID: \"dc84512f-63bb-4738-a32f-ca0f53bd0e04\") " pod="openshift-logging/collector-mwfcv" Dec 01 08:32:41 crc kubenswrapper[5004]: I1201 08:32:41.151870 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-mwfcv" Dec 01 08:32:41 crc kubenswrapper[5004]: I1201 08:32:41.160091 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-mwfcv" Dec 01 08:32:41 crc kubenswrapper[5004]: I1201 08:32:41.262812 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/dc84512f-63bb-4738-a32f-ca0f53bd0e04-datadir\") pod \"dc84512f-63bb-4738-a32f-ca0f53bd0e04\" (UID: \"dc84512f-63bb-4738-a32f-ca0f53bd0e04\") " Dec 01 08:32:41 crc kubenswrapper[5004]: I1201 08:32:41.262938 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc84512f-63bb-4738-a32f-ca0f53bd0e04-config\") pod \"dc84512f-63bb-4738-a32f-ca0f53bd0e04\" (UID: \"dc84512f-63bb-4738-a32f-ca0f53bd0e04\") " Dec 01 08:32:41 crc kubenswrapper[5004]: I1201 08:32:41.262946 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dc84512f-63bb-4738-a32f-ca0f53bd0e04-datadir" (OuterVolumeSpecName: "datadir") pod "dc84512f-63bb-4738-a32f-ca0f53bd0e04" (UID: "dc84512f-63bb-4738-a32f-ca0f53bd0e04"). InnerVolumeSpecName "datadir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 08:32:41 crc kubenswrapper[5004]: I1201 08:32:41.262992 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fp78d\" (UniqueName: \"kubernetes.io/projected/dc84512f-63bb-4738-a32f-ca0f53bd0e04-kube-api-access-fp78d\") pod \"dc84512f-63bb-4738-a32f-ca0f53bd0e04\" (UID: \"dc84512f-63bb-4738-a32f-ca0f53bd0e04\") " Dec 01 08:32:41 crc kubenswrapper[5004]: I1201 08:32:41.263076 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/dc84512f-63bb-4738-a32f-ca0f53bd0e04-sa-token\") pod \"dc84512f-63bb-4738-a32f-ca0f53bd0e04\" (UID: \"dc84512f-63bb-4738-a32f-ca0f53bd0e04\") " Dec 01 08:32:41 crc kubenswrapper[5004]: I1201 08:32:41.263143 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/dc84512f-63bb-4738-a32f-ca0f53bd0e04-tmp\") pod \"dc84512f-63bb-4738-a32f-ca0f53bd0e04\" (UID: \"dc84512f-63bb-4738-a32f-ca0f53bd0e04\") " Dec 01 08:32:41 crc kubenswrapper[5004]: I1201 08:32:41.263193 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/dc84512f-63bb-4738-a32f-ca0f53bd0e04-entrypoint\") pod \"dc84512f-63bb-4738-a32f-ca0f53bd0e04\" (UID: \"dc84512f-63bb-4738-a32f-ca0f53bd0e04\") " Dec 01 08:32:41 crc kubenswrapper[5004]: I1201 08:32:41.263266 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/dc84512f-63bb-4738-a32f-ca0f53bd0e04-collector-token\") pod \"dc84512f-63bb-4738-a32f-ca0f53bd0e04\" (UID: \"dc84512f-63bb-4738-a32f-ca0f53bd0e04\") " Dec 01 08:32:41 crc kubenswrapper[5004]: I1201 08:32:41.263313 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/dc84512f-63bb-4738-a32f-ca0f53bd0e04-metrics\") pod \"dc84512f-63bb-4738-a32f-ca0f53bd0e04\" (UID: \"dc84512f-63bb-4738-a32f-ca0f53bd0e04\") " Dec 01 08:32:41 crc kubenswrapper[5004]: I1201 08:32:41.263356 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/dc84512f-63bb-4738-a32f-ca0f53bd0e04-config-openshift-service-cacrt\") pod \"dc84512f-63bb-4738-a32f-ca0f53bd0e04\" (UID: \"dc84512f-63bb-4738-a32f-ca0f53bd0e04\") " Dec 01 08:32:41 crc kubenswrapper[5004]: I1201 08:32:41.263403 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dc84512f-63bb-4738-a32f-ca0f53bd0e04-trusted-ca\") pod \"dc84512f-63bb-4738-a32f-ca0f53bd0e04\" (UID: \"dc84512f-63bb-4738-a32f-ca0f53bd0e04\") " Dec 01 08:32:41 crc kubenswrapper[5004]: I1201 08:32:41.263843 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc84512f-63bb-4738-a32f-ca0f53bd0e04-config-openshift-service-cacrt" (OuterVolumeSpecName: "config-openshift-service-cacrt") pod "dc84512f-63bb-4738-a32f-ca0f53bd0e04" (UID: "dc84512f-63bb-4738-a32f-ca0f53bd0e04"). InnerVolumeSpecName "config-openshift-service-cacrt". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:32:41 crc kubenswrapper[5004]: I1201 08:32:41.264217 5004 reconciler_common.go:293] "Volume detached for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/dc84512f-63bb-4738-a32f-ca0f53bd0e04-config-openshift-service-cacrt\") on node \"crc\" DevicePath \"\"" Dec 01 08:32:41 crc kubenswrapper[5004]: I1201 08:32:41.264245 5004 reconciler_common.go:293] "Volume detached for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/dc84512f-63bb-4738-a32f-ca0f53bd0e04-datadir\") on node \"crc\" DevicePath \"\"" Dec 01 08:32:41 crc kubenswrapper[5004]: I1201 08:32:41.264206 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc84512f-63bb-4738-a32f-ca0f53bd0e04-entrypoint" (OuterVolumeSpecName: "entrypoint") pod "dc84512f-63bb-4738-a32f-ca0f53bd0e04" (UID: "dc84512f-63bb-4738-a32f-ca0f53bd0e04"). InnerVolumeSpecName "entrypoint". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:32:41 crc kubenswrapper[5004]: I1201 08:32:41.264282 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc84512f-63bb-4738-a32f-ca0f53bd0e04-config" (OuterVolumeSpecName: "config") pod "dc84512f-63bb-4738-a32f-ca0f53bd0e04" (UID: "dc84512f-63bb-4738-a32f-ca0f53bd0e04"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:32:41 crc kubenswrapper[5004]: I1201 08:32:41.264411 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc84512f-63bb-4738-a32f-ca0f53bd0e04-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "dc84512f-63bb-4738-a32f-ca0f53bd0e04" (UID: "dc84512f-63bb-4738-a32f-ca0f53bd0e04"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:32:41 crc kubenswrapper[5004]: I1201 08:32:41.267093 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc84512f-63bb-4738-a32f-ca0f53bd0e04-tmp" (OuterVolumeSpecName: "tmp") pod "dc84512f-63bb-4738-a32f-ca0f53bd0e04" (UID: "dc84512f-63bb-4738-a32f-ca0f53bd0e04"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:32:41 crc kubenswrapper[5004]: I1201 08:32:41.267797 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc84512f-63bb-4738-a32f-ca0f53bd0e04-collector-token" (OuterVolumeSpecName: "collector-token") pod "dc84512f-63bb-4738-a32f-ca0f53bd0e04" (UID: "dc84512f-63bb-4738-a32f-ca0f53bd0e04"). InnerVolumeSpecName "collector-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:32:41 crc kubenswrapper[5004]: I1201 08:32:41.268722 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc84512f-63bb-4738-a32f-ca0f53bd0e04-kube-api-access-fp78d" (OuterVolumeSpecName: "kube-api-access-fp78d") pod "dc84512f-63bb-4738-a32f-ca0f53bd0e04" (UID: "dc84512f-63bb-4738-a32f-ca0f53bd0e04"). InnerVolumeSpecName "kube-api-access-fp78d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:32:41 crc kubenswrapper[5004]: I1201 08:32:41.276820 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc84512f-63bb-4738-a32f-ca0f53bd0e04-sa-token" (OuterVolumeSpecName: "sa-token") pod "dc84512f-63bb-4738-a32f-ca0f53bd0e04" (UID: "dc84512f-63bb-4738-a32f-ca0f53bd0e04"). InnerVolumeSpecName "sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:32:41 crc kubenswrapper[5004]: I1201 08:32:41.277764 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc84512f-63bb-4738-a32f-ca0f53bd0e04-metrics" (OuterVolumeSpecName: "metrics") pod "dc84512f-63bb-4738-a32f-ca0f53bd0e04" (UID: "dc84512f-63bb-4738-a32f-ca0f53bd0e04"). InnerVolumeSpecName "metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:32:41 crc kubenswrapper[5004]: I1201 08:32:41.365369 5004 reconciler_common.go:293] "Volume detached for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/dc84512f-63bb-4738-a32f-ca0f53bd0e04-collector-token\") on node \"crc\" DevicePath \"\"" Dec 01 08:32:41 crc kubenswrapper[5004]: I1201 08:32:41.365420 5004 reconciler_common.go:293] "Volume detached for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/dc84512f-63bb-4738-a32f-ca0f53bd0e04-metrics\") on node \"crc\" DevicePath \"\"" Dec 01 08:32:41 crc kubenswrapper[5004]: I1201 08:32:41.365429 5004 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dc84512f-63bb-4738-a32f-ca0f53bd0e04-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 08:32:41 crc kubenswrapper[5004]: I1201 08:32:41.365438 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc84512f-63bb-4738-a32f-ca0f53bd0e04-config\") on node \"crc\" DevicePath \"\"" Dec 01 08:32:41 crc kubenswrapper[5004]: I1201 08:32:41.365447 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fp78d\" (UniqueName: \"kubernetes.io/projected/dc84512f-63bb-4738-a32f-ca0f53bd0e04-kube-api-access-fp78d\") on node \"crc\" DevicePath \"\"" Dec 01 08:32:41 crc kubenswrapper[5004]: I1201 08:32:41.365456 5004 reconciler_common.go:293] "Volume detached for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/dc84512f-63bb-4738-a32f-ca0f53bd0e04-sa-token\") on node \"crc\" DevicePath \"\"" Dec 01 08:32:41 crc kubenswrapper[5004]: I1201 08:32:41.365464 5004 reconciler_common.go:293] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/dc84512f-63bb-4738-a32f-ca0f53bd0e04-tmp\") on node \"crc\" DevicePath \"\"" Dec 01 08:32:41 crc kubenswrapper[5004]: I1201 08:32:41.365472 5004 reconciler_common.go:293] "Volume detached for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/dc84512f-63bb-4738-a32f-ca0f53bd0e04-entrypoint\") on node \"crc\" DevicePath \"\"" Dec 01 08:32:41 crc kubenswrapper[5004]: I1201 08:32:41.569065 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/dc84512f-63bb-4738-a32f-ca0f53bd0e04-collector-syslog-receiver\") pod \"collector-mwfcv\" (UID: \"dc84512f-63bb-4738-a32f-ca0f53bd0e04\") " pod="openshift-logging/collector-mwfcv" Dec 01 08:32:41 crc kubenswrapper[5004]: I1201 08:32:41.580997 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/dc84512f-63bb-4738-a32f-ca0f53bd0e04-collector-syslog-receiver\") pod \"collector-mwfcv\" (UID: \"dc84512f-63bb-4738-a32f-ca0f53bd0e04\") " pod="openshift-logging/collector-mwfcv" Dec 01 08:32:41 crc kubenswrapper[5004]: I1201 08:32:41.772828 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/dc84512f-63bb-4738-a32f-ca0f53bd0e04-collector-syslog-receiver\") pod \"dc84512f-63bb-4738-a32f-ca0f53bd0e04\" (UID: \"dc84512f-63bb-4738-a32f-ca0f53bd0e04\") " Dec 01 08:32:41 crc kubenswrapper[5004]: I1201 08:32:41.777221 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc84512f-63bb-4738-a32f-ca0f53bd0e04-collector-syslog-receiver" (OuterVolumeSpecName: "collector-syslog-receiver") pod "dc84512f-63bb-4738-a32f-ca0f53bd0e04" (UID: "dc84512f-63bb-4738-a32f-ca0f53bd0e04"). InnerVolumeSpecName "collector-syslog-receiver". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:32:41 crc kubenswrapper[5004]: I1201 08:32:41.875287 5004 reconciler_common.go:293] "Volume detached for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/dc84512f-63bb-4738-a32f-ca0f53bd0e04-collector-syslog-receiver\") on node \"crc\" DevicePath \"\"" Dec 01 08:32:42 crc kubenswrapper[5004]: I1201 08:32:42.160756 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-mwfcv" Dec 01 08:32:42 crc kubenswrapper[5004]: I1201 08:32:42.246466 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-logging/collector-mwfcv"] Dec 01 08:32:42 crc kubenswrapper[5004]: I1201 08:32:42.268233 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-logging/collector-mwfcv"] Dec 01 08:32:42 crc kubenswrapper[5004]: I1201 08:32:42.282990 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/collector-qdtrq"] Dec 01 08:32:42 crc kubenswrapper[5004]: I1201 08:32:42.284604 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-qdtrq" Dec 01 08:32:42 crc kubenswrapper[5004]: I1201 08:32:42.286758 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-metrics" Dec 01 08:32:42 crc kubenswrapper[5004]: I1201 08:32:42.287060 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-syslog-receiver" Dec 01 08:32:42 crc kubenswrapper[5004]: I1201 08:32:42.287464 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-config" Dec 01 08:32:42 crc kubenswrapper[5004]: I1201 08:32:42.287719 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-dockercfg-s8k7l" Dec 01 08:32:42 crc kubenswrapper[5004]: I1201 08:32:42.287920 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-token" Dec 01 08:32:42 crc kubenswrapper[5004]: I1201 08:32:42.294937 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-qdtrq"] Dec 01 08:32:42 crc kubenswrapper[5004]: I1201 08:32:42.295195 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-trustbundle" Dec 01 08:32:42 crc kubenswrapper[5004]: I1201 08:32:42.385477 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/55505d98-5690-43fc-b3a9-87f4d3d8db26-collector-token\") pod \"collector-qdtrq\" (UID: \"55505d98-5690-43fc-b3a9-87f4d3d8db26\") " pod="openshift-logging/collector-qdtrq" Dec 01 08:32:42 crc kubenswrapper[5004]: I1201 08:32:42.385533 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/55505d98-5690-43fc-b3a9-87f4d3d8db26-tmp\") pod \"collector-qdtrq\" (UID: \"55505d98-5690-43fc-b3a9-87f4d3d8db26\") " pod="openshift-logging/collector-qdtrq" Dec 01 08:32:42 crc kubenswrapper[5004]: I1201 08:32:42.385576 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/55505d98-5690-43fc-b3a9-87f4d3d8db26-metrics\") pod \"collector-qdtrq\" (UID: \"55505d98-5690-43fc-b3a9-87f4d3d8db26\") " pod="openshift-logging/collector-qdtrq" Dec 01 08:32:42 crc kubenswrapper[5004]: I1201 08:32:42.385598 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/55505d98-5690-43fc-b3a9-87f4d3d8db26-entrypoint\") pod \"collector-qdtrq\" (UID: \"55505d98-5690-43fc-b3a9-87f4d3d8db26\") " pod="openshift-logging/collector-qdtrq" Dec 01 08:32:42 crc kubenswrapper[5004]: I1201 08:32:42.385638 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/55505d98-5690-43fc-b3a9-87f4d3d8db26-sa-token\") pod \"collector-qdtrq\" (UID: \"55505d98-5690-43fc-b3a9-87f4d3d8db26\") " pod="openshift-logging/collector-qdtrq" Dec 01 08:32:42 crc kubenswrapper[5004]: I1201 08:32:42.385676 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/55505d98-5690-43fc-b3a9-87f4d3d8db26-collector-syslog-receiver\") pod \"collector-qdtrq\" (UID: \"55505d98-5690-43fc-b3a9-87f4d3d8db26\") " pod="openshift-logging/collector-qdtrq" Dec 01 08:32:42 crc kubenswrapper[5004]: I1201 08:32:42.385706 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/55505d98-5690-43fc-b3a9-87f4d3d8db26-config-openshift-service-cacrt\") pod \"collector-qdtrq\" (UID: \"55505d98-5690-43fc-b3a9-87f4d3d8db26\") " pod="openshift-logging/collector-qdtrq" Dec 01 08:32:42 crc kubenswrapper[5004]: I1201 08:32:42.385738 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55505d98-5690-43fc-b3a9-87f4d3d8db26-config\") pod \"collector-qdtrq\" (UID: \"55505d98-5690-43fc-b3a9-87f4d3d8db26\") " pod="openshift-logging/collector-qdtrq" Dec 01 08:32:42 crc kubenswrapper[5004]: I1201 08:32:42.385811 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/55505d98-5690-43fc-b3a9-87f4d3d8db26-trusted-ca\") pod \"collector-qdtrq\" (UID: \"55505d98-5690-43fc-b3a9-87f4d3d8db26\") " pod="openshift-logging/collector-qdtrq" Dec 01 08:32:42 crc kubenswrapper[5004]: I1201 08:32:42.385883 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4d8q\" (UniqueName: \"kubernetes.io/projected/55505d98-5690-43fc-b3a9-87f4d3d8db26-kube-api-access-t4d8q\") pod \"collector-qdtrq\" (UID: \"55505d98-5690-43fc-b3a9-87f4d3d8db26\") " pod="openshift-logging/collector-qdtrq" Dec 01 08:32:42 crc kubenswrapper[5004]: I1201 08:32:42.385977 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/55505d98-5690-43fc-b3a9-87f4d3d8db26-datadir\") pod \"collector-qdtrq\" (UID: \"55505d98-5690-43fc-b3a9-87f4d3d8db26\") " pod="openshift-logging/collector-qdtrq" Dec 01 08:32:42 crc kubenswrapper[5004]: I1201 08:32:42.488244 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/55505d98-5690-43fc-b3a9-87f4d3d8db26-datadir\") pod \"collector-qdtrq\" (UID: \"55505d98-5690-43fc-b3a9-87f4d3d8db26\") " pod="openshift-logging/collector-qdtrq" Dec 01 08:32:42 crc kubenswrapper[5004]: I1201 08:32:42.488327 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/55505d98-5690-43fc-b3a9-87f4d3d8db26-collector-token\") pod \"collector-qdtrq\" (UID: \"55505d98-5690-43fc-b3a9-87f4d3d8db26\") " pod="openshift-logging/collector-qdtrq" Dec 01 08:32:42 crc kubenswrapper[5004]: I1201 08:32:42.488374 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/55505d98-5690-43fc-b3a9-87f4d3d8db26-tmp\") pod \"collector-qdtrq\" (UID: \"55505d98-5690-43fc-b3a9-87f4d3d8db26\") " pod="openshift-logging/collector-qdtrq" Dec 01 08:32:42 crc kubenswrapper[5004]: I1201 08:32:42.488422 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/55505d98-5690-43fc-b3a9-87f4d3d8db26-datadir\") pod \"collector-qdtrq\" (UID: \"55505d98-5690-43fc-b3a9-87f4d3d8db26\") " pod="openshift-logging/collector-qdtrq" Dec 01 08:32:42 crc kubenswrapper[5004]: I1201 08:32:42.489445 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/55505d98-5690-43fc-b3a9-87f4d3d8db26-metrics\") pod \"collector-qdtrq\" (UID: \"55505d98-5690-43fc-b3a9-87f4d3d8db26\") " pod="openshift-logging/collector-qdtrq" Dec 01 08:32:42 crc kubenswrapper[5004]: I1201 08:32:42.489518 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/55505d98-5690-43fc-b3a9-87f4d3d8db26-entrypoint\") pod \"collector-qdtrq\" (UID: \"55505d98-5690-43fc-b3a9-87f4d3d8db26\") " pod="openshift-logging/collector-qdtrq" Dec 01 08:32:42 crc kubenswrapper[5004]: I1201 08:32:42.489613 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/55505d98-5690-43fc-b3a9-87f4d3d8db26-sa-token\") pod \"collector-qdtrq\" (UID: \"55505d98-5690-43fc-b3a9-87f4d3d8db26\") " pod="openshift-logging/collector-qdtrq" Dec 01 08:32:42 crc kubenswrapper[5004]: I1201 08:32:42.489686 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/55505d98-5690-43fc-b3a9-87f4d3d8db26-collector-syslog-receiver\") pod \"collector-qdtrq\" (UID: \"55505d98-5690-43fc-b3a9-87f4d3d8db26\") " pod="openshift-logging/collector-qdtrq" Dec 01 08:32:42 crc kubenswrapper[5004]: I1201 08:32:42.489744 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/55505d98-5690-43fc-b3a9-87f4d3d8db26-config-openshift-service-cacrt\") pod \"collector-qdtrq\" (UID: \"55505d98-5690-43fc-b3a9-87f4d3d8db26\") " pod="openshift-logging/collector-qdtrq" Dec 01 08:32:42 crc kubenswrapper[5004]: I1201 08:32:42.489792 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55505d98-5690-43fc-b3a9-87f4d3d8db26-config\") pod \"collector-qdtrq\" (UID: \"55505d98-5690-43fc-b3a9-87f4d3d8db26\") " pod="openshift-logging/collector-qdtrq" Dec 01 08:32:42 crc kubenswrapper[5004]: I1201 08:32:42.489839 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/55505d98-5690-43fc-b3a9-87f4d3d8db26-trusted-ca\") pod \"collector-qdtrq\" (UID: \"55505d98-5690-43fc-b3a9-87f4d3d8db26\") " pod="openshift-logging/collector-qdtrq" Dec 01 08:32:42 crc kubenswrapper[5004]: I1201 08:32:42.489878 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4d8q\" (UniqueName: \"kubernetes.io/projected/55505d98-5690-43fc-b3a9-87f4d3d8db26-kube-api-access-t4d8q\") pod \"collector-qdtrq\" (UID: \"55505d98-5690-43fc-b3a9-87f4d3d8db26\") " pod="openshift-logging/collector-qdtrq" Dec 01 08:32:42 crc kubenswrapper[5004]: I1201 08:32:42.490307 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/55505d98-5690-43fc-b3a9-87f4d3d8db26-entrypoint\") pod \"collector-qdtrq\" (UID: \"55505d98-5690-43fc-b3a9-87f4d3d8db26\") " pod="openshift-logging/collector-qdtrq" Dec 01 08:32:42 crc kubenswrapper[5004]: I1201 08:32:42.490434 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/55505d98-5690-43fc-b3a9-87f4d3d8db26-config-openshift-service-cacrt\") pod \"collector-qdtrq\" (UID: \"55505d98-5690-43fc-b3a9-87f4d3d8db26\") " pod="openshift-logging/collector-qdtrq" Dec 01 08:32:42 crc kubenswrapper[5004]: I1201 08:32:42.492073 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55505d98-5690-43fc-b3a9-87f4d3d8db26-config\") pod \"collector-qdtrq\" (UID: \"55505d98-5690-43fc-b3a9-87f4d3d8db26\") " pod="openshift-logging/collector-qdtrq" Dec 01 08:32:42 crc kubenswrapper[5004]: I1201 08:32:42.492108 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/55505d98-5690-43fc-b3a9-87f4d3d8db26-collector-token\") pod \"collector-qdtrq\" (UID: \"55505d98-5690-43fc-b3a9-87f4d3d8db26\") " pod="openshift-logging/collector-qdtrq" Dec 01 08:32:42 crc kubenswrapper[5004]: I1201 08:32:42.492378 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/55505d98-5690-43fc-b3a9-87f4d3d8db26-collector-syslog-receiver\") pod \"collector-qdtrq\" (UID: \"55505d98-5690-43fc-b3a9-87f4d3d8db26\") " pod="openshift-logging/collector-qdtrq" Dec 01 08:32:42 crc kubenswrapper[5004]: I1201 08:32:42.492945 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/55505d98-5690-43fc-b3a9-87f4d3d8db26-tmp\") pod \"collector-qdtrq\" (UID: \"55505d98-5690-43fc-b3a9-87f4d3d8db26\") " pod="openshift-logging/collector-qdtrq" Dec 01 08:32:42 crc kubenswrapper[5004]: I1201 08:32:42.494374 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/55505d98-5690-43fc-b3a9-87f4d3d8db26-trusted-ca\") pod \"collector-qdtrq\" (UID: \"55505d98-5690-43fc-b3a9-87f4d3d8db26\") " pod="openshift-logging/collector-qdtrq" Dec 01 08:32:42 crc kubenswrapper[5004]: I1201 08:32:42.495975 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/55505d98-5690-43fc-b3a9-87f4d3d8db26-metrics\") pod \"collector-qdtrq\" (UID: \"55505d98-5690-43fc-b3a9-87f4d3d8db26\") " pod="openshift-logging/collector-qdtrq" Dec 01 08:32:42 crc kubenswrapper[5004]: I1201 08:32:42.504959 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/55505d98-5690-43fc-b3a9-87f4d3d8db26-sa-token\") pod \"collector-qdtrq\" (UID: \"55505d98-5690-43fc-b3a9-87f4d3d8db26\") " pod="openshift-logging/collector-qdtrq" Dec 01 08:32:42 crc kubenswrapper[5004]: I1201 08:32:42.505860 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4d8q\" (UniqueName: \"kubernetes.io/projected/55505d98-5690-43fc-b3a9-87f4d3d8db26-kube-api-access-t4d8q\") pod \"collector-qdtrq\" (UID: \"55505d98-5690-43fc-b3a9-87f4d3d8db26\") " pod="openshift-logging/collector-qdtrq" Dec 01 08:32:42 crc kubenswrapper[5004]: I1201 08:32:42.600988 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-qdtrq" Dec 01 08:32:42 crc kubenswrapper[5004]: I1201 08:32:42.773350 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc84512f-63bb-4738-a32f-ca0f53bd0e04" path="/var/lib/kubelet/pods/dc84512f-63bb-4738-a32f-ca0f53bd0e04/volumes" Dec 01 08:32:43 crc kubenswrapper[5004]: I1201 08:32:43.026963 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-qdtrq"] Dec 01 08:32:43 crc kubenswrapper[5004]: I1201 08:32:43.168932 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/collector-qdtrq" event={"ID":"55505d98-5690-43fc-b3a9-87f4d3d8db26","Type":"ContainerStarted","Data":"bb941ab2a1a9c764585c7feeac0857112fdab42ddb06d81a7bc02899e1babd06"} Dec 01 08:32:44 crc kubenswrapper[5004]: I1201 08:32:44.517661 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xznmr" Dec 01 08:32:44 crc kubenswrapper[5004]: I1201 08:32:44.517730 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xznmr" Dec 01 08:32:44 crc kubenswrapper[5004]: I1201 08:32:44.571104 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xznmr" Dec 01 08:32:45 crc kubenswrapper[5004]: I1201 08:32:45.226825 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xznmr" Dec 01 08:32:45 crc kubenswrapper[5004]: I1201 08:32:45.284721 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xznmr"] Dec 01 08:32:47 crc kubenswrapper[5004]: I1201 08:32:47.201000 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xznmr" podUID="293e45a0-4c15-431b-9c41-8dd181912218" containerName="registry-server" containerID="cri-o://f085a75809d0f757a22f941159f0782368452107978b9b7268f10cb18b7de05c" gracePeriod=2 Dec 01 08:32:48 crc kubenswrapper[5004]: I1201 08:32:48.207835 5004 generic.go:334] "Generic (PLEG): container finished" podID="293e45a0-4c15-431b-9c41-8dd181912218" containerID="f085a75809d0f757a22f941159f0782368452107978b9b7268f10cb18b7de05c" exitCode=0 Dec 01 08:32:48 crc kubenswrapper[5004]: I1201 08:32:48.207911 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xznmr" event={"ID":"293e45a0-4c15-431b-9c41-8dd181912218","Type":"ContainerDied","Data":"f085a75809d0f757a22f941159f0782368452107978b9b7268f10cb18b7de05c"} Dec 01 08:32:49 crc kubenswrapper[5004]: I1201 08:32:49.502077 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xznmr" Dec 01 08:32:49 crc kubenswrapper[5004]: I1201 08:32:49.602409 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwsvk\" (UniqueName: \"kubernetes.io/projected/293e45a0-4c15-431b-9c41-8dd181912218-kube-api-access-xwsvk\") pod \"293e45a0-4c15-431b-9c41-8dd181912218\" (UID: \"293e45a0-4c15-431b-9c41-8dd181912218\") " Dec 01 08:32:49 crc kubenswrapper[5004]: I1201 08:32:49.602498 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/293e45a0-4c15-431b-9c41-8dd181912218-catalog-content\") pod \"293e45a0-4c15-431b-9c41-8dd181912218\" (UID: \"293e45a0-4c15-431b-9c41-8dd181912218\") " Dec 01 08:32:49 crc kubenswrapper[5004]: I1201 08:32:49.602550 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/293e45a0-4c15-431b-9c41-8dd181912218-utilities\") pod \"293e45a0-4c15-431b-9c41-8dd181912218\" (UID: \"293e45a0-4c15-431b-9c41-8dd181912218\") " Dec 01 08:32:49 crc kubenswrapper[5004]: I1201 08:32:49.603581 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/293e45a0-4c15-431b-9c41-8dd181912218-utilities" (OuterVolumeSpecName: "utilities") pod "293e45a0-4c15-431b-9c41-8dd181912218" (UID: "293e45a0-4c15-431b-9c41-8dd181912218"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:32:49 crc kubenswrapper[5004]: I1201 08:32:49.607180 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/293e45a0-4c15-431b-9c41-8dd181912218-kube-api-access-xwsvk" (OuterVolumeSpecName: "kube-api-access-xwsvk") pod "293e45a0-4c15-431b-9c41-8dd181912218" (UID: "293e45a0-4c15-431b-9c41-8dd181912218"). InnerVolumeSpecName "kube-api-access-xwsvk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:32:49 crc kubenswrapper[5004]: I1201 08:32:49.658835 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/293e45a0-4c15-431b-9c41-8dd181912218-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "293e45a0-4c15-431b-9c41-8dd181912218" (UID: "293e45a0-4c15-431b-9c41-8dd181912218"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:32:49 crc kubenswrapper[5004]: I1201 08:32:49.716224 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwsvk\" (UniqueName: \"kubernetes.io/projected/293e45a0-4c15-431b-9c41-8dd181912218-kube-api-access-xwsvk\") on node \"crc\" DevicePath \"\"" Dec 01 08:32:49 crc kubenswrapper[5004]: I1201 08:32:49.716262 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/293e45a0-4c15-431b-9c41-8dd181912218-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 08:32:49 crc kubenswrapper[5004]: I1201 08:32:49.716272 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/293e45a0-4c15-431b-9c41-8dd181912218-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 08:32:50 crc kubenswrapper[5004]: I1201 08:32:50.227609 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xznmr" event={"ID":"293e45a0-4c15-431b-9c41-8dd181912218","Type":"ContainerDied","Data":"96dd0f77c27f71d9bf90a967fcbcc3fbd0399315f40e5ce5f372f3dc0a0c0468"} Dec 01 08:32:50 crc kubenswrapper[5004]: I1201 08:32:50.227958 5004 scope.go:117] "RemoveContainer" containerID="f085a75809d0f757a22f941159f0782368452107978b9b7268f10cb18b7de05c" Dec 01 08:32:50 crc kubenswrapper[5004]: I1201 08:32:50.227905 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xznmr" Dec 01 08:32:50 crc kubenswrapper[5004]: I1201 08:32:50.229920 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/collector-qdtrq" event={"ID":"55505d98-5690-43fc-b3a9-87f4d3d8db26","Type":"ContainerStarted","Data":"62f8d49cc16e00b3aad863a443884f585362168766aefceb6fc7f5e19fa0b5b2"} Dec 01 08:32:50 crc kubenswrapper[5004]: I1201 08:32:50.256310 5004 scope.go:117] "RemoveContainer" containerID="1ef3bd684cea2a94d6f11d5c583cda7183cfab6256488c40de86c21556df36e5" Dec 01 08:32:50 crc kubenswrapper[5004]: I1201 08:32:50.286208 5004 scope.go:117] "RemoveContainer" containerID="a8f074e275836b152d5faa0eb24e1e0706366446ae6de11ea07bfa1be457d35e" Dec 01 08:32:50 crc kubenswrapper[5004]: I1201 08:32:50.288544 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/collector-qdtrq" podStartSLOduration=2.060473572 podStartE2EDuration="8.288527923s" podCreationTimestamp="2025-12-01 08:32:42 +0000 UTC" firstStartedPulling="2025-12-01 08:32:43.034270518 +0000 UTC m=+940.599262500" lastFinishedPulling="2025-12-01 08:32:49.262324869 +0000 UTC m=+946.827316851" observedRunningTime="2025-12-01 08:32:50.286464672 +0000 UTC m=+947.851456664" watchObservedRunningTime="2025-12-01 08:32:50.288527923 +0000 UTC m=+947.853519905" Dec 01 08:32:50 crc kubenswrapper[5004]: I1201 08:32:50.355631 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xznmr"] Dec 01 08:32:50 crc kubenswrapper[5004]: I1201 08:32:50.366409 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xznmr"] Dec 01 08:32:50 crc kubenswrapper[5004]: I1201 08:32:50.772291 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="293e45a0-4c15-431b-9c41-8dd181912218" path="/var/lib/kubelet/pods/293e45a0-4c15-431b-9c41-8dd181912218/volumes" Dec 01 08:33:08 crc kubenswrapper[5004]: I1201 08:33:08.729313 5004 patch_prober.go:28] interesting pod/machine-config-daemon-fvdgt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 08:33:08 crc kubenswrapper[5004]: I1201 08:33:08.730086 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 08:33:19 crc kubenswrapper[5004]: I1201 08:33:19.930817 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fghhzz"] Dec 01 08:33:19 crc kubenswrapper[5004]: E1201 08:33:19.931568 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="293e45a0-4c15-431b-9c41-8dd181912218" containerName="registry-server" Dec 01 08:33:19 crc kubenswrapper[5004]: I1201 08:33:19.931592 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="293e45a0-4c15-431b-9c41-8dd181912218" containerName="registry-server" Dec 01 08:33:19 crc kubenswrapper[5004]: E1201 08:33:19.931615 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="293e45a0-4c15-431b-9c41-8dd181912218" containerName="extract-content" Dec 01 08:33:19 crc kubenswrapper[5004]: I1201 08:33:19.931622 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="293e45a0-4c15-431b-9c41-8dd181912218" containerName="extract-content" Dec 01 08:33:19 crc kubenswrapper[5004]: E1201 08:33:19.931644 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="293e45a0-4c15-431b-9c41-8dd181912218" containerName="extract-utilities" Dec 01 08:33:19 crc kubenswrapper[5004]: I1201 08:33:19.931650 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="293e45a0-4c15-431b-9c41-8dd181912218" containerName="extract-utilities" Dec 01 08:33:19 crc kubenswrapper[5004]: I1201 08:33:19.931767 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="293e45a0-4c15-431b-9c41-8dd181912218" containerName="registry-server" Dec 01 08:33:19 crc kubenswrapper[5004]: I1201 08:33:19.932900 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fghhzz" Dec 01 08:33:19 crc kubenswrapper[5004]: I1201 08:33:19.935768 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 01 08:33:19 crc kubenswrapper[5004]: I1201 08:33:19.955807 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fghhzz"] Dec 01 08:33:19 crc kubenswrapper[5004]: I1201 08:33:19.997708 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bbc10d54-27c3-4dcb-beb7-d1b675428a2c-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fghhzz\" (UID: \"bbc10d54-27c3-4dcb-beb7-d1b675428a2c\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fghhzz" Dec 01 08:33:19 crc kubenswrapper[5004]: I1201 08:33:19.997782 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bv52g\" (UniqueName: \"kubernetes.io/projected/bbc10d54-27c3-4dcb-beb7-d1b675428a2c-kube-api-access-bv52g\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fghhzz\" (UID: \"bbc10d54-27c3-4dcb-beb7-d1b675428a2c\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fghhzz" Dec 01 08:33:19 crc kubenswrapper[5004]: I1201 08:33:19.997833 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bbc10d54-27c3-4dcb-beb7-d1b675428a2c-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fghhzz\" (UID: \"bbc10d54-27c3-4dcb-beb7-d1b675428a2c\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fghhzz" Dec 01 08:33:20 crc kubenswrapper[5004]: I1201 08:33:20.099412 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bbc10d54-27c3-4dcb-beb7-d1b675428a2c-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fghhzz\" (UID: \"bbc10d54-27c3-4dcb-beb7-d1b675428a2c\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fghhzz" Dec 01 08:33:20 crc kubenswrapper[5004]: I1201 08:33:20.099531 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bv52g\" (UniqueName: \"kubernetes.io/projected/bbc10d54-27c3-4dcb-beb7-d1b675428a2c-kube-api-access-bv52g\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fghhzz\" (UID: \"bbc10d54-27c3-4dcb-beb7-d1b675428a2c\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fghhzz" Dec 01 08:33:20 crc kubenswrapper[5004]: I1201 08:33:20.099604 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bbc10d54-27c3-4dcb-beb7-d1b675428a2c-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fghhzz\" (UID: \"bbc10d54-27c3-4dcb-beb7-d1b675428a2c\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fghhzz" Dec 01 08:33:20 crc kubenswrapper[5004]: I1201 08:33:20.100328 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bbc10d54-27c3-4dcb-beb7-d1b675428a2c-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fghhzz\" (UID: \"bbc10d54-27c3-4dcb-beb7-d1b675428a2c\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fghhzz" Dec 01 08:33:20 crc kubenswrapper[5004]: I1201 08:33:20.100362 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bbc10d54-27c3-4dcb-beb7-d1b675428a2c-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fghhzz\" (UID: \"bbc10d54-27c3-4dcb-beb7-d1b675428a2c\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fghhzz" Dec 01 08:33:20 crc kubenswrapper[5004]: I1201 08:33:20.124957 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bv52g\" (UniqueName: \"kubernetes.io/projected/bbc10d54-27c3-4dcb-beb7-d1b675428a2c-kube-api-access-bv52g\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fghhzz\" (UID: \"bbc10d54-27c3-4dcb-beb7-d1b675428a2c\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fghhzz" Dec 01 08:33:20 crc kubenswrapper[5004]: I1201 08:33:20.251678 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fghhzz" Dec 01 08:33:20 crc kubenswrapper[5004]: I1201 08:33:20.657068 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fghhzz"] Dec 01 08:33:21 crc kubenswrapper[5004]: I1201 08:33:21.483003 5004 generic.go:334] "Generic (PLEG): container finished" podID="bbc10d54-27c3-4dcb-beb7-d1b675428a2c" containerID="b1efebb4bd610e1fc31c12350a2f6fc45564774e32cb2b13fd4f53b4c778d0a6" exitCode=0 Dec 01 08:33:21 crc kubenswrapper[5004]: I1201 08:33:21.483095 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fghhzz" event={"ID":"bbc10d54-27c3-4dcb-beb7-d1b675428a2c","Type":"ContainerDied","Data":"b1efebb4bd610e1fc31c12350a2f6fc45564774e32cb2b13fd4f53b4c778d0a6"} Dec 01 08:33:21 crc kubenswrapper[5004]: I1201 08:33:21.483276 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fghhzz" event={"ID":"bbc10d54-27c3-4dcb-beb7-d1b675428a2c","Type":"ContainerStarted","Data":"1e31f1f9e819f82e4ef386683cfb38d20e4da20309155d4e2b47a6b16feed42b"} Dec 01 08:33:23 crc kubenswrapper[5004]: I1201 08:33:23.503373 5004 generic.go:334] "Generic (PLEG): container finished" podID="bbc10d54-27c3-4dcb-beb7-d1b675428a2c" containerID="1a614c93e314732820c15c475041e7fc6f33069f914b50b938ed29815bb9b62e" exitCode=0 Dec 01 08:33:23 crc kubenswrapper[5004]: I1201 08:33:23.503411 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fghhzz" event={"ID":"bbc10d54-27c3-4dcb-beb7-d1b675428a2c","Type":"ContainerDied","Data":"1a614c93e314732820c15c475041e7fc6f33069f914b50b938ed29815bb9b62e"} Dec 01 08:33:24 crc kubenswrapper[5004]: I1201 08:33:24.513317 5004 generic.go:334] "Generic (PLEG): container finished" podID="bbc10d54-27c3-4dcb-beb7-d1b675428a2c" containerID="dbdf77a27ae8ce8f8f873531b54b96b68c30adeebd96c4594107f6601ef171ea" exitCode=0 Dec 01 08:33:24 crc kubenswrapper[5004]: I1201 08:33:24.513437 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fghhzz" event={"ID":"bbc10d54-27c3-4dcb-beb7-d1b675428a2c","Type":"ContainerDied","Data":"dbdf77a27ae8ce8f8f873531b54b96b68c30adeebd96c4594107f6601ef171ea"} Dec 01 08:33:25 crc kubenswrapper[5004]: I1201 08:33:25.811972 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fghhzz" Dec 01 08:33:25 crc kubenswrapper[5004]: I1201 08:33:25.895681 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bv52g\" (UniqueName: \"kubernetes.io/projected/bbc10d54-27c3-4dcb-beb7-d1b675428a2c-kube-api-access-bv52g\") pod \"bbc10d54-27c3-4dcb-beb7-d1b675428a2c\" (UID: \"bbc10d54-27c3-4dcb-beb7-d1b675428a2c\") " Dec 01 08:33:25 crc kubenswrapper[5004]: I1201 08:33:25.895818 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bbc10d54-27c3-4dcb-beb7-d1b675428a2c-bundle\") pod \"bbc10d54-27c3-4dcb-beb7-d1b675428a2c\" (UID: \"bbc10d54-27c3-4dcb-beb7-d1b675428a2c\") " Dec 01 08:33:25 crc kubenswrapper[5004]: I1201 08:33:25.895846 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bbc10d54-27c3-4dcb-beb7-d1b675428a2c-util\") pod \"bbc10d54-27c3-4dcb-beb7-d1b675428a2c\" (UID: \"bbc10d54-27c3-4dcb-beb7-d1b675428a2c\") " Dec 01 08:33:25 crc kubenswrapper[5004]: I1201 08:33:25.897420 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbc10d54-27c3-4dcb-beb7-d1b675428a2c-bundle" (OuterVolumeSpecName: "bundle") pod "bbc10d54-27c3-4dcb-beb7-d1b675428a2c" (UID: "bbc10d54-27c3-4dcb-beb7-d1b675428a2c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:33:25 crc kubenswrapper[5004]: I1201 08:33:25.910738 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbc10d54-27c3-4dcb-beb7-d1b675428a2c-kube-api-access-bv52g" (OuterVolumeSpecName: "kube-api-access-bv52g") pod "bbc10d54-27c3-4dcb-beb7-d1b675428a2c" (UID: "bbc10d54-27c3-4dcb-beb7-d1b675428a2c"). InnerVolumeSpecName "kube-api-access-bv52g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:33:25 crc kubenswrapper[5004]: I1201 08:33:25.911445 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbc10d54-27c3-4dcb-beb7-d1b675428a2c-util" (OuterVolumeSpecName: "util") pod "bbc10d54-27c3-4dcb-beb7-d1b675428a2c" (UID: "bbc10d54-27c3-4dcb-beb7-d1b675428a2c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:33:25 crc kubenswrapper[5004]: I1201 08:33:25.997815 5004 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bbc10d54-27c3-4dcb-beb7-d1b675428a2c-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:33:25 crc kubenswrapper[5004]: I1201 08:33:25.997865 5004 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bbc10d54-27c3-4dcb-beb7-d1b675428a2c-util\") on node \"crc\" DevicePath \"\"" Dec 01 08:33:25 crc kubenswrapper[5004]: I1201 08:33:25.997883 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bv52g\" (UniqueName: \"kubernetes.io/projected/bbc10d54-27c3-4dcb-beb7-d1b675428a2c-kube-api-access-bv52g\") on node \"crc\" DevicePath \"\"" Dec 01 08:33:26 crc kubenswrapper[5004]: I1201 08:33:26.529553 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fghhzz" event={"ID":"bbc10d54-27c3-4dcb-beb7-d1b675428a2c","Type":"ContainerDied","Data":"1e31f1f9e819f82e4ef386683cfb38d20e4da20309155d4e2b47a6b16feed42b"} Dec 01 08:33:26 crc kubenswrapper[5004]: I1201 08:33:26.529627 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fghhzz" Dec 01 08:33:26 crc kubenswrapper[5004]: I1201 08:33:26.529632 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e31f1f9e819f82e4ef386683cfb38d20e4da20309155d4e2b47a6b16feed42b" Dec 01 08:33:31 crc kubenswrapper[5004]: I1201 08:33:31.679722 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-tvfkn"] Dec 01 08:33:31 crc kubenswrapper[5004]: E1201 08:33:31.680378 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbc10d54-27c3-4dcb-beb7-d1b675428a2c" containerName="pull" Dec 01 08:33:31 crc kubenswrapper[5004]: I1201 08:33:31.680388 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbc10d54-27c3-4dcb-beb7-d1b675428a2c" containerName="pull" Dec 01 08:33:31 crc kubenswrapper[5004]: E1201 08:33:31.680406 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbc10d54-27c3-4dcb-beb7-d1b675428a2c" containerName="extract" Dec 01 08:33:31 crc kubenswrapper[5004]: I1201 08:33:31.680411 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbc10d54-27c3-4dcb-beb7-d1b675428a2c" containerName="extract" Dec 01 08:33:31 crc kubenswrapper[5004]: E1201 08:33:31.680421 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbc10d54-27c3-4dcb-beb7-d1b675428a2c" containerName="util" Dec 01 08:33:31 crc kubenswrapper[5004]: I1201 08:33:31.680426 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbc10d54-27c3-4dcb-beb7-d1b675428a2c" containerName="util" Dec 01 08:33:31 crc kubenswrapper[5004]: I1201 08:33:31.680539 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbc10d54-27c3-4dcb-beb7-d1b675428a2c" containerName="extract" Dec 01 08:33:31 crc kubenswrapper[5004]: I1201 08:33:31.681059 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-tvfkn" Dec 01 08:33:31 crc kubenswrapper[5004]: I1201 08:33:31.683454 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 01 08:33:31 crc kubenswrapper[5004]: I1201 08:33:31.686003 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-mv49v" Dec 01 08:33:31 crc kubenswrapper[5004]: I1201 08:33:31.689728 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-tvfkn"] Dec 01 08:33:31 crc kubenswrapper[5004]: I1201 08:33:31.690402 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 01 08:33:31 crc kubenswrapper[5004]: I1201 08:33:31.791708 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfdm8\" (UniqueName: \"kubernetes.io/projected/1ac61d16-3eff-40e4-af81-79516560f041-kube-api-access-lfdm8\") pod \"nmstate-operator-5b5b58f5c8-tvfkn\" (UID: \"1ac61d16-3eff-40e4-af81-79516560f041\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-tvfkn" Dec 01 08:33:31 crc kubenswrapper[5004]: I1201 08:33:31.893277 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfdm8\" (UniqueName: \"kubernetes.io/projected/1ac61d16-3eff-40e4-af81-79516560f041-kube-api-access-lfdm8\") pod \"nmstate-operator-5b5b58f5c8-tvfkn\" (UID: \"1ac61d16-3eff-40e4-af81-79516560f041\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-tvfkn" Dec 01 08:33:31 crc kubenswrapper[5004]: I1201 08:33:31.910159 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfdm8\" (UniqueName: \"kubernetes.io/projected/1ac61d16-3eff-40e4-af81-79516560f041-kube-api-access-lfdm8\") pod \"nmstate-operator-5b5b58f5c8-tvfkn\" (UID: \"1ac61d16-3eff-40e4-af81-79516560f041\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-tvfkn" Dec 01 08:33:32 crc kubenswrapper[5004]: I1201 08:33:32.027246 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-tvfkn" Dec 01 08:33:32 crc kubenswrapper[5004]: I1201 08:33:32.448721 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-tvfkn"] Dec 01 08:33:32 crc kubenswrapper[5004]: I1201 08:33:32.583322 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-tvfkn" event={"ID":"1ac61d16-3eff-40e4-af81-79516560f041","Type":"ContainerStarted","Data":"8a6c57c1a1cbcb4d4667cc4b561bf33a597f381db5c4c7b892175e3f7bb2dc2e"} Dec 01 08:33:35 crc kubenswrapper[5004]: I1201 08:33:35.611025 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-tvfkn" event={"ID":"1ac61d16-3eff-40e4-af81-79516560f041","Type":"ContainerStarted","Data":"c8e6ddd46da67c1a91e88e3f496431ad779d59fb81fe16906cefc9772fff5809"} Dec 01 08:33:35 crc kubenswrapper[5004]: I1201 08:33:35.638184 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-tvfkn" podStartSLOduration=2.190991045 podStartE2EDuration="4.63815394s" podCreationTimestamp="2025-12-01 08:33:31 +0000 UTC" firstStartedPulling="2025-12-01 08:33:32.473597996 +0000 UTC m=+990.038589978" lastFinishedPulling="2025-12-01 08:33:34.920760881 +0000 UTC m=+992.485752873" observedRunningTime="2025-12-01 08:33:35.631880186 +0000 UTC m=+993.196872198" watchObservedRunningTime="2025-12-01 08:33:35.63815394 +0000 UTC m=+993.203145962" Dec 01 08:33:38 crc kubenswrapper[5004]: I1201 08:33:38.729734 5004 patch_prober.go:28] interesting pod/machine-config-daemon-fvdgt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 08:33:38 crc kubenswrapper[5004]: I1201 08:33:38.730586 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.171717 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-phx6c"] Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.173465 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-phx6c" Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.175202 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-s8j4f" Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.190195 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-phx6c"] Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.195279 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-kl9rs"] Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.196263 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-kl9rs" Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.197635 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.210466 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-jdhw7"] Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.211265 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-jdhw7" Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.249827 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdwnn\" (UniqueName: \"kubernetes.io/projected/1027139c-9eed-42ec-8ba6-43c330579482-kube-api-access-qdwnn\") pod \"nmstate-handler-jdhw7\" (UID: \"1027139c-9eed-42ec-8ba6-43c330579482\") " pod="openshift-nmstate/nmstate-handler-jdhw7" Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.249878 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r96bf\" (UniqueName: \"kubernetes.io/projected/255659ba-de9a-4177-8a9f-42b2169ca1b8-kube-api-access-r96bf\") pod \"nmstate-metrics-7f946cbc9-phx6c\" (UID: \"255659ba-de9a-4177-8a9f-42b2169ca1b8\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-phx6c" Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.249921 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9prw\" (UniqueName: \"kubernetes.io/projected/70434b12-7582-4851-b32d-034f4c21603a-kube-api-access-g9prw\") pod \"nmstate-webhook-5f6d4c5ccb-kl9rs\" (UID: \"70434b12-7582-4851-b32d-034f4c21603a\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-kl9rs" Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.249949 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/1027139c-9eed-42ec-8ba6-43c330579482-nmstate-lock\") pod \"nmstate-handler-jdhw7\" (UID: \"1027139c-9eed-42ec-8ba6-43c330579482\") " pod="openshift-nmstate/nmstate-handler-jdhw7" Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.249979 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/70434b12-7582-4851-b32d-034f4c21603a-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-kl9rs\" (UID: \"70434b12-7582-4851-b32d-034f4c21603a\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-kl9rs" Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.249998 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/1027139c-9eed-42ec-8ba6-43c330579482-ovs-socket\") pod \"nmstate-handler-jdhw7\" (UID: \"1027139c-9eed-42ec-8ba6-43c330579482\") " pod="openshift-nmstate/nmstate-handler-jdhw7" Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.250013 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/1027139c-9eed-42ec-8ba6-43c330579482-dbus-socket\") pod \"nmstate-handler-jdhw7\" (UID: \"1027139c-9eed-42ec-8ba6-43c330579482\") " pod="openshift-nmstate/nmstate-handler-jdhw7" Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.262037 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-kl9rs"] Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.347737 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-624tf"] Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.348981 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-624tf" Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.351187 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r96bf\" (UniqueName: \"kubernetes.io/projected/255659ba-de9a-4177-8a9f-42b2169ca1b8-kube-api-access-r96bf\") pod \"nmstate-metrics-7f946cbc9-phx6c\" (UID: \"255659ba-de9a-4177-8a9f-42b2169ca1b8\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-phx6c" Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.351254 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9prw\" (UniqueName: \"kubernetes.io/projected/70434b12-7582-4851-b32d-034f4c21603a-kube-api-access-g9prw\") pod \"nmstate-webhook-5f6d4c5ccb-kl9rs\" (UID: \"70434b12-7582-4851-b32d-034f4c21603a\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-kl9rs" Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.351300 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/1027139c-9eed-42ec-8ba6-43c330579482-nmstate-lock\") pod \"nmstate-handler-jdhw7\" (UID: \"1027139c-9eed-42ec-8ba6-43c330579482\") " pod="openshift-nmstate/nmstate-handler-jdhw7" Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.351339 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/70434b12-7582-4851-b32d-034f4c21603a-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-kl9rs\" (UID: \"70434b12-7582-4851-b32d-034f4c21603a\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-kl9rs" Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.351364 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/1027139c-9eed-42ec-8ba6-43c330579482-ovs-socket\") pod \"nmstate-handler-jdhw7\" (UID: \"1027139c-9eed-42ec-8ba6-43c330579482\") " pod="openshift-nmstate/nmstate-handler-jdhw7" Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.351388 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/1027139c-9eed-42ec-8ba6-43c330579482-dbus-socket\") pod \"nmstate-handler-jdhw7\" (UID: \"1027139c-9eed-42ec-8ba6-43c330579482\") " pod="openshift-nmstate/nmstate-handler-jdhw7" Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.351452 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdwnn\" (UniqueName: \"kubernetes.io/projected/1027139c-9eed-42ec-8ba6-43c330579482-kube-api-access-qdwnn\") pod \"nmstate-handler-jdhw7\" (UID: \"1027139c-9eed-42ec-8ba6-43c330579482\") " pod="openshift-nmstate/nmstate-handler-jdhw7" Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.351634 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/1027139c-9eed-42ec-8ba6-43c330579482-ovs-socket\") pod \"nmstate-handler-jdhw7\" (UID: \"1027139c-9eed-42ec-8ba6-43c330579482\") " pod="openshift-nmstate/nmstate-handler-jdhw7" Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.351667 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/1027139c-9eed-42ec-8ba6-43c330579482-nmstate-lock\") pod \"nmstate-handler-jdhw7\" (UID: \"1027139c-9eed-42ec-8ba6-43c330579482\") " pod="openshift-nmstate/nmstate-handler-jdhw7" Dec 01 08:33:41 crc kubenswrapper[5004]: E1201 08:33:41.351726 5004 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Dec 01 08:33:41 crc kubenswrapper[5004]: E1201 08:33:41.351783 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70434b12-7582-4851-b32d-034f4c21603a-tls-key-pair podName:70434b12-7582-4851-b32d-034f4c21603a nodeName:}" failed. No retries permitted until 2025-12-01 08:33:41.85176482 +0000 UTC m=+999.416756892 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/70434b12-7582-4851-b32d-034f4c21603a-tls-key-pair") pod "nmstate-webhook-5f6d4c5ccb-kl9rs" (UID: "70434b12-7582-4851-b32d-034f4c21603a") : secret "openshift-nmstate-webhook" not found Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.351882 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/1027139c-9eed-42ec-8ba6-43c330579482-dbus-socket\") pod \"nmstate-handler-jdhw7\" (UID: \"1027139c-9eed-42ec-8ba6-43c330579482\") " pod="openshift-nmstate/nmstate-handler-jdhw7" Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.353957 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-624tf"] Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.356021 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-zzkpv" Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.356144 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.356395 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.378435 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9prw\" (UniqueName: \"kubernetes.io/projected/70434b12-7582-4851-b32d-034f4c21603a-kube-api-access-g9prw\") pod \"nmstate-webhook-5f6d4c5ccb-kl9rs\" (UID: \"70434b12-7582-4851-b32d-034f4c21603a\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-kl9rs" Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.379077 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdwnn\" (UniqueName: \"kubernetes.io/projected/1027139c-9eed-42ec-8ba6-43c330579482-kube-api-access-qdwnn\") pod \"nmstate-handler-jdhw7\" (UID: \"1027139c-9eed-42ec-8ba6-43c330579482\") " pod="openshift-nmstate/nmstate-handler-jdhw7" Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.385438 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r96bf\" (UniqueName: \"kubernetes.io/projected/255659ba-de9a-4177-8a9f-42b2169ca1b8-kube-api-access-r96bf\") pod \"nmstate-metrics-7f946cbc9-phx6c\" (UID: \"255659ba-de9a-4177-8a9f-42b2169ca1b8\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-phx6c" Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.452672 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wz8w\" (UniqueName: \"kubernetes.io/projected/21e94157-2305-46da-a56e-59dce2baa4ad-kube-api-access-4wz8w\") pod \"nmstate-console-plugin-7fbb5f6569-624tf\" (UID: \"21e94157-2305-46da-a56e-59dce2baa4ad\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-624tf" Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.452978 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/21e94157-2305-46da-a56e-59dce2baa4ad-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-624tf\" (UID: \"21e94157-2305-46da-a56e-59dce2baa4ad\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-624tf" Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.453015 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/21e94157-2305-46da-a56e-59dce2baa4ad-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-624tf\" (UID: \"21e94157-2305-46da-a56e-59dce2baa4ad\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-624tf" Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.491794 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-phx6c" Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.533944 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-df4fb84fc-flnws"] Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.535035 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-df4fb84fc-flnws" Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.549369 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-df4fb84fc-flnws"] Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.554255 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wz8w\" (UniqueName: \"kubernetes.io/projected/21e94157-2305-46da-a56e-59dce2baa4ad-kube-api-access-4wz8w\") pod \"nmstate-console-plugin-7fbb5f6569-624tf\" (UID: \"21e94157-2305-46da-a56e-59dce2baa4ad\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-624tf" Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.554293 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/21e94157-2305-46da-a56e-59dce2baa4ad-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-624tf\" (UID: \"21e94157-2305-46da-a56e-59dce2baa4ad\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-624tf" Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.554330 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/21e94157-2305-46da-a56e-59dce2baa4ad-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-624tf\" (UID: \"21e94157-2305-46da-a56e-59dce2baa4ad\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-624tf" Dec 01 08:33:41 crc kubenswrapper[5004]: E1201 08:33:41.554461 5004 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Dec 01 08:33:41 crc kubenswrapper[5004]: E1201 08:33:41.554514 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21e94157-2305-46da-a56e-59dce2baa4ad-plugin-serving-cert podName:21e94157-2305-46da-a56e-59dce2baa4ad nodeName:}" failed. No retries permitted until 2025-12-01 08:33:42.054499543 +0000 UTC m=+999.619491525 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/21e94157-2305-46da-a56e-59dce2baa4ad-plugin-serving-cert") pod "nmstate-console-plugin-7fbb5f6569-624tf" (UID: "21e94157-2305-46da-a56e-59dce2baa4ad") : secret "plugin-serving-cert" not found Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.555417 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/21e94157-2305-46da-a56e-59dce2baa4ad-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-624tf\" (UID: \"21e94157-2305-46da-a56e-59dce2baa4ad\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-624tf" Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.568866 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-jdhw7" Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.583104 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wz8w\" (UniqueName: \"kubernetes.io/projected/21e94157-2305-46da-a56e-59dce2baa4ad-kube-api-access-4wz8w\") pod \"nmstate-console-plugin-7fbb5f6569-624tf\" (UID: \"21e94157-2305-46da-a56e-59dce2baa4ad\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-624tf" Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.655757 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a5680092-beb9-4fe4-b35b-4c795980e350-console-oauth-config\") pod \"console-df4fb84fc-flnws\" (UID: \"a5680092-beb9-4fe4-b35b-4c795980e350\") " pod="openshift-console/console-df4fb84fc-flnws" Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.656071 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a5680092-beb9-4fe4-b35b-4c795980e350-service-ca\") pod \"console-df4fb84fc-flnws\" (UID: \"a5680092-beb9-4fe4-b35b-4c795980e350\") " pod="openshift-console/console-df4fb84fc-flnws" Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.656206 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5bqg\" (UniqueName: \"kubernetes.io/projected/a5680092-beb9-4fe4-b35b-4c795980e350-kube-api-access-l5bqg\") pod \"console-df4fb84fc-flnws\" (UID: \"a5680092-beb9-4fe4-b35b-4c795980e350\") " pod="openshift-console/console-df4fb84fc-flnws" Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.656230 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a5680092-beb9-4fe4-b35b-4c795980e350-console-serving-cert\") pod \"console-df4fb84fc-flnws\" (UID: \"a5680092-beb9-4fe4-b35b-4c795980e350\") " pod="openshift-console/console-df4fb84fc-flnws" Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.656252 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a5680092-beb9-4fe4-b35b-4c795980e350-console-config\") pod \"console-df4fb84fc-flnws\" (UID: \"a5680092-beb9-4fe4-b35b-4c795980e350\") " pod="openshift-console/console-df4fb84fc-flnws" Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.656279 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a5680092-beb9-4fe4-b35b-4c795980e350-oauth-serving-cert\") pod \"console-df4fb84fc-flnws\" (UID: \"a5680092-beb9-4fe4-b35b-4c795980e350\") " pod="openshift-console/console-df4fb84fc-flnws" Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.656380 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a5680092-beb9-4fe4-b35b-4c795980e350-trusted-ca-bundle\") pod \"console-df4fb84fc-flnws\" (UID: \"a5680092-beb9-4fe4-b35b-4c795980e350\") " pod="openshift-console/console-df4fb84fc-flnws" Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.667095 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-jdhw7" event={"ID":"1027139c-9eed-42ec-8ba6-43c330579482","Type":"ContainerStarted","Data":"33905e7867a5230817bcef35da5f7768eb15cefa2d34e4b8d93ccf97c24a12d8"} Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.759497 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5bqg\" (UniqueName: \"kubernetes.io/projected/a5680092-beb9-4fe4-b35b-4c795980e350-kube-api-access-l5bqg\") pod \"console-df4fb84fc-flnws\" (UID: \"a5680092-beb9-4fe4-b35b-4c795980e350\") " pod="openshift-console/console-df4fb84fc-flnws" Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.759545 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a5680092-beb9-4fe4-b35b-4c795980e350-console-serving-cert\") pod \"console-df4fb84fc-flnws\" (UID: \"a5680092-beb9-4fe4-b35b-4c795980e350\") " pod="openshift-console/console-df4fb84fc-flnws" Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.759580 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a5680092-beb9-4fe4-b35b-4c795980e350-console-config\") pod \"console-df4fb84fc-flnws\" (UID: \"a5680092-beb9-4fe4-b35b-4c795980e350\") " pod="openshift-console/console-df4fb84fc-flnws" Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.759599 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a5680092-beb9-4fe4-b35b-4c795980e350-oauth-serving-cert\") pod \"console-df4fb84fc-flnws\" (UID: \"a5680092-beb9-4fe4-b35b-4c795980e350\") " pod="openshift-console/console-df4fb84fc-flnws" Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.759618 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a5680092-beb9-4fe4-b35b-4c795980e350-trusted-ca-bundle\") pod \"console-df4fb84fc-flnws\" (UID: \"a5680092-beb9-4fe4-b35b-4c795980e350\") " pod="openshift-console/console-df4fb84fc-flnws" Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.759671 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a5680092-beb9-4fe4-b35b-4c795980e350-console-oauth-config\") pod \"console-df4fb84fc-flnws\" (UID: \"a5680092-beb9-4fe4-b35b-4c795980e350\") " pod="openshift-console/console-df4fb84fc-flnws" Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.759692 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a5680092-beb9-4fe4-b35b-4c795980e350-service-ca\") pod \"console-df4fb84fc-flnws\" (UID: \"a5680092-beb9-4fe4-b35b-4c795980e350\") " pod="openshift-console/console-df4fb84fc-flnws" Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.760494 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a5680092-beb9-4fe4-b35b-4c795980e350-service-ca\") pod \"console-df4fb84fc-flnws\" (UID: \"a5680092-beb9-4fe4-b35b-4c795980e350\") " pod="openshift-console/console-df4fb84fc-flnws" Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.761233 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a5680092-beb9-4fe4-b35b-4c795980e350-oauth-serving-cert\") pod \"console-df4fb84fc-flnws\" (UID: \"a5680092-beb9-4fe4-b35b-4c795980e350\") " pod="openshift-console/console-df4fb84fc-flnws" Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.761439 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a5680092-beb9-4fe4-b35b-4c795980e350-trusted-ca-bundle\") pod \"console-df4fb84fc-flnws\" (UID: \"a5680092-beb9-4fe4-b35b-4c795980e350\") " pod="openshift-console/console-df4fb84fc-flnws" Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.762030 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a5680092-beb9-4fe4-b35b-4c795980e350-console-config\") pod \"console-df4fb84fc-flnws\" (UID: \"a5680092-beb9-4fe4-b35b-4c795980e350\") " pod="openshift-console/console-df4fb84fc-flnws" Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.764879 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a5680092-beb9-4fe4-b35b-4c795980e350-console-oauth-config\") pod \"console-df4fb84fc-flnws\" (UID: \"a5680092-beb9-4fe4-b35b-4c795980e350\") " pod="openshift-console/console-df4fb84fc-flnws" Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.766475 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a5680092-beb9-4fe4-b35b-4c795980e350-console-serving-cert\") pod \"console-df4fb84fc-flnws\" (UID: \"a5680092-beb9-4fe4-b35b-4c795980e350\") " pod="openshift-console/console-df4fb84fc-flnws" Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.778264 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5bqg\" (UniqueName: \"kubernetes.io/projected/a5680092-beb9-4fe4-b35b-4c795980e350-kube-api-access-l5bqg\") pod \"console-df4fb84fc-flnws\" (UID: \"a5680092-beb9-4fe4-b35b-4c795980e350\") " pod="openshift-console/console-df4fb84fc-flnws" Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.860961 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/70434b12-7582-4851-b32d-034f4c21603a-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-kl9rs\" (UID: \"70434b12-7582-4851-b32d-034f4c21603a\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-kl9rs" Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.863204 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-df4fb84fc-flnws" Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.864079 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/70434b12-7582-4851-b32d-034f4c21603a-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-kl9rs\" (UID: \"70434b12-7582-4851-b32d-034f4c21603a\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-kl9rs" Dec 01 08:33:41 crc kubenswrapper[5004]: I1201 08:33:41.995893 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-phx6c"] Dec 01 08:33:42 crc kubenswrapper[5004]: I1201 08:33:42.064799 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/21e94157-2305-46da-a56e-59dce2baa4ad-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-624tf\" (UID: \"21e94157-2305-46da-a56e-59dce2baa4ad\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-624tf" Dec 01 08:33:42 crc kubenswrapper[5004]: I1201 08:33:42.068806 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/21e94157-2305-46da-a56e-59dce2baa4ad-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-624tf\" (UID: \"21e94157-2305-46da-a56e-59dce2baa4ad\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-624tf" Dec 01 08:33:42 crc kubenswrapper[5004]: I1201 08:33:42.157646 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-kl9rs" Dec 01 08:33:42 crc kubenswrapper[5004]: I1201 08:33:42.274676 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-624tf" Dec 01 08:33:42 crc kubenswrapper[5004]: I1201 08:33:42.294698 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-df4fb84fc-flnws"] Dec 01 08:33:42 crc kubenswrapper[5004]: W1201 08:33:42.311857 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5680092_beb9_4fe4_b35b_4c795980e350.slice/crio-df197f448c2301ab71449ed0b2648e5d631dd3b50232f1d832fa12a4258bbe66 WatchSource:0}: Error finding container df197f448c2301ab71449ed0b2648e5d631dd3b50232f1d832fa12a4258bbe66: Status 404 returned error can't find the container with id df197f448c2301ab71449ed0b2648e5d631dd3b50232f1d832fa12a4258bbe66 Dec 01 08:33:42 crc kubenswrapper[5004]: I1201 08:33:42.591716 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-kl9rs"] Dec 01 08:33:42 crc kubenswrapper[5004]: I1201 08:33:42.673897 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-phx6c" event={"ID":"255659ba-de9a-4177-8a9f-42b2169ca1b8","Type":"ContainerStarted","Data":"eec3473d570580868548050db9673d58dd7609afa99821c15e1e2e52b0b7f699"} Dec 01 08:33:42 crc kubenswrapper[5004]: I1201 08:33:42.675430 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-df4fb84fc-flnws" event={"ID":"a5680092-beb9-4fe4-b35b-4c795980e350","Type":"ContainerStarted","Data":"e143900ac807a60c11908b700a66e156576792fed4ea3c1340002b772b6d7488"} Dec 01 08:33:42 crc kubenswrapper[5004]: I1201 08:33:42.675452 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-df4fb84fc-flnws" event={"ID":"a5680092-beb9-4fe4-b35b-4c795980e350","Type":"ContainerStarted","Data":"df197f448c2301ab71449ed0b2648e5d631dd3b50232f1d832fa12a4258bbe66"} Dec 01 08:33:42 crc kubenswrapper[5004]: I1201 08:33:42.677432 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-kl9rs" event={"ID":"70434b12-7582-4851-b32d-034f4c21603a","Type":"ContainerStarted","Data":"9b82c4bb870cb0cc1af1eaa56396f02fcef48e165d2b9eb2ac9ee5701ae6b006"} Dec 01 08:33:42 crc kubenswrapper[5004]: I1201 08:33:42.691543 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-df4fb84fc-flnws" podStartSLOduration=1.691528927 podStartE2EDuration="1.691528927s" podCreationTimestamp="2025-12-01 08:33:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:33:42.689783625 +0000 UTC m=+1000.254775607" watchObservedRunningTime="2025-12-01 08:33:42.691528927 +0000 UTC m=+1000.256520899" Dec 01 08:33:42 crc kubenswrapper[5004]: I1201 08:33:42.739684 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-624tf"] Dec 01 08:33:42 crc kubenswrapper[5004]: W1201 08:33:42.747398 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21e94157_2305_46da_a56e_59dce2baa4ad.slice/crio-ab6c3acc95bb45d8436806b385ad81c7fd78e36e8c2e46d79911f38409604fc6 WatchSource:0}: Error finding container ab6c3acc95bb45d8436806b385ad81c7fd78e36e8c2e46d79911f38409604fc6: Status 404 returned error can't find the container with id ab6c3acc95bb45d8436806b385ad81c7fd78e36e8c2e46d79911f38409604fc6 Dec 01 08:33:43 crc kubenswrapper[5004]: I1201 08:33:43.689336 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-624tf" event={"ID":"21e94157-2305-46da-a56e-59dce2baa4ad","Type":"ContainerStarted","Data":"ab6c3acc95bb45d8436806b385ad81c7fd78e36e8c2e46d79911f38409604fc6"} Dec 01 08:33:44 crc kubenswrapper[5004]: I1201 08:33:44.696840 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-kl9rs" event={"ID":"70434b12-7582-4851-b32d-034f4c21603a","Type":"ContainerStarted","Data":"31da70a02eb8b2533cf4d82dadbda6e1bc8491ab891d2949cb87b0ae35aadf97"} Dec 01 08:33:44 crc kubenswrapper[5004]: I1201 08:33:44.697150 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-kl9rs" Dec 01 08:33:44 crc kubenswrapper[5004]: I1201 08:33:44.698301 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-jdhw7" event={"ID":"1027139c-9eed-42ec-8ba6-43c330579482","Type":"ContainerStarted","Data":"60884c7bc9d001d823bb457bf4c0957d6d8d0c2255381f16b99fde6862094df8"} Dec 01 08:33:44 crc kubenswrapper[5004]: I1201 08:33:44.699042 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-jdhw7" Dec 01 08:33:44 crc kubenswrapper[5004]: I1201 08:33:44.701140 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-phx6c" event={"ID":"255659ba-de9a-4177-8a9f-42b2169ca1b8","Type":"ContainerStarted","Data":"47f03d63fe3f4b4c4732e2103087fa6c942a117d974a86b527a1c2bcf78c2446"} Dec 01 08:33:44 crc kubenswrapper[5004]: I1201 08:33:44.729178 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-jdhw7" podStartSLOduration=1.137601468 podStartE2EDuration="3.729161755s" podCreationTimestamp="2025-12-01 08:33:41 +0000 UTC" firstStartedPulling="2025-12-01 08:33:41.621691882 +0000 UTC m=+999.186683864" lastFinishedPulling="2025-12-01 08:33:44.213252159 +0000 UTC m=+1001.778244151" observedRunningTime="2025-12-01 08:33:44.727233098 +0000 UTC m=+1002.292225080" watchObservedRunningTime="2025-12-01 08:33:44.729161755 +0000 UTC m=+1002.294153737" Dec 01 08:33:44 crc kubenswrapper[5004]: I1201 08:33:44.730376 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-kl9rs" podStartSLOduration=2.112039343 podStartE2EDuration="3.730369055s" podCreationTimestamp="2025-12-01 08:33:41 +0000 UTC" firstStartedPulling="2025-12-01 08:33:42.600215477 +0000 UTC m=+1000.165207469" lastFinishedPulling="2025-12-01 08:33:44.218545189 +0000 UTC m=+1001.783537181" observedRunningTime="2025-12-01 08:33:44.714008203 +0000 UTC m=+1002.279000185" watchObservedRunningTime="2025-12-01 08:33:44.730369055 +0000 UTC m=+1002.295361037" Dec 01 08:33:46 crc kubenswrapper[5004]: I1201 08:33:46.725420 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-624tf" event={"ID":"21e94157-2305-46da-a56e-59dce2baa4ad","Type":"ContainerStarted","Data":"2a45f3a3dd98f8918033cfa99f8eb8432c2c34f559c456a45e8f59547ef23374"} Dec 01 08:33:46 crc kubenswrapper[5004]: I1201 08:33:46.745436 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-624tf" podStartSLOduration=2.416437431 podStartE2EDuration="5.745412949s" podCreationTimestamp="2025-12-01 08:33:41 +0000 UTC" firstStartedPulling="2025-12-01 08:33:42.750006582 +0000 UTC m=+1000.314998564" lastFinishedPulling="2025-12-01 08:33:46.07898209 +0000 UTC m=+1003.643974082" observedRunningTime="2025-12-01 08:33:46.744275411 +0000 UTC m=+1004.309267413" watchObservedRunningTime="2025-12-01 08:33:46.745412949 +0000 UTC m=+1004.310404931" Dec 01 08:33:47 crc kubenswrapper[5004]: I1201 08:33:47.754506 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-phx6c" event={"ID":"255659ba-de9a-4177-8a9f-42b2169ca1b8","Type":"ContainerStarted","Data":"cd01ef9f62b3807902ad776d6d2793ffcf2ec838983e81d70a1f3a58a86c20f8"} Dec 01 08:33:47 crc kubenswrapper[5004]: I1201 08:33:47.798013 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-phx6c" podStartSLOduration=1.455691503 podStartE2EDuration="6.797988032s" podCreationTimestamp="2025-12-01 08:33:41 +0000 UTC" firstStartedPulling="2025-12-01 08:33:42.001311765 +0000 UTC m=+999.566303767" lastFinishedPulling="2025-12-01 08:33:47.343608294 +0000 UTC m=+1004.908600296" observedRunningTime="2025-12-01 08:33:47.777896449 +0000 UTC m=+1005.342888461" watchObservedRunningTime="2025-12-01 08:33:47.797988032 +0000 UTC m=+1005.362980024" Dec 01 08:33:51 crc kubenswrapper[5004]: I1201 08:33:51.604156 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-jdhw7" Dec 01 08:33:51 crc kubenswrapper[5004]: I1201 08:33:51.864076 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-df4fb84fc-flnws" Dec 01 08:33:51 crc kubenswrapper[5004]: I1201 08:33:51.864123 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-df4fb84fc-flnws" Dec 01 08:33:51 crc kubenswrapper[5004]: I1201 08:33:51.869698 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-df4fb84fc-flnws" Dec 01 08:33:52 crc kubenswrapper[5004]: I1201 08:33:52.831487 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-df4fb84fc-flnws" Dec 01 08:33:52 crc kubenswrapper[5004]: I1201 08:33:52.902027 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-95c5d55ff-kpnt7"] Dec 01 08:34:02 crc kubenswrapper[5004]: I1201 08:34:02.170164 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-kl9rs" Dec 01 08:34:08 crc kubenswrapper[5004]: I1201 08:34:08.729841 5004 patch_prober.go:28] interesting pod/machine-config-daemon-fvdgt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 08:34:08 crc kubenswrapper[5004]: I1201 08:34:08.730539 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 08:34:08 crc kubenswrapper[5004]: I1201 08:34:08.730642 5004 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" Dec 01 08:34:08 crc kubenswrapper[5004]: I1201 08:34:08.731727 5004 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"69d8f022c5a4f9a84dbe3000c7f3fecc6974868815a83043bd8a0d7a4a9a2e59"} pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 08:34:08 crc kubenswrapper[5004]: I1201 08:34:08.731830 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" containerID="cri-o://69d8f022c5a4f9a84dbe3000c7f3fecc6974868815a83043bd8a0d7a4a9a2e59" gracePeriod=600 Dec 01 08:34:08 crc kubenswrapper[5004]: I1201 08:34:08.952009 5004 generic.go:334] "Generic (PLEG): container finished" podID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerID="69d8f022c5a4f9a84dbe3000c7f3fecc6974868815a83043bd8a0d7a4a9a2e59" exitCode=0 Dec 01 08:34:08 crc kubenswrapper[5004]: I1201 08:34:08.952056 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" event={"ID":"f9977ebb-82de-4e96-8763-0b5a84f8d4ce","Type":"ContainerDied","Data":"69d8f022c5a4f9a84dbe3000c7f3fecc6974868815a83043bd8a0d7a4a9a2e59"} Dec 01 08:34:08 crc kubenswrapper[5004]: I1201 08:34:08.952094 5004 scope.go:117] "RemoveContainer" containerID="8b94d92321b66c5263a45c381dbbdfe95975b64015e15b4b3949d9d6b2469402" Dec 01 08:34:09 crc kubenswrapper[5004]: I1201 08:34:09.964447 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" event={"ID":"f9977ebb-82de-4e96-8763-0b5a84f8d4ce","Type":"ContainerStarted","Data":"da4b1d9e1788dd947ac4216eff1a285666eccd0fc7594a8fc8667307c82c4fdb"} Dec 01 08:34:17 crc kubenswrapper[5004]: I1201 08:34:17.962085 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-95c5d55ff-kpnt7" podUID="966ebea9-4ef2-491b-b170-b7f2788fbe9a" containerName="console" containerID="cri-o://5bb03cb9dcc51c08fbf21c194bcf770e7ab94aeb6c7a8a38d0b6154099f069f1" gracePeriod=15 Dec 01 08:34:18 crc kubenswrapper[5004]: I1201 08:34:18.406441 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-95c5d55ff-kpnt7_966ebea9-4ef2-491b-b170-b7f2788fbe9a/console/0.log" Dec 01 08:34:18 crc kubenswrapper[5004]: I1201 08:34:18.407016 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-95c5d55ff-kpnt7" Dec 01 08:34:18 crc kubenswrapper[5004]: I1201 08:34:18.517775 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/966ebea9-4ef2-491b-b170-b7f2788fbe9a-console-serving-cert\") pod \"966ebea9-4ef2-491b-b170-b7f2788fbe9a\" (UID: \"966ebea9-4ef2-491b-b170-b7f2788fbe9a\") " Dec 01 08:34:18 crc kubenswrapper[5004]: I1201 08:34:18.517920 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/966ebea9-4ef2-491b-b170-b7f2788fbe9a-oauth-serving-cert\") pod \"966ebea9-4ef2-491b-b170-b7f2788fbe9a\" (UID: \"966ebea9-4ef2-491b-b170-b7f2788fbe9a\") " Dec 01 08:34:18 crc kubenswrapper[5004]: I1201 08:34:18.517993 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/966ebea9-4ef2-491b-b170-b7f2788fbe9a-console-config\") pod \"966ebea9-4ef2-491b-b170-b7f2788fbe9a\" (UID: \"966ebea9-4ef2-491b-b170-b7f2788fbe9a\") " Dec 01 08:34:18 crc kubenswrapper[5004]: I1201 08:34:18.518074 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/966ebea9-4ef2-491b-b170-b7f2788fbe9a-console-oauth-config\") pod \"966ebea9-4ef2-491b-b170-b7f2788fbe9a\" (UID: \"966ebea9-4ef2-491b-b170-b7f2788fbe9a\") " Dec 01 08:34:18 crc kubenswrapper[5004]: I1201 08:34:18.518134 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/966ebea9-4ef2-491b-b170-b7f2788fbe9a-service-ca\") pod \"966ebea9-4ef2-491b-b170-b7f2788fbe9a\" (UID: \"966ebea9-4ef2-491b-b170-b7f2788fbe9a\") " Dec 01 08:34:18 crc kubenswrapper[5004]: I1201 08:34:18.518205 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/966ebea9-4ef2-491b-b170-b7f2788fbe9a-trusted-ca-bundle\") pod \"966ebea9-4ef2-491b-b170-b7f2788fbe9a\" (UID: \"966ebea9-4ef2-491b-b170-b7f2788fbe9a\") " Dec 01 08:34:18 crc kubenswrapper[5004]: I1201 08:34:18.518273 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7x8j\" (UniqueName: \"kubernetes.io/projected/966ebea9-4ef2-491b-b170-b7f2788fbe9a-kube-api-access-g7x8j\") pod \"966ebea9-4ef2-491b-b170-b7f2788fbe9a\" (UID: \"966ebea9-4ef2-491b-b170-b7f2788fbe9a\") " Dec 01 08:34:18 crc kubenswrapper[5004]: I1201 08:34:18.518765 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/966ebea9-4ef2-491b-b170-b7f2788fbe9a-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "966ebea9-4ef2-491b-b170-b7f2788fbe9a" (UID: "966ebea9-4ef2-491b-b170-b7f2788fbe9a"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:34:18 crc kubenswrapper[5004]: I1201 08:34:18.519184 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/966ebea9-4ef2-491b-b170-b7f2788fbe9a-service-ca" (OuterVolumeSpecName: "service-ca") pod "966ebea9-4ef2-491b-b170-b7f2788fbe9a" (UID: "966ebea9-4ef2-491b-b170-b7f2788fbe9a"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:34:18 crc kubenswrapper[5004]: I1201 08:34:18.519267 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/966ebea9-4ef2-491b-b170-b7f2788fbe9a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "966ebea9-4ef2-491b-b170-b7f2788fbe9a" (UID: "966ebea9-4ef2-491b-b170-b7f2788fbe9a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:34:18 crc kubenswrapper[5004]: I1201 08:34:18.520353 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/966ebea9-4ef2-491b-b170-b7f2788fbe9a-console-config" (OuterVolumeSpecName: "console-config") pod "966ebea9-4ef2-491b-b170-b7f2788fbe9a" (UID: "966ebea9-4ef2-491b-b170-b7f2788fbe9a"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:34:18 crc kubenswrapper[5004]: I1201 08:34:18.523429 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/966ebea9-4ef2-491b-b170-b7f2788fbe9a-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "966ebea9-4ef2-491b-b170-b7f2788fbe9a" (UID: "966ebea9-4ef2-491b-b170-b7f2788fbe9a"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:34:18 crc kubenswrapper[5004]: I1201 08:34:18.523973 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/966ebea9-4ef2-491b-b170-b7f2788fbe9a-kube-api-access-g7x8j" (OuterVolumeSpecName: "kube-api-access-g7x8j") pod "966ebea9-4ef2-491b-b170-b7f2788fbe9a" (UID: "966ebea9-4ef2-491b-b170-b7f2788fbe9a"). InnerVolumeSpecName "kube-api-access-g7x8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:34:18 crc kubenswrapper[5004]: I1201 08:34:18.524425 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/966ebea9-4ef2-491b-b170-b7f2788fbe9a-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "966ebea9-4ef2-491b-b170-b7f2788fbe9a" (UID: "966ebea9-4ef2-491b-b170-b7f2788fbe9a"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:34:18 crc kubenswrapper[5004]: I1201 08:34:18.619819 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7x8j\" (UniqueName: \"kubernetes.io/projected/966ebea9-4ef2-491b-b170-b7f2788fbe9a-kube-api-access-g7x8j\") on node \"crc\" DevicePath \"\"" Dec 01 08:34:18 crc kubenswrapper[5004]: I1201 08:34:18.619888 5004 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/966ebea9-4ef2-491b-b170-b7f2788fbe9a-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 08:34:18 crc kubenswrapper[5004]: I1201 08:34:18.619901 5004 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/966ebea9-4ef2-491b-b170-b7f2788fbe9a-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 08:34:18 crc kubenswrapper[5004]: I1201 08:34:18.619938 5004 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/966ebea9-4ef2-491b-b170-b7f2788fbe9a-console-config\") on node \"crc\" DevicePath \"\"" Dec 01 08:34:18 crc kubenswrapper[5004]: I1201 08:34:18.619951 5004 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/966ebea9-4ef2-491b-b170-b7f2788fbe9a-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 01 08:34:18 crc kubenswrapper[5004]: I1201 08:34:18.619962 5004 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/966ebea9-4ef2-491b-b170-b7f2788fbe9a-service-ca\") on node \"crc\" DevicePath \"\"" Dec 01 08:34:18 crc kubenswrapper[5004]: I1201 08:34:18.619973 5004 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/966ebea9-4ef2-491b-b170-b7f2788fbe9a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:34:19 crc kubenswrapper[5004]: I1201 08:34:19.063597 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-95c5d55ff-kpnt7_966ebea9-4ef2-491b-b170-b7f2788fbe9a/console/0.log" Dec 01 08:34:19 crc kubenswrapper[5004]: I1201 08:34:19.063648 5004 generic.go:334] "Generic (PLEG): container finished" podID="966ebea9-4ef2-491b-b170-b7f2788fbe9a" containerID="5bb03cb9dcc51c08fbf21c194bcf770e7ab94aeb6c7a8a38d0b6154099f069f1" exitCode=2 Dec 01 08:34:19 crc kubenswrapper[5004]: I1201 08:34:19.063679 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-95c5d55ff-kpnt7" event={"ID":"966ebea9-4ef2-491b-b170-b7f2788fbe9a","Type":"ContainerDied","Data":"5bb03cb9dcc51c08fbf21c194bcf770e7ab94aeb6c7a8a38d0b6154099f069f1"} Dec 01 08:34:19 crc kubenswrapper[5004]: I1201 08:34:19.063707 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-95c5d55ff-kpnt7" event={"ID":"966ebea9-4ef2-491b-b170-b7f2788fbe9a","Type":"ContainerDied","Data":"2e947d22aa5f4710fa36eb3b60bf5a502718024255e96906e3af9570159d3f5c"} Dec 01 08:34:19 crc kubenswrapper[5004]: I1201 08:34:19.063725 5004 scope.go:117] "RemoveContainer" containerID="5bb03cb9dcc51c08fbf21c194bcf770e7ab94aeb6c7a8a38d0b6154099f069f1" Dec 01 08:34:19 crc kubenswrapper[5004]: I1201 08:34:19.063861 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-95c5d55ff-kpnt7" Dec 01 08:34:19 crc kubenswrapper[5004]: I1201 08:34:19.085915 5004 scope.go:117] "RemoveContainer" containerID="5bb03cb9dcc51c08fbf21c194bcf770e7ab94aeb6c7a8a38d0b6154099f069f1" Dec 01 08:34:19 crc kubenswrapper[5004]: E1201 08:34:19.088442 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bb03cb9dcc51c08fbf21c194bcf770e7ab94aeb6c7a8a38d0b6154099f069f1\": container with ID starting with 5bb03cb9dcc51c08fbf21c194bcf770e7ab94aeb6c7a8a38d0b6154099f069f1 not found: ID does not exist" containerID="5bb03cb9dcc51c08fbf21c194bcf770e7ab94aeb6c7a8a38d0b6154099f069f1" Dec 01 08:34:19 crc kubenswrapper[5004]: I1201 08:34:19.088499 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bb03cb9dcc51c08fbf21c194bcf770e7ab94aeb6c7a8a38d0b6154099f069f1"} err="failed to get container status \"5bb03cb9dcc51c08fbf21c194bcf770e7ab94aeb6c7a8a38d0b6154099f069f1\": rpc error: code = NotFound desc = could not find container \"5bb03cb9dcc51c08fbf21c194bcf770e7ab94aeb6c7a8a38d0b6154099f069f1\": container with ID starting with 5bb03cb9dcc51c08fbf21c194bcf770e7ab94aeb6c7a8a38d0b6154099f069f1 not found: ID does not exist" Dec 01 08:34:19 crc kubenswrapper[5004]: I1201 08:34:19.101635 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-95c5d55ff-kpnt7"] Dec 01 08:34:19 crc kubenswrapper[5004]: I1201 08:34:19.114603 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-95c5d55ff-kpnt7"] Dec 01 08:34:20 crc kubenswrapper[5004]: I1201 08:34:20.770531 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="966ebea9-4ef2-491b-b170-b7f2788fbe9a" path="/var/lib/kubelet/pods/966ebea9-4ef2-491b-b170-b7f2788fbe9a/volumes" Dec 01 08:34:21 crc kubenswrapper[5004]: I1201 08:34:21.505944 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83kl2cw"] Dec 01 08:34:21 crc kubenswrapper[5004]: E1201 08:34:21.506669 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="966ebea9-4ef2-491b-b170-b7f2788fbe9a" containerName="console" Dec 01 08:34:21 crc kubenswrapper[5004]: I1201 08:34:21.506693 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="966ebea9-4ef2-491b-b170-b7f2788fbe9a" containerName="console" Dec 01 08:34:21 crc kubenswrapper[5004]: I1201 08:34:21.506868 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="966ebea9-4ef2-491b-b170-b7f2788fbe9a" containerName="console" Dec 01 08:34:21 crc kubenswrapper[5004]: I1201 08:34:21.508136 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83kl2cw" Dec 01 08:34:21 crc kubenswrapper[5004]: I1201 08:34:21.513850 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 01 08:34:21 crc kubenswrapper[5004]: I1201 08:34:21.519027 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83kl2cw"] Dec 01 08:34:21 crc kubenswrapper[5004]: I1201 08:34:21.575498 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/749c2b48-2544-41c1-8dc8-716e9e459232-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83kl2cw\" (UID: \"749c2b48-2544-41c1-8dc8-716e9e459232\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83kl2cw" Dec 01 08:34:21 crc kubenswrapper[5004]: I1201 08:34:21.575602 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/749c2b48-2544-41c1-8dc8-716e9e459232-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83kl2cw\" (UID: \"749c2b48-2544-41c1-8dc8-716e9e459232\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83kl2cw" Dec 01 08:34:21 crc kubenswrapper[5004]: I1201 08:34:21.575696 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjcb4\" (UniqueName: \"kubernetes.io/projected/749c2b48-2544-41c1-8dc8-716e9e459232-kube-api-access-xjcb4\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83kl2cw\" (UID: \"749c2b48-2544-41c1-8dc8-716e9e459232\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83kl2cw" Dec 01 08:34:21 crc kubenswrapper[5004]: I1201 08:34:21.676839 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/749c2b48-2544-41c1-8dc8-716e9e459232-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83kl2cw\" (UID: \"749c2b48-2544-41c1-8dc8-716e9e459232\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83kl2cw" Dec 01 08:34:21 crc kubenswrapper[5004]: I1201 08:34:21.676907 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjcb4\" (UniqueName: \"kubernetes.io/projected/749c2b48-2544-41c1-8dc8-716e9e459232-kube-api-access-xjcb4\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83kl2cw\" (UID: \"749c2b48-2544-41c1-8dc8-716e9e459232\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83kl2cw" Dec 01 08:34:21 crc kubenswrapper[5004]: I1201 08:34:21.677018 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/749c2b48-2544-41c1-8dc8-716e9e459232-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83kl2cw\" (UID: \"749c2b48-2544-41c1-8dc8-716e9e459232\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83kl2cw" Dec 01 08:34:21 crc kubenswrapper[5004]: I1201 08:34:21.677446 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/749c2b48-2544-41c1-8dc8-716e9e459232-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83kl2cw\" (UID: \"749c2b48-2544-41c1-8dc8-716e9e459232\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83kl2cw" Dec 01 08:34:21 crc kubenswrapper[5004]: I1201 08:34:21.677498 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/749c2b48-2544-41c1-8dc8-716e9e459232-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83kl2cw\" (UID: \"749c2b48-2544-41c1-8dc8-716e9e459232\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83kl2cw" Dec 01 08:34:21 crc kubenswrapper[5004]: I1201 08:34:21.707374 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjcb4\" (UniqueName: \"kubernetes.io/projected/749c2b48-2544-41c1-8dc8-716e9e459232-kube-api-access-xjcb4\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83kl2cw\" (UID: \"749c2b48-2544-41c1-8dc8-716e9e459232\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83kl2cw" Dec 01 08:34:21 crc kubenswrapper[5004]: I1201 08:34:21.879967 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83kl2cw" Dec 01 08:34:22 crc kubenswrapper[5004]: I1201 08:34:22.331931 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83kl2cw"] Dec 01 08:34:23 crc kubenswrapper[5004]: I1201 08:34:23.103685 5004 generic.go:334] "Generic (PLEG): container finished" podID="749c2b48-2544-41c1-8dc8-716e9e459232" containerID="b1b47f9fecfec65f5d0cda5a54ee55da88251b4f85fb69525660ee6b0076d53a" exitCode=0 Dec 01 08:34:23 crc kubenswrapper[5004]: I1201 08:34:23.104049 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83kl2cw" event={"ID":"749c2b48-2544-41c1-8dc8-716e9e459232","Type":"ContainerDied","Data":"b1b47f9fecfec65f5d0cda5a54ee55da88251b4f85fb69525660ee6b0076d53a"} Dec 01 08:34:23 crc kubenswrapper[5004]: I1201 08:34:23.104092 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83kl2cw" event={"ID":"749c2b48-2544-41c1-8dc8-716e9e459232","Type":"ContainerStarted","Data":"43d57a87786cd85cbca29cc3c537246e443e863211c7ca4fcb4adcca77ca2d32"} Dec 01 08:34:23 crc kubenswrapper[5004]: I1201 08:34:23.106427 5004 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 08:34:25 crc kubenswrapper[5004]: I1201 08:34:25.119404 5004 generic.go:334] "Generic (PLEG): container finished" podID="749c2b48-2544-41c1-8dc8-716e9e459232" containerID="45b110084c784ebfbd7aaa539671acfdbe5b61b24e4bf1edfedf1acd9bb67ebd" exitCode=0 Dec 01 08:34:25 crc kubenswrapper[5004]: I1201 08:34:25.119515 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83kl2cw" event={"ID":"749c2b48-2544-41c1-8dc8-716e9e459232","Type":"ContainerDied","Data":"45b110084c784ebfbd7aaa539671acfdbe5b61b24e4bf1edfedf1acd9bb67ebd"} Dec 01 08:34:26 crc kubenswrapper[5004]: I1201 08:34:26.134403 5004 generic.go:334] "Generic (PLEG): container finished" podID="749c2b48-2544-41c1-8dc8-716e9e459232" containerID="16ba9591c74297e724cc1fe15a178f9a7f33b46c926ec8125565b6e6c66f81ca" exitCode=0 Dec 01 08:34:26 crc kubenswrapper[5004]: I1201 08:34:26.134477 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83kl2cw" event={"ID":"749c2b48-2544-41c1-8dc8-716e9e459232","Type":"ContainerDied","Data":"16ba9591c74297e724cc1fe15a178f9a7f33b46c926ec8125565b6e6c66f81ca"} Dec 01 08:34:27 crc kubenswrapper[5004]: I1201 08:34:27.444399 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83kl2cw" Dec 01 08:34:27 crc kubenswrapper[5004]: I1201 08:34:27.479714 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/749c2b48-2544-41c1-8dc8-716e9e459232-util\") pod \"749c2b48-2544-41c1-8dc8-716e9e459232\" (UID: \"749c2b48-2544-41c1-8dc8-716e9e459232\") " Dec 01 08:34:27 crc kubenswrapper[5004]: I1201 08:34:27.479824 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjcb4\" (UniqueName: \"kubernetes.io/projected/749c2b48-2544-41c1-8dc8-716e9e459232-kube-api-access-xjcb4\") pod \"749c2b48-2544-41c1-8dc8-716e9e459232\" (UID: \"749c2b48-2544-41c1-8dc8-716e9e459232\") " Dec 01 08:34:27 crc kubenswrapper[5004]: I1201 08:34:27.479859 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/749c2b48-2544-41c1-8dc8-716e9e459232-bundle\") pod \"749c2b48-2544-41c1-8dc8-716e9e459232\" (UID: \"749c2b48-2544-41c1-8dc8-716e9e459232\") " Dec 01 08:34:27 crc kubenswrapper[5004]: I1201 08:34:27.480849 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/749c2b48-2544-41c1-8dc8-716e9e459232-bundle" (OuterVolumeSpecName: "bundle") pod "749c2b48-2544-41c1-8dc8-716e9e459232" (UID: "749c2b48-2544-41c1-8dc8-716e9e459232"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:34:27 crc kubenswrapper[5004]: I1201 08:34:27.485387 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/749c2b48-2544-41c1-8dc8-716e9e459232-kube-api-access-xjcb4" (OuterVolumeSpecName: "kube-api-access-xjcb4") pod "749c2b48-2544-41c1-8dc8-716e9e459232" (UID: "749c2b48-2544-41c1-8dc8-716e9e459232"). InnerVolumeSpecName "kube-api-access-xjcb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:34:27 crc kubenswrapper[5004]: I1201 08:34:27.495822 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/749c2b48-2544-41c1-8dc8-716e9e459232-util" (OuterVolumeSpecName: "util") pod "749c2b48-2544-41c1-8dc8-716e9e459232" (UID: "749c2b48-2544-41c1-8dc8-716e9e459232"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:34:27 crc kubenswrapper[5004]: I1201 08:34:27.580995 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjcb4\" (UniqueName: \"kubernetes.io/projected/749c2b48-2544-41c1-8dc8-716e9e459232-kube-api-access-xjcb4\") on node \"crc\" DevicePath \"\"" Dec 01 08:34:27 crc kubenswrapper[5004]: I1201 08:34:27.581029 5004 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/749c2b48-2544-41c1-8dc8-716e9e459232-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:34:27 crc kubenswrapper[5004]: I1201 08:34:27.581037 5004 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/749c2b48-2544-41c1-8dc8-716e9e459232-util\") on node \"crc\" DevicePath \"\"" Dec 01 08:34:28 crc kubenswrapper[5004]: I1201 08:34:28.159772 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83kl2cw" event={"ID":"749c2b48-2544-41c1-8dc8-716e9e459232","Type":"ContainerDied","Data":"43d57a87786cd85cbca29cc3c537246e443e863211c7ca4fcb4adcca77ca2d32"} Dec 01 08:34:28 crc kubenswrapper[5004]: I1201 08:34:28.159854 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43d57a87786cd85cbca29cc3c537246e443e863211c7ca4fcb4adcca77ca2d32" Dec 01 08:34:28 crc kubenswrapper[5004]: I1201 08:34:28.159906 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83kl2cw" Dec 01 08:34:40 crc kubenswrapper[5004]: I1201 08:34:40.756479 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-7f9c5fbb9c-ljsdl"] Dec 01 08:34:40 crc kubenswrapper[5004]: E1201 08:34:40.757190 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="749c2b48-2544-41c1-8dc8-716e9e459232" containerName="util" Dec 01 08:34:40 crc kubenswrapper[5004]: I1201 08:34:40.757202 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="749c2b48-2544-41c1-8dc8-716e9e459232" containerName="util" Dec 01 08:34:40 crc kubenswrapper[5004]: E1201 08:34:40.757218 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="749c2b48-2544-41c1-8dc8-716e9e459232" containerName="extract" Dec 01 08:34:40 crc kubenswrapper[5004]: I1201 08:34:40.757225 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="749c2b48-2544-41c1-8dc8-716e9e459232" containerName="extract" Dec 01 08:34:40 crc kubenswrapper[5004]: E1201 08:34:40.757236 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="749c2b48-2544-41c1-8dc8-716e9e459232" containerName="pull" Dec 01 08:34:40 crc kubenswrapper[5004]: I1201 08:34:40.757244 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="749c2b48-2544-41c1-8dc8-716e9e459232" containerName="pull" Dec 01 08:34:40 crc kubenswrapper[5004]: I1201 08:34:40.757364 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="749c2b48-2544-41c1-8dc8-716e9e459232" containerName="extract" Dec 01 08:34:40 crc kubenswrapper[5004]: I1201 08:34:40.757856 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7f9c5fbb9c-ljsdl" Dec 01 08:34:40 crc kubenswrapper[5004]: I1201 08:34:40.760914 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 01 08:34:40 crc kubenswrapper[5004]: I1201 08:34:40.761026 5004 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 01 08:34:40 crc kubenswrapper[5004]: I1201 08:34:40.761084 5004 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-gzjjt" Dec 01 08:34:40 crc kubenswrapper[5004]: I1201 08:34:40.772080 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 01 08:34:40 crc kubenswrapper[5004]: I1201 08:34:40.772364 5004 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 01 08:34:40 crc kubenswrapper[5004]: I1201 08:34:40.796171 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7f9c5fbb9c-ljsdl"] Dec 01 08:34:40 crc kubenswrapper[5004]: I1201 08:34:40.907764 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4qfh\" (UniqueName: \"kubernetes.io/projected/81957152-e7e6-490b-a819-fa6d1a57c822-kube-api-access-d4qfh\") pod \"metallb-operator-controller-manager-7f9c5fbb9c-ljsdl\" (UID: \"81957152-e7e6-490b-a819-fa6d1a57c822\") " pod="metallb-system/metallb-operator-controller-manager-7f9c5fbb9c-ljsdl" Dec 01 08:34:40 crc kubenswrapper[5004]: I1201 08:34:40.907858 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/81957152-e7e6-490b-a819-fa6d1a57c822-webhook-cert\") pod \"metallb-operator-controller-manager-7f9c5fbb9c-ljsdl\" (UID: \"81957152-e7e6-490b-a819-fa6d1a57c822\") " pod="metallb-system/metallb-operator-controller-manager-7f9c5fbb9c-ljsdl" Dec 01 08:34:40 crc kubenswrapper[5004]: I1201 08:34:40.907961 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/81957152-e7e6-490b-a819-fa6d1a57c822-apiservice-cert\") pod \"metallb-operator-controller-manager-7f9c5fbb9c-ljsdl\" (UID: \"81957152-e7e6-490b-a819-fa6d1a57c822\") " pod="metallb-system/metallb-operator-controller-manager-7f9c5fbb9c-ljsdl" Dec 01 08:34:40 crc kubenswrapper[5004]: I1201 08:34:40.958343 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-74999fff7b-9rfrt"] Dec 01 08:34:40 crc kubenswrapper[5004]: I1201 08:34:40.959233 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-74999fff7b-9rfrt" Dec 01 08:34:40 crc kubenswrapper[5004]: I1201 08:34:40.961648 5004 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 01 08:34:40 crc kubenswrapper[5004]: I1201 08:34:40.961847 5004 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-f7hb6" Dec 01 08:34:40 crc kubenswrapper[5004]: I1201 08:34:40.961967 5004 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 01 08:34:40 crc kubenswrapper[5004]: I1201 08:34:40.969259 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-74999fff7b-9rfrt"] Dec 01 08:34:41 crc kubenswrapper[5004]: I1201 08:34:41.009310 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4qfh\" (UniqueName: \"kubernetes.io/projected/81957152-e7e6-490b-a819-fa6d1a57c822-kube-api-access-d4qfh\") pod \"metallb-operator-controller-manager-7f9c5fbb9c-ljsdl\" (UID: \"81957152-e7e6-490b-a819-fa6d1a57c822\") " pod="metallb-system/metallb-operator-controller-manager-7f9c5fbb9c-ljsdl" Dec 01 08:34:41 crc kubenswrapper[5004]: I1201 08:34:41.009379 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1d901b5c-40a1-4f35-8f0e-b9de6884d503-apiservice-cert\") pod \"metallb-operator-webhook-server-74999fff7b-9rfrt\" (UID: \"1d901b5c-40a1-4f35-8f0e-b9de6884d503\") " pod="metallb-system/metallb-operator-webhook-server-74999fff7b-9rfrt" Dec 01 08:34:41 crc kubenswrapper[5004]: I1201 08:34:41.009416 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/81957152-e7e6-490b-a819-fa6d1a57c822-webhook-cert\") pod \"metallb-operator-controller-manager-7f9c5fbb9c-ljsdl\" (UID: \"81957152-e7e6-490b-a819-fa6d1a57c822\") " pod="metallb-system/metallb-operator-controller-manager-7f9c5fbb9c-ljsdl" Dec 01 08:34:41 crc kubenswrapper[5004]: I1201 08:34:41.009460 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1d901b5c-40a1-4f35-8f0e-b9de6884d503-webhook-cert\") pod \"metallb-operator-webhook-server-74999fff7b-9rfrt\" (UID: \"1d901b5c-40a1-4f35-8f0e-b9de6884d503\") " pod="metallb-system/metallb-operator-webhook-server-74999fff7b-9rfrt" Dec 01 08:34:41 crc kubenswrapper[5004]: I1201 08:34:41.009481 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksqwl\" (UniqueName: \"kubernetes.io/projected/1d901b5c-40a1-4f35-8f0e-b9de6884d503-kube-api-access-ksqwl\") pod \"metallb-operator-webhook-server-74999fff7b-9rfrt\" (UID: \"1d901b5c-40a1-4f35-8f0e-b9de6884d503\") " pod="metallb-system/metallb-operator-webhook-server-74999fff7b-9rfrt" Dec 01 08:34:41 crc kubenswrapper[5004]: I1201 08:34:41.009513 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/81957152-e7e6-490b-a819-fa6d1a57c822-apiservice-cert\") pod \"metallb-operator-controller-manager-7f9c5fbb9c-ljsdl\" (UID: \"81957152-e7e6-490b-a819-fa6d1a57c822\") " pod="metallb-system/metallb-operator-controller-manager-7f9c5fbb9c-ljsdl" Dec 01 08:34:41 crc kubenswrapper[5004]: I1201 08:34:41.014956 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/81957152-e7e6-490b-a819-fa6d1a57c822-webhook-cert\") pod \"metallb-operator-controller-manager-7f9c5fbb9c-ljsdl\" (UID: \"81957152-e7e6-490b-a819-fa6d1a57c822\") " pod="metallb-system/metallb-operator-controller-manager-7f9c5fbb9c-ljsdl" Dec 01 08:34:41 crc kubenswrapper[5004]: I1201 08:34:41.019125 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/81957152-e7e6-490b-a819-fa6d1a57c822-apiservice-cert\") pod \"metallb-operator-controller-manager-7f9c5fbb9c-ljsdl\" (UID: \"81957152-e7e6-490b-a819-fa6d1a57c822\") " pod="metallb-system/metallb-operator-controller-manager-7f9c5fbb9c-ljsdl" Dec 01 08:34:41 crc kubenswrapper[5004]: I1201 08:34:41.025966 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4qfh\" (UniqueName: \"kubernetes.io/projected/81957152-e7e6-490b-a819-fa6d1a57c822-kube-api-access-d4qfh\") pod \"metallb-operator-controller-manager-7f9c5fbb9c-ljsdl\" (UID: \"81957152-e7e6-490b-a819-fa6d1a57c822\") " pod="metallb-system/metallb-operator-controller-manager-7f9c5fbb9c-ljsdl" Dec 01 08:34:41 crc kubenswrapper[5004]: I1201 08:34:41.075305 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7f9c5fbb9c-ljsdl" Dec 01 08:34:41 crc kubenswrapper[5004]: I1201 08:34:41.111005 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1d901b5c-40a1-4f35-8f0e-b9de6884d503-webhook-cert\") pod \"metallb-operator-webhook-server-74999fff7b-9rfrt\" (UID: \"1d901b5c-40a1-4f35-8f0e-b9de6884d503\") " pod="metallb-system/metallb-operator-webhook-server-74999fff7b-9rfrt" Dec 01 08:34:41 crc kubenswrapper[5004]: I1201 08:34:41.111342 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksqwl\" (UniqueName: \"kubernetes.io/projected/1d901b5c-40a1-4f35-8f0e-b9de6884d503-kube-api-access-ksqwl\") pod \"metallb-operator-webhook-server-74999fff7b-9rfrt\" (UID: \"1d901b5c-40a1-4f35-8f0e-b9de6884d503\") " pod="metallb-system/metallb-operator-webhook-server-74999fff7b-9rfrt" Dec 01 08:34:41 crc kubenswrapper[5004]: I1201 08:34:41.111444 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1d901b5c-40a1-4f35-8f0e-b9de6884d503-apiservice-cert\") pod \"metallb-operator-webhook-server-74999fff7b-9rfrt\" (UID: \"1d901b5c-40a1-4f35-8f0e-b9de6884d503\") " pod="metallb-system/metallb-operator-webhook-server-74999fff7b-9rfrt" Dec 01 08:34:41 crc kubenswrapper[5004]: I1201 08:34:41.114400 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1d901b5c-40a1-4f35-8f0e-b9de6884d503-apiservice-cert\") pod \"metallb-operator-webhook-server-74999fff7b-9rfrt\" (UID: \"1d901b5c-40a1-4f35-8f0e-b9de6884d503\") " pod="metallb-system/metallb-operator-webhook-server-74999fff7b-9rfrt" Dec 01 08:34:41 crc kubenswrapper[5004]: I1201 08:34:41.115217 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1d901b5c-40a1-4f35-8f0e-b9de6884d503-webhook-cert\") pod \"metallb-operator-webhook-server-74999fff7b-9rfrt\" (UID: \"1d901b5c-40a1-4f35-8f0e-b9de6884d503\") " pod="metallb-system/metallb-operator-webhook-server-74999fff7b-9rfrt" Dec 01 08:34:41 crc kubenswrapper[5004]: I1201 08:34:41.131280 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksqwl\" (UniqueName: \"kubernetes.io/projected/1d901b5c-40a1-4f35-8f0e-b9de6884d503-kube-api-access-ksqwl\") pod \"metallb-operator-webhook-server-74999fff7b-9rfrt\" (UID: \"1d901b5c-40a1-4f35-8f0e-b9de6884d503\") " pod="metallb-system/metallb-operator-webhook-server-74999fff7b-9rfrt" Dec 01 08:34:41 crc kubenswrapper[5004]: I1201 08:34:41.283246 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-74999fff7b-9rfrt" Dec 01 08:34:41 crc kubenswrapper[5004]: I1201 08:34:41.539111 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7f9c5fbb9c-ljsdl"] Dec 01 08:34:41 crc kubenswrapper[5004]: W1201 08:34:41.539769 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81957152_e7e6_490b_a819_fa6d1a57c822.slice/crio-65d9d45fea685a3f562375954adaf21403b939468dbe8e6263638137732b290c WatchSource:0}: Error finding container 65d9d45fea685a3f562375954adaf21403b939468dbe8e6263638137732b290c: Status 404 returned error can't find the container with id 65d9d45fea685a3f562375954adaf21403b939468dbe8e6263638137732b290c Dec 01 08:34:41 crc kubenswrapper[5004]: I1201 08:34:41.702006 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-74999fff7b-9rfrt"] Dec 01 08:34:41 crc kubenswrapper[5004]: W1201 08:34:41.710065 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d901b5c_40a1_4f35_8f0e_b9de6884d503.slice/crio-96942538df977a8040acd2d2eda399f572585376e2fc676e8798022dd82c0b81 WatchSource:0}: Error finding container 96942538df977a8040acd2d2eda399f572585376e2fc676e8798022dd82c0b81: Status 404 returned error can't find the container with id 96942538df977a8040acd2d2eda399f572585376e2fc676e8798022dd82c0b81 Dec 01 08:34:42 crc kubenswrapper[5004]: I1201 08:34:42.267121 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7f9c5fbb9c-ljsdl" event={"ID":"81957152-e7e6-490b-a819-fa6d1a57c822","Type":"ContainerStarted","Data":"65d9d45fea685a3f562375954adaf21403b939468dbe8e6263638137732b290c"} Dec 01 08:34:42 crc kubenswrapper[5004]: I1201 08:34:42.268521 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-74999fff7b-9rfrt" event={"ID":"1d901b5c-40a1-4f35-8f0e-b9de6884d503","Type":"ContainerStarted","Data":"96942538df977a8040acd2d2eda399f572585376e2fc676e8798022dd82c0b81"} Dec 01 08:34:47 crc kubenswrapper[5004]: I1201 08:34:47.302818 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-74999fff7b-9rfrt" event={"ID":"1d901b5c-40a1-4f35-8f0e-b9de6884d503","Type":"ContainerStarted","Data":"3f5883edd868fbf01dab762c0cd193dec736254bba48787d3ce4e440ac84b9f9"} Dec 01 08:34:47 crc kubenswrapper[5004]: I1201 08:34:47.303457 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-74999fff7b-9rfrt" Dec 01 08:34:47 crc kubenswrapper[5004]: I1201 08:34:47.304249 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7f9c5fbb9c-ljsdl" event={"ID":"81957152-e7e6-490b-a819-fa6d1a57c822","Type":"ContainerStarted","Data":"1df69887244aa32cc45dad157b0f4119eaba4e28cc764fc99d4a4fbd1dbe9e40"} Dec 01 08:34:47 crc kubenswrapper[5004]: I1201 08:34:47.304443 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-7f9c5fbb9c-ljsdl" Dec 01 08:34:47 crc kubenswrapper[5004]: I1201 08:34:47.330436 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-74999fff7b-9rfrt" podStartSLOduration=2.037212534 podStartE2EDuration="7.330412289s" podCreationTimestamp="2025-12-01 08:34:40 +0000 UTC" firstStartedPulling="2025-12-01 08:34:41.713096132 +0000 UTC m=+1059.278088114" lastFinishedPulling="2025-12-01 08:34:47.006295887 +0000 UTC m=+1064.571287869" observedRunningTime="2025-12-01 08:34:47.322509255 +0000 UTC m=+1064.887501237" watchObservedRunningTime="2025-12-01 08:34:47.330412289 +0000 UTC m=+1064.895404281" Dec 01 08:34:47 crc kubenswrapper[5004]: I1201 08:34:47.357484 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-7f9c5fbb9c-ljsdl" podStartSLOduration=1.917540837 podStartE2EDuration="7.357466132s" podCreationTimestamp="2025-12-01 08:34:40 +0000 UTC" firstStartedPulling="2025-12-01 08:34:41.54507389 +0000 UTC m=+1059.110065862" lastFinishedPulling="2025-12-01 08:34:46.984999165 +0000 UTC m=+1064.549991157" observedRunningTime="2025-12-01 08:34:47.351336512 +0000 UTC m=+1064.916328504" watchObservedRunningTime="2025-12-01 08:34:47.357466132 +0000 UTC m=+1064.922458114" Dec 01 08:35:01 crc kubenswrapper[5004]: I1201 08:35:01.296937 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-74999fff7b-9rfrt" Dec 01 08:35:21 crc kubenswrapper[5004]: I1201 08:35:21.080245 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-7f9c5fbb9c-ljsdl" Dec 01 08:35:21 crc kubenswrapper[5004]: I1201 08:35:21.800723 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-26q9v"] Dec 01 08:35:21 crc kubenswrapper[5004]: I1201 08:35:21.802011 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-26q9v" Dec 01 08:35:21 crc kubenswrapper[5004]: I1201 08:35:21.804499 5004 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-xvvwx" Dec 01 08:35:21 crc kubenswrapper[5004]: I1201 08:35:21.806246 5004 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 01 08:35:21 crc kubenswrapper[5004]: I1201 08:35:21.818512 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-26q9v"] Dec 01 08:35:21 crc kubenswrapper[5004]: I1201 08:35:21.833983 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-l6x9r"] Dec 01 08:35:21 crc kubenswrapper[5004]: I1201 08:35:21.839767 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-l6x9r" Dec 01 08:35:21 crc kubenswrapper[5004]: I1201 08:35:21.842353 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 01 08:35:21 crc kubenswrapper[5004]: I1201 08:35:21.842599 5004 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 01 08:35:21 crc kubenswrapper[5004]: I1201 08:35:21.866651 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bebabc29-870f-4604-bda6-e77a3db6a5ed-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-26q9v\" (UID: \"bebabc29-870f-4604-bda6-e77a3db6a5ed\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-26q9v" Dec 01 08:35:21 crc kubenswrapper[5004]: I1201 08:35:21.866761 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjzqd\" (UniqueName: \"kubernetes.io/projected/bebabc29-870f-4604-bda6-e77a3db6a5ed-kube-api-access-pjzqd\") pod \"frr-k8s-webhook-server-7fcb986d4-26q9v\" (UID: \"bebabc29-870f-4604-bda6-e77a3db6a5ed\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-26q9v" Dec 01 08:35:21 crc kubenswrapper[5004]: I1201 08:35:21.902535 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-cqvjk"] Dec 01 08:35:21 crc kubenswrapper[5004]: I1201 08:35:21.903982 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-cqvjk" Dec 01 08:35:21 crc kubenswrapper[5004]: I1201 08:35:21.906229 5004 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 01 08:35:21 crc kubenswrapper[5004]: I1201 08:35:21.906233 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 01 08:35:21 crc kubenswrapper[5004]: I1201 08:35:21.906739 5004 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-xkl9j" Dec 01 08:35:21 crc kubenswrapper[5004]: I1201 08:35:21.906773 5004 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 01 08:35:21 crc kubenswrapper[5004]: I1201 08:35:21.919899 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-5t9hg"] Dec 01 08:35:21 crc kubenswrapper[5004]: I1201 08:35:21.922122 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-5t9hg" Dec 01 08:35:21 crc kubenswrapper[5004]: I1201 08:35:21.924044 5004 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 01 08:35:21 crc kubenswrapper[5004]: I1201 08:35:21.937162 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-5t9hg"] Dec 01 08:35:21 crc kubenswrapper[5004]: I1201 08:35:21.967924 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/53c42c23-7bb0-4e51-ab58-3355b224864c-cert\") pod \"controller-f8648f98b-5t9hg\" (UID: \"53c42c23-7bb0-4e51-ab58-3355b224864c\") " pod="metallb-system/controller-f8648f98b-5t9hg" Dec 01 08:35:21 crc kubenswrapper[5004]: I1201 08:35:21.967967 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/113ca366-80ad-475e-829f-fcbb4a67e642-metrics\") pod \"frr-k8s-l6x9r\" (UID: \"113ca366-80ad-475e-829f-fcbb4a67e642\") " pod="metallb-system/frr-k8s-l6x9r" Dec 01 08:35:21 crc kubenswrapper[5004]: I1201 08:35:21.967993 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53c42c23-7bb0-4e51-ab58-3355b224864c-metrics-certs\") pod \"controller-f8648f98b-5t9hg\" (UID: \"53c42c23-7bb0-4e51-ab58-3355b224864c\") " pod="metallb-system/controller-f8648f98b-5t9hg" Dec 01 08:35:21 crc kubenswrapper[5004]: I1201 08:35:21.968009 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8dl7\" (UniqueName: \"kubernetes.io/projected/53c42c23-7bb0-4e51-ab58-3355b224864c-kube-api-access-v8dl7\") pod \"controller-f8648f98b-5t9hg\" (UID: \"53c42c23-7bb0-4e51-ab58-3355b224864c\") " pod="metallb-system/controller-f8648f98b-5t9hg" Dec 01 08:35:21 crc kubenswrapper[5004]: I1201 08:35:21.968202 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjzqd\" (UniqueName: \"kubernetes.io/projected/bebabc29-870f-4604-bda6-e77a3db6a5ed-kube-api-access-pjzqd\") pod \"frr-k8s-webhook-server-7fcb986d4-26q9v\" (UID: \"bebabc29-870f-4604-bda6-e77a3db6a5ed\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-26q9v" Dec 01 08:35:21 crc kubenswrapper[5004]: I1201 08:35:21.968311 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/0a3f9662-dd78-4183-8bb4-ccc7b6b17ed3-metallb-excludel2\") pod \"speaker-cqvjk\" (UID: \"0a3f9662-dd78-4183-8bb4-ccc7b6b17ed3\") " pod="metallb-system/speaker-cqvjk" Dec 01 08:35:21 crc kubenswrapper[5004]: I1201 08:35:21.968349 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jnk9\" (UniqueName: \"kubernetes.io/projected/0a3f9662-dd78-4183-8bb4-ccc7b6b17ed3-kube-api-access-2jnk9\") pod \"speaker-cqvjk\" (UID: \"0a3f9662-dd78-4183-8bb4-ccc7b6b17ed3\") " pod="metallb-system/speaker-cqvjk" Dec 01 08:35:21 crc kubenswrapper[5004]: I1201 08:35:21.968400 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/113ca366-80ad-475e-829f-fcbb4a67e642-frr-conf\") pod \"frr-k8s-l6x9r\" (UID: \"113ca366-80ad-475e-829f-fcbb4a67e642\") " pod="metallb-system/frr-k8s-l6x9r" Dec 01 08:35:21 crc kubenswrapper[5004]: I1201 08:35:21.968542 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/113ca366-80ad-475e-829f-fcbb4a67e642-reloader\") pod \"frr-k8s-l6x9r\" (UID: \"113ca366-80ad-475e-829f-fcbb4a67e642\") " pod="metallb-system/frr-k8s-l6x9r" Dec 01 08:35:21 crc kubenswrapper[5004]: I1201 08:35:21.968587 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/113ca366-80ad-475e-829f-fcbb4a67e642-frr-sockets\") pod \"frr-k8s-l6x9r\" (UID: \"113ca366-80ad-475e-829f-fcbb4a67e642\") " pod="metallb-system/frr-k8s-l6x9r" Dec 01 08:35:21 crc kubenswrapper[5004]: I1201 08:35:21.968608 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/0a3f9662-dd78-4183-8bb4-ccc7b6b17ed3-memberlist\") pod \"speaker-cqvjk\" (UID: \"0a3f9662-dd78-4183-8bb4-ccc7b6b17ed3\") " pod="metallb-system/speaker-cqvjk" Dec 01 08:35:21 crc kubenswrapper[5004]: I1201 08:35:21.968655 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/113ca366-80ad-475e-829f-fcbb4a67e642-metrics-certs\") pod \"frr-k8s-l6x9r\" (UID: \"113ca366-80ad-475e-829f-fcbb4a67e642\") " pod="metallb-system/frr-k8s-l6x9r" Dec 01 08:35:21 crc kubenswrapper[5004]: I1201 08:35:21.968687 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/113ca366-80ad-475e-829f-fcbb4a67e642-frr-startup\") pod \"frr-k8s-l6x9r\" (UID: \"113ca366-80ad-475e-829f-fcbb4a67e642\") " pod="metallb-system/frr-k8s-l6x9r" Dec 01 08:35:21 crc kubenswrapper[5004]: I1201 08:35:21.968718 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0a3f9662-dd78-4183-8bb4-ccc7b6b17ed3-metrics-certs\") pod \"speaker-cqvjk\" (UID: \"0a3f9662-dd78-4183-8bb4-ccc7b6b17ed3\") " pod="metallb-system/speaker-cqvjk" Dec 01 08:35:21 crc kubenswrapper[5004]: I1201 08:35:21.968751 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bebabc29-870f-4604-bda6-e77a3db6a5ed-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-26q9v\" (UID: \"bebabc29-870f-4604-bda6-e77a3db6a5ed\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-26q9v" Dec 01 08:35:21 crc kubenswrapper[5004]: E1201 08:35:21.968864 5004 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Dec 01 08:35:21 crc kubenswrapper[5004]: I1201 08:35:21.968881 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5x9n\" (UniqueName: \"kubernetes.io/projected/113ca366-80ad-475e-829f-fcbb4a67e642-kube-api-access-h5x9n\") pod \"frr-k8s-l6x9r\" (UID: \"113ca366-80ad-475e-829f-fcbb4a67e642\") " pod="metallb-system/frr-k8s-l6x9r" Dec 01 08:35:21 crc kubenswrapper[5004]: E1201 08:35:21.968912 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bebabc29-870f-4604-bda6-e77a3db6a5ed-cert podName:bebabc29-870f-4604-bda6-e77a3db6a5ed nodeName:}" failed. No retries permitted until 2025-12-01 08:35:22.468895185 +0000 UTC m=+1100.033887167 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bebabc29-870f-4604-bda6-e77a3db6a5ed-cert") pod "frr-k8s-webhook-server-7fcb986d4-26q9v" (UID: "bebabc29-870f-4604-bda6-e77a3db6a5ed") : secret "frr-k8s-webhook-server-cert" not found Dec 01 08:35:21 crc kubenswrapper[5004]: I1201 08:35:21.990496 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjzqd\" (UniqueName: \"kubernetes.io/projected/bebabc29-870f-4604-bda6-e77a3db6a5ed-kube-api-access-pjzqd\") pod \"frr-k8s-webhook-server-7fcb986d4-26q9v\" (UID: \"bebabc29-870f-4604-bda6-e77a3db6a5ed\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-26q9v" Dec 01 08:35:22 crc kubenswrapper[5004]: I1201 08:35:22.069900 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/113ca366-80ad-475e-829f-fcbb4a67e642-reloader\") pod \"frr-k8s-l6x9r\" (UID: \"113ca366-80ad-475e-829f-fcbb4a67e642\") " pod="metallb-system/frr-k8s-l6x9r" Dec 01 08:35:22 crc kubenswrapper[5004]: I1201 08:35:22.069955 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/113ca366-80ad-475e-829f-fcbb4a67e642-frr-sockets\") pod \"frr-k8s-l6x9r\" (UID: \"113ca366-80ad-475e-829f-fcbb4a67e642\") " pod="metallb-system/frr-k8s-l6x9r" Dec 01 08:35:22 crc kubenswrapper[5004]: I1201 08:35:22.069979 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/0a3f9662-dd78-4183-8bb4-ccc7b6b17ed3-memberlist\") pod \"speaker-cqvjk\" (UID: \"0a3f9662-dd78-4183-8bb4-ccc7b6b17ed3\") " pod="metallb-system/speaker-cqvjk" Dec 01 08:35:22 crc kubenswrapper[5004]: I1201 08:35:22.070007 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/113ca366-80ad-475e-829f-fcbb4a67e642-metrics-certs\") pod \"frr-k8s-l6x9r\" (UID: \"113ca366-80ad-475e-829f-fcbb4a67e642\") " pod="metallb-system/frr-k8s-l6x9r" Dec 01 08:35:22 crc kubenswrapper[5004]: I1201 08:35:22.070028 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/113ca366-80ad-475e-829f-fcbb4a67e642-frr-startup\") pod \"frr-k8s-l6x9r\" (UID: \"113ca366-80ad-475e-829f-fcbb4a67e642\") " pod="metallb-system/frr-k8s-l6x9r" Dec 01 08:35:22 crc kubenswrapper[5004]: I1201 08:35:22.070067 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0a3f9662-dd78-4183-8bb4-ccc7b6b17ed3-metrics-certs\") pod \"speaker-cqvjk\" (UID: \"0a3f9662-dd78-4183-8bb4-ccc7b6b17ed3\") " pod="metallb-system/speaker-cqvjk" Dec 01 08:35:22 crc kubenswrapper[5004]: I1201 08:35:22.070136 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5x9n\" (UniqueName: \"kubernetes.io/projected/113ca366-80ad-475e-829f-fcbb4a67e642-kube-api-access-h5x9n\") pod \"frr-k8s-l6x9r\" (UID: \"113ca366-80ad-475e-829f-fcbb4a67e642\") " pod="metallb-system/frr-k8s-l6x9r" Dec 01 08:35:22 crc kubenswrapper[5004]: I1201 08:35:22.070176 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/53c42c23-7bb0-4e51-ab58-3355b224864c-cert\") pod \"controller-f8648f98b-5t9hg\" (UID: \"53c42c23-7bb0-4e51-ab58-3355b224864c\") " pod="metallb-system/controller-f8648f98b-5t9hg" Dec 01 08:35:22 crc kubenswrapper[5004]: I1201 08:35:22.070202 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/113ca366-80ad-475e-829f-fcbb4a67e642-metrics\") pod \"frr-k8s-l6x9r\" (UID: \"113ca366-80ad-475e-829f-fcbb4a67e642\") " pod="metallb-system/frr-k8s-l6x9r" Dec 01 08:35:22 crc kubenswrapper[5004]: E1201 08:35:22.070135 5004 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 01 08:35:22 crc kubenswrapper[5004]: E1201 08:35:22.070294 5004 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Dec 01 08:35:22 crc kubenswrapper[5004]: I1201 08:35:22.070227 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53c42c23-7bb0-4e51-ab58-3355b224864c-metrics-certs\") pod \"controller-f8648f98b-5t9hg\" (UID: \"53c42c23-7bb0-4e51-ab58-3355b224864c\") " pod="metallb-system/controller-f8648f98b-5t9hg" Dec 01 08:35:22 crc kubenswrapper[5004]: E1201 08:35:22.070297 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a3f9662-dd78-4183-8bb4-ccc7b6b17ed3-memberlist podName:0a3f9662-dd78-4183-8bb4-ccc7b6b17ed3 nodeName:}" failed. No retries permitted until 2025-12-01 08:35:22.570275913 +0000 UTC m=+1100.135267925 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/0a3f9662-dd78-4183-8bb4-ccc7b6b17ed3-memberlist") pod "speaker-cqvjk" (UID: "0a3f9662-dd78-4183-8bb4-ccc7b6b17ed3") : secret "metallb-memberlist" not found Dec 01 08:35:22 crc kubenswrapper[5004]: E1201 08:35:22.070374 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53c42c23-7bb0-4e51-ab58-3355b224864c-metrics-certs podName:53c42c23-7bb0-4e51-ab58-3355b224864c nodeName:}" failed. No retries permitted until 2025-12-01 08:35:22.570354295 +0000 UTC m=+1100.135346327 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/53c42c23-7bb0-4e51-ab58-3355b224864c-metrics-certs") pod "controller-f8648f98b-5t9hg" (UID: "53c42c23-7bb0-4e51-ab58-3355b224864c") : secret "controller-certs-secret" not found Dec 01 08:35:22 crc kubenswrapper[5004]: I1201 08:35:22.070371 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/113ca366-80ad-475e-829f-fcbb4a67e642-reloader\") pod \"frr-k8s-l6x9r\" (UID: \"113ca366-80ad-475e-829f-fcbb4a67e642\") " pod="metallb-system/frr-k8s-l6x9r" Dec 01 08:35:22 crc kubenswrapper[5004]: I1201 08:35:22.070394 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8dl7\" (UniqueName: \"kubernetes.io/projected/53c42c23-7bb0-4e51-ab58-3355b224864c-kube-api-access-v8dl7\") pod \"controller-f8648f98b-5t9hg\" (UID: \"53c42c23-7bb0-4e51-ab58-3355b224864c\") " pod="metallb-system/controller-f8648f98b-5t9hg" Dec 01 08:35:22 crc kubenswrapper[5004]: I1201 08:35:22.070447 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/0a3f9662-dd78-4183-8bb4-ccc7b6b17ed3-metallb-excludel2\") pod \"speaker-cqvjk\" (UID: \"0a3f9662-dd78-4183-8bb4-ccc7b6b17ed3\") " pod="metallb-system/speaker-cqvjk" Dec 01 08:35:22 crc kubenswrapper[5004]: I1201 08:35:22.070476 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jnk9\" (UniqueName: \"kubernetes.io/projected/0a3f9662-dd78-4183-8bb4-ccc7b6b17ed3-kube-api-access-2jnk9\") pod \"speaker-cqvjk\" (UID: \"0a3f9662-dd78-4183-8bb4-ccc7b6b17ed3\") " pod="metallb-system/speaker-cqvjk" Dec 01 08:35:22 crc kubenswrapper[5004]: I1201 08:35:22.070508 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/113ca366-80ad-475e-829f-fcbb4a67e642-frr-conf\") pod \"frr-k8s-l6x9r\" (UID: \"113ca366-80ad-475e-829f-fcbb4a67e642\") " pod="metallb-system/frr-k8s-l6x9r" Dec 01 08:35:22 crc kubenswrapper[5004]: I1201 08:35:22.070593 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/113ca366-80ad-475e-829f-fcbb4a67e642-frr-sockets\") pod \"frr-k8s-l6x9r\" (UID: \"113ca366-80ad-475e-829f-fcbb4a67e642\") " pod="metallb-system/frr-k8s-l6x9r" Dec 01 08:35:22 crc kubenswrapper[5004]: I1201 08:35:22.070650 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/113ca366-80ad-475e-829f-fcbb4a67e642-metrics\") pod \"frr-k8s-l6x9r\" (UID: \"113ca366-80ad-475e-829f-fcbb4a67e642\") " pod="metallb-system/frr-k8s-l6x9r" Dec 01 08:35:22 crc kubenswrapper[5004]: I1201 08:35:22.070871 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/113ca366-80ad-475e-829f-fcbb4a67e642-frr-conf\") pod \"frr-k8s-l6x9r\" (UID: \"113ca366-80ad-475e-829f-fcbb4a67e642\") " pod="metallb-system/frr-k8s-l6x9r" Dec 01 08:35:22 crc kubenswrapper[5004]: I1201 08:35:22.071079 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/0a3f9662-dd78-4183-8bb4-ccc7b6b17ed3-metallb-excludel2\") pod \"speaker-cqvjk\" (UID: \"0a3f9662-dd78-4183-8bb4-ccc7b6b17ed3\") " pod="metallb-system/speaker-cqvjk" Dec 01 08:35:22 crc kubenswrapper[5004]: I1201 08:35:22.071192 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/113ca366-80ad-475e-829f-fcbb4a67e642-frr-startup\") pod \"frr-k8s-l6x9r\" (UID: \"113ca366-80ad-475e-829f-fcbb4a67e642\") " pod="metallb-system/frr-k8s-l6x9r" Dec 01 08:35:22 crc kubenswrapper[5004]: I1201 08:35:22.072658 5004 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 01 08:35:22 crc kubenswrapper[5004]: I1201 08:35:22.073725 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0a3f9662-dd78-4183-8bb4-ccc7b6b17ed3-metrics-certs\") pod \"speaker-cqvjk\" (UID: \"0a3f9662-dd78-4183-8bb4-ccc7b6b17ed3\") " pod="metallb-system/speaker-cqvjk" Dec 01 08:35:22 crc kubenswrapper[5004]: I1201 08:35:22.074232 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/113ca366-80ad-475e-829f-fcbb4a67e642-metrics-certs\") pod \"frr-k8s-l6x9r\" (UID: \"113ca366-80ad-475e-829f-fcbb4a67e642\") " pod="metallb-system/frr-k8s-l6x9r" Dec 01 08:35:22 crc kubenswrapper[5004]: I1201 08:35:22.083527 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/53c42c23-7bb0-4e51-ab58-3355b224864c-cert\") pod \"controller-f8648f98b-5t9hg\" (UID: \"53c42c23-7bb0-4e51-ab58-3355b224864c\") " pod="metallb-system/controller-f8648f98b-5t9hg" Dec 01 08:35:22 crc kubenswrapper[5004]: I1201 08:35:22.093232 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jnk9\" (UniqueName: \"kubernetes.io/projected/0a3f9662-dd78-4183-8bb4-ccc7b6b17ed3-kube-api-access-2jnk9\") pod \"speaker-cqvjk\" (UID: \"0a3f9662-dd78-4183-8bb4-ccc7b6b17ed3\") " pod="metallb-system/speaker-cqvjk" Dec 01 08:35:22 crc kubenswrapper[5004]: I1201 08:35:22.095828 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8dl7\" (UniqueName: \"kubernetes.io/projected/53c42c23-7bb0-4e51-ab58-3355b224864c-kube-api-access-v8dl7\") pod \"controller-f8648f98b-5t9hg\" (UID: \"53c42c23-7bb0-4e51-ab58-3355b224864c\") " pod="metallb-system/controller-f8648f98b-5t9hg" Dec 01 08:35:22 crc kubenswrapper[5004]: I1201 08:35:22.098010 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5x9n\" (UniqueName: \"kubernetes.io/projected/113ca366-80ad-475e-829f-fcbb4a67e642-kube-api-access-h5x9n\") pod \"frr-k8s-l6x9r\" (UID: \"113ca366-80ad-475e-829f-fcbb4a67e642\") " pod="metallb-system/frr-k8s-l6x9r" Dec 01 08:35:22 crc kubenswrapper[5004]: I1201 08:35:22.163572 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-l6x9r" Dec 01 08:35:22 crc kubenswrapper[5004]: I1201 08:35:22.475448 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bebabc29-870f-4604-bda6-e77a3db6a5ed-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-26q9v\" (UID: \"bebabc29-870f-4604-bda6-e77a3db6a5ed\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-26q9v" Dec 01 08:35:22 crc kubenswrapper[5004]: I1201 08:35:22.479757 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bebabc29-870f-4604-bda6-e77a3db6a5ed-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-26q9v\" (UID: \"bebabc29-870f-4604-bda6-e77a3db6a5ed\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-26q9v" Dec 01 08:35:22 crc kubenswrapper[5004]: I1201 08:35:22.577000 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53c42c23-7bb0-4e51-ab58-3355b224864c-metrics-certs\") pod \"controller-f8648f98b-5t9hg\" (UID: \"53c42c23-7bb0-4e51-ab58-3355b224864c\") " pod="metallb-system/controller-f8648f98b-5t9hg" Dec 01 08:35:22 crc kubenswrapper[5004]: I1201 08:35:22.577104 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/0a3f9662-dd78-4183-8bb4-ccc7b6b17ed3-memberlist\") pod \"speaker-cqvjk\" (UID: \"0a3f9662-dd78-4183-8bb4-ccc7b6b17ed3\") " pod="metallb-system/speaker-cqvjk" Dec 01 08:35:22 crc kubenswrapper[5004]: E1201 08:35:22.577270 5004 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 01 08:35:22 crc kubenswrapper[5004]: E1201 08:35:22.577390 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a3f9662-dd78-4183-8bb4-ccc7b6b17ed3-memberlist podName:0a3f9662-dd78-4183-8bb4-ccc7b6b17ed3 nodeName:}" failed. No retries permitted until 2025-12-01 08:35:23.577370093 +0000 UTC m=+1101.142362075 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/0a3f9662-dd78-4183-8bb4-ccc7b6b17ed3-memberlist") pod "speaker-cqvjk" (UID: "0a3f9662-dd78-4183-8bb4-ccc7b6b17ed3") : secret "metallb-memberlist" not found Dec 01 08:35:22 crc kubenswrapper[5004]: I1201 08:35:22.582929 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53c42c23-7bb0-4e51-ab58-3355b224864c-metrics-certs\") pod \"controller-f8648f98b-5t9hg\" (UID: \"53c42c23-7bb0-4e51-ab58-3355b224864c\") " pod="metallb-system/controller-f8648f98b-5t9hg" Dec 01 08:35:22 crc kubenswrapper[5004]: I1201 08:35:22.623644 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l6x9r" event={"ID":"113ca366-80ad-475e-829f-fcbb4a67e642","Type":"ContainerStarted","Data":"24695379bd317b93bc5250ac2efcbec4a9c14992035b2f1f758e7a579e5fda65"} Dec 01 08:35:22 crc kubenswrapper[5004]: I1201 08:35:22.733820 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-26q9v" Dec 01 08:35:22 crc kubenswrapper[5004]: I1201 08:35:22.837633 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-5t9hg" Dec 01 08:35:23 crc kubenswrapper[5004]: I1201 08:35:23.246802 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-26q9v"] Dec 01 08:35:23 crc kubenswrapper[5004]: W1201 08:35:23.373908 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53c42c23_7bb0_4e51_ab58_3355b224864c.slice/crio-57127dde87e77d14b002401af49c60735abc7ab537b99f898493668f7b4f2166 WatchSource:0}: Error finding container 57127dde87e77d14b002401af49c60735abc7ab537b99f898493668f7b4f2166: Status 404 returned error can't find the container with id 57127dde87e77d14b002401af49c60735abc7ab537b99f898493668f7b4f2166 Dec 01 08:35:23 crc kubenswrapper[5004]: I1201 08:35:23.383198 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-5t9hg"] Dec 01 08:35:23 crc kubenswrapper[5004]: I1201 08:35:23.595475 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/0a3f9662-dd78-4183-8bb4-ccc7b6b17ed3-memberlist\") pod \"speaker-cqvjk\" (UID: \"0a3f9662-dd78-4183-8bb4-ccc7b6b17ed3\") " pod="metallb-system/speaker-cqvjk" Dec 01 08:35:23 crc kubenswrapper[5004]: E1201 08:35:23.595925 5004 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 01 08:35:23 crc kubenswrapper[5004]: E1201 08:35:23.596225 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a3f9662-dd78-4183-8bb4-ccc7b6b17ed3-memberlist podName:0a3f9662-dd78-4183-8bb4-ccc7b6b17ed3 nodeName:}" failed. No retries permitted until 2025-12-01 08:35:25.596193417 +0000 UTC m=+1103.161185439 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/0a3f9662-dd78-4183-8bb4-ccc7b6b17ed3-memberlist") pod "speaker-cqvjk" (UID: "0a3f9662-dd78-4183-8bb4-ccc7b6b17ed3") : secret "metallb-memberlist" not found Dec 01 08:35:23 crc kubenswrapper[5004]: I1201 08:35:23.632541 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-26q9v" event={"ID":"bebabc29-870f-4604-bda6-e77a3db6a5ed","Type":"ContainerStarted","Data":"a8039582a575ca1f13d5eb8ab39b92f615aabdaec20b7abaa1f19d48ce896d8e"} Dec 01 08:35:23 crc kubenswrapper[5004]: I1201 08:35:23.633550 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-5t9hg" event={"ID":"53c42c23-7bb0-4e51-ab58-3355b224864c","Type":"ContainerStarted","Data":"57127dde87e77d14b002401af49c60735abc7ab537b99f898493668f7b4f2166"} Dec 01 08:35:24 crc kubenswrapper[5004]: I1201 08:35:24.642970 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-5t9hg" event={"ID":"53c42c23-7bb0-4e51-ab58-3355b224864c","Type":"ContainerStarted","Data":"cf0839cd4222ef25d8831f96a3a253863c75cfde042a20d11727e019ba1e2ba0"} Dec 01 08:35:24 crc kubenswrapper[5004]: I1201 08:35:24.643320 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-5t9hg" event={"ID":"53c42c23-7bb0-4e51-ab58-3355b224864c","Type":"ContainerStarted","Data":"d22b61860ac5214725d9a089cd7a4c1dbba951d91f28d54f054e767ecf1c0f9f"} Dec 01 08:35:24 crc kubenswrapper[5004]: I1201 08:35:24.643342 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-5t9hg" Dec 01 08:35:24 crc kubenswrapper[5004]: I1201 08:35:24.666624 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-5t9hg" podStartSLOduration=3.666542944 podStartE2EDuration="3.666542944s" podCreationTimestamp="2025-12-01 08:35:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:35:24.665435517 +0000 UTC m=+1102.230427499" watchObservedRunningTime="2025-12-01 08:35:24.666542944 +0000 UTC m=+1102.231534926" Dec 01 08:35:25 crc kubenswrapper[5004]: I1201 08:35:25.630693 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/0a3f9662-dd78-4183-8bb4-ccc7b6b17ed3-memberlist\") pod \"speaker-cqvjk\" (UID: \"0a3f9662-dd78-4183-8bb4-ccc7b6b17ed3\") " pod="metallb-system/speaker-cqvjk" Dec 01 08:35:25 crc kubenswrapper[5004]: I1201 08:35:25.650411 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/0a3f9662-dd78-4183-8bb4-ccc7b6b17ed3-memberlist\") pod \"speaker-cqvjk\" (UID: \"0a3f9662-dd78-4183-8bb4-ccc7b6b17ed3\") " pod="metallb-system/speaker-cqvjk" Dec 01 08:35:25 crc kubenswrapper[5004]: I1201 08:35:25.819335 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-cqvjk" Dec 01 08:35:26 crc kubenswrapper[5004]: I1201 08:35:26.671873 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-cqvjk" event={"ID":"0a3f9662-dd78-4183-8bb4-ccc7b6b17ed3","Type":"ContainerStarted","Data":"63e33ba2825600f4b36ffafd9db7343f25c95571552ace23834fe1ddf21dbc3f"} Dec 01 08:35:26 crc kubenswrapper[5004]: I1201 08:35:26.672238 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-cqvjk" event={"ID":"0a3f9662-dd78-4183-8bb4-ccc7b6b17ed3","Type":"ContainerStarted","Data":"bfed4fdce97c495acfd663da93e1729343eb99ffbea3a4847567f8703681126d"} Dec 01 08:35:26 crc kubenswrapper[5004]: I1201 08:35:26.672254 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-cqvjk" event={"ID":"0a3f9662-dd78-4183-8bb4-ccc7b6b17ed3","Type":"ContainerStarted","Data":"70fc1fb15ad3c9d7febba7fa1db1979365fda15203ae451e093cb84b0ab083aa"} Dec 01 08:35:26 crc kubenswrapper[5004]: I1201 08:35:26.672451 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-cqvjk" Dec 01 08:35:26 crc kubenswrapper[5004]: I1201 08:35:26.713788 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-cqvjk" podStartSLOduration=5.713767756 podStartE2EDuration="5.713767756s" podCreationTimestamp="2025-12-01 08:35:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:35:26.708708234 +0000 UTC m=+1104.273700236" watchObservedRunningTime="2025-12-01 08:35:26.713767756 +0000 UTC m=+1104.278759738" Dec 01 08:35:30 crc kubenswrapper[5004]: I1201 08:35:30.717834 5004 generic.go:334] "Generic (PLEG): container finished" podID="113ca366-80ad-475e-829f-fcbb4a67e642" containerID="ba24c567822d4c7204785def88d7b5a592189d2ea3e4c91867685f073abe482a" exitCode=0 Dec 01 08:35:30 crc kubenswrapper[5004]: I1201 08:35:30.718138 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l6x9r" event={"ID":"113ca366-80ad-475e-829f-fcbb4a67e642","Type":"ContainerDied","Data":"ba24c567822d4c7204785def88d7b5a592189d2ea3e4c91867685f073abe482a"} Dec 01 08:35:30 crc kubenswrapper[5004]: I1201 08:35:30.720956 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-26q9v" event={"ID":"bebabc29-870f-4604-bda6-e77a3db6a5ed","Type":"ContainerStarted","Data":"fa8f8e794fe53d992c9786d30b1656ba102800c6d8df21037fcadc94f75be2a8"} Dec 01 08:35:30 crc kubenswrapper[5004]: I1201 08:35:30.721928 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-26q9v" Dec 01 08:35:30 crc kubenswrapper[5004]: I1201 08:35:30.768162 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-26q9v" podStartSLOduration=2.853629552 podStartE2EDuration="9.768141841s" podCreationTimestamp="2025-12-01 08:35:21 +0000 UTC" firstStartedPulling="2025-12-01 08:35:23.265956166 +0000 UTC m=+1100.830948148" lastFinishedPulling="2025-12-01 08:35:30.180468455 +0000 UTC m=+1107.745460437" observedRunningTime="2025-12-01 08:35:30.764591945 +0000 UTC m=+1108.329583947" watchObservedRunningTime="2025-12-01 08:35:30.768141841 +0000 UTC m=+1108.333133823" Dec 01 08:35:31 crc kubenswrapper[5004]: I1201 08:35:31.733997 5004 generic.go:334] "Generic (PLEG): container finished" podID="113ca366-80ad-475e-829f-fcbb4a67e642" containerID="4363c6a38a07074abb30b26ecdfedccbfe66561b7583af51442489e1e708c477" exitCode=0 Dec 01 08:35:31 crc kubenswrapper[5004]: I1201 08:35:31.734143 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l6x9r" event={"ID":"113ca366-80ad-475e-829f-fcbb4a67e642","Type":"ContainerDied","Data":"4363c6a38a07074abb30b26ecdfedccbfe66561b7583af51442489e1e708c477"} Dec 01 08:35:32 crc kubenswrapper[5004]: I1201 08:35:32.747483 5004 generic.go:334] "Generic (PLEG): container finished" podID="113ca366-80ad-475e-829f-fcbb4a67e642" containerID="8e55bb642e4347d9dbdb6b8d82aa93f529e72f5e87b962a883919e25b2dbf54e" exitCode=0 Dec 01 08:35:32 crc kubenswrapper[5004]: I1201 08:35:32.747595 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l6x9r" event={"ID":"113ca366-80ad-475e-829f-fcbb4a67e642","Type":"ContainerDied","Data":"8e55bb642e4347d9dbdb6b8d82aa93f529e72f5e87b962a883919e25b2dbf54e"} Dec 01 08:35:33 crc kubenswrapper[5004]: I1201 08:35:33.761253 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l6x9r" event={"ID":"113ca366-80ad-475e-829f-fcbb4a67e642","Type":"ContainerStarted","Data":"c130b4f7ec919c959397f3260a45a4211590a30ef30b215a42fd3a3636594a9f"} Dec 01 08:35:33 crc kubenswrapper[5004]: I1201 08:35:33.761868 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l6x9r" event={"ID":"113ca366-80ad-475e-829f-fcbb4a67e642","Type":"ContainerStarted","Data":"5900131e420e89c0f0d66680d4f9e866ef41feb06e4cefbb4b1923fbf73acb4c"} Dec 01 08:35:33 crc kubenswrapper[5004]: I1201 08:35:33.761908 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l6x9r" event={"ID":"113ca366-80ad-475e-829f-fcbb4a67e642","Type":"ContainerStarted","Data":"dd988fc1a6e72579eaea5636ac4d8d21b361906ab01c83636680da07a946bf3d"} Dec 01 08:35:33 crc kubenswrapper[5004]: I1201 08:35:33.761937 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l6x9r" event={"ID":"113ca366-80ad-475e-829f-fcbb4a67e642","Type":"ContainerStarted","Data":"31473e35e6e0bc410ffef54dcd9c43be61608fe0ceff24c5098b4bbadb30de71"} Dec 01 08:35:34 crc kubenswrapper[5004]: I1201 08:35:34.783895 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l6x9r" event={"ID":"113ca366-80ad-475e-829f-fcbb4a67e642","Type":"ContainerStarted","Data":"93710987b5f0c42b84cc4e167ea3c2b7845477f5d0354150eb9de0e53e86a213"} Dec 01 08:35:34 crc kubenswrapper[5004]: I1201 08:35:34.783965 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l6x9r" event={"ID":"113ca366-80ad-475e-829f-fcbb4a67e642","Type":"ContainerStarted","Data":"e8cd606a58e4a25de30b27b41c437cac4fd30d4bef4a0c33b4eae88b07ed7716"} Dec 01 08:35:34 crc kubenswrapper[5004]: I1201 08:35:34.784099 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-l6x9r" Dec 01 08:35:34 crc kubenswrapper[5004]: I1201 08:35:34.834335 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-l6x9r" podStartSLOduration=5.969236647 podStartE2EDuration="13.834306599s" podCreationTimestamp="2025-12-01 08:35:21 +0000 UTC" firstStartedPulling="2025-12-01 08:35:22.311926501 +0000 UTC m=+1099.876918483" lastFinishedPulling="2025-12-01 08:35:30.176996453 +0000 UTC m=+1107.741988435" observedRunningTime="2025-12-01 08:35:34.818875858 +0000 UTC m=+1112.383867880" watchObservedRunningTime="2025-12-01 08:35:34.834306599 +0000 UTC m=+1112.399298621" Dec 01 08:35:37 crc kubenswrapper[5004]: I1201 08:35:37.164746 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-l6x9r" Dec 01 08:35:37 crc kubenswrapper[5004]: I1201 08:35:37.214489 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-l6x9r" Dec 01 08:35:42 crc kubenswrapper[5004]: I1201 08:35:42.171680 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-l6x9r" Dec 01 08:35:42 crc kubenswrapper[5004]: I1201 08:35:42.741957 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-26q9v" Dec 01 08:35:42 crc kubenswrapper[5004]: I1201 08:35:42.841860 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-5t9hg" Dec 01 08:35:45 crc kubenswrapper[5004]: I1201 08:35:45.824075 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-cqvjk" Dec 01 08:35:48 crc kubenswrapper[5004]: I1201 08:35:48.613011 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-z6h86"] Dec 01 08:35:48 crc kubenswrapper[5004]: I1201 08:35:48.615788 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-z6h86" Dec 01 08:35:48 crc kubenswrapper[5004]: I1201 08:35:48.618372 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-nn5dw" Dec 01 08:35:48 crc kubenswrapper[5004]: I1201 08:35:48.621871 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 01 08:35:48 crc kubenswrapper[5004]: I1201 08:35:48.625742 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 01 08:35:48 crc kubenswrapper[5004]: I1201 08:35:48.653413 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2ctt\" (UniqueName: \"kubernetes.io/projected/d349c8d4-0da8-47f3-8181-b20fe3b0f19e-kube-api-access-d2ctt\") pod \"openstack-operator-index-z6h86\" (UID: \"d349c8d4-0da8-47f3-8181-b20fe3b0f19e\") " pod="openstack-operators/openstack-operator-index-z6h86" Dec 01 08:35:48 crc kubenswrapper[5004]: I1201 08:35:48.760544 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2ctt\" (UniqueName: \"kubernetes.io/projected/d349c8d4-0da8-47f3-8181-b20fe3b0f19e-kube-api-access-d2ctt\") pod \"openstack-operator-index-z6h86\" (UID: \"d349c8d4-0da8-47f3-8181-b20fe3b0f19e\") " pod="openstack-operators/openstack-operator-index-z6h86" Dec 01 08:35:48 crc kubenswrapper[5004]: I1201 08:35:48.773737 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-z6h86"] Dec 01 08:35:48 crc kubenswrapper[5004]: I1201 08:35:48.798527 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2ctt\" (UniqueName: \"kubernetes.io/projected/d349c8d4-0da8-47f3-8181-b20fe3b0f19e-kube-api-access-d2ctt\") pod \"openstack-operator-index-z6h86\" (UID: \"d349c8d4-0da8-47f3-8181-b20fe3b0f19e\") " pod="openstack-operators/openstack-operator-index-z6h86" Dec 01 08:35:49 crc kubenswrapper[5004]: I1201 08:35:49.065514 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-z6h86" Dec 01 08:35:49 crc kubenswrapper[5004]: I1201 08:35:49.517156 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-z6h86"] Dec 01 08:35:49 crc kubenswrapper[5004]: W1201 08:35:49.531063 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd349c8d4_0da8_47f3_8181_b20fe3b0f19e.slice/crio-d4b922bbfee714c9f2b7e05761639b9e0acaeeeda20f7135b7aa6c06b28998df WatchSource:0}: Error finding container d4b922bbfee714c9f2b7e05761639b9e0acaeeeda20f7135b7aa6c06b28998df: Status 404 returned error can't find the container with id d4b922bbfee714c9f2b7e05761639b9e0acaeeeda20f7135b7aa6c06b28998df Dec 01 08:35:49 crc kubenswrapper[5004]: I1201 08:35:49.913441 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-z6h86" event={"ID":"d349c8d4-0da8-47f3-8181-b20fe3b0f19e","Type":"ContainerStarted","Data":"d4b922bbfee714c9f2b7e05761639b9e0acaeeeda20f7135b7aa6c06b28998df"} Dec 01 08:35:50 crc kubenswrapper[5004]: I1201 08:35:50.983603 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-z6h86"] Dec 01 08:35:51 crc kubenswrapper[5004]: I1201 08:35:51.387462 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-h2x7z"] Dec 01 08:35:51 crc kubenswrapper[5004]: I1201 08:35:51.393857 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-h2x7z" Dec 01 08:35:51 crc kubenswrapper[5004]: I1201 08:35:51.413820 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvkbj\" (UniqueName: \"kubernetes.io/projected/893e9b84-6818-4442-aad3-528d7b7f24b2-kube-api-access-pvkbj\") pod \"openstack-operator-index-h2x7z\" (UID: \"893e9b84-6818-4442-aad3-528d7b7f24b2\") " pod="openstack-operators/openstack-operator-index-h2x7z" Dec 01 08:35:51 crc kubenswrapper[5004]: I1201 08:35:51.418786 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-h2x7z"] Dec 01 08:35:51 crc kubenswrapper[5004]: I1201 08:35:51.515616 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvkbj\" (UniqueName: \"kubernetes.io/projected/893e9b84-6818-4442-aad3-528d7b7f24b2-kube-api-access-pvkbj\") pod \"openstack-operator-index-h2x7z\" (UID: \"893e9b84-6818-4442-aad3-528d7b7f24b2\") " pod="openstack-operators/openstack-operator-index-h2x7z" Dec 01 08:35:51 crc kubenswrapper[5004]: I1201 08:35:51.535872 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvkbj\" (UniqueName: \"kubernetes.io/projected/893e9b84-6818-4442-aad3-528d7b7f24b2-kube-api-access-pvkbj\") pod \"openstack-operator-index-h2x7z\" (UID: \"893e9b84-6818-4442-aad3-528d7b7f24b2\") " pod="openstack-operators/openstack-operator-index-h2x7z" Dec 01 08:35:51 crc kubenswrapper[5004]: I1201 08:35:51.739377 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-h2x7z" Dec 01 08:35:53 crc kubenswrapper[5004]: I1201 08:35:53.481992 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-h2x7z"] Dec 01 08:35:53 crc kubenswrapper[5004]: I1201 08:35:53.949527 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-h2x7z" event={"ID":"893e9b84-6818-4442-aad3-528d7b7f24b2","Type":"ContainerStarted","Data":"7937a2fe9fab0c89fb387565a0ca36e58cd2dac20428c840ca0786229e4f086d"} Dec 01 08:35:53 crc kubenswrapper[5004]: I1201 08:35:53.949614 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-h2x7z" event={"ID":"893e9b84-6818-4442-aad3-528d7b7f24b2","Type":"ContainerStarted","Data":"7b23f9ba20d609ed628b72e3e608bc7653d8eeb804e4304f3657e11865481fab"} Dec 01 08:35:53 crc kubenswrapper[5004]: I1201 08:35:53.951694 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-z6h86" event={"ID":"d349c8d4-0da8-47f3-8181-b20fe3b0f19e","Type":"ContainerStarted","Data":"c8e0448069216394d65dbca6b9331a0e6e005fa5b3f28dcc8ef2de55a37bc746"} Dec 01 08:35:53 crc kubenswrapper[5004]: I1201 08:35:53.951853 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-z6h86" podUID="d349c8d4-0da8-47f3-8181-b20fe3b0f19e" containerName="registry-server" containerID="cri-o://c8e0448069216394d65dbca6b9331a0e6e005fa5b3f28dcc8ef2de55a37bc746" gracePeriod=2 Dec 01 08:35:53 crc kubenswrapper[5004]: I1201 08:35:53.983715 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-h2x7z" podStartSLOduration=2.849979952 podStartE2EDuration="2.983682538s" podCreationTimestamp="2025-12-01 08:35:51 +0000 UTC" firstStartedPulling="2025-12-01 08:35:53.488401968 +0000 UTC m=+1131.053393950" lastFinishedPulling="2025-12-01 08:35:53.622104544 +0000 UTC m=+1131.187096536" observedRunningTime="2025-12-01 08:35:53.966113446 +0000 UTC m=+1131.531105468" watchObservedRunningTime="2025-12-01 08:35:53.983682538 +0000 UTC m=+1131.548674560" Dec 01 08:35:53 crc kubenswrapper[5004]: I1201 08:35:53.996011 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-z6h86" podStartSLOduration=2.290171918 podStartE2EDuration="5.995975212s" podCreationTimestamp="2025-12-01 08:35:48 +0000 UTC" firstStartedPulling="2025-12-01 08:35:49.534465853 +0000 UTC m=+1127.099457835" lastFinishedPulling="2025-12-01 08:35:53.240269137 +0000 UTC m=+1130.805261129" observedRunningTime="2025-12-01 08:35:53.987396416 +0000 UTC m=+1131.552388408" watchObservedRunningTime="2025-12-01 08:35:53.995975212 +0000 UTC m=+1131.560967234" Dec 01 08:35:54 crc kubenswrapper[5004]: I1201 08:35:54.539102 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-z6h86" Dec 01 08:35:54 crc kubenswrapper[5004]: I1201 08:35:54.580853 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2ctt\" (UniqueName: \"kubernetes.io/projected/d349c8d4-0da8-47f3-8181-b20fe3b0f19e-kube-api-access-d2ctt\") pod \"d349c8d4-0da8-47f3-8181-b20fe3b0f19e\" (UID: \"d349c8d4-0da8-47f3-8181-b20fe3b0f19e\") " Dec 01 08:35:54 crc kubenswrapper[5004]: I1201 08:35:54.588394 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d349c8d4-0da8-47f3-8181-b20fe3b0f19e-kube-api-access-d2ctt" (OuterVolumeSpecName: "kube-api-access-d2ctt") pod "d349c8d4-0da8-47f3-8181-b20fe3b0f19e" (UID: "d349c8d4-0da8-47f3-8181-b20fe3b0f19e"). InnerVolumeSpecName "kube-api-access-d2ctt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:35:54 crc kubenswrapper[5004]: I1201 08:35:54.683428 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2ctt\" (UniqueName: \"kubernetes.io/projected/d349c8d4-0da8-47f3-8181-b20fe3b0f19e-kube-api-access-d2ctt\") on node \"crc\" DevicePath \"\"" Dec 01 08:35:54 crc kubenswrapper[5004]: I1201 08:35:54.962328 5004 generic.go:334] "Generic (PLEG): container finished" podID="d349c8d4-0da8-47f3-8181-b20fe3b0f19e" containerID="c8e0448069216394d65dbca6b9331a0e6e005fa5b3f28dcc8ef2de55a37bc746" exitCode=0 Dec 01 08:35:54 crc kubenswrapper[5004]: I1201 08:35:54.962368 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-z6h86" Dec 01 08:35:54 crc kubenswrapper[5004]: I1201 08:35:54.962393 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-z6h86" event={"ID":"d349c8d4-0da8-47f3-8181-b20fe3b0f19e","Type":"ContainerDied","Data":"c8e0448069216394d65dbca6b9331a0e6e005fa5b3f28dcc8ef2de55a37bc746"} Dec 01 08:35:54 crc kubenswrapper[5004]: I1201 08:35:54.962440 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-z6h86" event={"ID":"d349c8d4-0da8-47f3-8181-b20fe3b0f19e","Type":"ContainerDied","Data":"d4b922bbfee714c9f2b7e05761639b9e0acaeeeda20f7135b7aa6c06b28998df"} Dec 01 08:35:54 crc kubenswrapper[5004]: I1201 08:35:54.962468 5004 scope.go:117] "RemoveContainer" containerID="c8e0448069216394d65dbca6b9331a0e6e005fa5b3f28dcc8ef2de55a37bc746" Dec 01 08:35:54 crc kubenswrapper[5004]: I1201 08:35:54.994876 5004 scope.go:117] "RemoveContainer" containerID="c8e0448069216394d65dbca6b9331a0e6e005fa5b3f28dcc8ef2de55a37bc746" Dec 01 08:35:54 crc kubenswrapper[5004]: E1201 08:35:54.995689 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8e0448069216394d65dbca6b9331a0e6e005fa5b3f28dcc8ef2de55a37bc746\": container with ID starting with c8e0448069216394d65dbca6b9331a0e6e005fa5b3f28dcc8ef2de55a37bc746 not found: ID does not exist" containerID="c8e0448069216394d65dbca6b9331a0e6e005fa5b3f28dcc8ef2de55a37bc746" Dec 01 08:35:54 crc kubenswrapper[5004]: I1201 08:35:54.995742 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8e0448069216394d65dbca6b9331a0e6e005fa5b3f28dcc8ef2de55a37bc746"} err="failed to get container status \"c8e0448069216394d65dbca6b9331a0e6e005fa5b3f28dcc8ef2de55a37bc746\": rpc error: code = NotFound desc = could not find container \"c8e0448069216394d65dbca6b9331a0e6e005fa5b3f28dcc8ef2de55a37bc746\": container with ID starting with c8e0448069216394d65dbca6b9331a0e6e005fa5b3f28dcc8ef2de55a37bc746 not found: ID does not exist" Dec 01 08:35:54 crc kubenswrapper[5004]: I1201 08:35:54.998464 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-z6h86"] Dec 01 08:35:55 crc kubenswrapper[5004]: I1201 08:35:55.008393 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-z6h86"] Dec 01 08:35:56 crc kubenswrapper[5004]: I1201 08:35:56.771406 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d349c8d4-0da8-47f3-8181-b20fe3b0f19e" path="/var/lib/kubelet/pods/d349c8d4-0da8-47f3-8181-b20fe3b0f19e/volumes" Dec 01 08:36:01 crc kubenswrapper[5004]: I1201 08:36:01.740233 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-h2x7z" Dec 01 08:36:01 crc kubenswrapper[5004]: I1201 08:36:01.740737 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-h2x7z" Dec 01 08:36:01 crc kubenswrapper[5004]: I1201 08:36:01.801744 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-h2x7z" Dec 01 08:36:02 crc kubenswrapper[5004]: I1201 08:36:02.046762 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-h2x7z" Dec 01 08:36:09 crc kubenswrapper[5004]: I1201 08:36:09.434869 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/232b41aac13e001dc857b9795d2b854afae448fcc1728af84c1ef8a48428nq7"] Dec 01 08:36:09 crc kubenswrapper[5004]: E1201 08:36:09.436021 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d349c8d4-0da8-47f3-8181-b20fe3b0f19e" containerName="registry-server" Dec 01 08:36:09 crc kubenswrapper[5004]: I1201 08:36:09.436043 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="d349c8d4-0da8-47f3-8181-b20fe3b0f19e" containerName="registry-server" Dec 01 08:36:09 crc kubenswrapper[5004]: I1201 08:36:09.436299 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="d349c8d4-0da8-47f3-8181-b20fe3b0f19e" containerName="registry-server" Dec 01 08:36:09 crc kubenswrapper[5004]: I1201 08:36:09.437951 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/232b41aac13e001dc857b9795d2b854afae448fcc1728af84c1ef8a48428nq7" Dec 01 08:36:09 crc kubenswrapper[5004]: I1201 08:36:09.441859 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-4jkxl" Dec 01 08:36:09 crc kubenswrapper[5004]: I1201 08:36:09.443923 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/232b41aac13e001dc857b9795d2b854afae448fcc1728af84c1ef8a48428nq7"] Dec 01 08:36:09 crc kubenswrapper[5004]: I1201 08:36:09.598185 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/911b89dd-ee3e-4349-b52d-36e0199aabba-bundle\") pod \"232b41aac13e001dc857b9795d2b854afae448fcc1728af84c1ef8a48428nq7\" (UID: \"911b89dd-ee3e-4349-b52d-36e0199aabba\") " pod="openstack-operators/232b41aac13e001dc857b9795d2b854afae448fcc1728af84c1ef8a48428nq7" Dec 01 08:36:09 crc kubenswrapper[5004]: I1201 08:36:09.598661 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/911b89dd-ee3e-4349-b52d-36e0199aabba-util\") pod \"232b41aac13e001dc857b9795d2b854afae448fcc1728af84c1ef8a48428nq7\" (UID: \"911b89dd-ee3e-4349-b52d-36e0199aabba\") " pod="openstack-operators/232b41aac13e001dc857b9795d2b854afae448fcc1728af84c1ef8a48428nq7" Dec 01 08:36:09 crc kubenswrapper[5004]: I1201 08:36:09.598708 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2lsn\" (UniqueName: \"kubernetes.io/projected/911b89dd-ee3e-4349-b52d-36e0199aabba-kube-api-access-z2lsn\") pod \"232b41aac13e001dc857b9795d2b854afae448fcc1728af84c1ef8a48428nq7\" (UID: \"911b89dd-ee3e-4349-b52d-36e0199aabba\") " pod="openstack-operators/232b41aac13e001dc857b9795d2b854afae448fcc1728af84c1ef8a48428nq7" Dec 01 08:36:09 crc kubenswrapper[5004]: I1201 08:36:09.700725 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/911b89dd-ee3e-4349-b52d-36e0199aabba-bundle\") pod \"232b41aac13e001dc857b9795d2b854afae448fcc1728af84c1ef8a48428nq7\" (UID: \"911b89dd-ee3e-4349-b52d-36e0199aabba\") " pod="openstack-operators/232b41aac13e001dc857b9795d2b854afae448fcc1728af84c1ef8a48428nq7" Dec 01 08:36:09 crc kubenswrapper[5004]: I1201 08:36:09.700828 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/911b89dd-ee3e-4349-b52d-36e0199aabba-util\") pod \"232b41aac13e001dc857b9795d2b854afae448fcc1728af84c1ef8a48428nq7\" (UID: \"911b89dd-ee3e-4349-b52d-36e0199aabba\") " pod="openstack-operators/232b41aac13e001dc857b9795d2b854afae448fcc1728af84c1ef8a48428nq7" Dec 01 08:36:09 crc kubenswrapper[5004]: I1201 08:36:09.700862 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2lsn\" (UniqueName: \"kubernetes.io/projected/911b89dd-ee3e-4349-b52d-36e0199aabba-kube-api-access-z2lsn\") pod \"232b41aac13e001dc857b9795d2b854afae448fcc1728af84c1ef8a48428nq7\" (UID: \"911b89dd-ee3e-4349-b52d-36e0199aabba\") " pod="openstack-operators/232b41aac13e001dc857b9795d2b854afae448fcc1728af84c1ef8a48428nq7" Dec 01 08:36:09 crc kubenswrapper[5004]: I1201 08:36:09.701395 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/911b89dd-ee3e-4349-b52d-36e0199aabba-bundle\") pod \"232b41aac13e001dc857b9795d2b854afae448fcc1728af84c1ef8a48428nq7\" (UID: \"911b89dd-ee3e-4349-b52d-36e0199aabba\") " pod="openstack-operators/232b41aac13e001dc857b9795d2b854afae448fcc1728af84c1ef8a48428nq7" Dec 01 08:36:09 crc kubenswrapper[5004]: I1201 08:36:09.701955 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/911b89dd-ee3e-4349-b52d-36e0199aabba-util\") pod \"232b41aac13e001dc857b9795d2b854afae448fcc1728af84c1ef8a48428nq7\" (UID: \"911b89dd-ee3e-4349-b52d-36e0199aabba\") " pod="openstack-operators/232b41aac13e001dc857b9795d2b854afae448fcc1728af84c1ef8a48428nq7" Dec 01 08:36:09 crc kubenswrapper[5004]: I1201 08:36:09.723757 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2lsn\" (UniqueName: \"kubernetes.io/projected/911b89dd-ee3e-4349-b52d-36e0199aabba-kube-api-access-z2lsn\") pod \"232b41aac13e001dc857b9795d2b854afae448fcc1728af84c1ef8a48428nq7\" (UID: \"911b89dd-ee3e-4349-b52d-36e0199aabba\") " pod="openstack-operators/232b41aac13e001dc857b9795d2b854afae448fcc1728af84c1ef8a48428nq7" Dec 01 08:36:09 crc kubenswrapper[5004]: I1201 08:36:09.766457 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/232b41aac13e001dc857b9795d2b854afae448fcc1728af84c1ef8a48428nq7" Dec 01 08:36:10 crc kubenswrapper[5004]: W1201 08:36:10.306988 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod911b89dd_ee3e_4349_b52d_36e0199aabba.slice/crio-3d0d9eb2869acb20b536676ddbf2d7a24d1e44320a092273d096dea3164354d8 WatchSource:0}: Error finding container 3d0d9eb2869acb20b536676ddbf2d7a24d1e44320a092273d096dea3164354d8: Status 404 returned error can't find the container with id 3d0d9eb2869acb20b536676ddbf2d7a24d1e44320a092273d096dea3164354d8 Dec 01 08:36:10 crc kubenswrapper[5004]: I1201 08:36:10.319548 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/232b41aac13e001dc857b9795d2b854afae448fcc1728af84c1ef8a48428nq7"] Dec 01 08:36:11 crc kubenswrapper[5004]: I1201 08:36:11.110940 5004 generic.go:334] "Generic (PLEG): container finished" podID="911b89dd-ee3e-4349-b52d-36e0199aabba" containerID="a9c9bbdff9a44e5f16918815e3dc333300939200f558dc8a2f4c9d86731afacd" exitCode=0 Dec 01 08:36:11 crc kubenswrapper[5004]: I1201 08:36:11.111076 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/232b41aac13e001dc857b9795d2b854afae448fcc1728af84c1ef8a48428nq7" event={"ID":"911b89dd-ee3e-4349-b52d-36e0199aabba","Type":"ContainerDied","Data":"a9c9bbdff9a44e5f16918815e3dc333300939200f558dc8a2f4c9d86731afacd"} Dec 01 08:36:11 crc kubenswrapper[5004]: I1201 08:36:11.111398 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/232b41aac13e001dc857b9795d2b854afae448fcc1728af84c1ef8a48428nq7" event={"ID":"911b89dd-ee3e-4349-b52d-36e0199aabba","Type":"ContainerStarted","Data":"3d0d9eb2869acb20b536676ddbf2d7a24d1e44320a092273d096dea3164354d8"} Dec 01 08:36:12 crc kubenswrapper[5004]: I1201 08:36:12.124344 5004 generic.go:334] "Generic (PLEG): container finished" podID="911b89dd-ee3e-4349-b52d-36e0199aabba" containerID="12658152560e0669e8ae4d72ebc1b092e5739db895e5771d76e41b4f9e7bda08" exitCode=0 Dec 01 08:36:12 crc kubenswrapper[5004]: I1201 08:36:12.124425 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/232b41aac13e001dc857b9795d2b854afae448fcc1728af84c1ef8a48428nq7" event={"ID":"911b89dd-ee3e-4349-b52d-36e0199aabba","Type":"ContainerDied","Data":"12658152560e0669e8ae4d72ebc1b092e5739db895e5771d76e41b4f9e7bda08"} Dec 01 08:36:13 crc kubenswrapper[5004]: I1201 08:36:13.143010 5004 generic.go:334] "Generic (PLEG): container finished" podID="911b89dd-ee3e-4349-b52d-36e0199aabba" containerID="a2a13652df6114ee22c044928144710f8e0256f64cc62107978925920d5874ec" exitCode=0 Dec 01 08:36:13 crc kubenswrapper[5004]: I1201 08:36:13.143092 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/232b41aac13e001dc857b9795d2b854afae448fcc1728af84c1ef8a48428nq7" event={"ID":"911b89dd-ee3e-4349-b52d-36e0199aabba","Type":"ContainerDied","Data":"a2a13652df6114ee22c044928144710f8e0256f64cc62107978925920d5874ec"} Dec 01 08:36:14 crc kubenswrapper[5004]: I1201 08:36:14.638215 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/232b41aac13e001dc857b9795d2b854afae448fcc1728af84c1ef8a48428nq7" Dec 01 08:36:14 crc kubenswrapper[5004]: I1201 08:36:14.713403 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/911b89dd-ee3e-4349-b52d-36e0199aabba-util\") pod \"911b89dd-ee3e-4349-b52d-36e0199aabba\" (UID: \"911b89dd-ee3e-4349-b52d-36e0199aabba\") " Dec 01 08:36:14 crc kubenswrapper[5004]: I1201 08:36:14.713806 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2lsn\" (UniqueName: \"kubernetes.io/projected/911b89dd-ee3e-4349-b52d-36e0199aabba-kube-api-access-z2lsn\") pod \"911b89dd-ee3e-4349-b52d-36e0199aabba\" (UID: \"911b89dd-ee3e-4349-b52d-36e0199aabba\") " Dec 01 08:36:14 crc kubenswrapper[5004]: I1201 08:36:14.713918 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/911b89dd-ee3e-4349-b52d-36e0199aabba-bundle\") pod \"911b89dd-ee3e-4349-b52d-36e0199aabba\" (UID: \"911b89dd-ee3e-4349-b52d-36e0199aabba\") " Dec 01 08:36:14 crc kubenswrapper[5004]: I1201 08:36:14.715315 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/911b89dd-ee3e-4349-b52d-36e0199aabba-bundle" (OuterVolumeSpecName: "bundle") pod "911b89dd-ee3e-4349-b52d-36e0199aabba" (UID: "911b89dd-ee3e-4349-b52d-36e0199aabba"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:36:14 crc kubenswrapper[5004]: I1201 08:36:14.723278 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/911b89dd-ee3e-4349-b52d-36e0199aabba-kube-api-access-z2lsn" (OuterVolumeSpecName: "kube-api-access-z2lsn") pod "911b89dd-ee3e-4349-b52d-36e0199aabba" (UID: "911b89dd-ee3e-4349-b52d-36e0199aabba"). InnerVolumeSpecName "kube-api-access-z2lsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:36:14 crc kubenswrapper[5004]: I1201 08:36:14.736452 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/911b89dd-ee3e-4349-b52d-36e0199aabba-util" (OuterVolumeSpecName: "util") pod "911b89dd-ee3e-4349-b52d-36e0199aabba" (UID: "911b89dd-ee3e-4349-b52d-36e0199aabba"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:36:14 crc kubenswrapper[5004]: I1201 08:36:14.818153 5004 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/911b89dd-ee3e-4349-b52d-36e0199aabba-util\") on node \"crc\" DevicePath \"\"" Dec 01 08:36:14 crc kubenswrapper[5004]: I1201 08:36:14.818291 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2lsn\" (UniqueName: \"kubernetes.io/projected/911b89dd-ee3e-4349-b52d-36e0199aabba-kube-api-access-z2lsn\") on node \"crc\" DevicePath \"\"" Dec 01 08:36:14 crc kubenswrapper[5004]: I1201 08:36:14.818326 5004 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/911b89dd-ee3e-4349-b52d-36e0199aabba-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:36:15 crc kubenswrapper[5004]: I1201 08:36:15.165146 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/232b41aac13e001dc857b9795d2b854afae448fcc1728af84c1ef8a48428nq7" event={"ID":"911b89dd-ee3e-4349-b52d-36e0199aabba","Type":"ContainerDied","Data":"3d0d9eb2869acb20b536676ddbf2d7a24d1e44320a092273d096dea3164354d8"} Dec 01 08:36:15 crc kubenswrapper[5004]: I1201 08:36:15.165210 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d0d9eb2869acb20b536676ddbf2d7a24d1e44320a092273d096dea3164354d8" Dec 01 08:36:15 crc kubenswrapper[5004]: I1201 08:36:15.165291 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/232b41aac13e001dc857b9795d2b854afae448fcc1728af84c1ef8a48428nq7" Dec 01 08:36:16 crc kubenswrapper[5004]: I1201 08:36:16.959160 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-688447bc77-l2kbh"] Dec 01 08:36:16 crc kubenswrapper[5004]: E1201 08:36:16.959696 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="911b89dd-ee3e-4349-b52d-36e0199aabba" containerName="extract" Dec 01 08:36:16 crc kubenswrapper[5004]: I1201 08:36:16.959709 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="911b89dd-ee3e-4349-b52d-36e0199aabba" containerName="extract" Dec 01 08:36:16 crc kubenswrapper[5004]: E1201 08:36:16.959726 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="911b89dd-ee3e-4349-b52d-36e0199aabba" containerName="util" Dec 01 08:36:16 crc kubenswrapper[5004]: I1201 08:36:16.959733 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="911b89dd-ee3e-4349-b52d-36e0199aabba" containerName="util" Dec 01 08:36:16 crc kubenswrapper[5004]: E1201 08:36:16.959756 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="911b89dd-ee3e-4349-b52d-36e0199aabba" containerName="pull" Dec 01 08:36:16 crc kubenswrapper[5004]: I1201 08:36:16.959762 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="911b89dd-ee3e-4349-b52d-36e0199aabba" containerName="pull" Dec 01 08:36:16 crc kubenswrapper[5004]: I1201 08:36:16.959906 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="911b89dd-ee3e-4349-b52d-36e0199aabba" containerName="extract" Dec 01 08:36:16 crc kubenswrapper[5004]: I1201 08:36:16.960437 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-688447bc77-l2kbh" Dec 01 08:36:16 crc kubenswrapper[5004]: I1201 08:36:16.963939 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-t9fnq" Dec 01 08:36:16 crc kubenswrapper[5004]: I1201 08:36:16.972147 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-688447bc77-l2kbh"] Dec 01 08:36:17 crc kubenswrapper[5004]: I1201 08:36:17.056703 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkgfj\" (UniqueName: \"kubernetes.io/projected/58986db9-c1c0-4caf-a239-211791e82dc2-kube-api-access-kkgfj\") pod \"openstack-operator-controller-operator-688447bc77-l2kbh\" (UID: \"58986db9-c1c0-4caf-a239-211791e82dc2\") " pod="openstack-operators/openstack-operator-controller-operator-688447bc77-l2kbh" Dec 01 08:36:17 crc kubenswrapper[5004]: I1201 08:36:17.159165 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkgfj\" (UniqueName: \"kubernetes.io/projected/58986db9-c1c0-4caf-a239-211791e82dc2-kube-api-access-kkgfj\") pod \"openstack-operator-controller-operator-688447bc77-l2kbh\" (UID: \"58986db9-c1c0-4caf-a239-211791e82dc2\") " pod="openstack-operators/openstack-operator-controller-operator-688447bc77-l2kbh" Dec 01 08:36:17 crc kubenswrapper[5004]: I1201 08:36:17.177715 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkgfj\" (UniqueName: \"kubernetes.io/projected/58986db9-c1c0-4caf-a239-211791e82dc2-kube-api-access-kkgfj\") pod \"openstack-operator-controller-operator-688447bc77-l2kbh\" (UID: \"58986db9-c1c0-4caf-a239-211791e82dc2\") " pod="openstack-operators/openstack-operator-controller-operator-688447bc77-l2kbh" Dec 01 08:36:17 crc kubenswrapper[5004]: I1201 08:36:17.316527 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-688447bc77-l2kbh" Dec 01 08:36:17 crc kubenswrapper[5004]: I1201 08:36:17.796798 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-688447bc77-l2kbh"] Dec 01 08:36:18 crc kubenswrapper[5004]: I1201 08:36:18.188488 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-688447bc77-l2kbh" event={"ID":"58986db9-c1c0-4caf-a239-211791e82dc2","Type":"ContainerStarted","Data":"d4c4a49bd2ce405cb9a664be9eede4197138123bd1df1a156adc70b85abdf1d3"} Dec 01 08:36:22 crc kubenswrapper[5004]: I1201 08:36:22.237213 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-688447bc77-l2kbh" event={"ID":"58986db9-c1c0-4caf-a239-211791e82dc2","Type":"ContainerStarted","Data":"7e308a6f421135ea0011db7bf3e2431f31921a851638d2b5bce8c5675e18445f"} Dec 01 08:36:22 crc kubenswrapper[5004]: I1201 08:36:22.237768 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-688447bc77-l2kbh" Dec 01 08:36:22 crc kubenswrapper[5004]: I1201 08:36:22.276103 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-688447bc77-l2kbh" podStartSLOduration=2.699972659 podStartE2EDuration="6.276085072s" podCreationTimestamp="2025-12-01 08:36:16 +0000 UTC" firstStartedPulling="2025-12-01 08:36:17.816227152 +0000 UTC m=+1155.381219134" lastFinishedPulling="2025-12-01 08:36:21.392339565 +0000 UTC m=+1158.957331547" observedRunningTime="2025-12-01 08:36:22.268208694 +0000 UTC m=+1159.833200706" watchObservedRunningTime="2025-12-01 08:36:22.276085072 +0000 UTC m=+1159.841077054" Dec 01 08:36:27 crc kubenswrapper[5004]: I1201 08:36:27.320309 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-688447bc77-l2kbh" Dec 01 08:36:38 crc kubenswrapper[5004]: I1201 08:36:38.728866 5004 patch_prober.go:28] interesting pod/machine-config-daemon-fvdgt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 08:36:38 crc kubenswrapper[5004]: I1201 08:36:38.729348 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.205685 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-rp7vl"] Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.207689 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-rp7vl" Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.219762 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-6dpk9"] Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.221530 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-25n29" Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.242191 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-rp7vl"] Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.242679 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-6dpk9" Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.261715 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-nc46h" Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.296466 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pm8ls\" (UniqueName: \"kubernetes.io/projected/9bca89aa-3367-4bff-b070-c191fcae5f2f-kube-api-access-pm8ls\") pod \"barbican-operator-controller-manager-7d9dfd778-rp7vl\" (UID: \"9bca89aa-3367-4bff-b070-c191fcae5f2f\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-rp7vl" Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.296521 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqh2f\" (UniqueName: \"kubernetes.io/projected/166912d9-e0b0-40b8-8e26-9c86183d7952-kube-api-access-jqh2f\") pod \"cinder-operator-controller-manager-859b6ccc6-6dpk9\" (UID: \"166912d9-e0b0-40b8-8e26-9c86183d7952\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-6dpk9" Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.298644 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-6dpk9"] Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.309179 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-lfs45"] Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.310914 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-lfs45" Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.314855 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-292qd" Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.318878 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-668d9c48b9-5c6kq"] Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.320192 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-5c6kq" Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.332130 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-h65sv" Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.336827 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-zgpn6"] Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.347817 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-zgpn6" Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.360081 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-bt9k8" Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.371626 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-668d9c48b9-5c6kq"] Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.384521 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-lfs45"] Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.399644 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqh2f\" (UniqueName: \"kubernetes.io/projected/166912d9-e0b0-40b8-8e26-9c86183d7952-kube-api-access-jqh2f\") pod \"cinder-operator-controller-manager-859b6ccc6-6dpk9\" (UID: \"166912d9-e0b0-40b8-8e26-9c86183d7952\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-6dpk9" Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.399761 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxnpw\" (UniqueName: \"kubernetes.io/projected/c6e6ae59-9f58-4856-b200-d42d1e1e23ed-kube-api-access-kxnpw\") pod \"designate-operator-controller-manager-78b4bc895b-lfs45\" (UID: \"c6e6ae59-9f58-4856-b200-d42d1e1e23ed\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-lfs45" Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.399816 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrsrb\" (UniqueName: \"kubernetes.io/projected/67dcdfb2-70ae-4444-b271-dd83dcb37756-kube-api-access-mrsrb\") pod \"glance-operator-controller-manager-668d9c48b9-5c6kq\" (UID: \"67dcdfb2-70ae-4444-b271-dd83dcb37756\") " pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-5c6kq" Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.399867 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jdxg\" (UniqueName: \"kubernetes.io/projected/b931f322-0c4e-4019-a11e-616c80d1e5f1-kube-api-access-6jdxg\") pod \"heat-operator-controller-manager-5f64f6f8bb-zgpn6\" (UID: \"b931f322-0c4e-4019-a11e-616c80d1e5f1\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-zgpn6" Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.399905 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pm8ls\" (UniqueName: \"kubernetes.io/projected/9bca89aa-3367-4bff-b070-c191fcae5f2f-kube-api-access-pm8ls\") pod \"barbican-operator-controller-manager-7d9dfd778-rp7vl\" (UID: \"9bca89aa-3367-4bff-b070-c191fcae5f2f\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-rp7vl" Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.425236 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-zgpn6"] Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.458884 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pm8ls\" (UniqueName: \"kubernetes.io/projected/9bca89aa-3367-4bff-b070-c191fcae5f2f-kube-api-access-pm8ls\") pod \"barbican-operator-controller-manager-7d9dfd778-rp7vl\" (UID: \"9bca89aa-3367-4bff-b070-c191fcae5f2f\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-rp7vl" Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.467791 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqh2f\" (UniqueName: \"kubernetes.io/projected/166912d9-e0b0-40b8-8e26-9c86183d7952-kube-api-access-jqh2f\") pod \"cinder-operator-controller-manager-859b6ccc6-6dpk9\" (UID: \"166912d9-e0b0-40b8-8e26-9c86183d7952\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-6dpk9" Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.480142 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-hzgf4"] Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.481390 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-hzgf4" Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.485184 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.485430 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-r7kqv" Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.508172 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxnpw\" (UniqueName: \"kubernetes.io/projected/c6e6ae59-9f58-4856-b200-d42d1e1e23ed-kube-api-access-kxnpw\") pod \"designate-operator-controller-manager-78b4bc895b-lfs45\" (UID: \"c6e6ae59-9f58-4856-b200-d42d1e1e23ed\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-lfs45" Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.508209 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwrgl\" (UniqueName: \"kubernetes.io/projected/f87b36f7-2558-4823-85fc-6b6e9090b1d7-kube-api-access-qwrgl\") pod \"infra-operator-controller-manager-57548d458d-hzgf4\" (UID: \"f87b36f7-2558-4823-85fc-6b6e9090b1d7\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-hzgf4" Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.508248 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrsrb\" (UniqueName: \"kubernetes.io/projected/67dcdfb2-70ae-4444-b271-dd83dcb37756-kube-api-access-mrsrb\") pod \"glance-operator-controller-manager-668d9c48b9-5c6kq\" (UID: \"67dcdfb2-70ae-4444-b271-dd83dcb37756\") " pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-5c6kq" Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.508285 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jdxg\" (UniqueName: \"kubernetes.io/projected/b931f322-0c4e-4019-a11e-616c80d1e5f1-kube-api-access-6jdxg\") pod \"heat-operator-controller-manager-5f64f6f8bb-zgpn6\" (UID: \"b931f322-0c4e-4019-a11e-616c80d1e5f1\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-zgpn6" Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.508319 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f87b36f7-2558-4823-85fc-6b6e9090b1d7-cert\") pod \"infra-operator-controller-manager-57548d458d-hzgf4\" (UID: \"f87b36f7-2558-4823-85fc-6b6e9090b1d7\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-hzgf4" Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.508813 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-78524"] Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.518811 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-78524" Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.521836 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-k85pz" Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.566249 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jdxg\" (UniqueName: \"kubernetes.io/projected/b931f322-0c4e-4019-a11e-616c80d1e5f1-kube-api-access-6jdxg\") pod \"heat-operator-controller-manager-5f64f6f8bb-zgpn6\" (UID: \"b931f322-0c4e-4019-a11e-616c80d1e5f1\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-zgpn6" Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.566666 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxnpw\" (UniqueName: \"kubernetes.io/projected/c6e6ae59-9f58-4856-b200-d42d1e1e23ed-kube-api-access-kxnpw\") pod \"designate-operator-controller-manager-78b4bc895b-lfs45\" (UID: \"c6e6ae59-9f58-4856-b200-d42d1e1e23ed\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-lfs45" Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.573764 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-78524"] Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.594491 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrsrb\" (UniqueName: \"kubernetes.io/projected/67dcdfb2-70ae-4444-b271-dd83dcb37756-kube-api-access-mrsrb\") pod \"glance-operator-controller-manager-668d9c48b9-5c6kq\" (UID: \"67dcdfb2-70ae-4444-b271-dd83dcb37756\") " pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-5c6kq" Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.597747 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-rp7vl" Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.615461 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwrgl\" (UniqueName: \"kubernetes.io/projected/f87b36f7-2558-4823-85fc-6b6e9090b1d7-kube-api-access-qwrgl\") pod \"infra-operator-controller-manager-57548d458d-hzgf4\" (UID: \"f87b36f7-2558-4823-85fc-6b6e9090b1d7\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-hzgf4" Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.615576 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f87b36f7-2558-4823-85fc-6b6e9090b1d7-cert\") pod \"infra-operator-controller-manager-57548d458d-hzgf4\" (UID: \"f87b36f7-2558-4823-85fc-6b6e9090b1d7\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-hzgf4" Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.615602 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xlf2\" (UniqueName: \"kubernetes.io/projected/aa866b8d-174b-4fab-a55d-cc2bcdef5526-kube-api-access-5xlf2\") pod \"horizon-operator-controller-manager-68c6d99b8f-78524\" (UID: \"aa866b8d-174b-4fab-a55d-cc2bcdef5526\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-78524" Dec 01 08:36:47 crc kubenswrapper[5004]: E1201 08:36:47.615976 5004 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 01 08:36:47 crc kubenswrapper[5004]: E1201 08:36:47.616014 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f87b36f7-2558-4823-85fc-6b6e9090b1d7-cert podName:f87b36f7-2558-4823-85fc-6b6e9090b1d7 nodeName:}" failed. No retries permitted until 2025-12-01 08:36:48.115999862 +0000 UTC m=+1185.680991844 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f87b36f7-2558-4823-85fc-6b6e9090b1d7-cert") pod "infra-operator-controller-manager-57548d458d-hzgf4" (UID: "f87b36f7-2558-4823-85fc-6b6e9090b1d7") : secret "infra-operator-webhook-server-cert" not found Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.616134 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-vv9hf"] Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.617322 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-vv9hf" Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.622794 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-8hw7v" Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.641373 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-hzgf4"] Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.641508 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-6dpk9" Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.658008 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwrgl\" (UniqueName: \"kubernetes.io/projected/f87b36f7-2558-4823-85fc-6b6e9090b1d7-kube-api-access-qwrgl\") pod \"infra-operator-controller-manager-57548d458d-hzgf4\" (UID: \"f87b36f7-2558-4823-85fc-6b6e9090b1d7\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-hzgf4" Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.671379 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-lfs45" Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.702742 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-546d4bdf48-59dzd"] Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.710700 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-5c6kq" Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.713398 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-59dzd" Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.715618 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-vv9hf"] Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.718680 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k55bg\" (UniqueName: \"kubernetes.io/projected/3521a18b-f34e-4107-9e34-048a9827a2fe-kube-api-access-k55bg\") pod \"ironic-operator-controller-manager-6c548fd776-vv9hf\" (UID: \"3521a18b-f34e-4107-9e34-048a9827a2fe\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-vv9hf" Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.718782 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xlf2\" (UniqueName: \"kubernetes.io/projected/aa866b8d-174b-4fab-a55d-cc2bcdef5526-kube-api-access-5xlf2\") pod \"horizon-operator-controller-manager-68c6d99b8f-78524\" (UID: \"aa866b8d-174b-4fab-a55d-cc2bcdef5526\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-78524" Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.719919 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-2hkvb" Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.726095 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-6546668bfd-6l4k4"] Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.727405 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-6l4k4" Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.729683 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-55g57"] Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.730986 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-55g57" Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.729697 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-nk6ts" Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.745910 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-gfpk4" Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.746894 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-zgpn6" Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.753893 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-546d4bdf48-59dzd"] Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.779715 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6546668bfd-6l4k4"] Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.820456 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k55bg\" (UniqueName: \"kubernetes.io/projected/3521a18b-f34e-4107-9e34-048a9827a2fe-kube-api-access-k55bg\") pod \"ironic-operator-controller-manager-6c548fd776-vv9hf\" (UID: \"3521a18b-f34e-4107-9e34-048a9827a2fe\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-vv9hf" Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.820535 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjgcp\" (UniqueName: \"kubernetes.io/projected/8d8f2a7f-1da5-45ca-8dd0-1fa87e3d46fe-kube-api-access-kjgcp\") pod \"manila-operator-controller-manager-6546668bfd-6l4k4\" (UID: \"8d8f2a7f-1da5-45ca-8dd0-1fa87e3d46fe\") " pod="openstack-operators/manila-operator-controller-manager-6546668bfd-6l4k4" Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.820577 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jzw8\" (UniqueName: \"kubernetes.io/projected/cb3f7f28-c99e-44d4-b534-83889924b531-kube-api-access-8jzw8\") pod \"keystone-operator-controller-manager-546d4bdf48-59dzd\" (UID: \"cb3f7f28-c99e-44d4-b534-83889924b531\") " pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-59dzd" Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.820654 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j48ct\" (UniqueName: \"kubernetes.io/projected/24fb1ec9-065a-464d-9797-8020c38f81e8-kube-api-access-j48ct\") pod \"mariadb-operator-controller-manager-56bbcc9d85-55g57\" (UID: \"24fb1ec9-065a-464d-9797-8020c38f81e8\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-55g57" Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.822398 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-55g57"] Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.839868 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-pk7mn"] Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.841224 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-pk7mn" Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.848264 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xlf2\" (UniqueName: \"kubernetes.io/projected/aa866b8d-174b-4fab-a55d-cc2bcdef5526-kube-api-access-5xlf2\") pod \"horizon-operator-controller-manager-68c6d99b8f-78524\" (UID: \"aa866b8d-174b-4fab-a55d-cc2bcdef5526\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-78524" Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.856188 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k55bg\" (UniqueName: \"kubernetes.io/projected/3521a18b-f34e-4107-9e34-048a9827a2fe-kube-api-access-k55bg\") pod \"ironic-operator-controller-manager-6c548fd776-vv9hf\" (UID: \"3521a18b-f34e-4107-9e34-048a9827a2fe\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-vv9hf" Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.891822 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-56x6b" Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.892545 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-pk7mn"] Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.922164 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjgcp\" (UniqueName: \"kubernetes.io/projected/8d8f2a7f-1da5-45ca-8dd0-1fa87e3d46fe-kube-api-access-kjgcp\") pod \"manila-operator-controller-manager-6546668bfd-6l4k4\" (UID: \"8d8f2a7f-1da5-45ca-8dd0-1fa87e3d46fe\") " pod="openstack-operators/manila-operator-controller-manager-6546668bfd-6l4k4" Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.922224 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jzw8\" (UniqueName: \"kubernetes.io/projected/cb3f7f28-c99e-44d4-b534-83889924b531-kube-api-access-8jzw8\") pod \"keystone-operator-controller-manager-546d4bdf48-59dzd\" (UID: \"cb3f7f28-c99e-44d4-b534-83889924b531\") " pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-59dzd" Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.922251 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hw2qx\" (UniqueName: \"kubernetes.io/projected/67bfafa3-790f-4b23-8bef-8b5da60bf6dc-kube-api-access-hw2qx\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-pk7mn\" (UID: \"67bfafa3-790f-4b23-8bef-8b5da60bf6dc\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-pk7mn" Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.922306 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j48ct\" (UniqueName: \"kubernetes.io/projected/24fb1ec9-065a-464d-9797-8020c38f81e8-kube-api-access-j48ct\") pod \"mariadb-operator-controller-manager-56bbcc9d85-55g57\" (UID: \"24fb1ec9-065a-464d-9797-8020c38f81e8\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-55g57" Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.960873 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j48ct\" (UniqueName: \"kubernetes.io/projected/24fb1ec9-065a-464d-9797-8020c38f81e8-kube-api-access-j48ct\") pod \"mariadb-operator-controller-manager-56bbcc9d85-55g57\" (UID: \"24fb1ec9-065a-464d-9797-8020c38f81e8\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-55g57" Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.965226 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjgcp\" (UniqueName: \"kubernetes.io/projected/8d8f2a7f-1da5-45ca-8dd0-1fa87e3d46fe-kube-api-access-kjgcp\") pod \"manila-operator-controller-manager-6546668bfd-6l4k4\" (UID: \"8d8f2a7f-1da5-45ca-8dd0-1fa87e3d46fe\") " pod="openstack-operators/manila-operator-controller-manager-6546668bfd-6l4k4" Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.967149 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jzw8\" (UniqueName: \"kubernetes.io/projected/cb3f7f28-c99e-44d4-b534-83889924b531-kube-api-access-8jzw8\") pod \"keystone-operator-controller-manager-546d4bdf48-59dzd\" (UID: \"cb3f7f28-c99e-44d4-b534-83889924b531\") " pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-59dzd" Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.968396 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-nm7jp"] Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.975097 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-nm7jp" Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.981293 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-l9rgg" Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.988882 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-78524" Dec 01 08:36:47 crc kubenswrapper[5004]: I1201 08:36:47.999228 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-wqpmp"] Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.000599 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-wqpmp" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.002925 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-6l4k4" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.003433 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-rvs8q" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.003884 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-vv9hf" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.015613 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-55g57" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.023319 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srjdx\" (UniqueName: \"kubernetes.io/projected/616c962a-6fda-4c1b-a377-51d721a17616-kube-api-access-srjdx\") pod \"nova-operator-controller-manager-697bc559fc-nm7jp\" (UID: \"616c962a-6fda-4c1b-a377-51d721a17616\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-nm7jp" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.023431 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hw2qx\" (UniqueName: \"kubernetes.io/projected/67bfafa3-790f-4b23-8bef-8b5da60bf6dc-kube-api-access-hw2qx\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-pk7mn\" (UID: \"67bfafa3-790f-4b23-8bef-8b5da60bf6dc\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-pk7mn" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.023491 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rn2l2\" (UniqueName: \"kubernetes.io/projected/d418620e-19a8-4171-94f7-2dba61ca8b6a-kube-api-access-rn2l2\") pod \"octavia-operator-controller-manager-998648c74-wqpmp\" (UID: \"d418620e-19a8-4171-94f7-2dba61ca8b6a\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-wqpmp" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.024770 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-wqpmp"] Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.037709 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-nm7jp"] Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.043888 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hw2qx\" (UniqueName: \"kubernetes.io/projected/67bfafa3-790f-4b23-8bef-8b5da60bf6dc-kube-api-access-hw2qx\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-pk7mn\" (UID: \"67bfafa3-790f-4b23-8bef-8b5da60bf6dc\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-pk7mn" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.067431 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd42mdlz"] Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.068892 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd42mdlz" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.077346 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd42mdlz"] Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.079381 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.079789 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-rb86h" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.099059 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-tnmn9"] Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.100455 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-tnmn9" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.105600 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-59dzd" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.107791 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-p468t"] Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.110715 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-7vqp6" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.124339 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rn2l2\" (UniqueName: \"kubernetes.io/projected/d418620e-19a8-4171-94f7-2dba61ca8b6a-kube-api-access-rn2l2\") pod \"octavia-operator-controller-manager-998648c74-wqpmp\" (UID: \"d418620e-19a8-4171-94f7-2dba61ca8b6a\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-wqpmp" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.124408 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrvxn\" (UniqueName: \"kubernetes.io/projected/fde3e479-59b7-4b8b-82c8-38b346fd3409-kube-api-access-wrvxn\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd42mdlz\" (UID: \"fde3e479-59b7-4b8b-82c8-38b346fd3409\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd42mdlz" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.124440 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srjdx\" (UniqueName: \"kubernetes.io/projected/616c962a-6fda-4c1b-a377-51d721a17616-kube-api-access-srjdx\") pod \"nova-operator-controller-manager-697bc559fc-nm7jp\" (UID: \"616c962a-6fda-4c1b-a377-51d721a17616\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-nm7jp" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.124487 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fde3e479-59b7-4b8b-82c8-38b346fd3409-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd42mdlz\" (UID: \"fde3e479-59b7-4b8b-82c8-38b346fd3409\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd42mdlz" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.124549 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fskwh\" (UniqueName: \"kubernetes.io/projected/bff56810-ae93-4fca-a568-cc88e971c1d8-kube-api-access-fskwh\") pod \"ovn-operator-controller-manager-b6456fdb6-tnmn9\" (UID: \"bff56810-ae93-4fca-a568-cc88e971c1d8\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-tnmn9" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.124590 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f87b36f7-2558-4823-85fc-6b6e9090b1d7-cert\") pod \"infra-operator-controller-manager-57548d458d-hzgf4\" (UID: \"f87b36f7-2558-4823-85fc-6b6e9090b1d7\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-hzgf4" Dec 01 08:36:48 crc kubenswrapper[5004]: E1201 08:36:48.125202 5004 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 01 08:36:48 crc kubenswrapper[5004]: E1201 08:36:48.125242 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f87b36f7-2558-4823-85fc-6b6e9090b1d7-cert podName:f87b36f7-2558-4823-85fc-6b6e9090b1d7 nodeName:}" failed. No retries permitted until 2025-12-01 08:36:49.125228025 +0000 UTC m=+1186.690220007 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f87b36f7-2558-4823-85fc-6b6e9090b1d7-cert") pod "infra-operator-controller-manager-57548d458d-hzgf4" (UID: "f87b36f7-2558-4823-85fc-6b6e9090b1d7") : secret "infra-operator-webhook-server-cert" not found Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.127533 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-tnmn9"] Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.127586 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-p468t"] Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.127666 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-p468t" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.130873 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-69mhd" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.143154 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-c96bj"] Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.144597 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-c96bj" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.146740 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-b9ghj" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.146883 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srjdx\" (UniqueName: \"kubernetes.io/projected/616c962a-6fda-4c1b-a377-51d721a17616-kube-api-access-srjdx\") pod \"nova-operator-controller-manager-697bc559fc-nm7jp\" (UID: \"616c962a-6fda-4c1b-a377-51d721a17616\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-nm7jp" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.153471 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rn2l2\" (UniqueName: \"kubernetes.io/projected/d418620e-19a8-4171-94f7-2dba61ca8b6a-kube-api-access-rn2l2\") pod \"octavia-operator-controller-manager-998648c74-wqpmp\" (UID: \"d418620e-19a8-4171-94f7-2dba61ca8b6a\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-wqpmp" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.154400 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-c96bj"] Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.183972 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-bcd9b8768-5phd6"] Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.189303 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-bcd9b8768-5phd6" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.192753 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-c22f7" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.206465 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-8n2qh"] Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.207807 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-8n2qh" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.213206 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-bcd9b8768-5phd6"] Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.213705 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-bcsxq" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.219687 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-8n2qh"] Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.226300 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fskwh\" (UniqueName: \"kubernetes.io/projected/bff56810-ae93-4fca-a568-cc88e971c1d8-kube-api-access-fskwh\") pod \"ovn-operator-controller-manager-b6456fdb6-tnmn9\" (UID: \"bff56810-ae93-4fca-a568-cc88e971c1d8\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-tnmn9" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.226388 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8xff\" (UniqueName: \"kubernetes.io/projected/5425cd72-5745-4b0f-ab14-b697c726d75f-kube-api-access-x8xff\") pod \"telemetry-operator-controller-manager-bcd9b8768-5phd6\" (UID: \"5425cd72-5745-4b0f-ab14-b697c726d75f\") " pod="openstack-operators/telemetry-operator-controller-manager-bcd9b8768-5phd6" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.226415 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fztv8\" (UniqueName: \"kubernetes.io/projected/a1d2c2cd-d1c5-490e-99ee-e0ab5c18ebc4-kube-api-access-fztv8\") pod \"swift-operator-controller-manager-5f8c65bbfc-c96bj\" (UID: \"a1d2c2cd-d1c5-490e-99ee-e0ab5c18ebc4\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-c96bj" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.226451 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrvxn\" (UniqueName: \"kubernetes.io/projected/fde3e479-59b7-4b8b-82c8-38b346fd3409-kube-api-access-wrvxn\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd42mdlz\" (UID: \"fde3e479-59b7-4b8b-82c8-38b346fd3409\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd42mdlz" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.226515 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4mrj\" (UniqueName: \"kubernetes.io/projected/9d4afad8-bd75-403d-b4d0-d7f01e1a6e5d-kube-api-access-s4mrj\") pod \"placement-operator-controller-manager-78f8948974-p468t\" (UID: \"9d4afad8-bd75-403d-b4d0-d7f01e1a6e5d\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-p468t" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.226542 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fde3e479-59b7-4b8b-82c8-38b346fd3409-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd42mdlz\" (UID: \"fde3e479-59b7-4b8b-82c8-38b346fd3409\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd42mdlz" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.226593 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjqmm\" (UniqueName: \"kubernetes.io/projected/bfa6d181-b802-48de-8c57-4b8b7a8f1e07-kube-api-access-hjqmm\") pod \"test-operator-controller-manager-5854674fcc-8n2qh\" (UID: \"bfa6d181-b802-48de-8c57-4b8b7a8f1e07\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-8n2qh" Dec 01 08:36:48 crc kubenswrapper[5004]: E1201 08:36:48.227095 5004 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 08:36:48 crc kubenswrapper[5004]: E1201 08:36:48.227152 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fde3e479-59b7-4b8b-82c8-38b346fd3409-cert podName:fde3e479-59b7-4b8b-82c8-38b346fd3409 nodeName:}" failed. No retries permitted until 2025-12-01 08:36:48.72713469 +0000 UTC m=+1186.292126672 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fde3e479-59b7-4b8b-82c8-38b346fd3409-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd42mdlz" (UID: "fde3e479-59b7-4b8b-82c8-38b346fd3409") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.227484 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-nfdnl"] Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.228972 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-nfdnl" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.233269 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-7gm9g" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.247999 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-nfdnl"] Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.255645 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrvxn\" (UniqueName: \"kubernetes.io/projected/fde3e479-59b7-4b8b-82c8-38b346fd3409-kube-api-access-wrvxn\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd42mdlz\" (UID: \"fde3e479-59b7-4b8b-82c8-38b346fd3409\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd42mdlz" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.260684 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fskwh\" (UniqueName: \"kubernetes.io/projected/bff56810-ae93-4fca-a568-cc88e971c1d8-kube-api-access-fskwh\") pod \"ovn-operator-controller-manager-b6456fdb6-tnmn9\" (UID: \"bff56810-ae93-4fca-a568-cc88e971c1d8\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-tnmn9" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.274986 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6c4968b65-xg6h2"] Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.277279 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6c4968b65-xg6h2" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.296190 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-txll4" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.296413 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.297130 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.300011 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6c4968b65-xg6h2"] Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.332768 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvwlf\" (UniqueName: \"kubernetes.io/projected/f1d1796c-7fa3-4a90-bfb2-cc257a69ba58-kube-api-access-dvwlf\") pod \"watcher-operator-controller-manager-769dc69bc-nfdnl\" (UID: \"f1d1796c-7fa3-4a90-bfb2-cc257a69ba58\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-nfdnl" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.332848 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38bf4275-c95e-4b2d-88fe-aeace2e41983-metrics-certs\") pod \"openstack-operator-controller-manager-6c4968b65-xg6h2\" (UID: \"38bf4275-c95e-4b2d-88fe-aeace2e41983\") " pod="openstack-operators/openstack-operator-controller-manager-6c4968b65-xg6h2" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.332914 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4mrj\" (UniqueName: \"kubernetes.io/projected/9d4afad8-bd75-403d-b4d0-d7f01e1a6e5d-kube-api-access-s4mrj\") pod \"placement-operator-controller-manager-78f8948974-p468t\" (UID: \"9d4afad8-bd75-403d-b4d0-d7f01e1a6e5d\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-p468t" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.332951 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/38bf4275-c95e-4b2d-88fe-aeace2e41983-webhook-certs\") pod \"openstack-operator-controller-manager-6c4968b65-xg6h2\" (UID: \"38bf4275-c95e-4b2d-88fe-aeace2e41983\") " pod="openstack-operators/openstack-operator-controller-manager-6c4968b65-xg6h2" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.333017 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjqmm\" (UniqueName: \"kubernetes.io/projected/bfa6d181-b802-48de-8c57-4b8b7a8f1e07-kube-api-access-hjqmm\") pod \"test-operator-controller-manager-5854674fcc-8n2qh\" (UID: \"bfa6d181-b802-48de-8c57-4b8b7a8f1e07\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-8n2qh" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.333185 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8xff\" (UniqueName: \"kubernetes.io/projected/5425cd72-5745-4b0f-ab14-b697c726d75f-kube-api-access-x8xff\") pod \"telemetry-operator-controller-manager-bcd9b8768-5phd6\" (UID: \"5425cd72-5745-4b0f-ab14-b697c726d75f\") " pod="openstack-operators/telemetry-operator-controller-manager-bcd9b8768-5phd6" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.333212 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fztv8\" (UniqueName: \"kubernetes.io/projected/a1d2c2cd-d1c5-490e-99ee-e0ab5c18ebc4-kube-api-access-fztv8\") pod \"swift-operator-controller-manager-5f8c65bbfc-c96bj\" (UID: \"a1d2c2cd-d1c5-490e-99ee-e0ab5c18ebc4\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-c96bj" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.333246 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xr5wj\" (UniqueName: \"kubernetes.io/projected/38bf4275-c95e-4b2d-88fe-aeace2e41983-kube-api-access-xr5wj\") pod \"openstack-operator-controller-manager-6c4968b65-xg6h2\" (UID: \"38bf4275-c95e-4b2d-88fe-aeace2e41983\") " pod="openstack-operators/openstack-operator-controller-manager-6c4968b65-xg6h2" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.334859 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-nm7jp" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.342348 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-pk7mn" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.357191 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4mrj\" (UniqueName: \"kubernetes.io/projected/9d4afad8-bd75-403d-b4d0-d7f01e1a6e5d-kube-api-access-s4mrj\") pod \"placement-operator-controller-manager-78f8948974-p468t\" (UID: \"9d4afad8-bd75-403d-b4d0-d7f01e1a6e5d\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-p468t" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.358092 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-wqpmp" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.362308 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8xff\" (UniqueName: \"kubernetes.io/projected/5425cd72-5745-4b0f-ab14-b697c726d75f-kube-api-access-x8xff\") pod \"telemetry-operator-controller-manager-bcd9b8768-5phd6\" (UID: \"5425cd72-5745-4b0f-ab14-b697c726d75f\") " pod="openstack-operators/telemetry-operator-controller-manager-bcd9b8768-5phd6" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.362643 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjqmm\" (UniqueName: \"kubernetes.io/projected/bfa6d181-b802-48de-8c57-4b8b7a8f1e07-kube-api-access-hjqmm\") pod \"test-operator-controller-manager-5854674fcc-8n2qh\" (UID: \"bfa6d181-b802-48de-8c57-4b8b7a8f1e07\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-8n2qh" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.364460 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fztv8\" (UniqueName: \"kubernetes.io/projected/a1d2c2cd-d1c5-490e-99ee-e0ab5c18ebc4-kube-api-access-fztv8\") pod \"swift-operator-controller-manager-5f8c65bbfc-c96bj\" (UID: \"a1d2c2cd-d1c5-490e-99ee-e0ab5c18ebc4\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-c96bj" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.366144 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-bcd9b8768-5phd6" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.389378 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l4rkg"] Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.391175 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l4rkg" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.394842 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-zgdl2" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.397116 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l4rkg"] Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.431499 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-8n2qh" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.448706 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g254f\" (UniqueName: \"kubernetes.io/projected/ed5bf034-cb91-4b02-97a4-c63a8506e527-kube-api-access-g254f\") pod \"rabbitmq-cluster-operator-manager-668c99d594-l4rkg\" (UID: \"ed5bf034-cb91-4b02-97a4-c63a8506e527\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l4rkg" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.448907 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xr5wj\" (UniqueName: \"kubernetes.io/projected/38bf4275-c95e-4b2d-88fe-aeace2e41983-kube-api-access-xr5wj\") pod \"openstack-operator-controller-manager-6c4968b65-xg6h2\" (UID: \"38bf4275-c95e-4b2d-88fe-aeace2e41983\") " pod="openstack-operators/openstack-operator-controller-manager-6c4968b65-xg6h2" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.449021 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvwlf\" (UniqueName: \"kubernetes.io/projected/f1d1796c-7fa3-4a90-bfb2-cc257a69ba58-kube-api-access-dvwlf\") pod \"watcher-operator-controller-manager-769dc69bc-nfdnl\" (UID: \"f1d1796c-7fa3-4a90-bfb2-cc257a69ba58\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-nfdnl" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.449108 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38bf4275-c95e-4b2d-88fe-aeace2e41983-metrics-certs\") pod \"openstack-operator-controller-manager-6c4968b65-xg6h2\" (UID: \"38bf4275-c95e-4b2d-88fe-aeace2e41983\") " pod="openstack-operators/openstack-operator-controller-manager-6c4968b65-xg6h2" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.449237 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/38bf4275-c95e-4b2d-88fe-aeace2e41983-webhook-certs\") pod \"openstack-operator-controller-manager-6c4968b65-xg6h2\" (UID: \"38bf4275-c95e-4b2d-88fe-aeace2e41983\") " pod="openstack-operators/openstack-operator-controller-manager-6c4968b65-xg6h2" Dec 01 08:36:48 crc kubenswrapper[5004]: E1201 08:36:48.449450 5004 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 01 08:36:48 crc kubenswrapper[5004]: E1201 08:36:48.449550 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38bf4275-c95e-4b2d-88fe-aeace2e41983-webhook-certs podName:38bf4275-c95e-4b2d-88fe-aeace2e41983 nodeName:}" failed. No retries permitted until 2025-12-01 08:36:48.949535294 +0000 UTC m=+1186.514527276 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/38bf4275-c95e-4b2d-88fe-aeace2e41983-webhook-certs") pod "openstack-operator-controller-manager-6c4968b65-xg6h2" (UID: "38bf4275-c95e-4b2d-88fe-aeace2e41983") : secret "webhook-server-cert" not found Dec 01 08:36:48 crc kubenswrapper[5004]: E1201 08:36:48.450034 5004 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 01 08:36:48 crc kubenswrapper[5004]: E1201 08:36:48.450185 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38bf4275-c95e-4b2d-88fe-aeace2e41983-metrics-certs podName:38bf4275-c95e-4b2d-88fe-aeace2e41983 nodeName:}" failed. No retries permitted until 2025-12-01 08:36:48.950176349 +0000 UTC m=+1186.515168331 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/38bf4275-c95e-4b2d-88fe-aeace2e41983-metrics-certs") pod "openstack-operator-controller-manager-6c4968b65-xg6h2" (UID: "38bf4275-c95e-4b2d-88fe-aeace2e41983") : secret "metrics-server-cert" not found Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.481450 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvwlf\" (UniqueName: \"kubernetes.io/projected/f1d1796c-7fa3-4a90-bfb2-cc257a69ba58-kube-api-access-dvwlf\") pod \"watcher-operator-controller-manager-769dc69bc-nfdnl\" (UID: \"f1d1796c-7fa3-4a90-bfb2-cc257a69ba58\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-nfdnl" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.489013 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xr5wj\" (UniqueName: \"kubernetes.io/projected/38bf4275-c95e-4b2d-88fe-aeace2e41983-kube-api-access-xr5wj\") pod \"openstack-operator-controller-manager-6c4968b65-xg6h2\" (UID: \"38bf4275-c95e-4b2d-88fe-aeace2e41983\") " pod="openstack-operators/openstack-operator-controller-manager-6c4968b65-xg6h2" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.537001 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-tnmn9" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.550796 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g254f\" (UniqueName: \"kubernetes.io/projected/ed5bf034-cb91-4b02-97a4-c63a8506e527-kube-api-access-g254f\") pod \"rabbitmq-cluster-operator-manager-668c99d594-l4rkg\" (UID: \"ed5bf034-cb91-4b02-97a4-c63a8506e527\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l4rkg" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.557435 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-c96bj" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.571472 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g254f\" (UniqueName: \"kubernetes.io/projected/ed5bf034-cb91-4b02-97a4-c63a8506e527-kube-api-access-g254f\") pod \"rabbitmq-cluster-operator-manager-668c99d594-l4rkg\" (UID: \"ed5bf034-cb91-4b02-97a4-c63a8506e527\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l4rkg" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.571952 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-p468t" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.635625 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-rp7vl"] Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.743723 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-nfdnl" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.755958 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fde3e479-59b7-4b8b-82c8-38b346fd3409-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd42mdlz\" (UID: \"fde3e479-59b7-4b8b-82c8-38b346fd3409\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd42mdlz" Dec 01 08:36:48 crc kubenswrapper[5004]: E1201 08:36:48.756150 5004 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 08:36:48 crc kubenswrapper[5004]: E1201 08:36:48.756189 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fde3e479-59b7-4b8b-82c8-38b346fd3409-cert podName:fde3e479-59b7-4b8b-82c8-38b346fd3409 nodeName:}" failed. No retries permitted until 2025-12-01 08:36:49.756176999 +0000 UTC m=+1187.321168981 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fde3e479-59b7-4b8b-82c8-38b346fd3409-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd42mdlz" (UID: "fde3e479-59b7-4b8b-82c8-38b346fd3409") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.778183 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l4rkg" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.783900 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-668d9c48b9-5c6kq"] Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.959545 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38bf4275-c95e-4b2d-88fe-aeace2e41983-metrics-certs\") pod \"openstack-operator-controller-manager-6c4968b65-xg6h2\" (UID: \"38bf4275-c95e-4b2d-88fe-aeace2e41983\") " pod="openstack-operators/openstack-operator-controller-manager-6c4968b65-xg6h2" Dec 01 08:36:48 crc kubenswrapper[5004]: I1201 08:36:48.959914 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/38bf4275-c95e-4b2d-88fe-aeace2e41983-webhook-certs\") pod \"openstack-operator-controller-manager-6c4968b65-xg6h2\" (UID: \"38bf4275-c95e-4b2d-88fe-aeace2e41983\") " pod="openstack-operators/openstack-operator-controller-manager-6c4968b65-xg6h2" Dec 01 08:36:48 crc kubenswrapper[5004]: E1201 08:36:48.959708 5004 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 01 08:36:48 crc kubenswrapper[5004]: E1201 08:36:48.960025 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38bf4275-c95e-4b2d-88fe-aeace2e41983-metrics-certs podName:38bf4275-c95e-4b2d-88fe-aeace2e41983 nodeName:}" failed. No retries permitted until 2025-12-01 08:36:49.960005767 +0000 UTC m=+1187.524997749 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/38bf4275-c95e-4b2d-88fe-aeace2e41983-metrics-certs") pod "openstack-operator-controller-manager-6c4968b65-xg6h2" (UID: "38bf4275-c95e-4b2d-88fe-aeace2e41983") : secret "metrics-server-cert" not found Dec 01 08:36:48 crc kubenswrapper[5004]: E1201 08:36:48.960058 5004 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 01 08:36:48 crc kubenswrapper[5004]: E1201 08:36:48.960109 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38bf4275-c95e-4b2d-88fe-aeace2e41983-webhook-certs podName:38bf4275-c95e-4b2d-88fe-aeace2e41983 nodeName:}" failed. No retries permitted until 2025-12-01 08:36:49.96009154 +0000 UTC m=+1187.525083522 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/38bf4275-c95e-4b2d-88fe-aeace2e41983-webhook-certs") pod "openstack-operator-controller-manager-6c4968b65-xg6h2" (UID: "38bf4275-c95e-4b2d-88fe-aeace2e41983") : secret "webhook-server-cert" not found Dec 01 08:36:49 crc kubenswrapper[5004]: I1201 08:36:49.167522 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f87b36f7-2558-4823-85fc-6b6e9090b1d7-cert\") pod \"infra-operator-controller-manager-57548d458d-hzgf4\" (UID: \"f87b36f7-2558-4823-85fc-6b6e9090b1d7\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-hzgf4" Dec 01 08:36:49 crc kubenswrapper[5004]: E1201 08:36:49.167726 5004 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 01 08:36:49 crc kubenswrapper[5004]: E1201 08:36:49.167791 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f87b36f7-2558-4823-85fc-6b6e9090b1d7-cert podName:f87b36f7-2558-4823-85fc-6b6e9090b1d7 nodeName:}" failed. No retries permitted until 2025-12-01 08:36:51.167776111 +0000 UTC m=+1188.732768093 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f87b36f7-2558-4823-85fc-6b6e9090b1d7-cert") pod "infra-operator-controller-manager-57548d458d-hzgf4" (UID: "f87b36f7-2558-4823-85fc-6b6e9090b1d7") : secret "infra-operator-webhook-server-cert" not found Dec 01 08:36:49 crc kubenswrapper[5004]: I1201 08:36:49.407099 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-vv9hf"] Dec 01 08:36:49 crc kubenswrapper[5004]: I1201 08:36:49.414358 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-78524"] Dec 01 08:36:49 crc kubenswrapper[5004]: I1201 08:36:49.418722 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-lfs45"] Dec 01 08:36:49 crc kubenswrapper[5004]: I1201 08:36:49.435983 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-55g57"] Dec 01 08:36:49 crc kubenswrapper[5004]: I1201 08:36:49.475851 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6546668bfd-6l4k4"] Dec 01 08:36:49 crc kubenswrapper[5004]: I1201 08:36:49.502026 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-5c6kq" event={"ID":"67dcdfb2-70ae-4444-b271-dd83dcb37756","Type":"ContainerStarted","Data":"a5b4cddcdb062e3e60adec2b924dca239ae335ad8c2ab7931c63e36b04f91f4f"} Dec 01 08:36:49 crc kubenswrapper[5004]: I1201 08:36:49.502883 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-lfs45" event={"ID":"c6e6ae59-9f58-4856-b200-d42d1e1e23ed","Type":"ContainerStarted","Data":"4bc6563c7894a4292a1569e1981a285966536536a0875d0a0a69e8a3db676e02"} Dec 01 08:36:49 crc kubenswrapper[5004]: I1201 08:36:49.503916 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-rp7vl" event={"ID":"9bca89aa-3367-4bff-b070-c191fcae5f2f","Type":"ContainerStarted","Data":"1c5bb31b09b8f846c465d45b0e2a7b394d666e20bfcfcc0c01e1a8c661a2e260"} Dec 01 08:36:49 crc kubenswrapper[5004]: I1201 08:36:49.507176 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-78524" event={"ID":"aa866b8d-174b-4fab-a55d-cc2bcdef5526","Type":"ContainerStarted","Data":"05928fade76e1c691df824d5e40e3e0c1777cd3639d8ef1bb651543882eb7084"} Dec 01 08:36:49 crc kubenswrapper[5004]: I1201 08:36:49.518239 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-vv9hf" event={"ID":"3521a18b-f34e-4107-9e34-048a9827a2fe","Type":"ContainerStarted","Data":"de9d1dd0939e313ace586f9482c869cb8e58666f9bb197121530d2bf696464b1"} Dec 01 08:36:49 crc kubenswrapper[5004]: W1201 08:36:49.526305 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d8f2a7f_1da5_45ca_8dd0_1fa87e3d46fe.slice/crio-c052d27e2d81781e7df009603a90a7aec9537604956442cd45e45f698f9d00b6 WatchSource:0}: Error finding container c052d27e2d81781e7df009603a90a7aec9537604956442cd45e45f698f9d00b6: Status 404 returned error can't find the container with id c052d27e2d81781e7df009603a90a7aec9537604956442cd45e45f698f9d00b6 Dec 01 08:36:49 crc kubenswrapper[5004]: I1201 08:36:49.527731 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-zgpn6"] Dec 01 08:36:49 crc kubenswrapper[5004]: W1201 08:36:49.533972 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb931f322_0c4e_4019_a11e_616c80d1e5f1.slice/crio-410638eb878ca80f05ea2a3f192883caddea5d72c9cc63bde3ceb4c5423dd85d WatchSource:0}: Error finding container 410638eb878ca80f05ea2a3f192883caddea5d72c9cc63bde3ceb4c5423dd85d: Status 404 returned error can't find the container with id 410638eb878ca80f05ea2a3f192883caddea5d72c9cc63bde3ceb4c5423dd85d Dec 01 08:36:49 crc kubenswrapper[5004]: I1201 08:36:49.534264 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-6dpk9"] Dec 01 08:36:49 crc kubenswrapper[5004]: W1201 08:36:49.552257 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod166912d9_e0b0_40b8_8e26_9c86183d7952.slice/crio-c1f04272b52dc67c6ce93ec5741db9684a332c3e602980f3e537e143af89aa1b WatchSource:0}: Error finding container c1f04272b52dc67c6ce93ec5741db9684a332c3e602980f3e537e143af89aa1b: Status 404 returned error can't find the container with id c1f04272b52dc67c6ce93ec5741db9684a332c3e602980f3e537e143af89aa1b Dec 01 08:36:49 crc kubenswrapper[5004]: W1201 08:36:49.737129 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod616c962a_6fda_4c1b_a377_51d721a17616.slice/crio-71f2537cf29b2bd7b3332a3cce8e70916527227c6b370fb5a173e0cdaf6fa01e WatchSource:0}: Error finding container 71f2537cf29b2bd7b3332a3cce8e70916527227c6b370fb5a173e0cdaf6fa01e: Status 404 returned error can't find the container with id 71f2537cf29b2bd7b3332a3cce8e70916527227c6b370fb5a173e0cdaf6fa01e Dec 01 08:36:49 crc kubenswrapper[5004]: I1201 08:36:49.739081 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-wqpmp"] Dec 01 08:36:49 crc kubenswrapper[5004]: W1201 08:36:49.748176 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd418620e_19a8_4171_94f7_2dba61ca8b6a.slice/crio-2687a8a0cb83327f26989226cb0139b49b451ed44316d009406ed93a5b3ec7b7 WatchSource:0}: Error finding container 2687a8a0cb83327f26989226cb0139b49b451ed44316d009406ed93a5b3ec7b7: Status 404 returned error can't find the container with id 2687a8a0cb83327f26989226cb0139b49b451ed44316d009406ed93a5b3ec7b7 Dec 01 08:36:49 crc kubenswrapper[5004]: I1201 08:36:49.749496 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-nm7jp"] Dec 01 08:36:49 crc kubenswrapper[5004]: I1201 08:36:49.761816 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-pk7mn"] Dec 01 08:36:49 crc kubenswrapper[5004]: W1201 08:36:49.766863 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67bfafa3_790f_4b23_8bef_8b5da60bf6dc.slice/crio-5b51e96f1e9eacc2963611b32cc08626c3689df15d283d166e782e9f59431e9c WatchSource:0}: Error finding container 5b51e96f1e9eacc2963611b32cc08626c3689df15d283d166e782e9f59431e9c: Status 404 returned error can't find the container with id 5b51e96f1e9eacc2963611b32cc08626c3689df15d283d166e782e9f59431e9c Dec 01 08:36:49 crc kubenswrapper[5004]: I1201 08:36:49.770117 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-bcd9b8768-5phd6"] Dec 01 08:36:49 crc kubenswrapper[5004]: I1201 08:36:49.779234 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-546d4bdf48-59dzd"] Dec 01 08:36:49 crc kubenswrapper[5004]: W1201 08:36:49.780643 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb3f7f28_c99e_44d4_b534_83889924b531.slice/crio-6fc4702a48cfb4ca3e868f3b38ffd97832a80cacabc83cf24e01f89d6b4d16e0 WatchSource:0}: Error finding container 6fc4702a48cfb4ca3e868f3b38ffd97832a80cacabc83cf24e01f89d6b4d16e0: Status 404 returned error can't find the container with id 6fc4702a48cfb4ca3e868f3b38ffd97832a80cacabc83cf24e01f89d6b4d16e0 Dec 01 08:36:49 crc kubenswrapper[5004]: I1201 08:36:49.781070 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fde3e479-59b7-4b8b-82c8-38b346fd3409-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd42mdlz\" (UID: \"fde3e479-59b7-4b8b-82c8-38b346fd3409\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd42mdlz" Dec 01 08:36:49 crc kubenswrapper[5004]: E1201 08:36:49.781947 5004 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 08:36:49 crc kubenswrapper[5004]: E1201 08:36:49.781995 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fde3e479-59b7-4b8b-82c8-38b346fd3409-cert podName:fde3e479-59b7-4b8b-82c8-38b346fd3409 nodeName:}" failed. No retries permitted until 2025-12-01 08:36:51.781981904 +0000 UTC m=+1189.346973876 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fde3e479-59b7-4b8b-82c8-38b346fd3409-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd42mdlz" (UID: "fde3e479-59b7-4b8b-82c8-38b346fd3409") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 08:36:49 crc kubenswrapper[5004]: E1201 08:36:49.789274 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.30:5001/openstack-k8s-operators/telemetry-operator:4b9b5976885dec7b8bba09fe9749f3929a03aa17,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-x8xff,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-bcd9b8768-5phd6_openstack-operators(5425cd72-5745-4b0f-ab14-b697c726d75f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 08:36:49 crc kubenswrapper[5004]: E1201 08:36:49.794553 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-x8xff,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-bcd9b8768-5phd6_openstack-operators(5425cd72-5745-4b0f-ab14-b697c726d75f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 08:36:49 crc kubenswrapper[5004]: E1201 08:36:49.796490 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/telemetry-operator-controller-manager-bcd9b8768-5phd6" podUID="5425cd72-5745-4b0f-ab14-b697c726d75f" Dec 01 08:36:49 crc kubenswrapper[5004]: I1201 08:36:49.974371 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-p468t"] Dec 01 08:36:49 crc kubenswrapper[5004]: I1201 08:36:49.984887 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38bf4275-c95e-4b2d-88fe-aeace2e41983-metrics-certs\") pod \"openstack-operator-controller-manager-6c4968b65-xg6h2\" (UID: \"38bf4275-c95e-4b2d-88fe-aeace2e41983\") " pod="openstack-operators/openstack-operator-controller-manager-6c4968b65-xg6h2" Dec 01 08:36:49 crc kubenswrapper[5004]: I1201 08:36:49.984966 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/38bf4275-c95e-4b2d-88fe-aeace2e41983-webhook-certs\") pod \"openstack-operator-controller-manager-6c4968b65-xg6h2\" (UID: \"38bf4275-c95e-4b2d-88fe-aeace2e41983\") " pod="openstack-operators/openstack-operator-controller-manager-6c4968b65-xg6h2" Dec 01 08:36:49 crc kubenswrapper[5004]: E1201 08:36:49.985209 5004 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 01 08:36:49 crc kubenswrapper[5004]: E1201 08:36:49.985273 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38bf4275-c95e-4b2d-88fe-aeace2e41983-webhook-certs podName:38bf4275-c95e-4b2d-88fe-aeace2e41983 nodeName:}" failed. No retries permitted until 2025-12-01 08:36:51.985251179 +0000 UTC m=+1189.550243161 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/38bf4275-c95e-4b2d-88fe-aeace2e41983-webhook-certs") pod "openstack-operator-controller-manager-6c4968b65-xg6h2" (UID: "38bf4275-c95e-4b2d-88fe-aeace2e41983") : secret "webhook-server-cert" not found Dec 01 08:36:49 crc kubenswrapper[5004]: E1201 08:36:49.985400 5004 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 01 08:36:49 crc kubenswrapper[5004]: E1201 08:36:49.985527 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38bf4275-c95e-4b2d-88fe-aeace2e41983-metrics-certs podName:38bf4275-c95e-4b2d-88fe-aeace2e41983 nodeName:}" failed. No retries permitted until 2025-12-01 08:36:51.985501815 +0000 UTC m=+1189.550493867 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/38bf4275-c95e-4b2d-88fe-aeace2e41983-metrics-certs") pod "openstack-operator-controller-manager-6c4968b65-xg6h2" (UID: "38bf4275-c95e-4b2d-88fe-aeace2e41983") : secret "metrics-server-cert" not found Dec 01 08:36:49 crc kubenswrapper[5004]: W1201 08:36:49.989311 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d4afad8_bd75_403d_b4d0_d7f01e1a6e5d.slice/crio-21a4fb150900977ddc5c0c8c922dc32a33d16a69177b287915b0b9577eeee20e WatchSource:0}: Error finding container 21a4fb150900977ddc5c0c8c922dc32a33d16a69177b287915b0b9577eeee20e: Status 404 returned error can't find the container with id 21a4fb150900977ddc5c0c8c922dc32a33d16a69177b287915b0b9577eeee20e Dec 01 08:36:49 crc kubenswrapper[5004]: I1201 08:36:49.990918 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l4rkg"] Dec 01 08:36:49 crc kubenswrapper[5004]: W1201 08:36:49.999353 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded5bf034_cb91_4b02_97a4_c63a8506e527.slice/crio-2a93b77b14f5af825f1dd99fcf228311616ca6ec460f89384a6b7b7c3c4478cc WatchSource:0}: Error finding container 2a93b77b14f5af825f1dd99fcf228311616ca6ec460f89384a6b7b7c3c4478cc: Status 404 returned error can't find the container with id 2a93b77b14f5af825f1dd99fcf228311616ca6ec460f89384a6b7b7c3c4478cc Dec 01 08:36:50 crc kubenswrapper[5004]: E1201 08:36:50.002491 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-g254f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-l4rkg_openstack-operators(ed5bf034-cb91-4b02-97a4-c63a8506e527): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 08:36:50 crc kubenswrapper[5004]: W1201 08:36:50.002867 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbff56810_ae93_4fca_a568_cc88e971c1d8.slice/crio-9e9eba61e735fed03ddfa700796ad8891bdf11d674d01bf5a49fe41443b6305f WatchSource:0}: Error finding container 9e9eba61e735fed03ddfa700796ad8891bdf11d674d01bf5a49fe41443b6305f: Status 404 returned error can't find the container with id 9e9eba61e735fed03ddfa700796ad8891bdf11d674d01bf5a49fe41443b6305f Dec 01 08:36:50 crc kubenswrapper[5004]: E1201 08:36:50.004330 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l4rkg" podUID="ed5bf034-cb91-4b02-97a4-c63a8506e527" Dec 01 08:36:50 crc kubenswrapper[5004]: E1201 08:36:50.006288 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fskwh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-tnmn9_openstack-operators(bff56810-ae93-4fca-a568-cc88e971c1d8): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 08:36:50 crc kubenswrapper[5004]: I1201 08:36:50.007140 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-tnmn9"] Dec 01 08:36:50 crc kubenswrapper[5004]: E1201 08:36:50.012781 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fskwh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-tnmn9_openstack-operators(bff56810-ae93-4fca-a568-cc88e971c1d8): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 08:36:50 crc kubenswrapper[5004]: E1201 08:36:50.014230 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-tnmn9" podUID="bff56810-ae93-4fca-a568-cc88e971c1d8" Dec 01 08:36:50 crc kubenswrapper[5004]: W1201 08:36:50.017045 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1d1796c_7fa3_4a90_bfb2_cc257a69ba58.slice/crio-5a865c04fb8509e591756ade05fe07072b87958d3252ad340525a0e05cc089f4 WatchSource:0}: Error finding container 5a865c04fb8509e591756ade05fe07072b87958d3252ad340525a0e05cc089f4: Status 404 returned error can't find the container with id 5a865c04fb8509e591756ade05fe07072b87958d3252ad340525a0e05cc089f4 Dec 01 08:36:50 crc kubenswrapper[5004]: E1201 08:36:50.027737 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hjqmm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-8n2qh_openstack-operators(bfa6d181-b802-48de-8c57-4b8b7a8f1e07): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 08:36:50 crc kubenswrapper[5004]: E1201 08:36:50.028036 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dvwlf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-769dc69bc-nfdnl_openstack-operators(f1d1796c-7fa3-4a90-bfb2-cc257a69ba58): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 08:36:50 crc kubenswrapper[5004]: E1201 08:36:50.030470 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dvwlf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-769dc69bc-nfdnl_openstack-operators(f1d1796c-7fa3-4a90-bfb2-cc257a69ba58): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 08:36:50 crc kubenswrapper[5004]: E1201 08:36:50.031028 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hjqmm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-8n2qh_openstack-operators(bfa6d181-b802-48de-8c57-4b8b7a8f1e07): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 08:36:50 crc kubenswrapper[5004]: E1201 08:36:50.034393 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-nfdnl" podUID="f1d1796c-7fa3-4a90-bfb2-cc257a69ba58" Dec 01 08:36:50 crc kubenswrapper[5004]: E1201 08:36:50.034482 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-8n2qh" podUID="bfa6d181-b802-48de-8c57-4b8b7a8f1e07" Dec 01 08:36:50 crc kubenswrapper[5004]: W1201 08:36:50.030811 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1d2c2cd_d1c5_490e_99ee_e0ab5c18ebc4.slice/crio-39b4d4cf574ac882309928d830822325a3da472f7720512055150f095d33d6b6 WatchSource:0}: Error finding container 39b4d4cf574ac882309928d830822325a3da472f7720512055150f095d33d6b6: Status 404 returned error can't find the container with id 39b4d4cf574ac882309928d830822325a3da472f7720512055150f095d33d6b6 Dec 01 08:36:50 crc kubenswrapper[5004]: I1201 08:36:50.037654 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-nfdnl"] Dec 01 08:36:50 crc kubenswrapper[5004]: E1201 08:36:50.038491 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fztv8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f8c65bbfc-c96bj_openstack-operators(a1d2c2cd-d1c5-490e-99ee-e0ab5c18ebc4): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 08:36:50 crc kubenswrapper[5004]: E1201 08:36:50.041434 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fztv8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f8c65bbfc-c96bj_openstack-operators(a1d2c2cd-d1c5-490e-99ee-e0ab5c18ebc4): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 08:36:50 crc kubenswrapper[5004]: E1201 08:36:50.043854 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-c96bj" podUID="a1d2c2cd-d1c5-490e-99ee-e0ab5c18ebc4" Dec 01 08:36:50 crc kubenswrapper[5004]: I1201 08:36:50.051807 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-8n2qh"] Dec 01 08:36:50 crc kubenswrapper[5004]: I1201 08:36:50.059164 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-c96bj"] Dec 01 08:36:50 crc kubenswrapper[5004]: I1201 08:36:50.529829 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-c96bj" event={"ID":"a1d2c2cd-d1c5-490e-99ee-e0ab5c18ebc4","Type":"ContainerStarted","Data":"39b4d4cf574ac882309928d830822325a3da472f7720512055150f095d33d6b6"} Dec 01 08:36:50 crc kubenswrapper[5004]: I1201 08:36:50.532436 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-pk7mn" event={"ID":"67bfafa3-790f-4b23-8bef-8b5da60bf6dc","Type":"ContainerStarted","Data":"5b51e96f1e9eacc2963611b32cc08626c3689df15d283d166e782e9f59431e9c"} Dec 01 08:36:50 crc kubenswrapper[5004]: E1201 08:36:50.534125 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-c96bj" podUID="a1d2c2cd-d1c5-490e-99ee-e0ab5c18ebc4" Dec 01 08:36:50 crc kubenswrapper[5004]: I1201 08:36:50.534742 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-zgpn6" event={"ID":"b931f322-0c4e-4019-a11e-616c80d1e5f1","Type":"ContainerStarted","Data":"410638eb878ca80f05ea2a3f192883caddea5d72c9cc63bde3ceb4c5423dd85d"} Dec 01 08:36:50 crc kubenswrapper[5004]: I1201 08:36:50.536510 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l4rkg" event={"ID":"ed5bf034-cb91-4b02-97a4-c63a8506e527","Type":"ContainerStarted","Data":"2a93b77b14f5af825f1dd99fcf228311616ca6ec460f89384a6b7b7c3c4478cc"} Dec 01 08:36:50 crc kubenswrapper[5004]: E1201 08:36:50.538085 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l4rkg" podUID="ed5bf034-cb91-4b02-97a4-c63a8506e527" Dec 01 08:36:50 crc kubenswrapper[5004]: I1201 08:36:50.539506 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-55g57" event={"ID":"24fb1ec9-065a-464d-9797-8020c38f81e8","Type":"ContainerStarted","Data":"92041626f6cda3dfb61dff07b300a973278db84f15b512ada8117703cff06b35"} Dec 01 08:36:50 crc kubenswrapper[5004]: I1201 08:36:50.541039 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-8n2qh" event={"ID":"bfa6d181-b802-48de-8c57-4b8b7a8f1e07","Type":"ContainerStarted","Data":"ae9e26a5930d754d18eee60fed25e8d1410e9e1e2073289e8b11ddb8680f72e6"} Dec 01 08:36:50 crc kubenswrapper[5004]: E1201 08:36:50.542889 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-8n2qh" podUID="bfa6d181-b802-48de-8c57-4b8b7a8f1e07" Dec 01 08:36:50 crc kubenswrapper[5004]: I1201 08:36:50.543211 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-59dzd" event={"ID":"cb3f7f28-c99e-44d4-b534-83889924b531","Type":"ContainerStarted","Data":"6fc4702a48cfb4ca3e868f3b38ffd97832a80cacabc83cf24e01f89d6b4d16e0"} Dec 01 08:36:50 crc kubenswrapper[5004]: I1201 08:36:50.544435 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-6dpk9" event={"ID":"166912d9-e0b0-40b8-8e26-9c86183d7952","Type":"ContainerStarted","Data":"c1f04272b52dc67c6ce93ec5741db9684a332c3e602980f3e537e143af89aa1b"} Dec 01 08:36:50 crc kubenswrapper[5004]: I1201 08:36:50.548503 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-bcd9b8768-5phd6" event={"ID":"5425cd72-5745-4b0f-ab14-b697c726d75f","Type":"ContainerStarted","Data":"f1f658e3d526719f47534f3511460e9f4176d487513a993a8551c1ccf687167d"} Dec 01 08:36:50 crc kubenswrapper[5004]: E1201 08:36:50.552089 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.30:5001/openstack-k8s-operators/telemetry-operator:4b9b5976885dec7b8bba09fe9749f3929a03aa17\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-bcd9b8768-5phd6" podUID="5425cd72-5745-4b0f-ab14-b697c726d75f" Dec 01 08:36:50 crc kubenswrapper[5004]: I1201 08:36:50.553636 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-nm7jp" event={"ID":"616c962a-6fda-4c1b-a377-51d721a17616","Type":"ContainerStarted","Data":"71f2537cf29b2bd7b3332a3cce8e70916527227c6b370fb5a173e0cdaf6fa01e"} Dec 01 08:36:50 crc kubenswrapper[5004]: I1201 08:36:50.555077 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-p468t" event={"ID":"9d4afad8-bd75-403d-b4d0-d7f01e1a6e5d","Type":"ContainerStarted","Data":"21a4fb150900977ddc5c0c8c922dc32a33d16a69177b287915b0b9577eeee20e"} Dec 01 08:36:50 crc kubenswrapper[5004]: I1201 08:36:50.556625 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-wqpmp" event={"ID":"d418620e-19a8-4171-94f7-2dba61ca8b6a","Type":"ContainerStarted","Data":"2687a8a0cb83327f26989226cb0139b49b451ed44316d009406ed93a5b3ec7b7"} Dec 01 08:36:50 crc kubenswrapper[5004]: I1201 08:36:50.558462 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-6l4k4" event={"ID":"8d8f2a7f-1da5-45ca-8dd0-1fa87e3d46fe","Type":"ContainerStarted","Data":"c052d27e2d81781e7df009603a90a7aec9537604956442cd45e45f698f9d00b6"} Dec 01 08:36:50 crc kubenswrapper[5004]: I1201 08:36:50.565480 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-nfdnl" event={"ID":"f1d1796c-7fa3-4a90-bfb2-cc257a69ba58","Type":"ContainerStarted","Data":"5a865c04fb8509e591756ade05fe07072b87958d3252ad340525a0e05cc089f4"} Dec 01 08:36:50 crc kubenswrapper[5004]: I1201 08:36:50.570861 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-tnmn9" event={"ID":"bff56810-ae93-4fca-a568-cc88e971c1d8","Type":"ContainerStarted","Data":"9e9eba61e735fed03ddfa700796ad8891bdf11d674d01bf5a49fe41443b6305f"} Dec 01 08:36:50 crc kubenswrapper[5004]: E1201 08:36:50.575377 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-nfdnl" podUID="f1d1796c-7fa3-4a90-bfb2-cc257a69ba58" Dec 01 08:36:50 crc kubenswrapper[5004]: E1201 08:36:50.588645 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-tnmn9" podUID="bff56810-ae93-4fca-a568-cc88e971c1d8" Dec 01 08:36:51 crc kubenswrapper[5004]: I1201 08:36:51.226170 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f87b36f7-2558-4823-85fc-6b6e9090b1d7-cert\") pod \"infra-operator-controller-manager-57548d458d-hzgf4\" (UID: \"f87b36f7-2558-4823-85fc-6b6e9090b1d7\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-hzgf4" Dec 01 08:36:51 crc kubenswrapper[5004]: E1201 08:36:51.226353 5004 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 01 08:36:51 crc kubenswrapper[5004]: E1201 08:36:51.226443 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f87b36f7-2558-4823-85fc-6b6e9090b1d7-cert podName:f87b36f7-2558-4823-85fc-6b6e9090b1d7 nodeName:}" failed. No retries permitted until 2025-12-01 08:36:55.226423938 +0000 UTC m=+1192.791415920 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f87b36f7-2558-4823-85fc-6b6e9090b1d7-cert") pod "infra-operator-controller-manager-57548d458d-hzgf4" (UID: "f87b36f7-2558-4823-85fc-6b6e9090b1d7") : secret "infra-operator-webhook-server-cert" not found Dec 01 08:36:51 crc kubenswrapper[5004]: E1201 08:36:51.582389 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l4rkg" podUID="ed5bf034-cb91-4b02-97a4-c63a8506e527" Dec 01 08:36:51 crc kubenswrapper[5004]: E1201 08:36:51.582918 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-nfdnl" podUID="f1d1796c-7fa3-4a90-bfb2-cc257a69ba58" Dec 01 08:36:51 crc kubenswrapper[5004]: E1201 08:36:51.583024 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-8n2qh" podUID="bfa6d181-b802-48de-8c57-4b8b7a8f1e07" Dec 01 08:36:51 crc kubenswrapper[5004]: E1201 08:36:51.583017 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-c96bj" podUID="a1d2c2cd-d1c5-490e-99ee-e0ab5c18ebc4" Dec 01 08:36:51 crc kubenswrapper[5004]: E1201 08:36:51.583088 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.30:5001/openstack-k8s-operators/telemetry-operator:4b9b5976885dec7b8bba09fe9749f3929a03aa17\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-bcd9b8768-5phd6" podUID="5425cd72-5745-4b0f-ab14-b697c726d75f" Dec 01 08:36:51 crc kubenswrapper[5004]: E1201 08:36:51.583733 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-tnmn9" podUID="bff56810-ae93-4fca-a568-cc88e971c1d8" Dec 01 08:36:51 crc kubenswrapper[5004]: I1201 08:36:51.837155 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fde3e479-59b7-4b8b-82c8-38b346fd3409-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd42mdlz\" (UID: \"fde3e479-59b7-4b8b-82c8-38b346fd3409\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd42mdlz" Dec 01 08:36:51 crc kubenswrapper[5004]: E1201 08:36:51.837372 5004 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 08:36:51 crc kubenswrapper[5004]: E1201 08:36:51.837927 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fde3e479-59b7-4b8b-82c8-38b346fd3409-cert podName:fde3e479-59b7-4b8b-82c8-38b346fd3409 nodeName:}" failed. No retries permitted until 2025-12-01 08:36:55.837912025 +0000 UTC m=+1193.402904007 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fde3e479-59b7-4b8b-82c8-38b346fd3409-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd42mdlz" (UID: "fde3e479-59b7-4b8b-82c8-38b346fd3409") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 08:36:52 crc kubenswrapper[5004]: I1201 08:36:52.040419 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38bf4275-c95e-4b2d-88fe-aeace2e41983-metrics-certs\") pod \"openstack-operator-controller-manager-6c4968b65-xg6h2\" (UID: \"38bf4275-c95e-4b2d-88fe-aeace2e41983\") " pod="openstack-operators/openstack-operator-controller-manager-6c4968b65-xg6h2" Dec 01 08:36:52 crc kubenswrapper[5004]: I1201 08:36:52.040488 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/38bf4275-c95e-4b2d-88fe-aeace2e41983-webhook-certs\") pod \"openstack-operator-controller-manager-6c4968b65-xg6h2\" (UID: \"38bf4275-c95e-4b2d-88fe-aeace2e41983\") " pod="openstack-operators/openstack-operator-controller-manager-6c4968b65-xg6h2" Dec 01 08:36:52 crc kubenswrapper[5004]: E1201 08:36:52.040706 5004 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 01 08:36:52 crc kubenswrapper[5004]: E1201 08:36:52.040715 5004 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 01 08:36:52 crc kubenswrapper[5004]: E1201 08:36:52.040777 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38bf4275-c95e-4b2d-88fe-aeace2e41983-webhook-certs podName:38bf4275-c95e-4b2d-88fe-aeace2e41983 nodeName:}" failed. No retries permitted until 2025-12-01 08:36:56.04075545 +0000 UTC m=+1193.605747432 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/38bf4275-c95e-4b2d-88fe-aeace2e41983-webhook-certs") pod "openstack-operator-controller-manager-6c4968b65-xg6h2" (UID: "38bf4275-c95e-4b2d-88fe-aeace2e41983") : secret "webhook-server-cert" not found Dec 01 08:36:52 crc kubenswrapper[5004]: E1201 08:36:52.040808 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38bf4275-c95e-4b2d-88fe-aeace2e41983-metrics-certs podName:38bf4275-c95e-4b2d-88fe-aeace2e41983 nodeName:}" failed. No retries permitted until 2025-12-01 08:36:56.040786201 +0000 UTC m=+1193.605778273 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/38bf4275-c95e-4b2d-88fe-aeace2e41983-metrics-certs") pod "openstack-operator-controller-manager-6c4968b65-xg6h2" (UID: "38bf4275-c95e-4b2d-88fe-aeace2e41983") : secret "metrics-server-cert" not found Dec 01 08:36:55 crc kubenswrapper[5004]: I1201 08:36:55.299263 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f87b36f7-2558-4823-85fc-6b6e9090b1d7-cert\") pod \"infra-operator-controller-manager-57548d458d-hzgf4\" (UID: \"f87b36f7-2558-4823-85fc-6b6e9090b1d7\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-hzgf4" Dec 01 08:36:55 crc kubenswrapper[5004]: E1201 08:36:55.300393 5004 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 01 08:36:55 crc kubenswrapper[5004]: E1201 08:36:55.300551 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f87b36f7-2558-4823-85fc-6b6e9090b1d7-cert podName:f87b36f7-2558-4823-85fc-6b6e9090b1d7 nodeName:}" failed. No retries permitted until 2025-12-01 08:37:03.300510875 +0000 UTC m=+1200.865502907 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f87b36f7-2558-4823-85fc-6b6e9090b1d7-cert") pod "infra-operator-controller-manager-57548d458d-hzgf4" (UID: "f87b36f7-2558-4823-85fc-6b6e9090b1d7") : secret "infra-operator-webhook-server-cert" not found Dec 01 08:36:55 crc kubenswrapper[5004]: I1201 08:36:55.908426 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fde3e479-59b7-4b8b-82c8-38b346fd3409-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd42mdlz\" (UID: \"fde3e479-59b7-4b8b-82c8-38b346fd3409\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd42mdlz" Dec 01 08:36:55 crc kubenswrapper[5004]: E1201 08:36:55.908871 5004 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 08:36:55 crc kubenswrapper[5004]: E1201 08:36:55.909002 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fde3e479-59b7-4b8b-82c8-38b346fd3409-cert podName:fde3e479-59b7-4b8b-82c8-38b346fd3409 nodeName:}" failed. No retries permitted until 2025-12-01 08:37:03.908977179 +0000 UTC m=+1201.473969171 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fde3e479-59b7-4b8b-82c8-38b346fd3409-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd42mdlz" (UID: "fde3e479-59b7-4b8b-82c8-38b346fd3409") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 08:36:56 crc kubenswrapper[5004]: I1201 08:36:56.112011 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38bf4275-c95e-4b2d-88fe-aeace2e41983-metrics-certs\") pod \"openstack-operator-controller-manager-6c4968b65-xg6h2\" (UID: \"38bf4275-c95e-4b2d-88fe-aeace2e41983\") " pod="openstack-operators/openstack-operator-controller-manager-6c4968b65-xg6h2" Dec 01 08:36:56 crc kubenswrapper[5004]: I1201 08:36:56.112131 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/38bf4275-c95e-4b2d-88fe-aeace2e41983-webhook-certs\") pod \"openstack-operator-controller-manager-6c4968b65-xg6h2\" (UID: \"38bf4275-c95e-4b2d-88fe-aeace2e41983\") " pod="openstack-operators/openstack-operator-controller-manager-6c4968b65-xg6h2" Dec 01 08:36:56 crc kubenswrapper[5004]: E1201 08:36:56.112242 5004 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 01 08:36:56 crc kubenswrapper[5004]: E1201 08:36:56.112341 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38bf4275-c95e-4b2d-88fe-aeace2e41983-metrics-certs podName:38bf4275-c95e-4b2d-88fe-aeace2e41983 nodeName:}" failed. No retries permitted until 2025-12-01 08:37:04.112314727 +0000 UTC m=+1201.677306749 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/38bf4275-c95e-4b2d-88fe-aeace2e41983-metrics-certs") pod "openstack-operator-controller-manager-6c4968b65-xg6h2" (UID: "38bf4275-c95e-4b2d-88fe-aeace2e41983") : secret "metrics-server-cert" not found Dec 01 08:36:56 crc kubenswrapper[5004]: E1201 08:36:56.112350 5004 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 01 08:36:56 crc kubenswrapper[5004]: E1201 08:36:56.112595 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38bf4275-c95e-4b2d-88fe-aeace2e41983-webhook-certs podName:38bf4275-c95e-4b2d-88fe-aeace2e41983 nodeName:}" failed. No retries permitted until 2025-12-01 08:37:04.112413759 +0000 UTC m=+1201.677405821 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/38bf4275-c95e-4b2d-88fe-aeace2e41983-webhook-certs") pod "openstack-operator-controller-manager-6c4968b65-xg6h2" (UID: "38bf4275-c95e-4b2d-88fe-aeace2e41983") : secret "webhook-server-cert" not found Dec 01 08:37:00 crc kubenswrapper[5004]: E1201 08:37:00.673523 5004 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:1d60701214b39cdb0fa70bbe5710f9b131139a9f4b482c2db4058a04daefb801" Dec 01 08:37:00 crc kubenswrapper[5004]: E1201 08:37:00.674253 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:1d60701214b39cdb0fa70bbe5710f9b131139a9f4b482c2db4058a04daefb801,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jqh2f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-859b6ccc6-6dpk9_openstack-operators(166912d9-e0b0-40b8-8e26-9c86183d7952): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 08:37:03 crc kubenswrapper[5004]: I1201 08:37:03.334988 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f87b36f7-2558-4823-85fc-6b6e9090b1d7-cert\") pod \"infra-operator-controller-manager-57548d458d-hzgf4\" (UID: \"f87b36f7-2558-4823-85fc-6b6e9090b1d7\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-hzgf4" Dec 01 08:37:03 crc kubenswrapper[5004]: I1201 08:37:03.342203 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f87b36f7-2558-4823-85fc-6b6e9090b1d7-cert\") pod \"infra-operator-controller-manager-57548d458d-hzgf4\" (UID: \"f87b36f7-2558-4823-85fc-6b6e9090b1d7\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-hzgf4" Dec 01 08:37:03 crc kubenswrapper[5004]: I1201 08:37:03.542906 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-r7kqv" Dec 01 08:37:03 crc kubenswrapper[5004]: I1201 08:37:03.551812 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-hzgf4" Dec 01 08:37:03 crc kubenswrapper[5004]: I1201 08:37:03.945859 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fde3e479-59b7-4b8b-82c8-38b346fd3409-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd42mdlz\" (UID: \"fde3e479-59b7-4b8b-82c8-38b346fd3409\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd42mdlz" Dec 01 08:37:03 crc kubenswrapper[5004]: I1201 08:37:03.950948 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fde3e479-59b7-4b8b-82c8-38b346fd3409-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd42mdlz\" (UID: \"fde3e479-59b7-4b8b-82c8-38b346fd3409\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd42mdlz" Dec 01 08:37:03 crc kubenswrapper[5004]: I1201 08:37:03.995443 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-rb86h" Dec 01 08:37:04 crc kubenswrapper[5004]: I1201 08:37:04.004653 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd42mdlz" Dec 01 08:37:04 crc kubenswrapper[5004]: I1201 08:37:04.148695 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38bf4275-c95e-4b2d-88fe-aeace2e41983-metrics-certs\") pod \"openstack-operator-controller-manager-6c4968b65-xg6h2\" (UID: \"38bf4275-c95e-4b2d-88fe-aeace2e41983\") " pod="openstack-operators/openstack-operator-controller-manager-6c4968b65-xg6h2" Dec 01 08:37:04 crc kubenswrapper[5004]: I1201 08:37:04.148756 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/38bf4275-c95e-4b2d-88fe-aeace2e41983-webhook-certs\") pod \"openstack-operator-controller-manager-6c4968b65-xg6h2\" (UID: \"38bf4275-c95e-4b2d-88fe-aeace2e41983\") " pod="openstack-operators/openstack-operator-controller-manager-6c4968b65-xg6h2" Dec 01 08:37:04 crc kubenswrapper[5004]: E1201 08:37:04.149040 5004 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 01 08:37:04 crc kubenswrapper[5004]: E1201 08:37:04.149127 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38bf4275-c95e-4b2d-88fe-aeace2e41983-webhook-certs podName:38bf4275-c95e-4b2d-88fe-aeace2e41983 nodeName:}" failed. No retries permitted until 2025-12-01 08:37:20.149109401 +0000 UTC m=+1217.714101383 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/38bf4275-c95e-4b2d-88fe-aeace2e41983-webhook-certs") pod "openstack-operator-controller-manager-6c4968b65-xg6h2" (UID: "38bf4275-c95e-4b2d-88fe-aeace2e41983") : secret "webhook-server-cert" not found Dec 01 08:37:04 crc kubenswrapper[5004]: I1201 08:37:04.156434 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38bf4275-c95e-4b2d-88fe-aeace2e41983-metrics-certs\") pod \"openstack-operator-controller-manager-6c4968b65-xg6h2\" (UID: \"38bf4275-c95e-4b2d-88fe-aeace2e41983\") " pod="openstack-operators/openstack-operator-controller-manager-6c4968b65-xg6h2" Dec 01 08:37:08 crc kubenswrapper[5004]: I1201 08:37:08.731393 5004 patch_prober.go:28] interesting pod/machine-config-daemon-fvdgt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 08:37:08 crc kubenswrapper[5004]: I1201 08:37:08.732365 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 08:37:16 crc kubenswrapper[5004]: E1201 08:37:16.941208 5004 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670" Dec 01 08:37:16 crc kubenswrapper[5004]: E1201 08:37:16.942418 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-srjdx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-nm7jp_openstack-operators(616c962a-6fda-4c1b-a377-51d721a17616): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 08:37:18 crc kubenswrapper[5004]: E1201 08:37:18.400645 5004 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = reading blob sha256:9f4bff248214d12c7254dc3c25ef82bd14ff143e2a06d159f2a8cc1c9e6ef1fd: Get \"https://quay.io/v2/openstack-k8s-operators/rabbitmq-cluster-operator/blobs/sha256:9f4bff248214d12c7254dc3c25ef82bd14ff143e2a06d159f2a8cc1c9e6ef1fd\": context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Dec 01 08:37:18 crc kubenswrapper[5004]: E1201 08:37:18.401501 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-g254f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-l4rkg_openstack-operators(ed5bf034-cb91-4b02-97a4-c63a8506e527): ErrImagePull: rpc error: code = Canceled desc = reading blob sha256:9f4bff248214d12c7254dc3c25ef82bd14ff143e2a06d159f2a8cc1c9e6ef1fd: Get \"https://quay.io/v2/openstack-k8s-operators/rabbitmq-cluster-operator/blobs/sha256:9f4bff248214d12c7254dc3c25ef82bd14ff143e2a06d159f2a8cc1c9e6ef1fd\": context canceled" logger="UnhandledError" Dec 01 08:37:18 crc kubenswrapper[5004]: E1201 08:37:18.402934 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = reading blob sha256:9f4bff248214d12c7254dc3c25ef82bd14ff143e2a06d159f2a8cc1c9e6ef1fd: Get \\\"https://quay.io/v2/openstack-k8s-operators/rabbitmq-cluster-operator/blobs/sha256:9f4bff248214d12c7254dc3c25ef82bd14ff143e2a06d159f2a8cc1c9e6ef1fd\\\": context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l4rkg" podUID="ed5bf034-cb91-4b02-97a4-c63a8506e527" Dec 01 08:37:18 crc kubenswrapper[5004]: E1201 08:37:18.422013 5004 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:986861e5a0a9954f63581d9d55a30f8057883cefea489415d76257774526eea3" Dec 01 08:37:18 crc kubenswrapper[5004]: E1201 08:37:18.422291 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:986861e5a0a9954f63581d9d55a30f8057883cefea489415d76257774526eea3,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8jzw8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-546d4bdf48-59dzd_openstack-operators(cb3f7f28-c99e-44d4-b534-83889924b531): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 08:37:18 crc kubenswrapper[5004]: E1201 08:37:18.998684 5004 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621" Dec 01 08:37:18 crc kubenswrapper[5004]: E1201 08:37:18.998968 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dvwlf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-769dc69bc-nfdnl_openstack-operators(f1d1796c-7fa3-4a90-bfb2-cc257a69ba58): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 08:37:19 crc kubenswrapper[5004]: E1201 08:37:19.549455 5004 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94" Dec 01 08:37:19 crc kubenswrapper[5004]: E1201 08:37:19.549641 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hjqmm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-8n2qh_openstack-operators(bfa6d181-b802-48de-8c57-4b8b7a8f1e07): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 08:37:20 crc kubenswrapper[5004]: E1201 08:37:20.163232 5004 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59" Dec 01 08:37:20 crc kubenswrapper[5004]: E1201 08:37:20.163452 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fskwh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-tnmn9_openstack-operators(bff56810-ae93-4fca-a568-cc88e971c1d8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 08:37:20 crc kubenswrapper[5004]: I1201 08:37:20.205637 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/38bf4275-c95e-4b2d-88fe-aeace2e41983-webhook-certs\") pod \"openstack-operator-controller-manager-6c4968b65-xg6h2\" (UID: \"38bf4275-c95e-4b2d-88fe-aeace2e41983\") " pod="openstack-operators/openstack-operator-controller-manager-6c4968b65-xg6h2" Dec 01 08:37:20 crc kubenswrapper[5004]: I1201 08:37:20.210135 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/38bf4275-c95e-4b2d-88fe-aeace2e41983-webhook-certs\") pod \"openstack-operator-controller-manager-6c4968b65-xg6h2\" (UID: \"38bf4275-c95e-4b2d-88fe-aeace2e41983\") " pod="openstack-operators/openstack-operator-controller-manager-6c4968b65-xg6h2" Dec 01 08:37:20 crc kubenswrapper[5004]: I1201 08:37:20.269428 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-txll4" Dec 01 08:37:20 crc kubenswrapper[5004]: I1201 08:37:20.276794 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6c4968b65-xg6h2" Dec 01 08:37:21 crc kubenswrapper[5004]: I1201 08:37:21.787371 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd42mdlz"] Dec 01 08:37:21 crc kubenswrapper[5004]: I1201 08:37:21.802532 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-hzgf4"] Dec 01 08:37:23 crc kubenswrapper[5004]: I1201 08:37:23.965798 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-55g57" event={"ID":"24fb1ec9-065a-464d-9797-8020c38f81e8","Type":"ContainerStarted","Data":"528f86fce379d2faa62612e55d079512bb33ad92d6aa18aa030b2693434f6d50"} Dec 01 08:37:23 crc kubenswrapper[5004]: I1201 08:37:23.968905 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-rp7vl" event={"ID":"9bca89aa-3367-4bff-b070-c191fcae5f2f","Type":"ContainerStarted","Data":"93cc1c3230d2aa5aa9adbc5194f01d13c098a1aa8dcae53803fdb470ae5613fd"} Dec 01 08:37:23 crc kubenswrapper[5004]: I1201 08:37:23.971444 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-6l4k4" event={"ID":"8d8f2a7f-1da5-45ca-8dd0-1fa87e3d46fe","Type":"ContainerStarted","Data":"e4a41de03773b9d76e91b0dc2480b3b229c986ce19c90c66a7ba7035cfaa741f"} Dec 01 08:37:23 crc kubenswrapper[5004]: I1201 08:37:23.972573 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-hzgf4" event={"ID":"f87b36f7-2558-4823-85fc-6b6e9090b1d7","Type":"ContainerStarted","Data":"8fee685cce176e4824cf07f5e51e36e2ca768799e1684db7ffbaaee3acd22a7c"} Dec 01 08:37:23 crc kubenswrapper[5004]: I1201 08:37:23.980590 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-vv9hf" event={"ID":"3521a18b-f34e-4107-9e34-048a9827a2fe","Type":"ContainerStarted","Data":"21cea6bc6c8a5329359e3b6cf6eb7daa1d23f8f4dd2538727b314243450e2e4b"} Dec 01 08:37:23 crc kubenswrapper[5004]: I1201 08:37:23.981694 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-zgpn6" event={"ID":"b931f322-0c4e-4019-a11e-616c80d1e5f1","Type":"ContainerStarted","Data":"5f45b4ac7796c5ac11d0c37f35fc52f99b7ee3b4ce4a5abf4ef6a1d5837638ce"} Dec 01 08:37:23 crc kubenswrapper[5004]: I1201 08:37:23.982596 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-5c6kq" event={"ID":"67dcdfb2-70ae-4444-b271-dd83dcb37756","Type":"ContainerStarted","Data":"05d9742ed86844dde2b1fd3460d232b3428ad51bf7d5b8d632cd5df4b6a5871f"} Dec 01 08:37:23 crc kubenswrapper[5004]: I1201 08:37:23.983257 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd42mdlz" event={"ID":"fde3e479-59b7-4b8b-82c8-38b346fd3409","Type":"ContainerStarted","Data":"0b145b5d1e8295fac26426b49e906ed633e45dd5a029d68c42ed1f3b2fe19ec9"} Dec 01 08:37:24 crc kubenswrapper[5004]: I1201 08:37:24.080411 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6c4968b65-xg6h2"] Dec 01 08:37:26 crc kubenswrapper[5004]: I1201 08:37:26.003145 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-78524" event={"ID":"aa866b8d-174b-4fab-a55d-cc2bcdef5526","Type":"ContainerStarted","Data":"13460d3f543e8af103e8802fe8445482b31709f48ef0565149fedf091f930b54"} Dec 01 08:37:26 crc kubenswrapper[5004]: I1201 08:37:26.004933 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-pk7mn" event={"ID":"67bfafa3-790f-4b23-8bef-8b5da60bf6dc","Type":"ContainerStarted","Data":"57f564f2ad769b8c7c060a44d88db6b042286c84891eddcac42943cbf07bec53"} Dec 01 08:37:26 crc kubenswrapper[5004]: E1201 08:37:26.646211 5004 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying layer: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 01 08:37:26 crc kubenswrapper[5004]: E1201 08:37:26.646631 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jqh2f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-859b6ccc6-6dpk9_openstack-operators(166912d9-e0b0-40b8-8e26-9c86183d7952): ErrImagePull: rpc error: code = Canceled desc = copying layer: context canceled" logger="UnhandledError" Dec 01 08:37:26 crc kubenswrapper[5004]: E1201 08:37:26.648301 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying layer: context canceled\"]" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-6dpk9" podUID="166912d9-e0b0-40b8-8e26-9c86183d7952" Dec 01 08:37:27 crc kubenswrapper[5004]: I1201 08:37:27.040405 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-lfs45" event={"ID":"c6e6ae59-9f58-4856-b200-d42d1e1e23ed","Type":"ContainerStarted","Data":"6f8e224fb2b7d4d2effb1a14698ecc6462a417b5c6315cad31bcdb794d64f5bd"} Dec 01 08:37:27 crc kubenswrapper[5004]: I1201 08:37:27.042885 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-p468t" event={"ID":"9d4afad8-bd75-403d-b4d0-d7f01e1a6e5d","Type":"ContainerStarted","Data":"c47ad195d062e1037ba3eef92b268b5703bcca9897be60750055459dca60d84d"} Dec 01 08:37:27 crc kubenswrapper[5004]: I1201 08:37:27.044827 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-wqpmp" event={"ID":"d418620e-19a8-4171-94f7-2dba61ca8b6a","Type":"ContainerStarted","Data":"541a849fa0c94b6873d9c5f828784f23ca0319d06dcacb0baa95217a27492cb1"} Dec 01 08:37:27 crc kubenswrapper[5004]: I1201 08:37:27.047322 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6c4968b65-xg6h2" event={"ID":"38bf4275-c95e-4b2d-88fe-aeace2e41983","Type":"ContainerStarted","Data":"611ab3ecefb29bf5297ea7ff42e6f918b470526b96cf1dc8dc913439905a1748"} Dec 01 08:37:28 crc kubenswrapper[5004]: I1201 08:37:28.059749 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-c96bj" event={"ID":"a1d2c2cd-d1c5-490e-99ee-e0ab5c18ebc4","Type":"ContainerStarted","Data":"fe06ee32fe3d0aea5b7a80f2e6b1a0054fd9a9676e7d512e4c999974dee61318"} Dec 01 08:37:29 crc kubenswrapper[5004]: I1201 08:37:29.069113 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-bcd9b8768-5phd6" event={"ID":"5425cd72-5745-4b0f-ab14-b697c726d75f","Type":"ContainerStarted","Data":"0373ece3acee7e791e890fea6441c40f9d27d01a72e5323c372d4fbbac8169b4"} Dec 01 08:37:31 crc kubenswrapper[5004]: I1201 08:37:31.101216 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-6dpk9" event={"ID":"166912d9-e0b0-40b8-8e26-9c86183d7952","Type":"ContainerStarted","Data":"495277fcaa9c134447af459729931581f2d9797dad87c3951c65c00b1717dd99"} Dec 01 08:37:31 crc kubenswrapper[5004]: I1201 08:37:31.110700 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-hzgf4" event={"ID":"f87b36f7-2558-4823-85fc-6b6e9090b1d7","Type":"ContainerStarted","Data":"adaa67e667f24e3266bc5616bfe5ef127d3680244505e4b202fa7d0829cd9704"} Dec 01 08:37:31 crc kubenswrapper[5004]: I1201 08:37:31.115244 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6c4968b65-xg6h2" event={"ID":"38bf4275-c95e-4b2d-88fe-aeace2e41983","Type":"ContainerStarted","Data":"cd93d9a92d8ec0a8a89fd84d7d8c5e558fac0bb34e46033dbd1514de54bab5c4"} Dec 01 08:37:31 crc kubenswrapper[5004]: I1201 08:37:31.115410 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-6c4968b65-xg6h2" Dec 01 08:37:31 crc kubenswrapper[5004]: I1201 08:37:31.163688 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-6c4968b65-xg6h2" podStartSLOduration=44.163666218 podStartE2EDuration="44.163666218s" podCreationTimestamp="2025-12-01 08:36:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:37:31.155652845 +0000 UTC m=+1228.720644847" watchObservedRunningTime="2025-12-01 08:37:31.163666218 +0000 UTC m=+1228.728658200" Dec 01 08:37:31 crc kubenswrapper[5004]: E1201 08:37:31.339870 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-59dzd" podUID="cb3f7f28-c99e-44d4-b534-83889924b531" Dec 01 08:37:31 crc kubenswrapper[5004]: E1201 08:37:31.339986 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-nm7jp" podUID="616c962a-6fda-4c1b-a377-51d721a17616" Dec 01 08:37:31 crc kubenswrapper[5004]: E1201 08:37:31.391147 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-nfdnl" podUID="f1d1796c-7fa3-4a90-bfb2-cc257a69ba58" Dec 01 08:37:31 crc kubenswrapper[5004]: E1201 08:37:31.540453 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-tnmn9" podUID="bff56810-ae93-4fca-a568-cc88e971c1d8" Dec 01 08:37:31 crc kubenswrapper[5004]: E1201 08:37:31.953230 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/test-operator-controller-manager-5854674fcc-8n2qh" podUID="bfa6d181-b802-48de-8c57-4b8b7a8f1e07" Dec 01 08:37:32 crc kubenswrapper[5004]: I1201 08:37:32.159694 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd42mdlz" event={"ID":"fde3e479-59b7-4b8b-82c8-38b346fd3409","Type":"ContainerStarted","Data":"aab77b19fbeaceaa7a96addfbd60a56c893e1e21ec2e0a5afb75a1ba239532a7"} Dec 01 08:37:32 crc kubenswrapper[5004]: I1201 08:37:32.159745 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd42mdlz" event={"ID":"fde3e479-59b7-4b8b-82c8-38b346fd3409","Type":"ContainerStarted","Data":"187f3472cebec19cbec182d55632bbfdcc188e57838201d17d880571fa8d2ceb"} Dec 01 08:37:32 crc kubenswrapper[5004]: I1201 08:37:32.161224 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd42mdlz" Dec 01 08:37:32 crc kubenswrapper[5004]: I1201 08:37:32.163450 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-pk7mn" event={"ID":"67bfafa3-790f-4b23-8bef-8b5da60bf6dc","Type":"ContainerStarted","Data":"68a6a50a0118752fad298a25ea974c53bded724c71ae16420b9bce5dd1f46ffe"} Dec 01 08:37:32 crc kubenswrapper[5004]: I1201 08:37:32.165069 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-pk7mn" Dec 01 08:37:32 crc kubenswrapper[5004]: I1201 08:37:32.177501 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-pk7mn" Dec 01 08:37:32 crc kubenswrapper[5004]: I1201 08:37:32.178780 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-p468t" event={"ID":"9d4afad8-bd75-403d-b4d0-d7f01e1a6e5d","Type":"ContainerStarted","Data":"65f2ade2d8f8e4d3c76c26c4e3d07d0457171a5248175ed22aede64ac9d850a0"} Dec 01 08:37:32 crc kubenswrapper[5004]: I1201 08:37:32.179427 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-78f8948974-p468t" Dec 01 08:37:32 crc kubenswrapper[5004]: I1201 08:37:32.180817 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-hzgf4" event={"ID":"f87b36f7-2558-4823-85fc-6b6e9090b1d7","Type":"ContainerStarted","Data":"8a4da705ee33e4694a5976efd3c63397c1f9260af2534645cda4a1b0811bc9f1"} Dec 01 08:37:32 crc kubenswrapper[5004]: I1201 08:37:32.181232 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-57548d458d-hzgf4" Dec 01 08:37:32 crc kubenswrapper[5004]: I1201 08:37:32.182488 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-bcd9b8768-5phd6" event={"ID":"5425cd72-5745-4b0f-ab14-b697c726d75f","Type":"ContainerStarted","Data":"7f716eb09fa0186094b6a67cb4a6ff2729076f722c98abf58646006498e7bc29"} Dec 01 08:37:32 crc kubenswrapper[5004]: I1201 08:37:32.182618 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-bcd9b8768-5phd6" Dec 01 08:37:32 crc kubenswrapper[5004]: I1201 08:37:32.184711 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-78f8948974-p468t" Dec 01 08:37:32 crc kubenswrapper[5004]: I1201 08:37:32.185664 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-nm7jp" event={"ID":"616c962a-6fda-4c1b-a377-51d721a17616","Type":"ContainerStarted","Data":"62af4864667e74598ebeb5d8a412cfd1217abb39a7e9b413e0533293c6955eb4"} Dec 01 08:37:32 crc kubenswrapper[5004]: I1201 08:37:32.192543 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-c96bj" event={"ID":"a1d2c2cd-d1c5-490e-99ee-e0ab5c18ebc4","Type":"ContainerStarted","Data":"5dde0b14f2d7d5416071f44891ea44c1d424895ac493fa755be83d289a1153d3"} Dec 01 08:37:32 crc kubenswrapper[5004]: I1201 08:37:32.194906 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-c96bj" Dec 01 08:37:32 crc kubenswrapper[5004]: I1201 08:37:32.198413 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-c96bj" Dec 01 08:37:32 crc kubenswrapper[5004]: I1201 08:37:32.203737 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-lfs45" event={"ID":"c6e6ae59-9f58-4856-b200-d42d1e1e23ed","Type":"ContainerStarted","Data":"b9611031e08045e282be9d729c78bd381ab7a339c6c9b95bf16010b72fee660c"} Dec 01 08:37:32 crc kubenswrapper[5004]: I1201 08:37:32.205847 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-lfs45" Dec 01 08:37:32 crc kubenswrapper[5004]: I1201 08:37:32.207875 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-lfs45" Dec 01 08:37:32 crc kubenswrapper[5004]: I1201 08:37:32.210273 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd42mdlz" podStartSLOduration=37.715289923 podStartE2EDuration="45.21025628s" podCreationTimestamp="2025-12-01 08:36:47 +0000 UTC" firstStartedPulling="2025-12-01 08:37:22.971001437 +0000 UTC m=+1220.535993419" lastFinishedPulling="2025-12-01 08:37:30.465967794 +0000 UTC m=+1228.030959776" observedRunningTime="2025-12-01 08:37:32.197987776 +0000 UTC m=+1229.762979758" watchObservedRunningTime="2025-12-01 08:37:32.21025628 +0000 UTC m=+1229.775248262" Dec 01 08:37:32 crc kubenswrapper[5004]: I1201 08:37:32.210852 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-55g57" event={"ID":"24fb1ec9-065a-464d-9797-8020c38f81e8","Type":"ContainerStarted","Data":"7cfa3ecde3ced9a86cdd806f40ee1c250b33ac332424bef2556571450fa776aa"} Dec 01 08:37:32 crc kubenswrapper[5004]: I1201 08:37:32.211616 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-55g57" Dec 01 08:37:32 crc kubenswrapper[5004]: I1201 08:37:32.215603 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-59dzd" event={"ID":"cb3f7f28-c99e-44d4-b534-83889924b531","Type":"ContainerStarted","Data":"f74252f78b17d3859937b0354c29a942b811dff01ef1c2fb2bdec86e9b7b6b75"} Dec 01 08:37:32 crc kubenswrapper[5004]: I1201 08:37:32.216235 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-55g57" Dec 01 08:37:32 crc kubenswrapper[5004]: I1201 08:37:32.221784 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-rp7vl" event={"ID":"9bca89aa-3367-4bff-b070-c191fcae5f2f","Type":"ContainerStarted","Data":"2a00cc24565e304efc473ac0174b9e4c0139d12a211c2a64324bd5fe0ea8b0d5"} Dec 01 08:37:32 crc kubenswrapper[5004]: I1201 08:37:32.224070 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-rp7vl" Dec 01 08:37:32 crc kubenswrapper[5004]: I1201 08:37:32.225301 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-rp7vl" Dec 01 08:37:32 crc kubenswrapper[5004]: I1201 08:37:32.230373 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-vv9hf" event={"ID":"3521a18b-f34e-4107-9e34-048a9827a2fe","Type":"ContainerStarted","Data":"5761637b82ec42cee370bdbec8a63e5f3beb48208ba0cd2fe03196786d77333f"} Dec 01 08:37:32 crc kubenswrapper[5004]: I1201 08:37:32.231495 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-57548d458d-hzgf4" podStartSLOduration=37.713736117 podStartE2EDuration="45.23148358s" podCreationTimestamp="2025-12-01 08:36:47 +0000 UTC" firstStartedPulling="2025-12-01 08:37:22.974364857 +0000 UTC m=+1220.539356849" lastFinishedPulling="2025-12-01 08:37:30.49211233 +0000 UTC m=+1228.057104312" observedRunningTime="2025-12-01 08:37:32.230124087 +0000 UTC m=+1229.795116069" watchObservedRunningTime="2025-12-01 08:37:32.23148358 +0000 UTC m=+1229.796475552" Dec 01 08:37:32 crc kubenswrapper[5004]: I1201 08:37:32.232384 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-vv9hf" Dec 01 08:37:32 crc kubenswrapper[5004]: I1201 08:37:32.234132 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-zgpn6" event={"ID":"b931f322-0c4e-4019-a11e-616c80d1e5f1","Type":"ContainerStarted","Data":"5be87ddc24e9fcd8e97928b5bf8f82cc10ded232a7ea421cf300d2646116b681"} Dec 01 08:37:32 crc kubenswrapper[5004]: I1201 08:37:32.234511 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-zgpn6" Dec 01 08:37:32 crc kubenswrapper[5004]: I1201 08:37:32.242716 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-zgpn6" Dec 01 08:37:32 crc kubenswrapper[5004]: I1201 08:37:32.242976 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-vv9hf" Dec 01 08:37:32 crc kubenswrapper[5004]: I1201 08:37:32.246708 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-nfdnl" event={"ID":"f1d1796c-7fa3-4a90-bfb2-cc257a69ba58","Type":"ContainerStarted","Data":"04bc27a867cc2e0f98d40f23c8e5bbfa5e21b50cb5075d48ce88281d56c10124"} Dec 01 08:37:32 crc kubenswrapper[5004]: E1201 08:37:32.250731 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-nfdnl" podUID="f1d1796c-7fa3-4a90-bfb2-cc257a69ba58" Dec 01 08:37:32 crc kubenswrapper[5004]: I1201 08:37:32.255023 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-6l4k4" event={"ID":"8d8f2a7f-1da5-45ca-8dd0-1fa87e3d46fe","Type":"ContainerStarted","Data":"d50cedee941a6581e0a5e092c948df6c1c1c582e7a7b085b2f9635abaa50a4f7"} Dec 01 08:37:32 crc kubenswrapper[5004]: I1201 08:37:32.255520 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-c96bj" podStartSLOduration=4.606690049 podStartE2EDuration="45.255509296s" podCreationTimestamp="2025-12-01 08:36:47 +0000 UTC" firstStartedPulling="2025-12-01 08:36:50.038354033 +0000 UTC m=+1187.603346015" lastFinishedPulling="2025-12-01 08:37:30.68717328 +0000 UTC m=+1228.252165262" observedRunningTime="2025-12-01 08:37:32.254046181 +0000 UTC m=+1229.819038163" watchObservedRunningTime="2025-12-01 08:37:32.255509296 +0000 UTC m=+1229.820501278" Dec 01 08:37:32 crc kubenswrapper[5004]: I1201 08:37:32.256841 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-6l4k4" Dec 01 08:37:32 crc kubenswrapper[5004]: I1201 08:37:32.263902 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-6l4k4" Dec 01 08:37:32 crc kubenswrapper[5004]: I1201 08:37:32.264794 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-6dpk9" event={"ID":"166912d9-e0b0-40b8-8e26-9c86183d7952","Type":"ContainerStarted","Data":"220520e39e0cede1e089a9b9cdd701da076f37f763649aa8491b4a8e3edb7a49"} Dec 01 08:37:32 crc kubenswrapper[5004]: I1201 08:37:32.265031 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-6dpk9" Dec 01 08:37:32 crc kubenswrapper[5004]: I1201 08:37:32.278531 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-78524" event={"ID":"aa866b8d-174b-4fab-a55d-cc2bcdef5526","Type":"ContainerStarted","Data":"136758646e846f3cdd26a033820a7f8fef1a4324d61cf7e57a9b6c30fe2917ff"} Dec 01 08:37:32 crc kubenswrapper[5004]: I1201 08:37:32.278914 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-bcd9b8768-5phd6" podStartSLOduration=4.242479674 podStartE2EDuration="45.278886946s" podCreationTimestamp="2025-12-01 08:36:47 +0000 UTC" firstStartedPulling="2025-12-01 08:36:49.789129325 +0000 UTC m=+1187.354121307" lastFinishedPulling="2025-12-01 08:37:30.825536597 +0000 UTC m=+1228.390528579" observedRunningTime="2025-12-01 08:37:32.273117558 +0000 UTC m=+1229.838109550" watchObservedRunningTime="2025-12-01 08:37:32.278886946 +0000 UTC m=+1229.843878928" Dec 01 08:37:32 crc kubenswrapper[5004]: I1201 08:37:32.279479 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-78524" Dec 01 08:37:32 crc kubenswrapper[5004]: I1201 08:37:32.283738 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-78524" Dec 01 08:37:32 crc kubenswrapper[5004]: I1201 08:37:32.287510 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-tnmn9" event={"ID":"bff56810-ae93-4fca-a568-cc88e971c1d8","Type":"ContainerStarted","Data":"ab135b337ac933606f675de17f469ad5cdf96887ade1a4b598d87ccfe446fa6f"} Dec 01 08:37:32 crc kubenswrapper[5004]: E1201 08:37:32.288707 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-tnmn9" podUID="bff56810-ae93-4fca-a568-cc88e971c1d8" Dec 01 08:37:32 crc kubenswrapper[5004]: I1201 08:37:32.293291 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-8n2qh" event={"ID":"bfa6d181-b802-48de-8c57-4b8b7a8f1e07","Type":"ContainerStarted","Data":"fa07612b8aa3ecbcb08e7ea81a150a80ec97792377c5eabe8a81ec1bab209124"} Dec 01 08:37:32 crc kubenswrapper[5004]: E1201 08:37:32.297650 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\"" pod="openstack-operators/test-operator-controller-manager-5854674fcc-8n2qh" podUID="bfa6d181-b802-48de-8c57-4b8b7a8f1e07" Dec 01 08:37:32 crc kubenswrapper[5004]: I1201 08:37:32.299244 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-wqpmp" event={"ID":"d418620e-19a8-4171-94f7-2dba61ca8b6a","Type":"ContainerStarted","Data":"9b18dcedb0aa601cf7b8a1172ecf33cdc40de390af97c79c031cca64132ea407"} Dec 01 08:37:32 crc kubenswrapper[5004]: I1201 08:37:32.299433 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-998648c74-wqpmp" Dec 01 08:37:32 crc kubenswrapper[5004]: I1201 08:37:32.300635 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-5c6kq" event={"ID":"67dcdfb2-70ae-4444-b271-dd83dcb37756","Type":"ContainerStarted","Data":"10f3b656623a044d66f03f2a6e817980503f52c5fbc3a0bdb0e1a8cf5546f5f5"} Dec 01 08:37:32 crc kubenswrapper[5004]: I1201 08:37:32.301239 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-5c6kq" Dec 01 08:37:32 crc kubenswrapper[5004]: I1201 08:37:32.305746 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-998648c74-wqpmp" Dec 01 08:37:32 crc kubenswrapper[5004]: I1201 08:37:32.306487 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-5c6kq" Dec 01 08:37:32 crc kubenswrapper[5004]: I1201 08:37:32.321427 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-pk7mn" podStartSLOduration=4.458406132 podStartE2EDuration="45.321408436s" podCreationTimestamp="2025-12-01 08:36:47 +0000 UTC" firstStartedPulling="2025-12-01 08:36:49.76936275 +0000 UTC m=+1187.334354742" lastFinishedPulling="2025-12-01 08:37:30.632365044 +0000 UTC m=+1228.197357046" observedRunningTime="2025-12-01 08:37:32.315177447 +0000 UTC m=+1229.880169429" watchObservedRunningTime="2025-12-01 08:37:32.321408436 +0000 UTC m=+1229.886400418" Dec 01 08:37:32 crc kubenswrapper[5004]: I1201 08:37:32.385631 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-78f8948974-p468t" podStartSLOduration=4.632876436 podStartE2EDuration="45.385613917s" podCreationTimestamp="2025-12-01 08:36:47 +0000 UTC" firstStartedPulling="2025-12-01 08:36:49.993187688 +0000 UTC m=+1187.558179670" lastFinishedPulling="2025-12-01 08:37:30.745925159 +0000 UTC m=+1228.310917151" observedRunningTime="2025-12-01 08:37:32.382887812 +0000 UTC m=+1229.947879804" watchObservedRunningTime="2025-12-01 08:37:32.385613917 +0000 UTC m=+1229.950605899" Dec 01 08:37:32 crc kubenswrapper[5004]: I1201 08:37:32.416101 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-998648c74-wqpmp" podStartSLOduration=4.485043752 podStartE2EDuration="45.416085828s" podCreationTimestamp="2025-12-01 08:36:47 +0000 UTC" firstStartedPulling="2025-12-01 08:36:49.757923356 +0000 UTC m=+1187.322915338" lastFinishedPulling="2025-12-01 08:37:30.688965412 +0000 UTC m=+1228.253957414" observedRunningTime="2025-12-01 08:37:32.411746985 +0000 UTC m=+1229.976738967" watchObservedRunningTime="2025-12-01 08:37:32.416085828 +0000 UTC m=+1229.981077800" Dec 01 08:37:32 crc kubenswrapper[5004]: I1201 08:37:32.445137 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-5c6kq" podStartSLOduration=3.6213465080000002 podStartE2EDuration="45.445118636s" podCreationTimestamp="2025-12-01 08:36:47 +0000 UTC" firstStartedPulling="2025-12-01 08:36:48.805101973 +0000 UTC m=+1186.370093955" lastFinishedPulling="2025-12-01 08:37:30.628874081 +0000 UTC m=+1228.193866083" observedRunningTime="2025-12-01 08:37:32.441193623 +0000 UTC m=+1230.006185625" watchObservedRunningTime="2025-12-01 08:37:32.445118636 +0000 UTC m=+1230.010110608" Dec 01 08:37:32 crc kubenswrapper[5004]: I1201 08:37:32.476698 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-6l4k4" podStartSLOduration=4.440749281 podStartE2EDuration="45.476672784s" podCreationTimestamp="2025-12-01 08:36:47 +0000 UTC" firstStartedPulling="2025-12-01 08:36:49.553446591 +0000 UTC m=+1187.118438573" lastFinishedPulling="2025-12-01 08:37:30.589370084 +0000 UTC m=+1228.154362076" observedRunningTime="2025-12-01 08:37:32.472447764 +0000 UTC m=+1230.037439766" watchObservedRunningTime="2025-12-01 08:37:32.476672784 +0000 UTC m=+1230.041664796" Dec 01 08:37:32 crc kubenswrapper[5004]: I1201 08:37:32.525637 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-zgpn6" podStartSLOduration=4.4309086220000005 podStartE2EDuration="45.525620114s" podCreationTimestamp="2025-12-01 08:36:47 +0000 UTC" firstStartedPulling="2025-12-01 08:36:49.536655029 +0000 UTC m=+1187.101647021" lastFinishedPulling="2025-12-01 08:37:30.631366501 +0000 UTC m=+1228.196358513" observedRunningTime="2025-12-01 08:37:32.522050949 +0000 UTC m=+1230.087042931" watchObservedRunningTime="2025-12-01 08:37:32.525620114 +0000 UTC m=+1230.090612086" Dec 01 08:37:32 crc kubenswrapper[5004]: I1201 08:37:32.559800 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-78524" podStartSLOduration=4.271198901 podStartE2EDuration="45.559782543s" podCreationTimestamp="2025-12-01 08:36:47 +0000 UTC" firstStartedPulling="2025-12-01 08:36:49.489757794 +0000 UTC m=+1187.054749776" lastFinishedPulling="2025-12-01 08:37:30.778341426 +0000 UTC m=+1228.343333418" observedRunningTime="2025-12-01 08:37:32.558705807 +0000 UTC m=+1230.123697789" watchObservedRunningTime="2025-12-01 08:37:32.559782543 +0000 UTC m=+1230.124774525" Dec 01 08:37:32 crc kubenswrapper[5004]: I1201 08:37:32.617122 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-rp7vl" podStartSLOduration=3.677961816 podStartE2EDuration="45.617104402s" podCreationTimestamp="2025-12-01 08:36:47 +0000 UTC" firstStartedPulling="2025-12-01 08:36:48.683966867 +0000 UTC m=+1186.248958849" lastFinishedPulling="2025-12-01 08:37:30.623109443 +0000 UTC m=+1228.188101435" observedRunningTime="2025-12-01 08:37:32.594692801 +0000 UTC m=+1230.159684783" watchObservedRunningTime="2025-12-01 08:37:32.617104402 +0000 UTC m=+1230.182096384" Dec 01 08:37:32 crc kubenswrapper[5004]: I1201 08:37:32.620239 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-vv9hf" podStartSLOduration=4.329276657 podStartE2EDuration="45.620231665s" podCreationTimestamp="2025-12-01 08:36:47 +0000 UTC" firstStartedPulling="2025-12-01 08:36:49.486756542 +0000 UTC m=+1187.051748534" lastFinishedPulling="2025-12-01 08:37:30.77771154 +0000 UTC m=+1228.342703542" observedRunningTime="2025-12-01 08:37:32.614549921 +0000 UTC m=+1230.179541903" watchObservedRunningTime="2025-12-01 08:37:32.620231665 +0000 UTC m=+1230.185223647" Dec 01 08:37:32 crc kubenswrapper[5004]: I1201 08:37:32.689900 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-55g57" podStartSLOduration=4.621339131 podStartE2EDuration="45.689883156s" podCreationTimestamp="2025-12-01 08:36:47 +0000 UTC" firstStartedPulling="2025-12-01 08:36:49.521182197 +0000 UTC m=+1187.086174189" lastFinishedPulling="2025-12-01 08:37:30.589726202 +0000 UTC m=+1228.154718214" observedRunningTime="2025-12-01 08:37:32.684110249 +0000 UTC m=+1230.249102231" watchObservedRunningTime="2025-12-01 08:37:32.689883156 +0000 UTC m=+1230.254875138" Dec 01 08:37:32 crc kubenswrapper[5004]: I1201 08:37:32.744498 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-lfs45" podStartSLOduration=4.5411539130000005 podStartE2EDuration="45.74447312s" podCreationTimestamp="2025-12-01 08:36:47 +0000 UTC" firstStartedPulling="2025-12-01 08:36:49.495597614 +0000 UTC m=+1187.060589596" lastFinishedPulling="2025-12-01 08:37:30.698916811 +0000 UTC m=+1228.263908803" observedRunningTime="2025-12-01 08:37:32.73182007 +0000 UTC m=+1230.296812052" watchObservedRunningTime="2025-12-01 08:37:32.74447312 +0000 UTC m=+1230.309465102" Dec 01 08:37:32 crc kubenswrapper[5004]: E1201 08:37:32.774106 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l4rkg" podUID="ed5bf034-cb91-4b02-97a4-c63a8506e527" Dec 01 08:37:32 crc kubenswrapper[5004]: I1201 08:37:32.806008 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-6dpk9" podStartSLOduration=5.103530393 podStartE2EDuration="45.805991827s" podCreationTimestamp="2025-12-01 08:36:47 +0000 UTC" firstStartedPulling="2025-12-01 08:36:49.554050666 +0000 UTC m=+1187.119042648" lastFinishedPulling="2025-12-01 08:37:30.25651209 +0000 UTC m=+1227.821504082" observedRunningTime="2025-12-01 08:37:32.782538111 +0000 UTC m=+1230.347530113" watchObservedRunningTime="2025-12-01 08:37:32.805991827 +0000 UTC m=+1230.370983809" Dec 01 08:37:33 crc kubenswrapper[5004]: I1201 08:37:33.313491 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-59dzd" event={"ID":"cb3f7f28-c99e-44d4-b534-83889924b531","Type":"ContainerStarted","Data":"bca52bb0eae496489284b53f914959de35a5b1a8dbbf4240d29bb850aee90b20"} Dec 01 08:37:33 crc kubenswrapper[5004]: I1201 08:37:33.313902 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-59dzd" Dec 01 08:37:33 crc kubenswrapper[5004]: I1201 08:37:33.316870 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-nm7jp" event={"ID":"616c962a-6fda-4c1b-a377-51d721a17616","Type":"ContainerStarted","Data":"1194d3426fa6db2e44ee020c4f6848c145e45995966f23b884883df8d1dbea94"} Dec 01 08:37:33 crc kubenswrapper[5004]: I1201 08:37:33.322222 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-bcd9b8768-5phd6" Dec 01 08:37:33 crc kubenswrapper[5004]: I1201 08:37:33.337293 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-59dzd" podStartSLOduration=3.08440461 podStartE2EDuration="46.337276856s" podCreationTimestamp="2025-12-01 08:36:47 +0000 UTC" firstStartedPulling="2025-12-01 08:36:49.785302403 +0000 UTC m=+1187.350294385" lastFinishedPulling="2025-12-01 08:37:33.038174639 +0000 UTC m=+1230.603166631" observedRunningTime="2025-12-01 08:37:33.334703515 +0000 UTC m=+1230.899695517" watchObservedRunningTime="2025-12-01 08:37:33.337276856 +0000 UTC m=+1230.902268838" Dec 01 08:37:33 crc kubenswrapper[5004]: I1201 08:37:33.385051 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-nm7jp" podStartSLOduration=3.404977802 podStartE2EDuration="46.385030427s" podCreationTimestamp="2025-12-01 08:36:47 +0000 UTC" firstStartedPulling="2025-12-01 08:36:49.739581176 +0000 UTC m=+1187.304573158" lastFinishedPulling="2025-12-01 08:37:32.719633801 +0000 UTC m=+1230.284625783" observedRunningTime="2025-12-01 08:37:33.379916087 +0000 UTC m=+1230.944908079" watchObservedRunningTime="2025-12-01 08:37:33.385030427 +0000 UTC m=+1230.950022419" Dec 01 08:37:34 crc kubenswrapper[5004]: I1201 08:37:34.324049 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-nm7jp" Dec 01 08:37:37 crc kubenswrapper[5004]: I1201 08:37:37.645058 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-6dpk9" Dec 01 08:37:38 crc kubenswrapper[5004]: I1201 08:37:38.116433 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-59dzd" Dec 01 08:37:38 crc kubenswrapper[5004]: I1201 08:37:38.338521 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-nm7jp" Dec 01 08:37:38 crc kubenswrapper[5004]: I1201 08:37:38.729782 5004 patch_prober.go:28] interesting pod/machine-config-daemon-fvdgt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 08:37:38 crc kubenswrapper[5004]: I1201 08:37:38.729984 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 08:37:38 crc kubenswrapper[5004]: I1201 08:37:38.730154 5004 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" Dec 01 08:37:38 crc kubenswrapper[5004]: I1201 08:37:38.732672 5004 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"da4b1d9e1788dd947ac4216eff1a285666eccd0fc7594a8fc8667307c82c4fdb"} pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 08:37:38 crc kubenswrapper[5004]: I1201 08:37:38.733026 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" containerID="cri-o://da4b1d9e1788dd947ac4216eff1a285666eccd0fc7594a8fc8667307c82c4fdb" gracePeriod=600 Dec 01 08:37:39 crc kubenswrapper[5004]: I1201 08:37:39.395128 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" event={"ID":"f9977ebb-82de-4e96-8763-0b5a84f8d4ce","Type":"ContainerDied","Data":"da4b1d9e1788dd947ac4216eff1a285666eccd0fc7594a8fc8667307c82c4fdb"} Dec 01 08:37:39 crc kubenswrapper[5004]: I1201 08:37:39.396004 5004 scope.go:117] "RemoveContainer" containerID="69d8f022c5a4f9a84dbe3000c7f3fecc6974868815a83043bd8a0d7a4a9a2e59" Dec 01 08:37:39 crc kubenswrapper[5004]: I1201 08:37:39.395077 5004 generic.go:334] "Generic (PLEG): container finished" podID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerID="da4b1d9e1788dd947ac4216eff1a285666eccd0fc7594a8fc8667307c82c4fdb" exitCode=0 Dec 01 08:37:39 crc kubenswrapper[5004]: I1201 08:37:39.396312 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" event={"ID":"f9977ebb-82de-4e96-8763-0b5a84f8d4ce","Type":"ContainerStarted","Data":"af0bd8ad09d4d665e418c6d76caa0150a18c17d3528d47d38f4681f4edce895d"} Dec 01 08:37:40 crc kubenswrapper[5004]: I1201 08:37:40.285605 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-6c4968b65-xg6h2" Dec 01 08:37:43 crc kubenswrapper[5004]: I1201 08:37:43.560981 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-57548d458d-hzgf4" Dec 01 08:37:44 crc kubenswrapper[5004]: I1201 08:37:44.015350 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd42mdlz" Dec 01 08:37:44 crc kubenswrapper[5004]: I1201 08:37:44.449217 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-nfdnl" event={"ID":"f1d1796c-7fa3-4a90-bfb2-cc257a69ba58","Type":"ContainerStarted","Data":"b9fccdcc400dce926a69fbb1339504ab43df7ca02f8816cfb2d75913ad614cde"} Dec 01 08:37:44 crc kubenswrapper[5004]: I1201 08:37:44.449954 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-nfdnl" Dec 01 08:37:44 crc kubenswrapper[5004]: I1201 08:37:44.466749 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-nfdnl" podStartSLOduration=3.261215059 podStartE2EDuration="57.466731767s" podCreationTimestamp="2025-12-01 08:36:47 +0000 UTC" firstStartedPulling="2025-12-01 08:36:50.027912352 +0000 UTC m=+1187.592904334" lastFinishedPulling="2025-12-01 08:37:44.23342906 +0000 UTC m=+1241.798421042" observedRunningTime="2025-12-01 08:37:44.464444073 +0000 UTC m=+1242.029436075" watchObservedRunningTime="2025-12-01 08:37:44.466731767 +0000 UTC m=+1242.031723759" Dec 01 08:37:46 crc kubenswrapper[5004]: I1201 08:37:46.478074 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-tnmn9" event={"ID":"bff56810-ae93-4fca-a568-cc88e971c1d8","Type":"ContainerStarted","Data":"4d441cc7803cbf97e826443bc2ad99f5633319577a725cbdfa0af4a28e065551"} Dec 01 08:37:46 crc kubenswrapper[5004]: I1201 08:37:46.478634 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-tnmn9" Dec 01 08:37:46 crc kubenswrapper[5004]: I1201 08:37:46.522117 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-tnmn9" podStartSLOduration=4.163187145 podStartE2EDuration="59.522090039s" podCreationTimestamp="2025-12-01 08:36:47 +0000 UTC" firstStartedPulling="2025-12-01 08:36:50.006073798 +0000 UTC m=+1187.571065780" lastFinishedPulling="2025-12-01 08:37:45.364976692 +0000 UTC m=+1242.929968674" observedRunningTime="2025-12-01 08:37:46.49891261 +0000 UTC m=+1244.063904692" watchObservedRunningTime="2025-12-01 08:37:46.522090039 +0000 UTC m=+1244.087082041" Dec 01 08:37:51 crc kubenswrapper[5004]: I1201 08:37:51.527014 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-8n2qh" event={"ID":"bfa6d181-b802-48de-8c57-4b8b7a8f1e07","Type":"ContainerStarted","Data":"90319c69001f26730dc16605058edea6d17ef6f7e1a5591c5cd11b7dc7c512d5"} Dec 01 08:37:51 crc kubenswrapper[5004]: I1201 08:37:51.527880 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5854674fcc-8n2qh" Dec 01 08:37:51 crc kubenswrapper[5004]: I1201 08:37:51.529164 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l4rkg" event={"ID":"ed5bf034-cb91-4b02-97a4-c63a8506e527","Type":"ContainerStarted","Data":"c6cccc984b32a1aa799109293cae636db2397af99cf7cea7ea010cb48cf6ca4d"} Dec 01 08:37:51 crc kubenswrapper[5004]: I1201 08:37:51.551346 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5854674fcc-8n2qh" podStartSLOduration=4.295235849 podStartE2EDuration="1m4.551332657s" podCreationTimestamp="2025-12-01 08:36:47 +0000 UTC" firstStartedPulling="2025-12-01 08:36:50.024037758 +0000 UTC m=+1187.589029740" lastFinishedPulling="2025-12-01 08:37:50.280134556 +0000 UTC m=+1247.845126548" observedRunningTime="2025-12-01 08:37:51.548341396 +0000 UTC m=+1249.113333388" watchObservedRunningTime="2025-12-01 08:37:51.551332657 +0000 UTC m=+1249.116324639" Dec 01 08:37:51 crc kubenswrapper[5004]: I1201 08:37:51.574420 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l4rkg" podStartSLOduration=3.262940039 podStartE2EDuration="1m3.574392924s" podCreationTimestamp="2025-12-01 08:36:48 +0000 UTC" firstStartedPulling="2025-12-01 08:36:50.002336328 +0000 UTC m=+1187.567328310" lastFinishedPulling="2025-12-01 08:37:50.313789203 +0000 UTC m=+1247.878781195" observedRunningTime="2025-12-01 08:37:51.56915337 +0000 UTC m=+1249.134145362" watchObservedRunningTime="2025-12-01 08:37:51.574392924 +0000 UTC m=+1249.139384946" Dec 01 08:37:58 crc kubenswrapper[5004]: I1201 08:37:58.434421 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5854674fcc-8n2qh" Dec 01 08:37:58 crc kubenswrapper[5004]: I1201 08:37:58.540639 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-tnmn9" Dec 01 08:37:58 crc kubenswrapper[5004]: I1201 08:37:58.746949 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-nfdnl" Dec 01 08:38:12 crc kubenswrapper[5004]: I1201 08:38:12.054633 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-vbwft"] Dec 01 08:38:12 crc kubenswrapper[5004]: I1201 08:38:12.056358 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-vbwft" Dec 01 08:38:12 crc kubenswrapper[5004]: I1201 08:38:12.069045 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 01 08:38:12 crc kubenswrapper[5004]: I1201 08:38:12.069076 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 01 08:38:12 crc kubenswrapper[5004]: I1201 08:38:12.069201 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 01 08:38:12 crc kubenswrapper[5004]: I1201 08:38:12.073514 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-pkg4g" Dec 01 08:38:12 crc kubenswrapper[5004]: I1201 08:38:12.085806 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-vbwft"] Dec 01 08:38:12 crc kubenswrapper[5004]: I1201 08:38:12.111971 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-pqhzv"] Dec 01 08:38:12 crc kubenswrapper[5004]: I1201 08:38:12.114641 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-pqhzv" Dec 01 08:38:12 crc kubenswrapper[5004]: I1201 08:38:12.120180 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 01 08:38:12 crc kubenswrapper[5004]: I1201 08:38:12.131941 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4k9h\" (UniqueName: \"kubernetes.io/projected/24875a21-fff6-4224-aa6c-a3264f6f0c26-kube-api-access-b4k9h\") pod \"dnsmasq-dns-675f4bcbfc-vbwft\" (UID: \"24875a21-fff6-4224-aa6c-a3264f6f0c26\") " pod="openstack/dnsmasq-dns-675f4bcbfc-vbwft" Dec 01 08:38:12 crc kubenswrapper[5004]: I1201 08:38:12.132034 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24875a21-fff6-4224-aa6c-a3264f6f0c26-config\") pod \"dnsmasq-dns-675f4bcbfc-vbwft\" (UID: \"24875a21-fff6-4224-aa6c-a3264f6f0c26\") " pod="openstack/dnsmasq-dns-675f4bcbfc-vbwft" Dec 01 08:38:12 crc kubenswrapper[5004]: I1201 08:38:12.153378 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-pqhzv"] Dec 01 08:38:12 crc kubenswrapper[5004]: I1201 08:38:12.233121 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad2a2834-a832-452e-bd49-73656a9cde6a-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-pqhzv\" (UID: \"ad2a2834-a832-452e-bd49-73656a9cde6a\") " pod="openstack/dnsmasq-dns-78dd6ddcc-pqhzv" Dec 01 08:38:12 crc kubenswrapper[5004]: I1201 08:38:12.233163 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad2a2834-a832-452e-bd49-73656a9cde6a-config\") pod \"dnsmasq-dns-78dd6ddcc-pqhzv\" (UID: \"ad2a2834-a832-452e-bd49-73656a9cde6a\") " pod="openstack/dnsmasq-dns-78dd6ddcc-pqhzv" Dec 01 08:38:12 crc kubenswrapper[5004]: I1201 08:38:12.233261 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4k9h\" (UniqueName: \"kubernetes.io/projected/24875a21-fff6-4224-aa6c-a3264f6f0c26-kube-api-access-b4k9h\") pod \"dnsmasq-dns-675f4bcbfc-vbwft\" (UID: \"24875a21-fff6-4224-aa6c-a3264f6f0c26\") " pod="openstack/dnsmasq-dns-675f4bcbfc-vbwft" Dec 01 08:38:12 crc kubenswrapper[5004]: I1201 08:38:12.233301 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlsss\" (UniqueName: \"kubernetes.io/projected/ad2a2834-a832-452e-bd49-73656a9cde6a-kube-api-access-jlsss\") pod \"dnsmasq-dns-78dd6ddcc-pqhzv\" (UID: \"ad2a2834-a832-452e-bd49-73656a9cde6a\") " pod="openstack/dnsmasq-dns-78dd6ddcc-pqhzv" Dec 01 08:38:12 crc kubenswrapper[5004]: I1201 08:38:12.233345 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24875a21-fff6-4224-aa6c-a3264f6f0c26-config\") pod \"dnsmasq-dns-675f4bcbfc-vbwft\" (UID: \"24875a21-fff6-4224-aa6c-a3264f6f0c26\") " pod="openstack/dnsmasq-dns-675f4bcbfc-vbwft" Dec 01 08:38:12 crc kubenswrapper[5004]: I1201 08:38:12.234193 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24875a21-fff6-4224-aa6c-a3264f6f0c26-config\") pod \"dnsmasq-dns-675f4bcbfc-vbwft\" (UID: \"24875a21-fff6-4224-aa6c-a3264f6f0c26\") " pod="openstack/dnsmasq-dns-675f4bcbfc-vbwft" Dec 01 08:38:12 crc kubenswrapper[5004]: I1201 08:38:12.254351 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4k9h\" (UniqueName: \"kubernetes.io/projected/24875a21-fff6-4224-aa6c-a3264f6f0c26-kube-api-access-b4k9h\") pod \"dnsmasq-dns-675f4bcbfc-vbwft\" (UID: \"24875a21-fff6-4224-aa6c-a3264f6f0c26\") " pod="openstack/dnsmasq-dns-675f4bcbfc-vbwft" Dec 01 08:38:12 crc kubenswrapper[5004]: I1201 08:38:12.334935 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad2a2834-a832-452e-bd49-73656a9cde6a-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-pqhzv\" (UID: \"ad2a2834-a832-452e-bd49-73656a9cde6a\") " pod="openstack/dnsmasq-dns-78dd6ddcc-pqhzv" Dec 01 08:38:12 crc kubenswrapper[5004]: I1201 08:38:12.334976 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad2a2834-a832-452e-bd49-73656a9cde6a-config\") pod \"dnsmasq-dns-78dd6ddcc-pqhzv\" (UID: \"ad2a2834-a832-452e-bd49-73656a9cde6a\") " pod="openstack/dnsmasq-dns-78dd6ddcc-pqhzv" Dec 01 08:38:12 crc kubenswrapper[5004]: I1201 08:38:12.335722 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad2a2834-a832-452e-bd49-73656a9cde6a-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-pqhzv\" (UID: \"ad2a2834-a832-452e-bd49-73656a9cde6a\") " pod="openstack/dnsmasq-dns-78dd6ddcc-pqhzv" Dec 01 08:38:12 crc kubenswrapper[5004]: I1201 08:38:12.335744 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad2a2834-a832-452e-bd49-73656a9cde6a-config\") pod \"dnsmasq-dns-78dd6ddcc-pqhzv\" (UID: \"ad2a2834-a832-452e-bd49-73656a9cde6a\") " pod="openstack/dnsmasq-dns-78dd6ddcc-pqhzv" Dec 01 08:38:12 crc kubenswrapper[5004]: I1201 08:38:12.335866 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlsss\" (UniqueName: \"kubernetes.io/projected/ad2a2834-a832-452e-bd49-73656a9cde6a-kube-api-access-jlsss\") pod \"dnsmasq-dns-78dd6ddcc-pqhzv\" (UID: \"ad2a2834-a832-452e-bd49-73656a9cde6a\") " pod="openstack/dnsmasq-dns-78dd6ddcc-pqhzv" Dec 01 08:38:12 crc kubenswrapper[5004]: I1201 08:38:12.350922 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlsss\" (UniqueName: \"kubernetes.io/projected/ad2a2834-a832-452e-bd49-73656a9cde6a-kube-api-access-jlsss\") pod \"dnsmasq-dns-78dd6ddcc-pqhzv\" (UID: \"ad2a2834-a832-452e-bd49-73656a9cde6a\") " pod="openstack/dnsmasq-dns-78dd6ddcc-pqhzv" Dec 01 08:38:12 crc kubenswrapper[5004]: I1201 08:38:12.374627 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-vbwft" Dec 01 08:38:12 crc kubenswrapper[5004]: I1201 08:38:12.435395 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-pqhzv" Dec 01 08:38:12 crc kubenswrapper[5004]: I1201 08:38:12.966286 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-vbwft"] Dec 01 08:38:13 crc kubenswrapper[5004]: I1201 08:38:13.077966 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-pqhzv"] Dec 01 08:38:13 crc kubenswrapper[5004]: I1201 08:38:13.749188 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-pqhzv" event={"ID":"ad2a2834-a832-452e-bd49-73656a9cde6a","Type":"ContainerStarted","Data":"a4172d14cb308075c7f3ebb6a4c31303beb3274a663a3f2c25529bff87227cd9"} Dec 01 08:38:13 crc kubenswrapper[5004]: I1201 08:38:13.750425 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-vbwft" event={"ID":"24875a21-fff6-4224-aa6c-a3264f6f0c26","Type":"ContainerStarted","Data":"6a0fd550170224ebe9d0d1d1d0f680b16a2ca1867996d762c08f8776ea18346c"} Dec 01 08:38:15 crc kubenswrapper[5004]: I1201 08:38:15.173206 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-vbwft"] Dec 01 08:38:15 crc kubenswrapper[5004]: I1201 08:38:15.201149 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-hrcgz"] Dec 01 08:38:15 crc kubenswrapper[5004]: I1201 08:38:15.204554 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-hrcgz" Dec 01 08:38:15 crc kubenswrapper[5004]: I1201 08:38:15.213833 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-hrcgz"] Dec 01 08:38:15 crc kubenswrapper[5004]: I1201 08:38:15.360467 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fe1d5ea-e24c-44b9-9de2-5011b3fc04fd-config\") pod \"dnsmasq-dns-666b6646f7-hrcgz\" (UID: \"7fe1d5ea-e24c-44b9-9de2-5011b3fc04fd\") " pod="openstack/dnsmasq-dns-666b6646f7-hrcgz" Dec 01 08:38:15 crc kubenswrapper[5004]: I1201 08:38:15.360527 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsp2g\" (UniqueName: \"kubernetes.io/projected/7fe1d5ea-e24c-44b9-9de2-5011b3fc04fd-kube-api-access-vsp2g\") pod \"dnsmasq-dns-666b6646f7-hrcgz\" (UID: \"7fe1d5ea-e24c-44b9-9de2-5011b3fc04fd\") " pod="openstack/dnsmasq-dns-666b6646f7-hrcgz" Dec 01 08:38:15 crc kubenswrapper[5004]: I1201 08:38:15.360731 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7fe1d5ea-e24c-44b9-9de2-5011b3fc04fd-dns-svc\") pod \"dnsmasq-dns-666b6646f7-hrcgz\" (UID: \"7fe1d5ea-e24c-44b9-9de2-5011b3fc04fd\") " pod="openstack/dnsmasq-dns-666b6646f7-hrcgz" Dec 01 08:38:15 crc kubenswrapper[5004]: I1201 08:38:15.462315 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsp2g\" (UniqueName: \"kubernetes.io/projected/7fe1d5ea-e24c-44b9-9de2-5011b3fc04fd-kube-api-access-vsp2g\") pod \"dnsmasq-dns-666b6646f7-hrcgz\" (UID: \"7fe1d5ea-e24c-44b9-9de2-5011b3fc04fd\") " pod="openstack/dnsmasq-dns-666b6646f7-hrcgz" Dec 01 08:38:15 crc kubenswrapper[5004]: I1201 08:38:15.462431 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7fe1d5ea-e24c-44b9-9de2-5011b3fc04fd-dns-svc\") pod \"dnsmasq-dns-666b6646f7-hrcgz\" (UID: \"7fe1d5ea-e24c-44b9-9de2-5011b3fc04fd\") " pod="openstack/dnsmasq-dns-666b6646f7-hrcgz" Dec 01 08:38:15 crc kubenswrapper[5004]: I1201 08:38:15.462510 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fe1d5ea-e24c-44b9-9de2-5011b3fc04fd-config\") pod \"dnsmasq-dns-666b6646f7-hrcgz\" (UID: \"7fe1d5ea-e24c-44b9-9de2-5011b3fc04fd\") " pod="openstack/dnsmasq-dns-666b6646f7-hrcgz" Dec 01 08:38:15 crc kubenswrapper[5004]: I1201 08:38:15.463354 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fe1d5ea-e24c-44b9-9de2-5011b3fc04fd-config\") pod \"dnsmasq-dns-666b6646f7-hrcgz\" (UID: \"7fe1d5ea-e24c-44b9-9de2-5011b3fc04fd\") " pod="openstack/dnsmasq-dns-666b6646f7-hrcgz" Dec 01 08:38:15 crc kubenswrapper[5004]: I1201 08:38:15.463402 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7fe1d5ea-e24c-44b9-9de2-5011b3fc04fd-dns-svc\") pod \"dnsmasq-dns-666b6646f7-hrcgz\" (UID: \"7fe1d5ea-e24c-44b9-9de2-5011b3fc04fd\") " pod="openstack/dnsmasq-dns-666b6646f7-hrcgz" Dec 01 08:38:15 crc kubenswrapper[5004]: I1201 08:38:15.500484 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsp2g\" (UniqueName: \"kubernetes.io/projected/7fe1d5ea-e24c-44b9-9de2-5011b3fc04fd-kube-api-access-vsp2g\") pod \"dnsmasq-dns-666b6646f7-hrcgz\" (UID: \"7fe1d5ea-e24c-44b9-9de2-5011b3fc04fd\") " pod="openstack/dnsmasq-dns-666b6646f7-hrcgz" Dec 01 08:38:15 crc kubenswrapper[5004]: I1201 08:38:15.535050 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-hrcgz" Dec 01 08:38:15 crc kubenswrapper[5004]: I1201 08:38:15.583078 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-pqhzv"] Dec 01 08:38:15 crc kubenswrapper[5004]: I1201 08:38:15.621813 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-bvrdk"] Dec 01 08:38:15 crc kubenswrapper[5004]: I1201 08:38:15.623326 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-bvrdk" Dec 01 08:38:15 crc kubenswrapper[5004]: I1201 08:38:15.634972 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-bvrdk"] Dec 01 08:38:15 crc kubenswrapper[5004]: I1201 08:38:15.670286 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bbab4428-7b77-4b11-87b3-d720250c9b77-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-bvrdk\" (UID: \"bbab4428-7b77-4b11-87b3-d720250c9b77\") " pod="openstack/dnsmasq-dns-57d769cc4f-bvrdk" Dec 01 08:38:15 crc kubenswrapper[5004]: I1201 08:38:15.670321 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pltws\" (UniqueName: \"kubernetes.io/projected/bbab4428-7b77-4b11-87b3-d720250c9b77-kube-api-access-pltws\") pod \"dnsmasq-dns-57d769cc4f-bvrdk\" (UID: \"bbab4428-7b77-4b11-87b3-d720250c9b77\") " pod="openstack/dnsmasq-dns-57d769cc4f-bvrdk" Dec 01 08:38:15 crc kubenswrapper[5004]: I1201 08:38:15.670429 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbab4428-7b77-4b11-87b3-d720250c9b77-config\") pod \"dnsmasq-dns-57d769cc4f-bvrdk\" (UID: \"bbab4428-7b77-4b11-87b3-d720250c9b77\") " pod="openstack/dnsmasq-dns-57d769cc4f-bvrdk" Dec 01 08:38:15 crc kubenswrapper[5004]: I1201 08:38:15.774933 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbab4428-7b77-4b11-87b3-d720250c9b77-config\") pod \"dnsmasq-dns-57d769cc4f-bvrdk\" (UID: \"bbab4428-7b77-4b11-87b3-d720250c9b77\") " pod="openstack/dnsmasq-dns-57d769cc4f-bvrdk" Dec 01 08:38:15 crc kubenswrapper[5004]: I1201 08:38:15.774997 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bbab4428-7b77-4b11-87b3-d720250c9b77-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-bvrdk\" (UID: \"bbab4428-7b77-4b11-87b3-d720250c9b77\") " pod="openstack/dnsmasq-dns-57d769cc4f-bvrdk" Dec 01 08:38:15 crc kubenswrapper[5004]: I1201 08:38:15.775079 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pltws\" (UniqueName: \"kubernetes.io/projected/bbab4428-7b77-4b11-87b3-d720250c9b77-kube-api-access-pltws\") pod \"dnsmasq-dns-57d769cc4f-bvrdk\" (UID: \"bbab4428-7b77-4b11-87b3-d720250c9b77\") " pod="openstack/dnsmasq-dns-57d769cc4f-bvrdk" Dec 01 08:38:15 crc kubenswrapper[5004]: I1201 08:38:15.775990 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbab4428-7b77-4b11-87b3-d720250c9b77-config\") pod \"dnsmasq-dns-57d769cc4f-bvrdk\" (UID: \"bbab4428-7b77-4b11-87b3-d720250c9b77\") " pod="openstack/dnsmasq-dns-57d769cc4f-bvrdk" Dec 01 08:38:15 crc kubenswrapper[5004]: I1201 08:38:15.777418 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bbab4428-7b77-4b11-87b3-d720250c9b77-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-bvrdk\" (UID: \"bbab4428-7b77-4b11-87b3-d720250c9b77\") " pod="openstack/dnsmasq-dns-57d769cc4f-bvrdk" Dec 01 08:38:15 crc kubenswrapper[5004]: I1201 08:38:15.809694 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pltws\" (UniqueName: \"kubernetes.io/projected/bbab4428-7b77-4b11-87b3-d720250c9b77-kube-api-access-pltws\") pod \"dnsmasq-dns-57d769cc4f-bvrdk\" (UID: \"bbab4428-7b77-4b11-87b3-d720250c9b77\") " pod="openstack/dnsmasq-dns-57d769cc4f-bvrdk" Dec 01 08:38:15 crc kubenswrapper[5004]: I1201 08:38:15.993245 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-bvrdk" Dec 01 08:38:16 crc kubenswrapper[5004]: I1201 08:38:16.298635 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-hrcgz"] Dec 01 08:38:16 crc kubenswrapper[5004]: W1201 08:38:16.317043 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7fe1d5ea_e24c_44b9_9de2_5011b3fc04fd.slice/crio-1f869fa7e415961f96123299de2ed0526f9521e7cfd496f9c69b5c7e73e24d8d WatchSource:0}: Error finding container 1f869fa7e415961f96123299de2ed0526f9521e7cfd496f9c69b5c7e73e24d8d: Status 404 returned error can't find the container with id 1f869fa7e415961f96123299de2ed0526f9521e7cfd496f9c69b5c7e73e24d8d Dec 01 08:38:16 crc kubenswrapper[5004]: I1201 08:38:16.360853 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 08:38:16 crc kubenswrapper[5004]: I1201 08:38:16.362619 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 01 08:38:16 crc kubenswrapper[5004]: I1201 08:38:16.370454 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 01 08:38:16 crc kubenswrapper[5004]: I1201 08:38:16.370613 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 01 08:38:16 crc kubenswrapper[5004]: I1201 08:38:16.370716 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-qpv7q" Dec 01 08:38:16 crc kubenswrapper[5004]: I1201 08:38:16.370807 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 01 08:38:16 crc kubenswrapper[5004]: I1201 08:38:16.371164 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 01 08:38:16 crc kubenswrapper[5004]: I1201 08:38:16.371254 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 01 08:38:16 crc kubenswrapper[5004]: I1201 08:38:16.388530 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 01 08:38:16 crc kubenswrapper[5004]: I1201 08:38:16.410837 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 08:38:16 crc kubenswrapper[5004]: I1201 08:38:16.503723 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e17a426c-0069-4c51-91ad-e5fbf6e0bb2a-config-data\") pod \"rabbitmq-server-0\" (UID: \"e17a426c-0069-4c51-91ad-e5fbf6e0bb2a\") " pod="openstack/rabbitmq-server-0" Dec 01 08:38:16 crc kubenswrapper[5004]: I1201 08:38:16.503764 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e17a426c-0069-4c51-91ad-e5fbf6e0bb2a-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e17a426c-0069-4c51-91ad-e5fbf6e0bb2a\") " pod="openstack/rabbitmq-server-0" Dec 01 08:38:16 crc kubenswrapper[5004]: I1201 08:38:16.503790 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e17a426c-0069-4c51-91ad-e5fbf6e0bb2a-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e17a426c-0069-4c51-91ad-e5fbf6e0bb2a\") " pod="openstack/rabbitmq-server-0" Dec 01 08:38:16 crc kubenswrapper[5004]: I1201 08:38:16.503812 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e17a426c-0069-4c51-91ad-e5fbf6e0bb2a-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e17a426c-0069-4c51-91ad-e5fbf6e0bb2a\") " pod="openstack/rabbitmq-server-0" Dec 01 08:38:16 crc kubenswrapper[5004]: I1201 08:38:16.503839 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e17a426c-0069-4c51-91ad-e5fbf6e0bb2a-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e17a426c-0069-4c51-91ad-e5fbf6e0bb2a\") " pod="openstack/rabbitmq-server-0" Dec 01 08:38:16 crc kubenswrapper[5004]: I1201 08:38:16.503856 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e17a426c-0069-4c51-91ad-e5fbf6e0bb2a-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e17a426c-0069-4c51-91ad-e5fbf6e0bb2a\") " pod="openstack/rabbitmq-server-0" Dec 01 08:38:16 crc kubenswrapper[5004]: I1201 08:38:16.503884 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ph2g8\" (UniqueName: \"kubernetes.io/projected/e17a426c-0069-4c51-91ad-e5fbf6e0bb2a-kube-api-access-ph2g8\") pod \"rabbitmq-server-0\" (UID: \"e17a426c-0069-4c51-91ad-e5fbf6e0bb2a\") " pod="openstack/rabbitmq-server-0" Dec 01 08:38:16 crc kubenswrapper[5004]: I1201 08:38:16.503904 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e17a426c-0069-4c51-91ad-e5fbf6e0bb2a-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e17a426c-0069-4c51-91ad-e5fbf6e0bb2a\") " pod="openstack/rabbitmq-server-0" Dec 01 08:38:16 crc kubenswrapper[5004]: I1201 08:38:16.503932 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"e17a426c-0069-4c51-91ad-e5fbf6e0bb2a\") " pod="openstack/rabbitmq-server-0" Dec 01 08:38:16 crc kubenswrapper[5004]: I1201 08:38:16.503946 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e17a426c-0069-4c51-91ad-e5fbf6e0bb2a-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e17a426c-0069-4c51-91ad-e5fbf6e0bb2a\") " pod="openstack/rabbitmq-server-0" Dec 01 08:38:16 crc kubenswrapper[5004]: I1201 08:38:16.503965 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e17a426c-0069-4c51-91ad-e5fbf6e0bb2a-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e17a426c-0069-4c51-91ad-e5fbf6e0bb2a\") " pod="openstack/rabbitmq-server-0" Dec 01 08:38:16 crc kubenswrapper[5004]: I1201 08:38:16.542814 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-bvrdk"] Dec 01 08:38:16 crc kubenswrapper[5004]: I1201 08:38:16.605815 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"e17a426c-0069-4c51-91ad-e5fbf6e0bb2a\") " pod="openstack/rabbitmq-server-0" Dec 01 08:38:16 crc kubenswrapper[5004]: I1201 08:38:16.606142 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e17a426c-0069-4c51-91ad-e5fbf6e0bb2a-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e17a426c-0069-4c51-91ad-e5fbf6e0bb2a\") " pod="openstack/rabbitmq-server-0" Dec 01 08:38:16 crc kubenswrapper[5004]: I1201 08:38:16.606166 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e17a426c-0069-4c51-91ad-e5fbf6e0bb2a-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e17a426c-0069-4c51-91ad-e5fbf6e0bb2a\") " pod="openstack/rabbitmq-server-0" Dec 01 08:38:16 crc kubenswrapper[5004]: I1201 08:38:16.606255 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e17a426c-0069-4c51-91ad-e5fbf6e0bb2a-config-data\") pod \"rabbitmq-server-0\" (UID: \"e17a426c-0069-4c51-91ad-e5fbf6e0bb2a\") " pod="openstack/rabbitmq-server-0" Dec 01 08:38:16 crc kubenswrapper[5004]: I1201 08:38:16.606257 5004 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"e17a426c-0069-4c51-91ad-e5fbf6e0bb2a\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/rabbitmq-server-0" Dec 01 08:38:16 crc kubenswrapper[5004]: I1201 08:38:16.606279 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e17a426c-0069-4c51-91ad-e5fbf6e0bb2a-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e17a426c-0069-4c51-91ad-e5fbf6e0bb2a\") " pod="openstack/rabbitmq-server-0" Dec 01 08:38:16 crc kubenswrapper[5004]: I1201 08:38:16.606301 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e17a426c-0069-4c51-91ad-e5fbf6e0bb2a-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e17a426c-0069-4c51-91ad-e5fbf6e0bb2a\") " pod="openstack/rabbitmq-server-0" Dec 01 08:38:16 crc kubenswrapper[5004]: I1201 08:38:16.606325 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e17a426c-0069-4c51-91ad-e5fbf6e0bb2a-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e17a426c-0069-4c51-91ad-e5fbf6e0bb2a\") " pod="openstack/rabbitmq-server-0" Dec 01 08:38:16 crc kubenswrapper[5004]: I1201 08:38:16.606346 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e17a426c-0069-4c51-91ad-e5fbf6e0bb2a-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e17a426c-0069-4c51-91ad-e5fbf6e0bb2a\") " pod="openstack/rabbitmq-server-0" Dec 01 08:38:16 crc kubenswrapper[5004]: I1201 08:38:16.606367 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e17a426c-0069-4c51-91ad-e5fbf6e0bb2a-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e17a426c-0069-4c51-91ad-e5fbf6e0bb2a\") " pod="openstack/rabbitmq-server-0" Dec 01 08:38:16 crc kubenswrapper[5004]: I1201 08:38:16.606395 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ph2g8\" (UniqueName: \"kubernetes.io/projected/e17a426c-0069-4c51-91ad-e5fbf6e0bb2a-kube-api-access-ph2g8\") pod \"rabbitmq-server-0\" (UID: \"e17a426c-0069-4c51-91ad-e5fbf6e0bb2a\") " pod="openstack/rabbitmq-server-0" Dec 01 08:38:16 crc kubenswrapper[5004]: I1201 08:38:16.606424 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e17a426c-0069-4c51-91ad-e5fbf6e0bb2a-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e17a426c-0069-4c51-91ad-e5fbf6e0bb2a\") " pod="openstack/rabbitmq-server-0" Dec 01 08:38:16 crc kubenswrapper[5004]: I1201 08:38:16.607509 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e17a426c-0069-4c51-91ad-e5fbf6e0bb2a-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e17a426c-0069-4c51-91ad-e5fbf6e0bb2a\") " pod="openstack/rabbitmq-server-0" Dec 01 08:38:16 crc kubenswrapper[5004]: I1201 08:38:16.607787 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e17a426c-0069-4c51-91ad-e5fbf6e0bb2a-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e17a426c-0069-4c51-91ad-e5fbf6e0bb2a\") " pod="openstack/rabbitmq-server-0" Dec 01 08:38:16 crc kubenswrapper[5004]: I1201 08:38:16.612479 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e17a426c-0069-4c51-91ad-e5fbf6e0bb2a-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e17a426c-0069-4c51-91ad-e5fbf6e0bb2a\") " pod="openstack/rabbitmq-server-0" Dec 01 08:38:16 crc kubenswrapper[5004]: I1201 08:38:16.612740 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e17a426c-0069-4c51-91ad-e5fbf6e0bb2a-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e17a426c-0069-4c51-91ad-e5fbf6e0bb2a\") " pod="openstack/rabbitmq-server-0" Dec 01 08:38:16 crc kubenswrapper[5004]: I1201 08:38:16.613126 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e17a426c-0069-4c51-91ad-e5fbf6e0bb2a-config-data\") pod \"rabbitmq-server-0\" (UID: \"e17a426c-0069-4c51-91ad-e5fbf6e0bb2a\") " pod="openstack/rabbitmq-server-0" Dec 01 08:38:16 crc kubenswrapper[5004]: I1201 08:38:16.614113 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e17a426c-0069-4c51-91ad-e5fbf6e0bb2a-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e17a426c-0069-4c51-91ad-e5fbf6e0bb2a\") " pod="openstack/rabbitmq-server-0" Dec 01 08:38:16 crc kubenswrapper[5004]: I1201 08:38:16.614340 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e17a426c-0069-4c51-91ad-e5fbf6e0bb2a-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e17a426c-0069-4c51-91ad-e5fbf6e0bb2a\") " pod="openstack/rabbitmq-server-0" Dec 01 08:38:16 crc kubenswrapper[5004]: I1201 08:38:16.616606 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e17a426c-0069-4c51-91ad-e5fbf6e0bb2a-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e17a426c-0069-4c51-91ad-e5fbf6e0bb2a\") " pod="openstack/rabbitmq-server-0" Dec 01 08:38:16 crc kubenswrapper[5004]: I1201 08:38:16.616700 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e17a426c-0069-4c51-91ad-e5fbf6e0bb2a-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e17a426c-0069-4c51-91ad-e5fbf6e0bb2a\") " pod="openstack/rabbitmq-server-0" Dec 01 08:38:16 crc kubenswrapper[5004]: I1201 08:38:16.633412 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ph2g8\" (UniqueName: \"kubernetes.io/projected/e17a426c-0069-4c51-91ad-e5fbf6e0bb2a-kube-api-access-ph2g8\") pod \"rabbitmq-server-0\" (UID: \"e17a426c-0069-4c51-91ad-e5fbf6e0bb2a\") " pod="openstack/rabbitmq-server-0" Dec 01 08:38:16 crc kubenswrapper[5004]: I1201 08:38:16.636088 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"e17a426c-0069-4c51-91ad-e5fbf6e0bb2a\") " pod="openstack/rabbitmq-server-0" Dec 01 08:38:16 crc kubenswrapper[5004]: I1201 08:38:16.689080 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 01 08:38:16 crc kubenswrapper[5004]: I1201 08:38:16.746651 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 08:38:16 crc kubenswrapper[5004]: I1201 08:38:16.748051 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:38:16 crc kubenswrapper[5004]: I1201 08:38:16.750967 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 01 08:38:16 crc kubenswrapper[5004]: I1201 08:38:16.751153 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-p9vkw" Dec 01 08:38:16 crc kubenswrapper[5004]: I1201 08:38:16.759171 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 01 08:38:16 crc kubenswrapper[5004]: I1201 08:38:16.759427 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 01 08:38:16 crc kubenswrapper[5004]: I1201 08:38:16.759935 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 01 08:38:16 crc kubenswrapper[5004]: I1201 08:38:16.760081 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 01 08:38:16 crc kubenswrapper[5004]: I1201 08:38:16.760201 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 01 08:38:16 crc kubenswrapper[5004]: I1201 08:38:16.783436 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 08:38:16 crc kubenswrapper[5004]: I1201 08:38:16.788264 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-bvrdk" event={"ID":"bbab4428-7b77-4b11-87b3-d720250c9b77","Type":"ContainerStarted","Data":"de65a58fd6478b999c51e4d7295ab1840ebf1aba822aa704f9a9dea79943fec1"} Dec 01 08:38:16 crc kubenswrapper[5004]: I1201 08:38:16.789482 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-hrcgz" event={"ID":"7fe1d5ea-e24c-44b9-9de2-5011b3fc04fd","Type":"ContainerStarted","Data":"1f869fa7e415961f96123299de2ed0526f9521e7cfd496f9c69b5c7e73e24d8d"} Dec 01 08:38:16 crc kubenswrapper[5004]: I1201 08:38:16.916930 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b571e2e5-2a78-45af-83aa-3d874b2569b3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:38:16 crc kubenswrapper[5004]: I1201 08:38:16.916978 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b571e2e5-2a78-45af-83aa-3d874b2569b3-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b571e2e5-2a78-45af-83aa-3d874b2569b3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:38:16 crc kubenswrapper[5004]: I1201 08:38:16.917001 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b571e2e5-2a78-45af-83aa-3d874b2569b3-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b571e2e5-2a78-45af-83aa-3d874b2569b3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:38:16 crc kubenswrapper[5004]: I1201 08:38:16.917020 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b571e2e5-2a78-45af-83aa-3d874b2569b3-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b571e2e5-2a78-45af-83aa-3d874b2569b3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:38:16 crc kubenswrapper[5004]: I1201 08:38:16.917081 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b571e2e5-2a78-45af-83aa-3d874b2569b3-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b571e2e5-2a78-45af-83aa-3d874b2569b3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:38:16 crc kubenswrapper[5004]: I1201 08:38:16.917159 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b571e2e5-2a78-45af-83aa-3d874b2569b3-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b571e2e5-2a78-45af-83aa-3d874b2569b3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:38:16 crc kubenswrapper[5004]: I1201 08:38:16.917190 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b571e2e5-2a78-45af-83aa-3d874b2569b3-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b571e2e5-2a78-45af-83aa-3d874b2569b3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:38:16 crc kubenswrapper[5004]: I1201 08:38:16.917251 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b571e2e5-2a78-45af-83aa-3d874b2569b3-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b571e2e5-2a78-45af-83aa-3d874b2569b3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:38:16 crc kubenswrapper[5004]: I1201 08:38:16.917272 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b571e2e5-2a78-45af-83aa-3d874b2569b3-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b571e2e5-2a78-45af-83aa-3d874b2569b3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:38:16 crc kubenswrapper[5004]: I1201 08:38:16.917298 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b571e2e5-2a78-45af-83aa-3d874b2569b3-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b571e2e5-2a78-45af-83aa-3d874b2569b3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:38:16 crc kubenswrapper[5004]: I1201 08:38:16.917388 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsg4t\" (UniqueName: \"kubernetes.io/projected/b571e2e5-2a78-45af-83aa-3d874b2569b3-kube-api-access-vsg4t\") pod \"rabbitmq-cell1-server-0\" (UID: \"b571e2e5-2a78-45af-83aa-3d874b2569b3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:38:17 crc kubenswrapper[5004]: I1201 08:38:17.022617 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b571e2e5-2a78-45af-83aa-3d874b2569b3-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b571e2e5-2a78-45af-83aa-3d874b2569b3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:38:17 crc kubenswrapper[5004]: I1201 08:38:17.022675 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b571e2e5-2a78-45af-83aa-3d874b2569b3-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b571e2e5-2a78-45af-83aa-3d874b2569b3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:38:17 crc kubenswrapper[5004]: I1201 08:38:17.022715 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b571e2e5-2a78-45af-83aa-3d874b2569b3-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b571e2e5-2a78-45af-83aa-3d874b2569b3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:38:17 crc kubenswrapper[5004]: I1201 08:38:17.022738 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b571e2e5-2a78-45af-83aa-3d874b2569b3-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b571e2e5-2a78-45af-83aa-3d874b2569b3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:38:17 crc kubenswrapper[5004]: I1201 08:38:17.022765 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b571e2e5-2a78-45af-83aa-3d874b2569b3-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b571e2e5-2a78-45af-83aa-3d874b2569b3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:38:17 crc kubenswrapper[5004]: I1201 08:38:17.022827 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsg4t\" (UniqueName: \"kubernetes.io/projected/b571e2e5-2a78-45af-83aa-3d874b2569b3-kube-api-access-vsg4t\") pod \"rabbitmq-cell1-server-0\" (UID: \"b571e2e5-2a78-45af-83aa-3d874b2569b3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:38:17 crc kubenswrapper[5004]: I1201 08:38:17.022886 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b571e2e5-2a78-45af-83aa-3d874b2569b3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:38:17 crc kubenswrapper[5004]: I1201 08:38:17.022915 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b571e2e5-2a78-45af-83aa-3d874b2569b3-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b571e2e5-2a78-45af-83aa-3d874b2569b3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:38:17 crc kubenswrapper[5004]: I1201 08:38:17.022942 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b571e2e5-2a78-45af-83aa-3d874b2569b3-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b571e2e5-2a78-45af-83aa-3d874b2569b3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:38:17 crc kubenswrapper[5004]: I1201 08:38:17.022961 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b571e2e5-2a78-45af-83aa-3d874b2569b3-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b571e2e5-2a78-45af-83aa-3d874b2569b3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:38:17 crc kubenswrapper[5004]: I1201 08:38:17.023004 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b571e2e5-2a78-45af-83aa-3d874b2569b3-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b571e2e5-2a78-45af-83aa-3d874b2569b3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:38:17 crc kubenswrapper[5004]: I1201 08:38:17.023451 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b571e2e5-2a78-45af-83aa-3d874b2569b3-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b571e2e5-2a78-45af-83aa-3d874b2569b3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:38:17 crc kubenswrapper[5004]: I1201 08:38:17.027475 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b571e2e5-2a78-45af-83aa-3d874b2569b3-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b571e2e5-2a78-45af-83aa-3d874b2569b3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:38:17 crc kubenswrapper[5004]: I1201 08:38:17.027626 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b571e2e5-2a78-45af-83aa-3d874b2569b3-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b571e2e5-2a78-45af-83aa-3d874b2569b3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:38:17 crc kubenswrapper[5004]: I1201 08:38:17.027724 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b571e2e5-2a78-45af-83aa-3d874b2569b3-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b571e2e5-2a78-45af-83aa-3d874b2569b3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:38:17 crc kubenswrapper[5004]: I1201 08:38:17.027977 5004 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b571e2e5-2a78-45af-83aa-3d874b2569b3\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:38:17 crc kubenswrapper[5004]: I1201 08:38:17.028088 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b571e2e5-2a78-45af-83aa-3d874b2569b3-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b571e2e5-2a78-45af-83aa-3d874b2569b3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:38:17 crc kubenswrapper[5004]: I1201 08:38:17.028581 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b571e2e5-2a78-45af-83aa-3d874b2569b3-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b571e2e5-2a78-45af-83aa-3d874b2569b3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:38:17 crc kubenswrapper[5004]: I1201 08:38:17.029056 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b571e2e5-2a78-45af-83aa-3d874b2569b3-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b571e2e5-2a78-45af-83aa-3d874b2569b3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:38:17 crc kubenswrapper[5004]: I1201 08:38:17.032175 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b571e2e5-2a78-45af-83aa-3d874b2569b3-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b571e2e5-2a78-45af-83aa-3d874b2569b3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:38:17 crc kubenswrapper[5004]: I1201 08:38:17.032512 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b571e2e5-2a78-45af-83aa-3d874b2569b3-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b571e2e5-2a78-45af-83aa-3d874b2569b3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:38:17 crc kubenswrapper[5004]: I1201 08:38:17.043352 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsg4t\" (UniqueName: \"kubernetes.io/projected/b571e2e5-2a78-45af-83aa-3d874b2569b3-kube-api-access-vsg4t\") pod \"rabbitmq-cell1-server-0\" (UID: \"b571e2e5-2a78-45af-83aa-3d874b2569b3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:38:17 crc kubenswrapper[5004]: I1201 08:38:17.060828 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b571e2e5-2a78-45af-83aa-3d874b2569b3\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:38:17 crc kubenswrapper[5004]: I1201 08:38:17.078580 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:38:18 crc kubenswrapper[5004]: I1201 08:38:18.191394 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 01 08:38:18 crc kubenswrapper[5004]: I1201 08:38:18.194264 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 01 08:38:18 crc kubenswrapper[5004]: I1201 08:38:18.197546 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-nspmm" Dec 01 08:38:18 crc kubenswrapper[5004]: I1201 08:38:18.197808 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 01 08:38:18 crc kubenswrapper[5004]: I1201 08:38:18.197936 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 01 08:38:18 crc kubenswrapper[5004]: I1201 08:38:18.199868 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 01 08:38:18 crc kubenswrapper[5004]: I1201 08:38:18.204183 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 01 08:38:18 crc kubenswrapper[5004]: I1201 08:38:18.209811 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 01 08:38:18 crc kubenswrapper[5004]: I1201 08:38:18.356223 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/04a6dd3a-f297-40b9-b480-0239383b9460-config-data-default\") pod \"openstack-galera-0\" (UID: \"04a6dd3a-f297-40b9-b480-0239383b9460\") " pod="openstack/openstack-galera-0" Dec 01 08:38:18 crc kubenswrapper[5004]: I1201 08:38:18.356606 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04a6dd3a-f297-40b9-b480-0239383b9460-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"04a6dd3a-f297-40b9-b480-0239383b9460\") " pod="openstack/openstack-galera-0" Dec 01 08:38:18 crc kubenswrapper[5004]: I1201 08:38:18.356649 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/04a6dd3a-f297-40b9-b480-0239383b9460-config-data-generated\") pod \"openstack-galera-0\" (UID: \"04a6dd3a-f297-40b9-b480-0239383b9460\") " pod="openstack/openstack-galera-0" Dec 01 08:38:18 crc kubenswrapper[5004]: I1201 08:38:18.356696 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vptmm\" (UniqueName: \"kubernetes.io/projected/04a6dd3a-f297-40b9-b480-0239383b9460-kube-api-access-vptmm\") pod \"openstack-galera-0\" (UID: \"04a6dd3a-f297-40b9-b480-0239383b9460\") " pod="openstack/openstack-galera-0" Dec 01 08:38:18 crc kubenswrapper[5004]: I1201 08:38:18.356742 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/04a6dd3a-f297-40b9-b480-0239383b9460-kolla-config\") pod \"openstack-galera-0\" (UID: \"04a6dd3a-f297-40b9-b480-0239383b9460\") " pod="openstack/openstack-galera-0" Dec 01 08:38:18 crc kubenswrapper[5004]: I1201 08:38:18.356767 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/04a6dd3a-f297-40b9-b480-0239383b9460-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"04a6dd3a-f297-40b9-b480-0239383b9460\") " pod="openstack/openstack-galera-0" Dec 01 08:38:18 crc kubenswrapper[5004]: I1201 08:38:18.356800 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/04a6dd3a-f297-40b9-b480-0239383b9460-operator-scripts\") pod \"openstack-galera-0\" (UID: \"04a6dd3a-f297-40b9-b480-0239383b9460\") " pod="openstack/openstack-galera-0" Dec 01 08:38:18 crc kubenswrapper[5004]: I1201 08:38:18.356824 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"04a6dd3a-f297-40b9-b480-0239383b9460\") " pod="openstack/openstack-galera-0" Dec 01 08:38:18 crc kubenswrapper[5004]: I1201 08:38:18.458401 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/04a6dd3a-f297-40b9-b480-0239383b9460-config-data-default\") pod \"openstack-galera-0\" (UID: \"04a6dd3a-f297-40b9-b480-0239383b9460\") " pod="openstack/openstack-galera-0" Dec 01 08:38:18 crc kubenswrapper[5004]: I1201 08:38:18.458473 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04a6dd3a-f297-40b9-b480-0239383b9460-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"04a6dd3a-f297-40b9-b480-0239383b9460\") " pod="openstack/openstack-galera-0" Dec 01 08:38:18 crc kubenswrapper[5004]: I1201 08:38:18.458510 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/04a6dd3a-f297-40b9-b480-0239383b9460-config-data-generated\") pod \"openstack-galera-0\" (UID: \"04a6dd3a-f297-40b9-b480-0239383b9460\") " pod="openstack/openstack-galera-0" Dec 01 08:38:18 crc kubenswrapper[5004]: I1201 08:38:18.458548 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vptmm\" (UniqueName: \"kubernetes.io/projected/04a6dd3a-f297-40b9-b480-0239383b9460-kube-api-access-vptmm\") pod \"openstack-galera-0\" (UID: \"04a6dd3a-f297-40b9-b480-0239383b9460\") " pod="openstack/openstack-galera-0" Dec 01 08:38:18 crc kubenswrapper[5004]: I1201 08:38:18.458586 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/04a6dd3a-f297-40b9-b480-0239383b9460-kolla-config\") pod \"openstack-galera-0\" (UID: \"04a6dd3a-f297-40b9-b480-0239383b9460\") " pod="openstack/openstack-galera-0" Dec 01 08:38:18 crc kubenswrapper[5004]: I1201 08:38:18.458605 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/04a6dd3a-f297-40b9-b480-0239383b9460-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"04a6dd3a-f297-40b9-b480-0239383b9460\") " pod="openstack/openstack-galera-0" Dec 01 08:38:18 crc kubenswrapper[5004]: I1201 08:38:18.458660 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/04a6dd3a-f297-40b9-b480-0239383b9460-operator-scripts\") pod \"openstack-galera-0\" (UID: \"04a6dd3a-f297-40b9-b480-0239383b9460\") " pod="openstack/openstack-galera-0" Dec 01 08:38:18 crc kubenswrapper[5004]: I1201 08:38:18.458680 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"04a6dd3a-f297-40b9-b480-0239383b9460\") " pod="openstack/openstack-galera-0" Dec 01 08:38:18 crc kubenswrapper[5004]: I1201 08:38:18.459016 5004 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"04a6dd3a-f297-40b9-b480-0239383b9460\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/openstack-galera-0" Dec 01 08:38:18 crc kubenswrapper[5004]: I1201 08:38:18.459442 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/04a6dd3a-f297-40b9-b480-0239383b9460-config-data-default\") pod \"openstack-galera-0\" (UID: \"04a6dd3a-f297-40b9-b480-0239383b9460\") " pod="openstack/openstack-galera-0" Dec 01 08:38:18 crc kubenswrapper[5004]: I1201 08:38:18.460099 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/04a6dd3a-f297-40b9-b480-0239383b9460-config-data-generated\") pod \"openstack-galera-0\" (UID: \"04a6dd3a-f297-40b9-b480-0239383b9460\") " pod="openstack/openstack-galera-0" Dec 01 08:38:18 crc kubenswrapper[5004]: I1201 08:38:18.460306 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/04a6dd3a-f297-40b9-b480-0239383b9460-kolla-config\") pod \"openstack-galera-0\" (UID: \"04a6dd3a-f297-40b9-b480-0239383b9460\") " pod="openstack/openstack-galera-0" Dec 01 08:38:18 crc kubenswrapper[5004]: I1201 08:38:18.460907 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/04a6dd3a-f297-40b9-b480-0239383b9460-operator-scripts\") pod \"openstack-galera-0\" (UID: \"04a6dd3a-f297-40b9-b480-0239383b9460\") " pod="openstack/openstack-galera-0" Dec 01 08:38:18 crc kubenswrapper[5004]: I1201 08:38:18.470614 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/04a6dd3a-f297-40b9-b480-0239383b9460-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"04a6dd3a-f297-40b9-b480-0239383b9460\") " pod="openstack/openstack-galera-0" Dec 01 08:38:18 crc kubenswrapper[5004]: I1201 08:38:18.472226 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04a6dd3a-f297-40b9-b480-0239383b9460-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"04a6dd3a-f297-40b9-b480-0239383b9460\") " pod="openstack/openstack-galera-0" Dec 01 08:38:18 crc kubenswrapper[5004]: I1201 08:38:18.480695 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vptmm\" (UniqueName: \"kubernetes.io/projected/04a6dd3a-f297-40b9-b480-0239383b9460-kube-api-access-vptmm\") pod \"openstack-galera-0\" (UID: \"04a6dd3a-f297-40b9-b480-0239383b9460\") " pod="openstack/openstack-galera-0" Dec 01 08:38:18 crc kubenswrapper[5004]: I1201 08:38:18.527267 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"04a6dd3a-f297-40b9-b480-0239383b9460\") " pod="openstack/openstack-galera-0" Dec 01 08:38:18 crc kubenswrapper[5004]: I1201 08:38:18.822900 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 01 08:38:19 crc kubenswrapper[5004]: I1201 08:38:19.557337 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 01 08:38:19 crc kubenswrapper[5004]: I1201 08:38:19.559864 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 01 08:38:19 crc kubenswrapper[5004]: I1201 08:38:19.562212 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 01 08:38:19 crc kubenswrapper[5004]: I1201 08:38:19.562535 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-fn8mb" Dec 01 08:38:19 crc kubenswrapper[5004]: I1201 08:38:19.562617 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 01 08:38:19 crc kubenswrapper[5004]: I1201 08:38:19.562738 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 01 08:38:19 crc kubenswrapper[5004]: I1201 08:38:19.586378 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 01 08:38:19 crc kubenswrapper[5004]: I1201 08:38:19.687472 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/bb30b7a7-42e1-421b-8673-7f3c8f5cfae3-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"bb30b7a7-42e1-421b-8673-7f3c8f5cfae3\") " pod="openstack/openstack-cell1-galera-0" Dec 01 08:38:19 crc kubenswrapper[5004]: I1201 08:38:19.688039 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5rc5\" (UniqueName: \"kubernetes.io/projected/bb30b7a7-42e1-421b-8673-7f3c8f5cfae3-kube-api-access-n5rc5\") pod \"openstack-cell1-galera-0\" (UID: \"bb30b7a7-42e1-421b-8673-7f3c8f5cfae3\") " pod="openstack/openstack-cell1-galera-0" Dec 01 08:38:19 crc kubenswrapper[5004]: I1201 08:38:19.688128 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb30b7a7-42e1-421b-8673-7f3c8f5cfae3-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"bb30b7a7-42e1-421b-8673-7f3c8f5cfae3\") " pod="openstack/openstack-cell1-galera-0" Dec 01 08:38:19 crc kubenswrapper[5004]: I1201 08:38:19.688242 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb30b7a7-42e1-421b-8673-7f3c8f5cfae3-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"bb30b7a7-42e1-421b-8673-7f3c8f5cfae3\") " pod="openstack/openstack-cell1-galera-0" Dec 01 08:38:19 crc kubenswrapper[5004]: I1201 08:38:19.688363 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"bb30b7a7-42e1-421b-8673-7f3c8f5cfae3\") " pod="openstack/openstack-cell1-galera-0" Dec 01 08:38:19 crc kubenswrapper[5004]: I1201 08:38:19.688435 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/bb30b7a7-42e1-421b-8673-7f3c8f5cfae3-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"bb30b7a7-42e1-421b-8673-7f3c8f5cfae3\") " pod="openstack/openstack-cell1-galera-0" Dec 01 08:38:19 crc kubenswrapper[5004]: I1201 08:38:19.688474 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb30b7a7-42e1-421b-8673-7f3c8f5cfae3-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"bb30b7a7-42e1-421b-8673-7f3c8f5cfae3\") " pod="openstack/openstack-cell1-galera-0" Dec 01 08:38:19 crc kubenswrapper[5004]: I1201 08:38:19.688519 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/bb30b7a7-42e1-421b-8673-7f3c8f5cfae3-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"bb30b7a7-42e1-421b-8673-7f3c8f5cfae3\") " pod="openstack/openstack-cell1-galera-0" Dec 01 08:38:19 crc kubenswrapper[5004]: I1201 08:38:19.785039 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 01 08:38:19 crc kubenswrapper[5004]: I1201 08:38:19.786351 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 01 08:38:19 crc kubenswrapper[5004]: I1201 08:38:19.789637 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"bb30b7a7-42e1-421b-8673-7f3c8f5cfae3\") " pod="openstack/openstack-cell1-galera-0" Dec 01 08:38:19 crc kubenswrapper[5004]: I1201 08:38:19.789688 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/bb30b7a7-42e1-421b-8673-7f3c8f5cfae3-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"bb30b7a7-42e1-421b-8673-7f3c8f5cfae3\") " pod="openstack/openstack-cell1-galera-0" Dec 01 08:38:19 crc kubenswrapper[5004]: I1201 08:38:19.789735 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb30b7a7-42e1-421b-8673-7f3c8f5cfae3-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"bb30b7a7-42e1-421b-8673-7f3c8f5cfae3\") " pod="openstack/openstack-cell1-galera-0" Dec 01 08:38:19 crc kubenswrapper[5004]: I1201 08:38:19.789764 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/bb30b7a7-42e1-421b-8673-7f3c8f5cfae3-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"bb30b7a7-42e1-421b-8673-7f3c8f5cfae3\") " pod="openstack/openstack-cell1-galera-0" Dec 01 08:38:19 crc kubenswrapper[5004]: I1201 08:38:19.789828 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/bb30b7a7-42e1-421b-8673-7f3c8f5cfae3-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"bb30b7a7-42e1-421b-8673-7f3c8f5cfae3\") " pod="openstack/openstack-cell1-galera-0" Dec 01 08:38:19 crc kubenswrapper[5004]: I1201 08:38:19.789952 5004 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"bb30b7a7-42e1-421b-8673-7f3c8f5cfae3\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/openstack-cell1-galera-0" Dec 01 08:38:19 crc kubenswrapper[5004]: I1201 08:38:19.790499 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/bb30b7a7-42e1-421b-8673-7f3c8f5cfae3-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"bb30b7a7-42e1-421b-8673-7f3c8f5cfae3\") " pod="openstack/openstack-cell1-galera-0" Dec 01 08:38:19 crc kubenswrapper[5004]: I1201 08:38:19.790696 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/bb30b7a7-42e1-421b-8673-7f3c8f5cfae3-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"bb30b7a7-42e1-421b-8673-7f3c8f5cfae3\") " pod="openstack/openstack-cell1-galera-0" Dec 01 08:38:19 crc kubenswrapper[5004]: I1201 08:38:19.791069 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/bb30b7a7-42e1-421b-8673-7f3c8f5cfae3-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"bb30b7a7-42e1-421b-8673-7f3c8f5cfae3\") " pod="openstack/openstack-cell1-galera-0" Dec 01 08:38:19 crc kubenswrapper[5004]: I1201 08:38:19.797251 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 01 08:38:19 crc kubenswrapper[5004]: I1201 08:38:19.797427 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-pcklt" Dec 01 08:38:19 crc kubenswrapper[5004]: I1201 08:38:19.797507 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb30b7a7-42e1-421b-8673-7f3c8f5cfae3-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"bb30b7a7-42e1-421b-8673-7f3c8f5cfae3\") " pod="openstack/openstack-cell1-galera-0" Dec 01 08:38:19 crc kubenswrapper[5004]: I1201 08:38:19.797789 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 01 08:38:19 crc kubenswrapper[5004]: I1201 08:38:19.800512 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5rc5\" (UniqueName: \"kubernetes.io/projected/bb30b7a7-42e1-421b-8673-7f3c8f5cfae3-kube-api-access-n5rc5\") pod \"openstack-cell1-galera-0\" (UID: \"bb30b7a7-42e1-421b-8673-7f3c8f5cfae3\") " pod="openstack/openstack-cell1-galera-0" Dec 01 08:38:19 crc kubenswrapper[5004]: I1201 08:38:19.800618 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb30b7a7-42e1-421b-8673-7f3c8f5cfae3-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"bb30b7a7-42e1-421b-8673-7f3c8f5cfae3\") " pod="openstack/openstack-cell1-galera-0" Dec 01 08:38:19 crc kubenswrapper[5004]: I1201 08:38:19.800707 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb30b7a7-42e1-421b-8673-7f3c8f5cfae3-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"bb30b7a7-42e1-421b-8673-7f3c8f5cfae3\") " pod="openstack/openstack-cell1-galera-0" Dec 01 08:38:19 crc kubenswrapper[5004]: I1201 08:38:19.805881 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 01 08:38:19 crc kubenswrapper[5004]: I1201 08:38:19.806401 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb30b7a7-42e1-421b-8673-7f3c8f5cfae3-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"bb30b7a7-42e1-421b-8673-7f3c8f5cfae3\") " pod="openstack/openstack-cell1-galera-0" Dec 01 08:38:19 crc kubenswrapper[5004]: I1201 08:38:19.827004 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb30b7a7-42e1-421b-8673-7f3c8f5cfae3-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"bb30b7a7-42e1-421b-8673-7f3c8f5cfae3\") " pod="openstack/openstack-cell1-galera-0" Dec 01 08:38:19 crc kubenswrapper[5004]: I1201 08:38:19.837520 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5rc5\" (UniqueName: \"kubernetes.io/projected/bb30b7a7-42e1-421b-8673-7f3c8f5cfae3-kube-api-access-n5rc5\") pod \"openstack-cell1-galera-0\" (UID: \"bb30b7a7-42e1-421b-8673-7f3c8f5cfae3\") " pod="openstack/openstack-cell1-galera-0" Dec 01 08:38:19 crc kubenswrapper[5004]: I1201 08:38:19.862148 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"bb30b7a7-42e1-421b-8673-7f3c8f5cfae3\") " pod="openstack/openstack-cell1-galera-0" Dec 01 08:38:19 crc kubenswrapper[5004]: I1201 08:38:19.882789 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 01 08:38:19 crc kubenswrapper[5004]: I1201 08:38:19.902223 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/08dfc50d-80b8-4885-826d-4a8314b46234-config-data\") pod \"memcached-0\" (UID: \"08dfc50d-80b8-4885-826d-4a8314b46234\") " pod="openstack/memcached-0" Dec 01 08:38:19 crc kubenswrapper[5004]: I1201 08:38:19.902327 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08dfc50d-80b8-4885-826d-4a8314b46234-combined-ca-bundle\") pod \"memcached-0\" (UID: \"08dfc50d-80b8-4885-826d-4a8314b46234\") " pod="openstack/memcached-0" Dec 01 08:38:19 crc kubenswrapper[5004]: I1201 08:38:19.902368 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/08dfc50d-80b8-4885-826d-4a8314b46234-memcached-tls-certs\") pod \"memcached-0\" (UID: \"08dfc50d-80b8-4885-826d-4a8314b46234\") " pod="openstack/memcached-0" Dec 01 08:38:19 crc kubenswrapper[5004]: I1201 08:38:19.902421 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jztqr\" (UniqueName: \"kubernetes.io/projected/08dfc50d-80b8-4885-826d-4a8314b46234-kube-api-access-jztqr\") pod \"memcached-0\" (UID: \"08dfc50d-80b8-4885-826d-4a8314b46234\") " pod="openstack/memcached-0" Dec 01 08:38:19 crc kubenswrapper[5004]: I1201 08:38:19.902493 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/08dfc50d-80b8-4885-826d-4a8314b46234-kolla-config\") pod \"memcached-0\" (UID: \"08dfc50d-80b8-4885-826d-4a8314b46234\") " pod="openstack/memcached-0" Dec 01 08:38:20 crc kubenswrapper[5004]: I1201 08:38:20.004725 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08dfc50d-80b8-4885-826d-4a8314b46234-combined-ca-bundle\") pod \"memcached-0\" (UID: \"08dfc50d-80b8-4885-826d-4a8314b46234\") " pod="openstack/memcached-0" Dec 01 08:38:20 crc kubenswrapper[5004]: I1201 08:38:20.004780 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/08dfc50d-80b8-4885-826d-4a8314b46234-memcached-tls-certs\") pod \"memcached-0\" (UID: \"08dfc50d-80b8-4885-826d-4a8314b46234\") " pod="openstack/memcached-0" Dec 01 08:38:20 crc kubenswrapper[5004]: I1201 08:38:20.004826 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jztqr\" (UniqueName: \"kubernetes.io/projected/08dfc50d-80b8-4885-826d-4a8314b46234-kube-api-access-jztqr\") pod \"memcached-0\" (UID: \"08dfc50d-80b8-4885-826d-4a8314b46234\") " pod="openstack/memcached-0" Dec 01 08:38:20 crc kubenswrapper[5004]: I1201 08:38:20.004887 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/08dfc50d-80b8-4885-826d-4a8314b46234-kolla-config\") pod \"memcached-0\" (UID: \"08dfc50d-80b8-4885-826d-4a8314b46234\") " pod="openstack/memcached-0" Dec 01 08:38:20 crc kubenswrapper[5004]: I1201 08:38:20.004932 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/08dfc50d-80b8-4885-826d-4a8314b46234-config-data\") pod \"memcached-0\" (UID: \"08dfc50d-80b8-4885-826d-4a8314b46234\") " pod="openstack/memcached-0" Dec 01 08:38:20 crc kubenswrapper[5004]: I1201 08:38:20.005777 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/08dfc50d-80b8-4885-826d-4a8314b46234-config-data\") pod \"memcached-0\" (UID: \"08dfc50d-80b8-4885-826d-4a8314b46234\") " pod="openstack/memcached-0" Dec 01 08:38:20 crc kubenswrapper[5004]: I1201 08:38:20.006989 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/08dfc50d-80b8-4885-826d-4a8314b46234-kolla-config\") pod \"memcached-0\" (UID: \"08dfc50d-80b8-4885-826d-4a8314b46234\") " pod="openstack/memcached-0" Dec 01 08:38:20 crc kubenswrapper[5004]: I1201 08:38:20.008540 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08dfc50d-80b8-4885-826d-4a8314b46234-combined-ca-bundle\") pod \"memcached-0\" (UID: \"08dfc50d-80b8-4885-826d-4a8314b46234\") " pod="openstack/memcached-0" Dec 01 08:38:20 crc kubenswrapper[5004]: I1201 08:38:20.018817 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/08dfc50d-80b8-4885-826d-4a8314b46234-memcached-tls-certs\") pod \"memcached-0\" (UID: \"08dfc50d-80b8-4885-826d-4a8314b46234\") " pod="openstack/memcached-0" Dec 01 08:38:20 crc kubenswrapper[5004]: I1201 08:38:20.026511 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jztqr\" (UniqueName: \"kubernetes.io/projected/08dfc50d-80b8-4885-826d-4a8314b46234-kube-api-access-jztqr\") pod \"memcached-0\" (UID: \"08dfc50d-80b8-4885-826d-4a8314b46234\") " pod="openstack/memcached-0" Dec 01 08:38:20 crc kubenswrapper[5004]: I1201 08:38:20.209725 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 01 08:38:21 crc kubenswrapper[5004]: I1201 08:38:21.816988 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 08:38:21 crc kubenswrapper[5004]: I1201 08:38:21.829411 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 08:38:21 crc kubenswrapper[5004]: I1201 08:38:21.829610 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 01 08:38:21 crc kubenswrapper[5004]: I1201 08:38:21.866219 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-7jvm2" Dec 01 08:38:21 crc kubenswrapper[5004]: I1201 08:38:21.972185 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6swp\" (UniqueName: \"kubernetes.io/projected/5fcd0c1b-820e-4bc1-b3fd-22ac13415e3c-kube-api-access-c6swp\") pod \"kube-state-metrics-0\" (UID: \"5fcd0c1b-820e-4bc1-b3fd-22ac13415e3c\") " pod="openstack/kube-state-metrics-0" Dec 01 08:38:22 crc kubenswrapper[5004]: I1201 08:38:22.074473 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6swp\" (UniqueName: \"kubernetes.io/projected/5fcd0c1b-820e-4bc1-b3fd-22ac13415e3c-kube-api-access-c6swp\") pod \"kube-state-metrics-0\" (UID: \"5fcd0c1b-820e-4bc1-b3fd-22ac13415e3c\") " pod="openstack/kube-state-metrics-0" Dec 01 08:38:22 crc kubenswrapper[5004]: I1201 08:38:22.107813 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6swp\" (UniqueName: \"kubernetes.io/projected/5fcd0c1b-820e-4bc1-b3fd-22ac13415e3c-kube-api-access-c6swp\") pod \"kube-state-metrics-0\" (UID: \"5fcd0c1b-820e-4bc1-b3fd-22ac13415e3c\") " pod="openstack/kube-state-metrics-0" Dec 01 08:38:22 crc kubenswrapper[5004]: I1201 08:38:22.191117 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 01 08:38:22 crc kubenswrapper[5004]: I1201 08:38:22.603078 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-ui-dashboards-7d5fb4cbfb-5htt9"] Dec 01 08:38:22 crc kubenswrapper[5004]: I1201 08:38:22.605118 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-5htt9" Dec 01 08:38:22 crc kubenswrapper[5004]: I1201 08:38:22.610427 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-ui-dashboards" Dec 01 08:38:22 crc kubenswrapper[5004]: I1201 08:38:22.611384 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-ui-dashboards-sa-dockercfg-2fnvk" Dec 01 08:38:22 crc kubenswrapper[5004]: I1201 08:38:22.623261 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-ui-dashboards-7d5fb4cbfb-5htt9"] Dec 01 08:38:22 crc kubenswrapper[5004]: I1201 08:38:22.789607 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ff890d7-d00c-4b87-86d6-3eb403821ee3-serving-cert\") pod \"observability-ui-dashboards-7d5fb4cbfb-5htt9\" (UID: \"1ff890d7-d00c-4b87-86d6-3eb403821ee3\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-5htt9" Dec 01 08:38:22 crc kubenswrapper[5004]: I1201 08:38:22.789727 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6q2d8\" (UniqueName: \"kubernetes.io/projected/1ff890d7-d00c-4b87-86d6-3eb403821ee3-kube-api-access-6q2d8\") pod \"observability-ui-dashboards-7d5fb4cbfb-5htt9\" (UID: \"1ff890d7-d00c-4b87-86d6-3eb403821ee3\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-5htt9" Dec 01 08:38:22 crc kubenswrapper[5004]: I1201 08:38:22.891418 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ff890d7-d00c-4b87-86d6-3eb403821ee3-serving-cert\") pod \"observability-ui-dashboards-7d5fb4cbfb-5htt9\" (UID: \"1ff890d7-d00c-4b87-86d6-3eb403821ee3\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-5htt9" Dec 01 08:38:22 crc kubenswrapper[5004]: I1201 08:38:22.891670 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6q2d8\" (UniqueName: \"kubernetes.io/projected/1ff890d7-d00c-4b87-86d6-3eb403821ee3-kube-api-access-6q2d8\") pod \"observability-ui-dashboards-7d5fb4cbfb-5htt9\" (UID: \"1ff890d7-d00c-4b87-86d6-3eb403821ee3\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-5htt9" Dec 01 08:38:22 crc kubenswrapper[5004]: I1201 08:38:22.899210 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ff890d7-d00c-4b87-86d6-3eb403821ee3-serving-cert\") pod \"observability-ui-dashboards-7d5fb4cbfb-5htt9\" (UID: \"1ff890d7-d00c-4b87-86d6-3eb403821ee3\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-5htt9" Dec 01 08:38:22 crc kubenswrapper[5004]: I1201 08:38:22.924620 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7cc4f8f495-54q5l"] Dec 01 08:38:22 crc kubenswrapper[5004]: I1201 08:38:22.925940 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7cc4f8f495-54q5l" Dec 01 08:38:22 crc kubenswrapper[5004]: I1201 08:38:22.936450 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7cc4f8f495-54q5l"] Dec 01 08:38:22 crc kubenswrapper[5004]: I1201 08:38:22.949623 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6q2d8\" (UniqueName: \"kubernetes.io/projected/1ff890d7-d00c-4b87-86d6-3eb403821ee3-kube-api-access-6q2d8\") pod \"observability-ui-dashboards-7d5fb4cbfb-5htt9\" (UID: \"1ff890d7-d00c-4b87-86d6-3eb403821ee3\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-5htt9" Dec 01 08:38:23 crc kubenswrapper[5004]: I1201 08:38:23.103323 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e4e3584c-a3b8-4fef-99cf-fd56218b1299-console-oauth-config\") pod \"console-7cc4f8f495-54q5l\" (UID: \"e4e3584c-a3b8-4fef-99cf-fd56218b1299\") " pod="openshift-console/console-7cc4f8f495-54q5l" Dec 01 08:38:23 crc kubenswrapper[5004]: I1201 08:38:23.103951 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e4e3584c-a3b8-4fef-99cf-fd56218b1299-oauth-serving-cert\") pod \"console-7cc4f8f495-54q5l\" (UID: \"e4e3584c-a3b8-4fef-99cf-fd56218b1299\") " pod="openshift-console/console-7cc4f8f495-54q5l" Dec 01 08:38:23 crc kubenswrapper[5004]: I1201 08:38:23.104119 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4e3584c-a3b8-4fef-99cf-fd56218b1299-trusted-ca-bundle\") pod \"console-7cc4f8f495-54q5l\" (UID: \"e4e3584c-a3b8-4fef-99cf-fd56218b1299\") " pod="openshift-console/console-7cc4f8f495-54q5l" Dec 01 08:38:23 crc kubenswrapper[5004]: I1201 08:38:23.104197 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e4e3584c-a3b8-4fef-99cf-fd56218b1299-console-serving-cert\") pod \"console-7cc4f8f495-54q5l\" (UID: \"e4e3584c-a3b8-4fef-99cf-fd56218b1299\") " pod="openshift-console/console-7cc4f8f495-54q5l" Dec 01 08:38:23 crc kubenswrapper[5004]: I1201 08:38:23.104275 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g54kh\" (UniqueName: \"kubernetes.io/projected/e4e3584c-a3b8-4fef-99cf-fd56218b1299-kube-api-access-g54kh\") pod \"console-7cc4f8f495-54q5l\" (UID: \"e4e3584c-a3b8-4fef-99cf-fd56218b1299\") " pod="openshift-console/console-7cc4f8f495-54q5l" Dec 01 08:38:23 crc kubenswrapper[5004]: I1201 08:38:23.104367 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e4e3584c-a3b8-4fef-99cf-fd56218b1299-console-config\") pod \"console-7cc4f8f495-54q5l\" (UID: \"e4e3584c-a3b8-4fef-99cf-fd56218b1299\") " pod="openshift-console/console-7cc4f8f495-54q5l" Dec 01 08:38:23 crc kubenswrapper[5004]: I1201 08:38:23.104450 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e4e3584c-a3b8-4fef-99cf-fd56218b1299-service-ca\") pod \"console-7cc4f8f495-54q5l\" (UID: \"e4e3584c-a3b8-4fef-99cf-fd56218b1299\") " pod="openshift-console/console-7cc4f8f495-54q5l" Dec 01 08:38:23 crc kubenswrapper[5004]: I1201 08:38:23.187015 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 01 08:38:23 crc kubenswrapper[5004]: I1201 08:38:23.189337 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 01 08:38:23 crc kubenswrapper[5004]: I1201 08:38:23.195653 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Dec 01 08:38:23 crc kubenswrapper[5004]: I1201 08:38:23.195704 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Dec 01 08:38:23 crc kubenswrapper[5004]: I1201 08:38:23.195979 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Dec 01 08:38:23 crc kubenswrapper[5004]: I1201 08:38:23.196202 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Dec 01 08:38:23 crc kubenswrapper[5004]: I1201 08:38:23.205883 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e4e3584c-a3b8-4fef-99cf-fd56218b1299-console-oauth-config\") pod \"console-7cc4f8f495-54q5l\" (UID: \"e4e3584c-a3b8-4fef-99cf-fd56218b1299\") " pod="openshift-console/console-7cc4f8f495-54q5l" Dec 01 08:38:23 crc kubenswrapper[5004]: I1201 08:38:23.205951 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e4e3584c-a3b8-4fef-99cf-fd56218b1299-oauth-serving-cert\") pod \"console-7cc4f8f495-54q5l\" (UID: \"e4e3584c-a3b8-4fef-99cf-fd56218b1299\") " pod="openshift-console/console-7cc4f8f495-54q5l" Dec 01 08:38:23 crc kubenswrapper[5004]: I1201 08:38:23.206022 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4e3584c-a3b8-4fef-99cf-fd56218b1299-trusted-ca-bundle\") pod \"console-7cc4f8f495-54q5l\" (UID: \"e4e3584c-a3b8-4fef-99cf-fd56218b1299\") " pod="openshift-console/console-7cc4f8f495-54q5l" Dec 01 08:38:23 crc kubenswrapper[5004]: I1201 08:38:23.206042 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e4e3584c-a3b8-4fef-99cf-fd56218b1299-console-serving-cert\") pod \"console-7cc4f8f495-54q5l\" (UID: \"e4e3584c-a3b8-4fef-99cf-fd56218b1299\") " pod="openshift-console/console-7cc4f8f495-54q5l" Dec 01 08:38:23 crc kubenswrapper[5004]: I1201 08:38:23.206059 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g54kh\" (UniqueName: \"kubernetes.io/projected/e4e3584c-a3b8-4fef-99cf-fd56218b1299-kube-api-access-g54kh\") pod \"console-7cc4f8f495-54q5l\" (UID: \"e4e3584c-a3b8-4fef-99cf-fd56218b1299\") " pod="openshift-console/console-7cc4f8f495-54q5l" Dec 01 08:38:23 crc kubenswrapper[5004]: I1201 08:38:23.206093 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e4e3584c-a3b8-4fef-99cf-fd56218b1299-console-config\") pod \"console-7cc4f8f495-54q5l\" (UID: \"e4e3584c-a3b8-4fef-99cf-fd56218b1299\") " pod="openshift-console/console-7cc4f8f495-54q5l" Dec 01 08:38:23 crc kubenswrapper[5004]: I1201 08:38:23.206111 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e4e3584c-a3b8-4fef-99cf-fd56218b1299-service-ca\") pod \"console-7cc4f8f495-54q5l\" (UID: \"e4e3584c-a3b8-4fef-99cf-fd56218b1299\") " pod="openshift-console/console-7cc4f8f495-54q5l" Dec 01 08:38:23 crc kubenswrapper[5004]: I1201 08:38:23.206983 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e4e3584c-a3b8-4fef-99cf-fd56218b1299-service-ca\") pod \"console-7cc4f8f495-54q5l\" (UID: \"e4e3584c-a3b8-4fef-99cf-fd56218b1299\") " pod="openshift-console/console-7cc4f8f495-54q5l" Dec 01 08:38:23 crc kubenswrapper[5004]: I1201 08:38:23.207248 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4e3584c-a3b8-4fef-99cf-fd56218b1299-trusted-ca-bundle\") pod \"console-7cc4f8f495-54q5l\" (UID: \"e4e3584c-a3b8-4fef-99cf-fd56218b1299\") " pod="openshift-console/console-7cc4f8f495-54q5l" Dec 01 08:38:23 crc kubenswrapper[5004]: I1201 08:38:23.207478 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Dec 01 08:38:23 crc kubenswrapper[5004]: I1201 08:38:23.207592 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e4e3584c-a3b8-4fef-99cf-fd56218b1299-console-config\") pod \"console-7cc4f8f495-54q5l\" (UID: \"e4e3584c-a3b8-4fef-99cf-fd56218b1299\") " pod="openshift-console/console-7cc4f8f495-54q5l" Dec 01 08:38:23 crc kubenswrapper[5004]: I1201 08:38:23.208102 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e4e3584c-a3b8-4fef-99cf-fd56218b1299-oauth-serving-cert\") pod \"console-7cc4f8f495-54q5l\" (UID: \"e4e3584c-a3b8-4fef-99cf-fd56218b1299\") " pod="openshift-console/console-7cc4f8f495-54q5l" Dec 01 08:38:23 crc kubenswrapper[5004]: I1201 08:38:23.223289 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e4e3584c-a3b8-4fef-99cf-fd56218b1299-console-serving-cert\") pod \"console-7cc4f8f495-54q5l\" (UID: \"e4e3584c-a3b8-4fef-99cf-fd56218b1299\") " pod="openshift-console/console-7cc4f8f495-54q5l" Dec 01 08:38:23 crc kubenswrapper[5004]: I1201 08:38:23.231783 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-5htt9" Dec 01 08:38:23 crc kubenswrapper[5004]: I1201 08:38:23.264224 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e4e3584c-a3b8-4fef-99cf-fd56218b1299-console-oauth-config\") pod \"console-7cc4f8f495-54q5l\" (UID: \"e4e3584c-a3b8-4fef-99cf-fd56218b1299\") " pod="openshift-console/console-7cc4f8f495-54q5l" Dec 01 08:38:23 crc kubenswrapper[5004]: I1201 08:38:23.267519 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 01 08:38:23 crc kubenswrapper[5004]: I1201 08:38:23.268197 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-z6dvc" Dec 01 08:38:23 crc kubenswrapper[5004]: I1201 08:38:23.291501 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g54kh\" (UniqueName: \"kubernetes.io/projected/e4e3584c-a3b8-4fef-99cf-fd56218b1299-kube-api-access-g54kh\") pod \"console-7cc4f8f495-54q5l\" (UID: \"e4e3584c-a3b8-4fef-99cf-fd56218b1299\") " pod="openshift-console/console-7cc4f8f495-54q5l" Dec 01 08:38:23 crc kubenswrapper[5004]: I1201 08:38:23.309429 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c74b987d-43c4-49d9-92fc-f13f3b7b7dd2-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"c74b987d-43c4-49d9-92fc-f13f3b7b7dd2\") " pod="openstack/prometheus-metric-storage-0" Dec 01 08:38:23 crc kubenswrapper[5004]: I1201 08:38:23.309477 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxgx2\" (UniqueName: \"kubernetes.io/projected/c74b987d-43c4-49d9-92fc-f13f3b7b7dd2-kube-api-access-kxgx2\") pod \"prometheus-metric-storage-0\" (UID: \"c74b987d-43c4-49d9-92fc-f13f3b7b7dd2\") " pod="openstack/prometheus-metric-storage-0" Dec 01 08:38:23 crc kubenswrapper[5004]: I1201 08:38:23.309609 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"prometheus-metric-storage-0\" (UID: \"c74b987d-43c4-49d9-92fc-f13f3b7b7dd2\") " pod="openstack/prometheus-metric-storage-0" Dec 01 08:38:23 crc kubenswrapper[5004]: I1201 08:38:23.309697 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c74b987d-43c4-49d9-92fc-f13f3b7b7dd2-config\") pod \"prometheus-metric-storage-0\" (UID: \"c74b987d-43c4-49d9-92fc-f13f3b7b7dd2\") " pod="openstack/prometheus-metric-storage-0" Dec 01 08:38:23 crc kubenswrapper[5004]: I1201 08:38:23.309745 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c74b987d-43c4-49d9-92fc-f13f3b7b7dd2-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"c74b987d-43c4-49d9-92fc-f13f3b7b7dd2\") " pod="openstack/prometheus-metric-storage-0" Dec 01 08:38:23 crc kubenswrapper[5004]: I1201 08:38:23.309790 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c74b987d-43c4-49d9-92fc-f13f3b7b7dd2-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"c74b987d-43c4-49d9-92fc-f13f3b7b7dd2\") " pod="openstack/prometheus-metric-storage-0" Dec 01 08:38:23 crc kubenswrapper[5004]: I1201 08:38:23.309836 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c74b987d-43c4-49d9-92fc-f13f3b7b7dd2-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"c74b987d-43c4-49d9-92fc-f13f3b7b7dd2\") " pod="openstack/prometheus-metric-storage-0" Dec 01 08:38:23 crc kubenswrapper[5004]: I1201 08:38:23.309884 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c74b987d-43c4-49d9-92fc-f13f3b7b7dd2-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"c74b987d-43c4-49d9-92fc-f13f3b7b7dd2\") " pod="openstack/prometheus-metric-storage-0" Dec 01 08:38:23 crc kubenswrapper[5004]: I1201 08:38:23.411257 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c74b987d-43c4-49d9-92fc-f13f3b7b7dd2-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"c74b987d-43c4-49d9-92fc-f13f3b7b7dd2\") " pod="openstack/prometheus-metric-storage-0" Dec 01 08:38:23 crc kubenswrapper[5004]: I1201 08:38:23.411328 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c74b987d-43c4-49d9-92fc-f13f3b7b7dd2-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"c74b987d-43c4-49d9-92fc-f13f3b7b7dd2\") " pod="openstack/prometheus-metric-storage-0" Dec 01 08:38:23 crc kubenswrapper[5004]: I1201 08:38:23.411358 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c74b987d-43c4-49d9-92fc-f13f3b7b7dd2-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"c74b987d-43c4-49d9-92fc-f13f3b7b7dd2\") " pod="openstack/prometheus-metric-storage-0" Dec 01 08:38:23 crc kubenswrapper[5004]: I1201 08:38:23.411439 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c74b987d-43c4-49d9-92fc-f13f3b7b7dd2-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"c74b987d-43c4-49d9-92fc-f13f3b7b7dd2\") " pod="openstack/prometheus-metric-storage-0" Dec 01 08:38:23 crc kubenswrapper[5004]: I1201 08:38:23.411466 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxgx2\" (UniqueName: \"kubernetes.io/projected/c74b987d-43c4-49d9-92fc-f13f3b7b7dd2-kube-api-access-kxgx2\") pod \"prometheus-metric-storage-0\" (UID: \"c74b987d-43c4-49d9-92fc-f13f3b7b7dd2\") " pod="openstack/prometheus-metric-storage-0" Dec 01 08:38:23 crc kubenswrapper[5004]: I1201 08:38:23.411509 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"prometheus-metric-storage-0\" (UID: \"c74b987d-43c4-49d9-92fc-f13f3b7b7dd2\") " pod="openstack/prometheus-metric-storage-0" Dec 01 08:38:23 crc kubenswrapper[5004]: I1201 08:38:23.411540 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c74b987d-43c4-49d9-92fc-f13f3b7b7dd2-config\") pod \"prometheus-metric-storage-0\" (UID: \"c74b987d-43c4-49d9-92fc-f13f3b7b7dd2\") " pod="openstack/prometheus-metric-storage-0" Dec 01 08:38:23 crc kubenswrapper[5004]: I1201 08:38:23.411576 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c74b987d-43c4-49d9-92fc-f13f3b7b7dd2-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"c74b987d-43c4-49d9-92fc-f13f3b7b7dd2\") " pod="openstack/prometheus-metric-storage-0" Dec 01 08:38:23 crc kubenswrapper[5004]: I1201 08:38:23.411927 5004 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"prometheus-metric-storage-0\" (UID: \"c74b987d-43c4-49d9-92fc-f13f3b7b7dd2\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/prometheus-metric-storage-0" Dec 01 08:38:23 crc kubenswrapper[5004]: I1201 08:38:23.412506 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c74b987d-43c4-49d9-92fc-f13f3b7b7dd2-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"c74b987d-43c4-49d9-92fc-f13f3b7b7dd2\") " pod="openstack/prometheus-metric-storage-0" Dec 01 08:38:23 crc kubenswrapper[5004]: I1201 08:38:23.415441 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c74b987d-43c4-49d9-92fc-f13f3b7b7dd2-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"c74b987d-43c4-49d9-92fc-f13f3b7b7dd2\") " pod="openstack/prometheus-metric-storage-0" Dec 01 08:38:23 crc kubenswrapper[5004]: I1201 08:38:23.415835 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c74b987d-43c4-49d9-92fc-f13f3b7b7dd2-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"c74b987d-43c4-49d9-92fc-f13f3b7b7dd2\") " pod="openstack/prometheus-metric-storage-0" Dec 01 08:38:23 crc kubenswrapper[5004]: I1201 08:38:23.416272 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c74b987d-43c4-49d9-92fc-f13f3b7b7dd2-config\") pod \"prometheus-metric-storage-0\" (UID: \"c74b987d-43c4-49d9-92fc-f13f3b7b7dd2\") " pod="openstack/prometheus-metric-storage-0" Dec 01 08:38:23 crc kubenswrapper[5004]: I1201 08:38:23.428169 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c74b987d-43c4-49d9-92fc-f13f3b7b7dd2-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"c74b987d-43c4-49d9-92fc-f13f3b7b7dd2\") " pod="openstack/prometheus-metric-storage-0" Dec 01 08:38:23 crc kubenswrapper[5004]: I1201 08:38:23.428235 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c74b987d-43c4-49d9-92fc-f13f3b7b7dd2-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"c74b987d-43c4-49d9-92fc-f13f3b7b7dd2\") " pod="openstack/prometheus-metric-storage-0" Dec 01 08:38:23 crc kubenswrapper[5004]: I1201 08:38:23.430247 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxgx2\" (UniqueName: \"kubernetes.io/projected/c74b987d-43c4-49d9-92fc-f13f3b7b7dd2-kube-api-access-kxgx2\") pod \"prometheus-metric-storage-0\" (UID: \"c74b987d-43c4-49d9-92fc-f13f3b7b7dd2\") " pod="openstack/prometheus-metric-storage-0" Dec 01 08:38:23 crc kubenswrapper[5004]: I1201 08:38:23.437716 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"prometheus-metric-storage-0\" (UID: \"c74b987d-43c4-49d9-92fc-f13f3b7b7dd2\") " pod="openstack/prometheus-metric-storage-0" Dec 01 08:38:23 crc kubenswrapper[5004]: I1201 08:38:23.516144 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 01 08:38:23 crc kubenswrapper[5004]: I1201 08:38:23.588208 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7cc4f8f495-54q5l" Dec 01 08:38:25 crc kubenswrapper[5004]: I1201 08:38:25.124152 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 01 08:38:25 crc kubenswrapper[5004]: I1201 08:38:25.129838 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 01 08:38:25 crc kubenswrapper[5004]: I1201 08:38:25.133393 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 01 08:38:25 crc kubenswrapper[5004]: I1201 08:38:25.133908 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 01 08:38:25 crc kubenswrapper[5004]: I1201 08:38:25.134235 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Dec 01 08:38:25 crc kubenswrapper[5004]: I1201 08:38:25.134445 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-pz2p6" Dec 01 08:38:25 crc kubenswrapper[5004]: I1201 08:38:25.134632 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Dec 01 08:38:25 crc kubenswrapper[5004]: I1201 08:38:25.152978 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 01 08:38:25 crc kubenswrapper[5004]: I1201 08:38:25.257964 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c5ea92c9-9b0a-473a-872f-a78f27946432-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"c5ea92c9-9b0a-473a-872f-a78f27946432\") " pod="openstack/ovsdbserver-nb-0" Dec 01 08:38:25 crc kubenswrapper[5004]: I1201 08:38:25.257998 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c5ea92c9-9b0a-473a-872f-a78f27946432-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"c5ea92c9-9b0a-473a-872f-a78f27946432\") " pod="openstack/ovsdbserver-nb-0" Dec 01 08:38:25 crc kubenswrapper[5004]: I1201 08:38:25.258029 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5ea92c9-9b0a-473a-872f-a78f27946432-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"c5ea92c9-9b0a-473a-872f-a78f27946432\") " pod="openstack/ovsdbserver-nb-0" Dec 01 08:38:25 crc kubenswrapper[5004]: I1201 08:38:25.258049 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5ea92c9-9b0a-473a-872f-a78f27946432-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"c5ea92c9-9b0a-473a-872f-a78f27946432\") " pod="openstack/ovsdbserver-nb-0" Dec 01 08:38:25 crc kubenswrapper[5004]: I1201 08:38:25.258068 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fw7d4\" (UniqueName: \"kubernetes.io/projected/c5ea92c9-9b0a-473a-872f-a78f27946432-kube-api-access-fw7d4\") pod \"ovsdbserver-nb-0\" (UID: \"c5ea92c9-9b0a-473a-872f-a78f27946432\") " pod="openstack/ovsdbserver-nb-0" Dec 01 08:38:25 crc kubenswrapper[5004]: I1201 08:38:25.258338 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5ea92c9-9b0a-473a-872f-a78f27946432-config\") pod \"ovsdbserver-nb-0\" (UID: \"c5ea92c9-9b0a-473a-872f-a78f27946432\") " pod="openstack/ovsdbserver-nb-0" Dec 01 08:38:25 crc kubenswrapper[5004]: I1201 08:38:25.258675 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-nb-0\" (UID: \"c5ea92c9-9b0a-473a-872f-a78f27946432\") " pod="openstack/ovsdbserver-nb-0" Dec 01 08:38:25 crc kubenswrapper[5004]: I1201 08:38:25.258708 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5ea92c9-9b0a-473a-872f-a78f27946432-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"c5ea92c9-9b0a-473a-872f-a78f27946432\") " pod="openstack/ovsdbserver-nb-0" Dec 01 08:38:25 crc kubenswrapper[5004]: I1201 08:38:25.360177 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fw7d4\" (UniqueName: \"kubernetes.io/projected/c5ea92c9-9b0a-473a-872f-a78f27946432-kube-api-access-fw7d4\") pod \"ovsdbserver-nb-0\" (UID: \"c5ea92c9-9b0a-473a-872f-a78f27946432\") " pod="openstack/ovsdbserver-nb-0" Dec 01 08:38:25 crc kubenswrapper[5004]: I1201 08:38:25.360424 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5ea92c9-9b0a-473a-872f-a78f27946432-config\") pod \"ovsdbserver-nb-0\" (UID: \"c5ea92c9-9b0a-473a-872f-a78f27946432\") " pod="openstack/ovsdbserver-nb-0" Dec 01 08:38:25 crc kubenswrapper[5004]: I1201 08:38:25.360493 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-nb-0\" (UID: \"c5ea92c9-9b0a-473a-872f-a78f27946432\") " pod="openstack/ovsdbserver-nb-0" Dec 01 08:38:25 crc kubenswrapper[5004]: I1201 08:38:25.360510 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5ea92c9-9b0a-473a-872f-a78f27946432-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"c5ea92c9-9b0a-473a-872f-a78f27946432\") " pod="openstack/ovsdbserver-nb-0" Dec 01 08:38:25 crc kubenswrapper[5004]: I1201 08:38:25.360592 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c5ea92c9-9b0a-473a-872f-a78f27946432-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"c5ea92c9-9b0a-473a-872f-a78f27946432\") " pod="openstack/ovsdbserver-nb-0" Dec 01 08:38:25 crc kubenswrapper[5004]: I1201 08:38:25.360610 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c5ea92c9-9b0a-473a-872f-a78f27946432-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"c5ea92c9-9b0a-473a-872f-a78f27946432\") " pod="openstack/ovsdbserver-nb-0" Dec 01 08:38:25 crc kubenswrapper[5004]: I1201 08:38:25.360637 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5ea92c9-9b0a-473a-872f-a78f27946432-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"c5ea92c9-9b0a-473a-872f-a78f27946432\") " pod="openstack/ovsdbserver-nb-0" Dec 01 08:38:25 crc kubenswrapper[5004]: I1201 08:38:25.360658 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5ea92c9-9b0a-473a-872f-a78f27946432-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"c5ea92c9-9b0a-473a-872f-a78f27946432\") " pod="openstack/ovsdbserver-nb-0" Dec 01 08:38:25 crc kubenswrapper[5004]: I1201 08:38:25.361133 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c5ea92c9-9b0a-473a-872f-a78f27946432-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"c5ea92c9-9b0a-473a-872f-a78f27946432\") " pod="openstack/ovsdbserver-nb-0" Dec 01 08:38:25 crc kubenswrapper[5004]: I1201 08:38:25.361240 5004 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-nb-0\" (UID: \"c5ea92c9-9b0a-473a-872f-a78f27946432\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/ovsdbserver-nb-0" Dec 01 08:38:25 crc kubenswrapper[5004]: I1201 08:38:25.361478 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5ea92c9-9b0a-473a-872f-a78f27946432-config\") pod \"ovsdbserver-nb-0\" (UID: \"c5ea92c9-9b0a-473a-872f-a78f27946432\") " pod="openstack/ovsdbserver-nb-0" Dec 01 08:38:25 crc kubenswrapper[5004]: I1201 08:38:25.361915 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c5ea92c9-9b0a-473a-872f-a78f27946432-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"c5ea92c9-9b0a-473a-872f-a78f27946432\") " pod="openstack/ovsdbserver-nb-0" Dec 01 08:38:25 crc kubenswrapper[5004]: I1201 08:38:25.366593 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5ea92c9-9b0a-473a-872f-a78f27946432-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"c5ea92c9-9b0a-473a-872f-a78f27946432\") " pod="openstack/ovsdbserver-nb-0" Dec 01 08:38:25 crc kubenswrapper[5004]: I1201 08:38:25.366673 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5ea92c9-9b0a-473a-872f-a78f27946432-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"c5ea92c9-9b0a-473a-872f-a78f27946432\") " pod="openstack/ovsdbserver-nb-0" Dec 01 08:38:25 crc kubenswrapper[5004]: I1201 08:38:25.373863 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5ea92c9-9b0a-473a-872f-a78f27946432-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"c5ea92c9-9b0a-473a-872f-a78f27946432\") " pod="openstack/ovsdbserver-nb-0" Dec 01 08:38:25 crc kubenswrapper[5004]: I1201 08:38:25.377385 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fw7d4\" (UniqueName: \"kubernetes.io/projected/c5ea92c9-9b0a-473a-872f-a78f27946432-kube-api-access-fw7d4\") pod \"ovsdbserver-nb-0\" (UID: \"c5ea92c9-9b0a-473a-872f-a78f27946432\") " pod="openstack/ovsdbserver-nb-0" Dec 01 08:38:25 crc kubenswrapper[5004]: I1201 08:38:25.387747 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-nb-0\" (UID: \"c5ea92c9-9b0a-473a-872f-a78f27946432\") " pod="openstack/ovsdbserver-nb-0" Dec 01 08:38:25 crc kubenswrapper[5004]: I1201 08:38:25.466551 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 01 08:38:26 crc kubenswrapper[5004]: I1201 08:38:26.068329 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-r8x2b"] Dec 01 08:38:26 crc kubenswrapper[5004]: I1201 08:38:26.069691 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-r8x2b" Dec 01 08:38:26 crc kubenswrapper[5004]: I1201 08:38:26.071975 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-kqn5z" Dec 01 08:38:26 crc kubenswrapper[5004]: I1201 08:38:26.072187 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 01 08:38:26 crc kubenswrapper[5004]: I1201 08:38:26.075380 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Dec 01 08:38:26 crc kubenswrapper[5004]: I1201 08:38:26.088594 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-r8x2b"] Dec 01 08:38:26 crc kubenswrapper[5004]: I1201 08:38:26.127088 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-8qrgn"] Dec 01 08:38:26 crc kubenswrapper[5004]: I1201 08:38:26.129247 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-8qrgn" Dec 01 08:38:26 crc kubenswrapper[5004]: I1201 08:38:26.142861 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-8qrgn"] Dec 01 08:38:26 crc kubenswrapper[5004]: I1201 08:38:26.180731 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/effd853b-0b95-4749-8119-88fcfaf8b0c0-var-run-ovn\") pod \"ovn-controller-r8x2b\" (UID: \"effd853b-0b95-4749-8119-88fcfaf8b0c0\") " pod="openstack/ovn-controller-r8x2b" Dec 01 08:38:26 crc kubenswrapper[5004]: I1201 08:38:26.180806 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/0b64b7d6-2fe8-43b0-9632-84e70a749fe9-etc-ovs\") pod \"ovn-controller-ovs-8qrgn\" (UID: \"0b64b7d6-2fe8-43b0-9632-84e70a749fe9\") " pod="openstack/ovn-controller-ovs-8qrgn" Dec 01 08:38:26 crc kubenswrapper[5004]: I1201 08:38:26.180828 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/effd853b-0b95-4749-8119-88fcfaf8b0c0-var-run\") pod \"ovn-controller-r8x2b\" (UID: \"effd853b-0b95-4749-8119-88fcfaf8b0c0\") " pod="openstack/ovn-controller-r8x2b" Dec 01 08:38:26 crc kubenswrapper[5004]: I1201 08:38:26.180849 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/effd853b-0b95-4749-8119-88fcfaf8b0c0-scripts\") pod \"ovn-controller-r8x2b\" (UID: \"effd853b-0b95-4749-8119-88fcfaf8b0c0\") " pod="openstack/ovn-controller-r8x2b" Dec 01 08:38:26 crc kubenswrapper[5004]: I1201 08:38:26.180865 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/0b64b7d6-2fe8-43b0-9632-84e70a749fe9-var-log\") pod \"ovn-controller-ovs-8qrgn\" (UID: \"0b64b7d6-2fe8-43b0-9632-84e70a749fe9\") " pod="openstack/ovn-controller-ovs-8qrgn" Dec 01 08:38:26 crc kubenswrapper[5004]: I1201 08:38:26.181080 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0b64b7d6-2fe8-43b0-9632-84e70a749fe9-scripts\") pod \"ovn-controller-ovs-8qrgn\" (UID: \"0b64b7d6-2fe8-43b0-9632-84e70a749fe9\") " pod="openstack/ovn-controller-ovs-8qrgn" Dec 01 08:38:26 crc kubenswrapper[5004]: I1201 08:38:26.181136 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swlt6\" (UniqueName: \"kubernetes.io/projected/0b64b7d6-2fe8-43b0-9632-84e70a749fe9-kube-api-access-swlt6\") pod \"ovn-controller-ovs-8qrgn\" (UID: \"0b64b7d6-2fe8-43b0-9632-84e70a749fe9\") " pod="openstack/ovn-controller-ovs-8qrgn" Dec 01 08:38:26 crc kubenswrapper[5004]: I1201 08:38:26.181208 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/effd853b-0b95-4749-8119-88fcfaf8b0c0-ovn-controller-tls-certs\") pod \"ovn-controller-r8x2b\" (UID: \"effd853b-0b95-4749-8119-88fcfaf8b0c0\") " pod="openstack/ovn-controller-r8x2b" Dec 01 08:38:26 crc kubenswrapper[5004]: I1201 08:38:26.181239 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0b64b7d6-2fe8-43b0-9632-84e70a749fe9-var-run\") pod \"ovn-controller-ovs-8qrgn\" (UID: \"0b64b7d6-2fe8-43b0-9632-84e70a749fe9\") " pod="openstack/ovn-controller-ovs-8qrgn" Dec 01 08:38:26 crc kubenswrapper[5004]: I1201 08:38:26.181276 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jz4pw\" (UniqueName: \"kubernetes.io/projected/effd853b-0b95-4749-8119-88fcfaf8b0c0-kube-api-access-jz4pw\") pod \"ovn-controller-r8x2b\" (UID: \"effd853b-0b95-4749-8119-88fcfaf8b0c0\") " pod="openstack/ovn-controller-r8x2b" Dec 01 08:38:26 crc kubenswrapper[5004]: I1201 08:38:26.181302 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/effd853b-0b95-4749-8119-88fcfaf8b0c0-combined-ca-bundle\") pod \"ovn-controller-r8x2b\" (UID: \"effd853b-0b95-4749-8119-88fcfaf8b0c0\") " pod="openstack/ovn-controller-r8x2b" Dec 01 08:38:26 crc kubenswrapper[5004]: I1201 08:38:26.181394 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/effd853b-0b95-4749-8119-88fcfaf8b0c0-var-log-ovn\") pod \"ovn-controller-r8x2b\" (UID: \"effd853b-0b95-4749-8119-88fcfaf8b0c0\") " pod="openstack/ovn-controller-r8x2b" Dec 01 08:38:26 crc kubenswrapper[5004]: I1201 08:38:26.181431 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/0b64b7d6-2fe8-43b0-9632-84e70a749fe9-var-lib\") pod \"ovn-controller-ovs-8qrgn\" (UID: \"0b64b7d6-2fe8-43b0-9632-84e70a749fe9\") " pod="openstack/ovn-controller-ovs-8qrgn" Dec 01 08:38:26 crc kubenswrapper[5004]: I1201 08:38:26.283542 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/effd853b-0b95-4749-8119-88fcfaf8b0c0-combined-ca-bundle\") pod \"ovn-controller-r8x2b\" (UID: \"effd853b-0b95-4749-8119-88fcfaf8b0c0\") " pod="openstack/ovn-controller-r8x2b" Dec 01 08:38:26 crc kubenswrapper[5004]: I1201 08:38:26.283636 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/effd853b-0b95-4749-8119-88fcfaf8b0c0-var-log-ovn\") pod \"ovn-controller-r8x2b\" (UID: \"effd853b-0b95-4749-8119-88fcfaf8b0c0\") " pod="openstack/ovn-controller-r8x2b" Dec 01 08:38:26 crc kubenswrapper[5004]: I1201 08:38:26.283661 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/0b64b7d6-2fe8-43b0-9632-84e70a749fe9-var-lib\") pod \"ovn-controller-ovs-8qrgn\" (UID: \"0b64b7d6-2fe8-43b0-9632-84e70a749fe9\") " pod="openstack/ovn-controller-ovs-8qrgn" Dec 01 08:38:26 crc kubenswrapper[5004]: I1201 08:38:26.284057 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/effd853b-0b95-4749-8119-88fcfaf8b0c0-var-log-ovn\") pod \"ovn-controller-r8x2b\" (UID: \"effd853b-0b95-4749-8119-88fcfaf8b0c0\") " pod="openstack/ovn-controller-r8x2b" Dec 01 08:38:26 crc kubenswrapper[5004]: I1201 08:38:26.284106 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/effd853b-0b95-4749-8119-88fcfaf8b0c0-var-run-ovn\") pod \"ovn-controller-r8x2b\" (UID: \"effd853b-0b95-4749-8119-88fcfaf8b0c0\") " pod="openstack/ovn-controller-r8x2b" Dec 01 08:38:26 crc kubenswrapper[5004]: I1201 08:38:26.284149 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/0b64b7d6-2fe8-43b0-9632-84e70a749fe9-etc-ovs\") pod \"ovn-controller-ovs-8qrgn\" (UID: \"0b64b7d6-2fe8-43b0-9632-84e70a749fe9\") " pod="openstack/ovn-controller-ovs-8qrgn" Dec 01 08:38:26 crc kubenswrapper[5004]: I1201 08:38:26.284163 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/effd853b-0b95-4749-8119-88fcfaf8b0c0-var-run\") pod \"ovn-controller-r8x2b\" (UID: \"effd853b-0b95-4749-8119-88fcfaf8b0c0\") " pod="openstack/ovn-controller-r8x2b" Dec 01 08:38:26 crc kubenswrapper[5004]: I1201 08:38:26.284178 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/effd853b-0b95-4749-8119-88fcfaf8b0c0-scripts\") pod \"ovn-controller-r8x2b\" (UID: \"effd853b-0b95-4749-8119-88fcfaf8b0c0\") " pod="openstack/ovn-controller-r8x2b" Dec 01 08:38:26 crc kubenswrapper[5004]: I1201 08:38:26.284194 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/0b64b7d6-2fe8-43b0-9632-84e70a749fe9-var-log\") pod \"ovn-controller-ovs-8qrgn\" (UID: \"0b64b7d6-2fe8-43b0-9632-84e70a749fe9\") " pod="openstack/ovn-controller-ovs-8qrgn" Dec 01 08:38:26 crc kubenswrapper[5004]: I1201 08:38:26.284200 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/0b64b7d6-2fe8-43b0-9632-84e70a749fe9-var-lib\") pod \"ovn-controller-ovs-8qrgn\" (UID: \"0b64b7d6-2fe8-43b0-9632-84e70a749fe9\") " pod="openstack/ovn-controller-ovs-8qrgn" Dec 01 08:38:26 crc kubenswrapper[5004]: I1201 08:38:26.284310 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/0b64b7d6-2fe8-43b0-9632-84e70a749fe9-var-log\") pod \"ovn-controller-ovs-8qrgn\" (UID: \"0b64b7d6-2fe8-43b0-9632-84e70a749fe9\") " pod="openstack/ovn-controller-ovs-8qrgn" Dec 01 08:38:26 crc kubenswrapper[5004]: I1201 08:38:26.284310 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0b64b7d6-2fe8-43b0-9632-84e70a749fe9-scripts\") pod \"ovn-controller-ovs-8qrgn\" (UID: \"0b64b7d6-2fe8-43b0-9632-84e70a749fe9\") " pod="openstack/ovn-controller-ovs-8qrgn" Dec 01 08:38:26 crc kubenswrapper[5004]: I1201 08:38:26.284349 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swlt6\" (UniqueName: \"kubernetes.io/projected/0b64b7d6-2fe8-43b0-9632-84e70a749fe9-kube-api-access-swlt6\") pod \"ovn-controller-ovs-8qrgn\" (UID: \"0b64b7d6-2fe8-43b0-9632-84e70a749fe9\") " pod="openstack/ovn-controller-ovs-8qrgn" Dec 01 08:38:26 crc kubenswrapper[5004]: I1201 08:38:26.284379 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/effd853b-0b95-4749-8119-88fcfaf8b0c0-ovn-controller-tls-certs\") pod \"ovn-controller-r8x2b\" (UID: \"effd853b-0b95-4749-8119-88fcfaf8b0c0\") " pod="openstack/ovn-controller-r8x2b" Dec 01 08:38:26 crc kubenswrapper[5004]: I1201 08:38:26.284397 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0b64b7d6-2fe8-43b0-9632-84e70a749fe9-var-run\") pod \"ovn-controller-ovs-8qrgn\" (UID: \"0b64b7d6-2fe8-43b0-9632-84e70a749fe9\") " pod="openstack/ovn-controller-ovs-8qrgn" Dec 01 08:38:26 crc kubenswrapper[5004]: I1201 08:38:26.284419 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jz4pw\" (UniqueName: \"kubernetes.io/projected/effd853b-0b95-4749-8119-88fcfaf8b0c0-kube-api-access-jz4pw\") pod \"ovn-controller-r8x2b\" (UID: \"effd853b-0b95-4749-8119-88fcfaf8b0c0\") " pod="openstack/ovn-controller-r8x2b" Dec 01 08:38:26 crc kubenswrapper[5004]: I1201 08:38:26.284779 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/effd853b-0b95-4749-8119-88fcfaf8b0c0-var-run-ovn\") pod \"ovn-controller-r8x2b\" (UID: \"effd853b-0b95-4749-8119-88fcfaf8b0c0\") " pod="openstack/ovn-controller-r8x2b" Dec 01 08:38:26 crc kubenswrapper[5004]: I1201 08:38:26.284882 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/0b64b7d6-2fe8-43b0-9632-84e70a749fe9-etc-ovs\") pod \"ovn-controller-ovs-8qrgn\" (UID: \"0b64b7d6-2fe8-43b0-9632-84e70a749fe9\") " pod="openstack/ovn-controller-ovs-8qrgn" Dec 01 08:38:26 crc kubenswrapper[5004]: I1201 08:38:26.284949 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/effd853b-0b95-4749-8119-88fcfaf8b0c0-var-run\") pod \"ovn-controller-r8x2b\" (UID: \"effd853b-0b95-4749-8119-88fcfaf8b0c0\") " pod="openstack/ovn-controller-r8x2b" Dec 01 08:38:26 crc kubenswrapper[5004]: I1201 08:38:26.286291 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0b64b7d6-2fe8-43b0-9632-84e70a749fe9-var-run\") pod \"ovn-controller-ovs-8qrgn\" (UID: \"0b64b7d6-2fe8-43b0-9632-84e70a749fe9\") " pod="openstack/ovn-controller-ovs-8qrgn" Dec 01 08:38:26 crc kubenswrapper[5004]: I1201 08:38:26.287426 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0b64b7d6-2fe8-43b0-9632-84e70a749fe9-scripts\") pod \"ovn-controller-ovs-8qrgn\" (UID: \"0b64b7d6-2fe8-43b0-9632-84e70a749fe9\") " pod="openstack/ovn-controller-ovs-8qrgn" Dec 01 08:38:26 crc kubenswrapper[5004]: I1201 08:38:26.287658 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/effd853b-0b95-4749-8119-88fcfaf8b0c0-scripts\") pod \"ovn-controller-r8x2b\" (UID: \"effd853b-0b95-4749-8119-88fcfaf8b0c0\") " pod="openstack/ovn-controller-r8x2b" Dec 01 08:38:26 crc kubenswrapper[5004]: I1201 08:38:26.289244 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/effd853b-0b95-4749-8119-88fcfaf8b0c0-ovn-controller-tls-certs\") pod \"ovn-controller-r8x2b\" (UID: \"effd853b-0b95-4749-8119-88fcfaf8b0c0\") " pod="openstack/ovn-controller-r8x2b" Dec 01 08:38:26 crc kubenswrapper[5004]: I1201 08:38:26.300969 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/effd853b-0b95-4749-8119-88fcfaf8b0c0-combined-ca-bundle\") pod \"ovn-controller-r8x2b\" (UID: \"effd853b-0b95-4749-8119-88fcfaf8b0c0\") " pod="openstack/ovn-controller-r8x2b" Dec 01 08:38:26 crc kubenswrapper[5004]: I1201 08:38:26.301734 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jz4pw\" (UniqueName: \"kubernetes.io/projected/effd853b-0b95-4749-8119-88fcfaf8b0c0-kube-api-access-jz4pw\") pod \"ovn-controller-r8x2b\" (UID: \"effd853b-0b95-4749-8119-88fcfaf8b0c0\") " pod="openstack/ovn-controller-r8x2b" Dec 01 08:38:26 crc kubenswrapper[5004]: I1201 08:38:26.311524 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swlt6\" (UniqueName: \"kubernetes.io/projected/0b64b7d6-2fe8-43b0-9632-84e70a749fe9-kube-api-access-swlt6\") pod \"ovn-controller-ovs-8qrgn\" (UID: \"0b64b7d6-2fe8-43b0-9632-84e70a749fe9\") " pod="openstack/ovn-controller-ovs-8qrgn" Dec 01 08:38:26 crc kubenswrapper[5004]: I1201 08:38:26.398680 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-r8x2b" Dec 01 08:38:26 crc kubenswrapper[5004]: I1201 08:38:26.443845 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-8qrgn" Dec 01 08:38:29 crc kubenswrapper[5004]: I1201 08:38:29.639378 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 01 08:38:29 crc kubenswrapper[5004]: I1201 08:38:29.641860 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 01 08:38:29 crc kubenswrapper[5004]: I1201 08:38:29.644810 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 01 08:38:29 crc kubenswrapper[5004]: I1201 08:38:29.644984 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 01 08:38:29 crc kubenswrapper[5004]: I1201 08:38:29.645677 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-gwv7w" Dec 01 08:38:29 crc kubenswrapper[5004]: I1201 08:38:29.648539 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 01 08:38:29 crc kubenswrapper[5004]: I1201 08:38:29.655866 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Dec 01 08:38:29 crc kubenswrapper[5004]: I1201 08:38:29.758122 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6362259b-bac4-4df3-ad0c-d76511731aae-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"6362259b-bac4-4df3-ad0c-d76511731aae\") " pod="openstack/ovsdbserver-sb-0" Dec 01 08:38:29 crc kubenswrapper[5004]: I1201 08:38:29.758195 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6362259b-bac4-4df3-ad0c-d76511731aae-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"6362259b-bac4-4df3-ad0c-d76511731aae\") " pod="openstack/ovsdbserver-sb-0" Dec 01 08:38:29 crc kubenswrapper[5004]: I1201 08:38:29.758494 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6362259b-bac4-4df3-ad0c-d76511731aae-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"6362259b-bac4-4df3-ad0c-d76511731aae\") " pod="openstack/ovsdbserver-sb-0" Dec 01 08:38:29 crc kubenswrapper[5004]: I1201 08:38:29.758651 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6362259b-bac4-4df3-ad0c-d76511731aae-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"6362259b-bac4-4df3-ad0c-d76511731aae\") " pod="openstack/ovsdbserver-sb-0" Dec 01 08:38:29 crc kubenswrapper[5004]: I1201 08:38:29.758754 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6362259b-bac4-4df3-ad0c-d76511731aae-config\") pod \"ovsdbserver-sb-0\" (UID: \"6362259b-bac4-4df3-ad0c-d76511731aae\") " pod="openstack/ovsdbserver-sb-0" Dec 01 08:38:29 crc kubenswrapper[5004]: I1201 08:38:29.759271 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6362259b-bac4-4df3-ad0c-d76511731aae-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"6362259b-bac4-4df3-ad0c-d76511731aae\") " pod="openstack/ovsdbserver-sb-0" Dec 01 08:38:29 crc kubenswrapper[5004]: I1201 08:38:29.759325 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ldp6\" (UniqueName: \"kubernetes.io/projected/6362259b-bac4-4df3-ad0c-d76511731aae-kube-api-access-2ldp6\") pod \"ovsdbserver-sb-0\" (UID: \"6362259b-bac4-4df3-ad0c-d76511731aae\") " pod="openstack/ovsdbserver-sb-0" Dec 01 08:38:29 crc kubenswrapper[5004]: I1201 08:38:29.759960 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"6362259b-bac4-4df3-ad0c-d76511731aae\") " pod="openstack/ovsdbserver-sb-0" Dec 01 08:38:29 crc kubenswrapper[5004]: I1201 08:38:29.862707 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ldp6\" (UniqueName: \"kubernetes.io/projected/6362259b-bac4-4df3-ad0c-d76511731aae-kube-api-access-2ldp6\") pod \"ovsdbserver-sb-0\" (UID: \"6362259b-bac4-4df3-ad0c-d76511731aae\") " pod="openstack/ovsdbserver-sb-0" Dec 01 08:38:29 crc kubenswrapper[5004]: I1201 08:38:29.862842 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"6362259b-bac4-4df3-ad0c-d76511731aae\") " pod="openstack/ovsdbserver-sb-0" Dec 01 08:38:29 crc kubenswrapper[5004]: I1201 08:38:29.862988 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6362259b-bac4-4df3-ad0c-d76511731aae-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"6362259b-bac4-4df3-ad0c-d76511731aae\") " pod="openstack/ovsdbserver-sb-0" Dec 01 08:38:29 crc kubenswrapper[5004]: I1201 08:38:29.863037 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6362259b-bac4-4df3-ad0c-d76511731aae-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"6362259b-bac4-4df3-ad0c-d76511731aae\") " pod="openstack/ovsdbserver-sb-0" Dec 01 08:38:29 crc kubenswrapper[5004]: I1201 08:38:29.863104 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6362259b-bac4-4df3-ad0c-d76511731aae-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"6362259b-bac4-4df3-ad0c-d76511731aae\") " pod="openstack/ovsdbserver-sb-0" Dec 01 08:38:29 crc kubenswrapper[5004]: I1201 08:38:29.863140 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6362259b-bac4-4df3-ad0c-d76511731aae-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"6362259b-bac4-4df3-ad0c-d76511731aae\") " pod="openstack/ovsdbserver-sb-0" Dec 01 08:38:29 crc kubenswrapper[5004]: I1201 08:38:29.863187 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6362259b-bac4-4df3-ad0c-d76511731aae-config\") pod \"ovsdbserver-sb-0\" (UID: \"6362259b-bac4-4df3-ad0c-d76511731aae\") " pod="openstack/ovsdbserver-sb-0" Dec 01 08:38:29 crc kubenswrapper[5004]: I1201 08:38:29.863215 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6362259b-bac4-4df3-ad0c-d76511731aae-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"6362259b-bac4-4df3-ad0c-d76511731aae\") " pod="openstack/ovsdbserver-sb-0" Dec 01 08:38:29 crc kubenswrapper[5004]: I1201 08:38:29.863646 5004 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"6362259b-bac4-4df3-ad0c-d76511731aae\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/ovsdbserver-sb-0" Dec 01 08:38:29 crc kubenswrapper[5004]: I1201 08:38:29.863956 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6362259b-bac4-4df3-ad0c-d76511731aae-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"6362259b-bac4-4df3-ad0c-d76511731aae\") " pod="openstack/ovsdbserver-sb-0" Dec 01 08:38:29 crc kubenswrapper[5004]: I1201 08:38:29.865157 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6362259b-bac4-4df3-ad0c-d76511731aae-config\") pod \"ovsdbserver-sb-0\" (UID: \"6362259b-bac4-4df3-ad0c-d76511731aae\") " pod="openstack/ovsdbserver-sb-0" Dec 01 08:38:29 crc kubenswrapper[5004]: I1201 08:38:29.865714 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6362259b-bac4-4df3-ad0c-d76511731aae-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"6362259b-bac4-4df3-ad0c-d76511731aae\") " pod="openstack/ovsdbserver-sb-0" Dec 01 08:38:29 crc kubenswrapper[5004]: I1201 08:38:29.877546 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6362259b-bac4-4df3-ad0c-d76511731aae-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"6362259b-bac4-4df3-ad0c-d76511731aae\") " pod="openstack/ovsdbserver-sb-0" Dec 01 08:38:29 crc kubenswrapper[5004]: I1201 08:38:29.878077 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6362259b-bac4-4df3-ad0c-d76511731aae-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"6362259b-bac4-4df3-ad0c-d76511731aae\") " pod="openstack/ovsdbserver-sb-0" Dec 01 08:38:29 crc kubenswrapper[5004]: I1201 08:38:29.897071 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6362259b-bac4-4df3-ad0c-d76511731aae-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"6362259b-bac4-4df3-ad0c-d76511731aae\") " pod="openstack/ovsdbserver-sb-0" Dec 01 08:38:29 crc kubenswrapper[5004]: I1201 08:38:29.905726 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ldp6\" (UniqueName: \"kubernetes.io/projected/6362259b-bac4-4df3-ad0c-d76511731aae-kube-api-access-2ldp6\") pod \"ovsdbserver-sb-0\" (UID: \"6362259b-bac4-4df3-ad0c-d76511731aae\") " pod="openstack/ovsdbserver-sb-0" Dec 01 08:38:29 crc kubenswrapper[5004]: I1201 08:38:29.933541 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"6362259b-bac4-4df3-ad0c-d76511731aae\") " pod="openstack/ovsdbserver-sb-0" Dec 01 08:38:29 crc kubenswrapper[5004]: I1201 08:38:29.969687 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 01 08:38:30 crc kubenswrapper[5004]: I1201 08:38:30.493197 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 01 08:38:30 crc kubenswrapper[5004]: I1201 08:38:30.610704 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 01 08:38:30 crc kubenswrapper[5004]: E1201 08:38:30.993553 5004 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 01 08:38:30 crc kubenswrapper[5004]: E1201 08:38:30.994090 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jlsss,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-pqhzv_openstack(ad2a2834-a832-452e-bd49-73656a9cde6a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 08:38:30 crc kubenswrapper[5004]: E1201 08:38:30.995350 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-pqhzv" podUID="ad2a2834-a832-452e-bd49-73656a9cde6a" Dec 01 08:38:30 crc kubenswrapper[5004]: W1201 08:38:30.998063 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb30b7a7_42e1_421b_8673_7f3c8f5cfae3.slice/crio-fe10a10502b3c352c6129078fd1b5848292fcffb9a770017770220ba8daa3e8e WatchSource:0}: Error finding container fe10a10502b3c352c6129078fd1b5848292fcffb9a770017770220ba8daa3e8e: Status 404 returned error can't find the container with id fe10a10502b3c352c6129078fd1b5848292fcffb9a770017770220ba8daa3e8e Dec 01 08:38:31 crc kubenswrapper[5004]: W1201 08:38:31.000344 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04a6dd3a_f297_40b9_b480_0239383b9460.slice/crio-08a24e1b2c44cce6f05bd38dda62c9c41978f0e7f37ada306d76c5a557964b24 WatchSource:0}: Error finding container 08a24e1b2c44cce6f05bd38dda62c9c41978f0e7f37ada306d76c5a557964b24: Status 404 returned error can't find the container with id 08a24e1b2c44cce6f05bd38dda62c9c41978f0e7f37ada306d76c5a557964b24 Dec 01 08:38:31 crc kubenswrapper[5004]: E1201 08:38:31.005695 5004 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 01 08:38:31 crc kubenswrapper[5004]: E1201 08:38:31.005888 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b4k9h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-vbwft_openstack(24875a21-fff6-4224-aa6c-a3264f6f0c26): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 08:38:31 crc kubenswrapper[5004]: E1201 08:38:31.007253 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-vbwft" podUID="24875a21-fff6-4224-aa6c-a3264f6f0c26" Dec 01 08:38:31 crc kubenswrapper[5004]: I1201 08:38:31.524098 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 08:38:31 crc kubenswrapper[5004]: W1201 08:38:31.528349 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode17a426c_0069_4c51_91ad_e5fbf6e0bb2a.slice/crio-207ade0bc554b41d9f45fe1cb0ddb6600e0af82e2ae314fd7caaf67248a79fa6 WatchSource:0}: Error finding container 207ade0bc554b41d9f45fe1cb0ddb6600e0af82e2ae314fd7caaf67248a79fa6: Status 404 returned error can't find the container with id 207ade0bc554b41d9f45fe1cb0ddb6600e0af82e2ae314fd7caaf67248a79fa6 Dec 01 08:38:31 crc kubenswrapper[5004]: I1201 08:38:31.962294 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-ui-dashboards-7d5fb4cbfb-5htt9"] Dec 01 08:38:31 crc kubenswrapper[5004]: I1201 08:38:31.974169 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 08:38:31 crc kubenswrapper[5004]: W1201 08:38:31.986094 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc74b987d_43c4_49d9_92fc_f13f3b7b7dd2.slice/crio-3d2a125fdb9901a662791555122bad5681e504991ed1ee3567418b3d384274a2 WatchSource:0}: Error finding container 3d2a125fdb9901a662791555122bad5681e504991ed1ee3567418b3d384274a2: Status 404 returned error can't find the container with id 3d2a125fdb9901a662791555122bad5681e504991ed1ee3567418b3d384274a2 Dec 01 08:38:31 crc kubenswrapper[5004]: W1201 08:38:31.986757 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ff890d7_d00c_4b87_86d6_3eb403821ee3.slice/crio-1a7fb0e76a1ff8784ae21b2565cf42fe47e6f90c5820d7d7bd9b940fa3b8344b WatchSource:0}: Error finding container 1a7fb0e76a1ff8784ae21b2565cf42fe47e6f90c5820d7d7bd9b940fa3b8344b: Status 404 returned error can't find the container with id 1a7fb0e76a1ff8784ae21b2565cf42fe47e6f90c5820d7d7bd9b940fa3b8344b Dec 01 08:38:31 crc kubenswrapper[5004]: I1201 08:38:31.995295 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e17a426c-0069-4c51-91ad-e5fbf6e0bb2a","Type":"ContainerStarted","Data":"207ade0bc554b41d9f45fe1cb0ddb6600e0af82e2ae314fd7caaf67248a79fa6"} Dec 01 08:38:32 crc kubenswrapper[5004]: I1201 08:38:32.004064 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 01 08:38:32 crc kubenswrapper[5004]: I1201 08:38:32.015358 5004 generic.go:334] "Generic (PLEG): container finished" podID="bbab4428-7b77-4b11-87b3-d720250c9b77" containerID="4a986e73a4741cdd92ca2dd2ca35677a113b10773a209e823827a6e0c8f57d38" exitCode=0 Dec 01 08:38:32 crc kubenswrapper[5004]: I1201 08:38:32.015507 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-bvrdk" event={"ID":"bbab4428-7b77-4b11-87b3-d720250c9b77","Type":"ContainerDied","Data":"4a986e73a4741cdd92ca2dd2ca35677a113b10773a209e823827a6e0c8f57d38"} Dec 01 08:38:32 crc kubenswrapper[5004]: I1201 08:38:32.021610 5004 generic.go:334] "Generic (PLEG): container finished" podID="7fe1d5ea-e24c-44b9-9de2-5011b3fc04fd" containerID="11f560d5ee547328e29fe33ff5aac6d254796633ea0b39b787ba0fce806e015f" exitCode=0 Dec 01 08:38:32 crc kubenswrapper[5004]: I1201 08:38:32.021931 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-hrcgz" event={"ID":"7fe1d5ea-e24c-44b9-9de2-5011b3fc04fd","Type":"ContainerDied","Data":"11f560d5ee547328e29fe33ff5aac6d254796633ea0b39b787ba0fce806e015f"} Dec 01 08:38:32 crc kubenswrapper[5004]: I1201 08:38:32.023194 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 08:38:32 crc kubenswrapper[5004]: I1201 08:38:32.030543 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"bb30b7a7-42e1-421b-8673-7f3c8f5cfae3","Type":"ContainerStarted","Data":"fe10a10502b3c352c6129078fd1b5848292fcffb9a770017770220ba8daa3e8e"} Dec 01 08:38:32 crc kubenswrapper[5004]: I1201 08:38:32.031834 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"04a6dd3a-f297-40b9-b480-0239383b9460","Type":"ContainerStarted","Data":"08a24e1b2c44cce6f05bd38dda62c9c41978f0e7f37ada306d76c5a557964b24"} Dec 01 08:38:32 crc kubenswrapper[5004]: I1201 08:38:32.213543 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-8qrgn"] Dec 01 08:38:32 crc kubenswrapper[5004]: W1201 08:38:32.221014 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0b64b7d6_2fe8_43b0_9632_84e70a749fe9.slice/crio-9ca8229c76fc89ae50cd559c40c8794db4f662f82f42e2cc55ec2b089a4a8239 WatchSource:0}: Error finding container 9ca8229c76fc89ae50cd559c40c8794db4f662f82f42e2cc55ec2b089a4a8239: Status 404 returned error can't find the container with id 9ca8229c76fc89ae50cd559c40c8794db4f662f82f42e2cc55ec2b089a4a8239 Dec 01 08:38:32 crc kubenswrapper[5004]: I1201 08:38:32.612920 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-vbwft" Dec 01 08:38:32 crc kubenswrapper[5004]: I1201 08:38:32.624693 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-pqhzv" Dec 01 08:38:32 crc kubenswrapper[5004]: I1201 08:38:32.771903 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24875a21-fff6-4224-aa6c-a3264f6f0c26-config\") pod \"24875a21-fff6-4224-aa6c-a3264f6f0c26\" (UID: \"24875a21-fff6-4224-aa6c-a3264f6f0c26\") " Dec 01 08:38:32 crc kubenswrapper[5004]: I1201 08:38:32.772065 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad2a2834-a832-452e-bd49-73656a9cde6a-dns-svc\") pod \"ad2a2834-a832-452e-bd49-73656a9cde6a\" (UID: \"ad2a2834-a832-452e-bd49-73656a9cde6a\") " Dec 01 08:38:32 crc kubenswrapper[5004]: I1201 08:38:32.772278 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jlsss\" (UniqueName: \"kubernetes.io/projected/ad2a2834-a832-452e-bd49-73656a9cde6a-kube-api-access-jlsss\") pod \"ad2a2834-a832-452e-bd49-73656a9cde6a\" (UID: \"ad2a2834-a832-452e-bd49-73656a9cde6a\") " Dec 01 08:38:32 crc kubenswrapper[5004]: I1201 08:38:32.772395 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4k9h\" (UniqueName: \"kubernetes.io/projected/24875a21-fff6-4224-aa6c-a3264f6f0c26-kube-api-access-b4k9h\") pod \"24875a21-fff6-4224-aa6c-a3264f6f0c26\" (UID: \"24875a21-fff6-4224-aa6c-a3264f6f0c26\") " Dec 01 08:38:32 crc kubenswrapper[5004]: I1201 08:38:32.772527 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad2a2834-a832-452e-bd49-73656a9cde6a-config\") pod \"ad2a2834-a832-452e-bd49-73656a9cde6a\" (UID: \"ad2a2834-a832-452e-bd49-73656a9cde6a\") " Dec 01 08:38:32 crc kubenswrapper[5004]: I1201 08:38:32.773423 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad2a2834-a832-452e-bd49-73656a9cde6a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ad2a2834-a832-452e-bd49-73656a9cde6a" (UID: "ad2a2834-a832-452e-bd49-73656a9cde6a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:38:32 crc kubenswrapper[5004]: I1201 08:38:32.773831 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24875a21-fff6-4224-aa6c-a3264f6f0c26-config" (OuterVolumeSpecName: "config") pod "24875a21-fff6-4224-aa6c-a3264f6f0c26" (UID: "24875a21-fff6-4224-aa6c-a3264f6f0c26"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:38:32 crc kubenswrapper[5004]: I1201 08:38:32.776083 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad2a2834-a832-452e-bd49-73656a9cde6a-config" (OuterVolumeSpecName: "config") pod "ad2a2834-a832-452e-bd49-73656a9cde6a" (UID: "ad2a2834-a832-452e-bd49-73656a9cde6a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:38:32 crc kubenswrapper[5004]: I1201 08:38:32.780224 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24875a21-fff6-4224-aa6c-a3264f6f0c26-kube-api-access-b4k9h" (OuterVolumeSpecName: "kube-api-access-b4k9h") pod "24875a21-fff6-4224-aa6c-a3264f6f0c26" (UID: "24875a21-fff6-4224-aa6c-a3264f6f0c26"). InnerVolumeSpecName "kube-api-access-b4k9h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:38:32 crc kubenswrapper[5004]: I1201 08:38:32.784681 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad2a2834-a832-452e-bd49-73656a9cde6a-kube-api-access-jlsss" (OuterVolumeSpecName: "kube-api-access-jlsss") pod "ad2a2834-a832-452e-bd49-73656a9cde6a" (UID: "ad2a2834-a832-452e-bd49-73656a9cde6a"). InnerVolumeSpecName "kube-api-access-jlsss". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:38:32 crc kubenswrapper[5004]: I1201 08:38:32.832016 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-r8x2b"] Dec 01 08:38:32 crc kubenswrapper[5004]: I1201 08:38:32.838269 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7cc4f8f495-54q5l"] Dec 01 08:38:32 crc kubenswrapper[5004]: I1201 08:38:32.861110 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 01 08:38:32 crc kubenswrapper[5004]: I1201 08:38:32.880385 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jlsss\" (UniqueName: \"kubernetes.io/projected/ad2a2834-a832-452e-bd49-73656a9cde6a-kube-api-access-jlsss\") on node \"crc\" DevicePath \"\"" Dec 01 08:38:32 crc kubenswrapper[5004]: I1201 08:38:32.880417 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4k9h\" (UniqueName: \"kubernetes.io/projected/24875a21-fff6-4224-aa6c-a3264f6f0c26-kube-api-access-b4k9h\") on node \"crc\" DevicePath \"\"" Dec 01 08:38:32 crc kubenswrapper[5004]: I1201 08:38:32.880427 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad2a2834-a832-452e-bd49-73656a9cde6a-config\") on node \"crc\" DevicePath \"\"" Dec 01 08:38:32 crc kubenswrapper[5004]: I1201 08:38:32.880437 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24875a21-fff6-4224-aa6c-a3264f6f0c26-config\") on node \"crc\" DevicePath \"\"" Dec 01 08:38:32 crc kubenswrapper[5004]: I1201 08:38:32.880446 5004 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad2a2834-a832-452e-bd49-73656a9cde6a-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 08:38:32 crc kubenswrapper[5004]: I1201 08:38:32.912592 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 01 08:38:33 crc kubenswrapper[5004]: I1201 08:38:33.044066 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-bvrdk" event={"ID":"bbab4428-7b77-4b11-87b3-d720250c9b77","Type":"ContainerStarted","Data":"d93e4648a465a94411d3c539eb3775cf8b01b0fe9c299cfe89390de91e851d9e"} Dec 01 08:38:33 crc kubenswrapper[5004]: I1201 08:38:33.045042 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-bvrdk" Dec 01 08:38:33 crc kubenswrapper[5004]: I1201 08:38:33.046081 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c74b987d-43c4-49d9-92fc-f13f3b7b7dd2","Type":"ContainerStarted","Data":"3d2a125fdb9901a662791555122bad5681e504991ed1ee3567418b3d384274a2"} Dec 01 08:38:33 crc kubenswrapper[5004]: I1201 08:38:33.048218 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b571e2e5-2a78-45af-83aa-3d874b2569b3","Type":"ContainerStarted","Data":"09c5010b95ba396d08e791d342ba34fd773f6961f449432d7a61ef4a0f3ba58d"} Dec 01 08:38:33 crc kubenswrapper[5004]: I1201 08:38:33.049397 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-pqhzv" event={"ID":"ad2a2834-a832-452e-bd49-73656a9cde6a","Type":"ContainerDied","Data":"a4172d14cb308075c7f3ebb6a4c31303beb3274a663a3f2c25529bff87227cd9"} Dec 01 08:38:33 crc kubenswrapper[5004]: I1201 08:38:33.049473 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-pqhzv" Dec 01 08:38:33 crc kubenswrapper[5004]: I1201 08:38:33.055422 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-5htt9" event={"ID":"1ff890d7-d00c-4b87-86d6-3eb403821ee3","Type":"ContainerStarted","Data":"1a7fb0e76a1ff8784ae21b2565cf42fe47e6f90c5820d7d7bd9b940fa3b8344b"} Dec 01 08:38:33 crc kubenswrapper[5004]: I1201 08:38:33.057460 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5fcd0c1b-820e-4bc1-b3fd-22ac13415e3c","Type":"ContainerStarted","Data":"2402adbdc1bb8af82f683494045bbbfce5c45c16acc1dd2ccf1770c5ec08bdac"} Dec 01 08:38:33 crc kubenswrapper[5004]: I1201 08:38:33.060935 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-vbwft" event={"ID":"24875a21-fff6-4224-aa6c-a3264f6f0c26","Type":"ContainerDied","Data":"6a0fd550170224ebe9d0d1d1d0f680b16a2ca1867996d762c08f8776ea18346c"} Dec 01 08:38:33 crc kubenswrapper[5004]: I1201 08:38:33.060941 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-vbwft" Dec 01 08:38:33 crc kubenswrapper[5004]: I1201 08:38:33.062157 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-8qrgn" event={"ID":"0b64b7d6-2fe8-43b0-9632-84e70a749fe9","Type":"ContainerStarted","Data":"9ca8229c76fc89ae50cd559c40c8794db4f662f82f42e2cc55ec2b089a4a8239"} Dec 01 08:38:33 crc kubenswrapper[5004]: I1201 08:38:33.066780 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-bvrdk" podStartSLOduration=3.399546135 podStartE2EDuration="18.066767355s" podCreationTimestamp="2025-12-01 08:38:15 +0000 UTC" firstStartedPulling="2025-12-01 08:38:16.56475194 +0000 UTC m=+1274.129743922" lastFinishedPulling="2025-12-01 08:38:31.23197316 +0000 UTC m=+1288.796965142" observedRunningTime="2025-12-01 08:38:33.057879145 +0000 UTC m=+1290.622871127" watchObservedRunningTime="2025-12-01 08:38:33.066767355 +0000 UTC m=+1290.631759337" Dec 01 08:38:33 crc kubenswrapper[5004]: I1201 08:38:33.067295 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-hrcgz" event={"ID":"7fe1d5ea-e24c-44b9-9de2-5011b3fc04fd","Type":"ContainerStarted","Data":"3df8d11558a7bc0eb49a8052c0f22d46a69e0172109f9fd14bf498c269b2e7ad"} Dec 01 08:38:33 crc kubenswrapper[5004]: I1201 08:38:33.067522 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-hrcgz" Dec 01 08:38:33 crc kubenswrapper[5004]: I1201 08:38:33.109646 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-vbwft"] Dec 01 08:38:33 crc kubenswrapper[5004]: I1201 08:38:33.116380 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-vbwft"] Dec 01 08:38:33 crc kubenswrapper[5004]: I1201 08:38:33.119269 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-hrcgz" podStartSLOduration=3.274644006 podStartE2EDuration="18.119257558s" podCreationTimestamp="2025-12-01 08:38:15 +0000 UTC" firstStartedPulling="2025-12-01 08:38:16.322698745 +0000 UTC m=+1273.887690727" lastFinishedPulling="2025-12-01 08:38:31.167312287 +0000 UTC m=+1288.732304279" observedRunningTime="2025-12-01 08:38:33.103757061 +0000 UTC m=+1290.668749063" watchObservedRunningTime="2025-12-01 08:38:33.119257558 +0000 UTC m=+1290.684249540" Dec 01 08:38:33 crc kubenswrapper[5004]: I1201 08:38:33.143508 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-pqhzv"] Dec 01 08:38:33 crc kubenswrapper[5004]: I1201 08:38:33.156500 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-pqhzv"] Dec 01 08:38:33 crc kubenswrapper[5004]: W1201 08:38:33.288131 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08dfc50d_80b8_4885_826d_4a8314b46234.slice/crio-d876d6a6c08dfa2604965630cfdc6b8116aaa088cb4dc9fe5c9b553bb7b13095 WatchSource:0}: Error finding container d876d6a6c08dfa2604965630cfdc6b8116aaa088cb4dc9fe5c9b553bb7b13095: Status 404 returned error can't find the container with id d876d6a6c08dfa2604965630cfdc6b8116aaa088cb4dc9fe5c9b553bb7b13095 Dec 01 08:38:33 crc kubenswrapper[5004]: W1201 08:38:33.289865 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6362259b_bac4_4df3_ad0c_d76511731aae.slice/crio-05cba64d91fef4ee32f75a2339ad68e9b4cc1a7966f64e9d5366263619052ded WatchSource:0}: Error finding container 05cba64d91fef4ee32f75a2339ad68e9b4cc1a7966f64e9d5366263619052ded: Status 404 returned error can't find the container with id 05cba64d91fef4ee32f75a2339ad68e9b4cc1a7966f64e9d5366263619052ded Dec 01 08:38:33 crc kubenswrapper[5004]: W1201 08:38:33.293232 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4e3584c_a3b8_4fef_99cf_fd56218b1299.slice/crio-64dfd091a65f51ec22bad9956fcfc1ef1090b6ed5bd0881f6645d054c138fa0f WatchSource:0}: Error finding container 64dfd091a65f51ec22bad9956fcfc1ef1090b6ed5bd0881f6645d054c138fa0f: Status 404 returned error can't find the container with id 64dfd091a65f51ec22bad9956fcfc1ef1090b6ed5bd0881f6645d054c138fa0f Dec 01 08:38:33 crc kubenswrapper[5004]: W1201 08:38:33.296173 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeffd853b_0b95_4749_8119_88fcfaf8b0c0.slice/crio-68ca055ee4bfb82a4a167d3d34a64c2e4bcb88c1c0674db15e3fd2519f298370 WatchSource:0}: Error finding container 68ca055ee4bfb82a4a167d3d34a64c2e4bcb88c1c0674db15e3fd2519f298370: Status 404 returned error can't find the container with id 68ca055ee4bfb82a4a167d3d34a64c2e4bcb88c1c0674db15e3fd2519f298370 Dec 01 08:38:33 crc kubenswrapper[5004]: I1201 08:38:33.713973 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 01 08:38:33 crc kubenswrapper[5004]: W1201 08:38:33.865353 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc5ea92c9_9b0a_473a_872f_a78f27946432.slice/crio-d763fd0d66bd48cb85f83a7cc8407958cc4d5fc7721b057a897f55b6753469e3 WatchSource:0}: Error finding container d763fd0d66bd48cb85f83a7cc8407958cc4d5fc7721b057a897f55b6753469e3: Status 404 returned error can't find the container with id d763fd0d66bd48cb85f83a7cc8407958cc4d5fc7721b057a897f55b6753469e3 Dec 01 08:38:34 crc kubenswrapper[5004]: I1201 08:38:34.076919 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"c5ea92c9-9b0a-473a-872f-a78f27946432","Type":"ContainerStarted","Data":"d763fd0d66bd48cb85f83a7cc8407958cc4d5fc7721b057a897f55b6753469e3"} Dec 01 08:38:34 crc kubenswrapper[5004]: I1201 08:38:34.078980 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-r8x2b" event={"ID":"effd853b-0b95-4749-8119-88fcfaf8b0c0","Type":"ContainerStarted","Data":"68ca055ee4bfb82a4a167d3d34a64c2e4bcb88c1c0674db15e3fd2519f298370"} Dec 01 08:38:34 crc kubenswrapper[5004]: I1201 08:38:34.080688 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7cc4f8f495-54q5l" event={"ID":"e4e3584c-a3b8-4fef-99cf-fd56218b1299","Type":"ContainerStarted","Data":"64dfd091a65f51ec22bad9956fcfc1ef1090b6ed5bd0881f6645d054c138fa0f"} Dec 01 08:38:34 crc kubenswrapper[5004]: I1201 08:38:34.081868 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"6362259b-bac4-4df3-ad0c-d76511731aae","Type":"ContainerStarted","Data":"05cba64d91fef4ee32f75a2339ad68e9b4cc1a7966f64e9d5366263619052ded"} Dec 01 08:38:34 crc kubenswrapper[5004]: I1201 08:38:34.083485 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"08dfc50d-80b8-4885-826d-4a8314b46234","Type":"ContainerStarted","Data":"d876d6a6c08dfa2604965630cfdc6b8116aaa088cb4dc9fe5c9b553bb7b13095"} Dec 01 08:38:34 crc kubenswrapper[5004]: I1201 08:38:34.789789 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24875a21-fff6-4224-aa6c-a3264f6f0c26" path="/var/lib/kubelet/pods/24875a21-fff6-4224-aa6c-a3264f6f0c26/volumes" Dec 01 08:38:34 crc kubenswrapper[5004]: I1201 08:38:34.790485 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad2a2834-a832-452e-bd49-73656a9cde6a" path="/var/lib/kubelet/pods/ad2a2834-a832-452e-bd49-73656a9cde6a/volumes" Dec 01 08:38:36 crc kubenswrapper[5004]: I1201 08:38:36.108844 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7cc4f8f495-54q5l" event={"ID":"e4e3584c-a3b8-4fef-99cf-fd56218b1299","Type":"ContainerStarted","Data":"25d3f590bb2a8729216df274e70fc24de55ff8c00d79cd1fff2ff81e65062faf"} Dec 01 08:38:36 crc kubenswrapper[5004]: I1201 08:38:36.135389 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7cc4f8f495-54q5l" podStartSLOduration=14.135365816 podStartE2EDuration="14.135365816s" podCreationTimestamp="2025-12-01 08:38:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:38:36.129585548 +0000 UTC m=+1293.694577560" watchObservedRunningTime="2025-12-01 08:38:36.135365816 +0000 UTC m=+1293.700357818" Dec 01 08:38:38 crc kubenswrapper[5004]: I1201 08:38:38.933105 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-tbwk4"] Dec 01 08:38:38 crc kubenswrapper[5004]: I1201 08:38:38.935360 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-tbwk4" Dec 01 08:38:38 crc kubenswrapper[5004]: I1201 08:38:38.938222 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 01 08:38:38 crc kubenswrapper[5004]: I1201 08:38:38.942756 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-tbwk4"] Dec 01 08:38:39 crc kubenswrapper[5004]: I1201 08:38:39.051162 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfb808bc-b59f-492b-a3aa-d817263501a5-config\") pod \"ovn-controller-metrics-tbwk4\" (UID: \"cfb808bc-b59f-492b-a3aa-d817263501a5\") " pod="openstack/ovn-controller-metrics-tbwk4" Dec 01 08:38:39 crc kubenswrapper[5004]: I1201 08:38:39.051247 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/cfb808bc-b59f-492b-a3aa-d817263501a5-ovs-rundir\") pod \"ovn-controller-metrics-tbwk4\" (UID: \"cfb808bc-b59f-492b-a3aa-d817263501a5\") " pod="openstack/ovn-controller-metrics-tbwk4" Dec 01 08:38:39 crc kubenswrapper[5004]: I1201 08:38:39.051318 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfb808bc-b59f-492b-a3aa-d817263501a5-combined-ca-bundle\") pod \"ovn-controller-metrics-tbwk4\" (UID: \"cfb808bc-b59f-492b-a3aa-d817263501a5\") " pod="openstack/ovn-controller-metrics-tbwk4" Dec 01 08:38:39 crc kubenswrapper[5004]: I1201 08:38:39.051383 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfb808bc-b59f-492b-a3aa-d817263501a5-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-tbwk4\" (UID: \"cfb808bc-b59f-492b-a3aa-d817263501a5\") " pod="openstack/ovn-controller-metrics-tbwk4" Dec 01 08:38:39 crc kubenswrapper[5004]: I1201 08:38:39.051402 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8nmt\" (UniqueName: \"kubernetes.io/projected/cfb808bc-b59f-492b-a3aa-d817263501a5-kube-api-access-s8nmt\") pod \"ovn-controller-metrics-tbwk4\" (UID: \"cfb808bc-b59f-492b-a3aa-d817263501a5\") " pod="openstack/ovn-controller-metrics-tbwk4" Dec 01 08:38:39 crc kubenswrapper[5004]: I1201 08:38:39.051420 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/cfb808bc-b59f-492b-a3aa-d817263501a5-ovn-rundir\") pod \"ovn-controller-metrics-tbwk4\" (UID: \"cfb808bc-b59f-492b-a3aa-d817263501a5\") " pod="openstack/ovn-controller-metrics-tbwk4" Dec 01 08:38:39 crc kubenswrapper[5004]: I1201 08:38:39.073427 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-bvrdk"] Dec 01 08:38:39 crc kubenswrapper[5004]: I1201 08:38:39.073651 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-bvrdk" podUID="bbab4428-7b77-4b11-87b3-d720250c9b77" containerName="dnsmasq-dns" containerID="cri-o://d93e4648a465a94411d3c539eb3775cf8b01b0fe9c299cfe89390de91e851d9e" gracePeriod=10 Dec 01 08:38:39 crc kubenswrapper[5004]: I1201 08:38:39.076742 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-bvrdk" Dec 01 08:38:39 crc kubenswrapper[5004]: I1201 08:38:39.098717 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-wl2nn"] Dec 01 08:38:39 crc kubenswrapper[5004]: I1201 08:38:39.100303 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-wl2nn" Dec 01 08:38:39 crc kubenswrapper[5004]: I1201 08:38:39.112347 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 01 08:38:39 crc kubenswrapper[5004]: I1201 08:38:39.116676 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-wl2nn"] Dec 01 08:38:39 crc kubenswrapper[5004]: I1201 08:38:39.157647 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/cfb808bc-b59f-492b-a3aa-d817263501a5-ovn-rundir\") pod \"ovn-controller-metrics-tbwk4\" (UID: \"cfb808bc-b59f-492b-a3aa-d817263501a5\") " pod="openstack/ovn-controller-metrics-tbwk4" Dec 01 08:38:39 crc kubenswrapper[5004]: I1201 08:38:39.157749 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfb808bc-b59f-492b-a3aa-d817263501a5-config\") pod \"ovn-controller-metrics-tbwk4\" (UID: \"cfb808bc-b59f-492b-a3aa-d817263501a5\") " pod="openstack/ovn-controller-metrics-tbwk4" Dec 01 08:38:39 crc kubenswrapper[5004]: I1201 08:38:39.157776 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/cfb808bc-b59f-492b-a3aa-d817263501a5-ovs-rundir\") pod \"ovn-controller-metrics-tbwk4\" (UID: \"cfb808bc-b59f-492b-a3aa-d817263501a5\") " pod="openstack/ovn-controller-metrics-tbwk4" Dec 01 08:38:39 crc kubenswrapper[5004]: I1201 08:38:39.157834 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfb808bc-b59f-492b-a3aa-d817263501a5-combined-ca-bundle\") pod \"ovn-controller-metrics-tbwk4\" (UID: \"cfb808bc-b59f-492b-a3aa-d817263501a5\") " pod="openstack/ovn-controller-metrics-tbwk4" Dec 01 08:38:39 crc kubenswrapper[5004]: I1201 08:38:39.157894 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfb808bc-b59f-492b-a3aa-d817263501a5-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-tbwk4\" (UID: \"cfb808bc-b59f-492b-a3aa-d817263501a5\") " pod="openstack/ovn-controller-metrics-tbwk4" Dec 01 08:38:39 crc kubenswrapper[5004]: I1201 08:38:39.157912 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8nmt\" (UniqueName: \"kubernetes.io/projected/cfb808bc-b59f-492b-a3aa-d817263501a5-kube-api-access-s8nmt\") pod \"ovn-controller-metrics-tbwk4\" (UID: \"cfb808bc-b59f-492b-a3aa-d817263501a5\") " pod="openstack/ovn-controller-metrics-tbwk4" Dec 01 08:38:39 crc kubenswrapper[5004]: I1201 08:38:39.158455 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/cfb808bc-b59f-492b-a3aa-d817263501a5-ovn-rundir\") pod \"ovn-controller-metrics-tbwk4\" (UID: \"cfb808bc-b59f-492b-a3aa-d817263501a5\") " pod="openstack/ovn-controller-metrics-tbwk4" Dec 01 08:38:39 crc kubenswrapper[5004]: I1201 08:38:39.159067 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfb808bc-b59f-492b-a3aa-d817263501a5-config\") pod \"ovn-controller-metrics-tbwk4\" (UID: \"cfb808bc-b59f-492b-a3aa-d817263501a5\") " pod="openstack/ovn-controller-metrics-tbwk4" Dec 01 08:38:39 crc kubenswrapper[5004]: I1201 08:38:39.159122 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/cfb808bc-b59f-492b-a3aa-d817263501a5-ovs-rundir\") pod \"ovn-controller-metrics-tbwk4\" (UID: \"cfb808bc-b59f-492b-a3aa-d817263501a5\") " pod="openstack/ovn-controller-metrics-tbwk4" Dec 01 08:38:39 crc kubenswrapper[5004]: I1201 08:38:39.169084 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfb808bc-b59f-492b-a3aa-d817263501a5-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-tbwk4\" (UID: \"cfb808bc-b59f-492b-a3aa-d817263501a5\") " pod="openstack/ovn-controller-metrics-tbwk4" Dec 01 08:38:39 crc kubenswrapper[5004]: I1201 08:38:39.175473 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfb808bc-b59f-492b-a3aa-d817263501a5-combined-ca-bundle\") pod \"ovn-controller-metrics-tbwk4\" (UID: \"cfb808bc-b59f-492b-a3aa-d817263501a5\") " pod="openstack/ovn-controller-metrics-tbwk4" Dec 01 08:38:39 crc kubenswrapper[5004]: I1201 08:38:39.188707 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8nmt\" (UniqueName: \"kubernetes.io/projected/cfb808bc-b59f-492b-a3aa-d817263501a5-kube-api-access-s8nmt\") pod \"ovn-controller-metrics-tbwk4\" (UID: \"cfb808bc-b59f-492b-a3aa-d817263501a5\") " pod="openstack/ovn-controller-metrics-tbwk4" Dec 01 08:38:39 crc kubenswrapper[5004]: I1201 08:38:39.258438 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-hrcgz"] Dec 01 08:38:39 crc kubenswrapper[5004]: I1201 08:38:39.259936 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-hrcgz" podUID="7fe1d5ea-e24c-44b9-9de2-5011b3fc04fd" containerName="dnsmasq-dns" containerID="cri-o://3df8d11558a7bc0eb49a8052c0f22d46a69e0172109f9fd14bf498c269b2e7ad" gracePeriod=10 Dec 01 08:38:39 crc kubenswrapper[5004]: I1201 08:38:39.259713 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ce540a3-feaa-469f-85cf-ec800dd6b1bc-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-wl2nn\" (UID: \"7ce540a3-feaa-469f-85cf-ec800dd6b1bc\") " pod="openstack/dnsmasq-dns-5bf47b49b7-wl2nn" Dec 01 08:38:39 crc kubenswrapper[5004]: I1201 08:38:39.260128 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97s6f\" (UniqueName: \"kubernetes.io/projected/7ce540a3-feaa-469f-85cf-ec800dd6b1bc-kube-api-access-97s6f\") pod \"dnsmasq-dns-5bf47b49b7-wl2nn\" (UID: \"7ce540a3-feaa-469f-85cf-ec800dd6b1bc\") " pod="openstack/dnsmasq-dns-5bf47b49b7-wl2nn" Dec 01 08:38:39 crc kubenswrapper[5004]: I1201 08:38:39.260157 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ce540a3-feaa-469f-85cf-ec800dd6b1bc-config\") pod \"dnsmasq-dns-5bf47b49b7-wl2nn\" (UID: \"7ce540a3-feaa-469f-85cf-ec800dd6b1bc\") " pod="openstack/dnsmasq-dns-5bf47b49b7-wl2nn" Dec 01 08:38:39 crc kubenswrapper[5004]: I1201 08:38:39.260199 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7ce540a3-feaa-469f-85cf-ec800dd6b1bc-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-wl2nn\" (UID: \"7ce540a3-feaa-469f-85cf-ec800dd6b1bc\") " pod="openstack/dnsmasq-dns-5bf47b49b7-wl2nn" Dec 01 08:38:39 crc kubenswrapper[5004]: I1201 08:38:39.260903 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-666b6646f7-hrcgz" Dec 01 08:38:39 crc kubenswrapper[5004]: I1201 08:38:39.270034 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-tbwk4" Dec 01 08:38:39 crc kubenswrapper[5004]: I1201 08:38:39.293671 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8554648995-4vb89"] Dec 01 08:38:39 crc kubenswrapper[5004]: I1201 08:38:39.295361 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-4vb89" Dec 01 08:38:39 crc kubenswrapper[5004]: I1201 08:38:39.302419 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 01 08:38:39 crc kubenswrapper[5004]: I1201 08:38:39.325419 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-4vb89"] Dec 01 08:38:39 crc kubenswrapper[5004]: I1201 08:38:39.361592 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ce540a3-feaa-469f-85cf-ec800dd6b1bc-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-wl2nn\" (UID: \"7ce540a3-feaa-469f-85cf-ec800dd6b1bc\") " pod="openstack/dnsmasq-dns-5bf47b49b7-wl2nn" Dec 01 08:38:39 crc kubenswrapper[5004]: I1201 08:38:39.361726 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97s6f\" (UniqueName: \"kubernetes.io/projected/7ce540a3-feaa-469f-85cf-ec800dd6b1bc-kube-api-access-97s6f\") pod \"dnsmasq-dns-5bf47b49b7-wl2nn\" (UID: \"7ce540a3-feaa-469f-85cf-ec800dd6b1bc\") " pod="openstack/dnsmasq-dns-5bf47b49b7-wl2nn" Dec 01 08:38:39 crc kubenswrapper[5004]: I1201 08:38:39.361751 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ce540a3-feaa-469f-85cf-ec800dd6b1bc-config\") pod \"dnsmasq-dns-5bf47b49b7-wl2nn\" (UID: \"7ce540a3-feaa-469f-85cf-ec800dd6b1bc\") " pod="openstack/dnsmasq-dns-5bf47b49b7-wl2nn" Dec 01 08:38:39 crc kubenswrapper[5004]: I1201 08:38:39.361791 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7ce540a3-feaa-469f-85cf-ec800dd6b1bc-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-wl2nn\" (UID: \"7ce540a3-feaa-469f-85cf-ec800dd6b1bc\") " pod="openstack/dnsmasq-dns-5bf47b49b7-wl2nn" Dec 01 08:38:39 crc kubenswrapper[5004]: I1201 08:38:39.362602 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ce540a3-feaa-469f-85cf-ec800dd6b1bc-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-wl2nn\" (UID: \"7ce540a3-feaa-469f-85cf-ec800dd6b1bc\") " pod="openstack/dnsmasq-dns-5bf47b49b7-wl2nn" Dec 01 08:38:39 crc kubenswrapper[5004]: I1201 08:38:39.362641 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ce540a3-feaa-469f-85cf-ec800dd6b1bc-config\") pod \"dnsmasq-dns-5bf47b49b7-wl2nn\" (UID: \"7ce540a3-feaa-469f-85cf-ec800dd6b1bc\") " pod="openstack/dnsmasq-dns-5bf47b49b7-wl2nn" Dec 01 08:38:39 crc kubenswrapper[5004]: I1201 08:38:39.362671 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7ce540a3-feaa-469f-85cf-ec800dd6b1bc-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-wl2nn\" (UID: \"7ce540a3-feaa-469f-85cf-ec800dd6b1bc\") " pod="openstack/dnsmasq-dns-5bf47b49b7-wl2nn" Dec 01 08:38:39 crc kubenswrapper[5004]: I1201 08:38:39.378289 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97s6f\" (UniqueName: \"kubernetes.io/projected/7ce540a3-feaa-469f-85cf-ec800dd6b1bc-kube-api-access-97s6f\") pod \"dnsmasq-dns-5bf47b49b7-wl2nn\" (UID: \"7ce540a3-feaa-469f-85cf-ec800dd6b1bc\") " pod="openstack/dnsmasq-dns-5bf47b49b7-wl2nn" Dec 01 08:38:39 crc kubenswrapper[5004]: I1201 08:38:39.452871 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-wl2nn" Dec 01 08:38:39 crc kubenswrapper[5004]: I1201 08:38:39.466541 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/abf31963-3bf7-4b6e-adaa-8605634a9530-dns-svc\") pod \"dnsmasq-dns-8554648995-4vb89\" (UID: \"abf31963-3bf7-4b6e-adaa-8605634a9530\") " pod="openstack/dnsmasq-dns-8554648995-4vb89" Dec 01 08:38:39 crc kubenswrapper[5004]: I1201 08:38:39.466664 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abf31963-3bf7-4b6e-adaa-8605634a9530-config\") pod \"dnsmasq-dns-8554648995-4vb89\" (UID: \"abf31963-3bf7-4b6e-adaa-8605634a9530\") " pod="openstack/dnsmasq-dns-8554648995-4vb89" Dec 01 08:38:39 crc kubenswrapper[5004]: I1201 08:38:39.466831 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/abf31963-3bf7-4b6e-adaa-8605634a9530-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-4vb89\" (UID: \"abf31963-3bf7-4b6e-adaa-8605634a9530\") " pod="openstack/dnsmasq-dns-8554648995-4vb89" Dec 01 08:38:39 crc kubenswrapper[5004]: I1201 08:38:39.466897 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gcqr\" (UniqueName: \"kubernetes.io/projected/abf31963-3bf7-4b6e-adaa-8605634a9530-kube-api-access-9gcqr\") pod \"dnsmasq-dns-8554648995-4vb89\" (UID: \"abf31963-3bf7-4b6e-adaa-8605634a9530\") " pod="openstack/dnsmasq-dns-8554648995-4vb89" Dec 01 08:38:39 crc kubenswrapper[5004]: I1201 08:38:39.466988 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/abf31963-3bf7-4b6e-adaa-8605634a9530-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-4vb89\" (UID: \"abf31963-3bf7-4b6e-adaa-8605634a9530\") " pod="openstack/dnsmasq-dns-8554648995-4vb89" Dec 01 08:38:39 crc kubenswrapper[5004]: I1201 08:38:39.568465 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/abf31963-3bf7-4b6e-adaa-8605634a9530-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-4vb89\" (UID: \"abf31963-3bf7-4b6e-adaa-8605634a9530\") " pod="openstack/dnsmasq-dns-8554648995-4vb89" Dec 01 08:38:39 crc kubenswrapper[5004]: I1201 08:38:39.568735 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/abf31963-3bf7-4b6e-adaa-8605634a9530-dns-svc\") pod \"dnsmasq-dns-8554648995-4vb89\" (UID: \"abf31963-3bf7-4b6e-adaa-8605634a9530\") " pod="openstack/dnsmasq-dns-8554648995-4vb89" Dec 01 08:38:39 crc kubenswrapper[5004]: I1201 08:38:39.569308 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/abf31963-3bf7-4b6e-adaa-8605634a9530-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-4vb89\" (UID: \"abf31963-3bf7-4b6e-adaa-8605634a9530\") " pod="openstack/dnsmasq-dns-8554648995-4vb89" Dec 01 08:38:39 crc kubenswrapper[5004]: I1201 08:38:39.569916 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/abf31963-3bf7-4b6e-adaa-8605634a9530-dns-svc\") pod \"dnsmasq-dns-8554648995-4vb89\" (UID: \"abf31963-3bf7-4b6e-adaa-8605634a9530\") " pod="openstack/dnsmasq-dns-8554648995-4vb89" Dec 01 08:38:39 crc kubenswrapper[5004]: I1201 08:38:39.570078 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abf31963-3bf7-4b6e-adaa-8605634a9530-config\") pod \"dnsmasq-dns-8554648995-4vb89\" (UID: \"abf31963-3bf7-4b6e-adaa-8605634a9530\") " pod="openstack/dnsmasq-dns-8554648995-4vb89" Dec 01 08:38:39 crc kubenswrapper[5004]: I1201 08:38:39.570189 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/abf31963-3bf7-4b6e-adaa-8605634a9530-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-4vb89\" (UID: \"abf31963-3bf7-4b6e-adaa-8605634a9530\") " pod="openstack/dnsmasq-dns-8554648995-4vb89" Dec 01 08:38:39 crc kubenswrapper[5004]: I1201 08:38:39.570244 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gcqr\" (UniqueName: \"kubernetes.io/projected/abf31963-3bf7-4b6e-adaa-8605634a9530-kube-api-access-9gcqr\") pod \"dnsmasq-dns-8554648995-4vb89\" (UID: \"abf31963-3bf7-4b6e-adaa-8605634a9530\") " pod="openstack/dnsmasq-dns-8554648995-4vb89" Dec 01 08:38:39 crc kubenswrapper[5004]: I1201 08:38:39.570790 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abf31963-3bf7-4b6e-adaa-8605634a9530-config\") pod \"dnsmasq-dns-8554648995-4vb89\" (UID: \"abf31963-3bf7-4b6e-adaa-8605634a9530\") " pod="openstack/dnsmasq-dns-8554648995-4vb89" Dec 01 08:38:39 crc kubenswrapper[5004]: I1201 08:38:39.571167 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/abf31963-3bf7-4b6e-adaa-8605634a9530-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-4vb89\" (UID: \"abf31963-3bf7-4b6e-adaa-8605634a9530\") " pod="openstack/dnsmasq-dns-8554648995-4vb89" Dec 01 08:38:39 crc kubenswrapper[5004]: I1201 08:38:39.588312 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gcqr\" (UniqueName: \"kubernetes.io/projected/abf31963-3bf7-4b6e-adaa-8605634a9530-kube-api-access-9gcqr\") pod \"dnsmasq-dns-8554648995-4vb89\" (UID: \"abf31963-3bf7-4b6e-adaa-8605634a9530\") " pod="openstack/dnsmasq-dns-8554648995-4vb89" Dec 01 08:38:39 crc kubenswrapper[5004]: I1201 08:38:39.626255 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-4vb89" Dec 01 08:38:40 crc kubenswrapper[5004]: I1201 08:38:40.154777 5004 generic.go:334] "Generic (PLEG): container finished" podID="bbab4428-7b77-4b11-87b3-d720250c9b77" containerID="d93e4648a465a94411d3c539eb3775cf8b01b0fe9c299cfe89390de91e851d9e" exitCode=0 Dec 01 08:38:40 crc kubenswrapper[5004]: I1201 08:38:40.154925 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-bvrdk" event={"ID":"bbab4428-7b77-4b11-87b3-d720250c9b77","Type":"ContainerDied","Data":"d93e4648a465a94411d3c539eb3775cf8b01b0fe9c299cfe89390de91e851d9e"} Dec 01 08:38:40 crc kubenswrapper[5004]: I1201 08:38:40.158288 5004 generic.go:334] "Generic (PLEG): container finished" podID="7fe1d5ea-e24c-44b9-9de2-5011b3fc04fd" containerID="3df8d11558a7bc0eb49a8052c0f22d46a69e0172109f9fd14bf498c269b2e7ad" exitCode=0 Dec 01 08:38:40 crc kubenswrapper[5004]: I1201 08:38:40.158341 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-hrcgz" event={"ID":"7fe1d5ea-e24c-44b9-9de2-5011b3fc04fd","Type":"ContainerDied","Data":"3df8d11558a7bc0eb49a8052c0f22d46a69e0172109f9fd14bf498c269b2e7ad"} Dec 01 08:38:40 crc kubenswrapper[5004]: I1201 08:38:40.536608 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-666b6646f7-hrcgz" podUID="7fe1d5ea-e24c-44b9-9de2-5011b3fc04fd" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.127:5353: connect: connection refused" Dec 01 08:38:40 crc kubenswrapper[5004]: I1201 08:38:40.995616 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57d769cc4f-bvrdk" podUID="bbab4428-7b77-4b11-87b3-d720250c9b77" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.128:5353: connect: connection refused" Dec 01 08:38:43 crc kubenswrapper[5004]: I1201 08:38:43.589548 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7cc4f8f495-54q5l" Dec 01 08:38:43 crc kubenswrapper[5004]: I1201 08:38:43.590926 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7cc4f8f495-54q5l" Dec 01 08:38:43 crc kubenswrapper[5004]: I1201 08:38:43.599729 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7cc4f8f495-54q5l" Dec 01 08:38:44 crc kubenswrapper[5004]: I1201 08:38:44.200716 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7cc4f8f495-54q5l" Dec 01 08:38:44 crc kubenswrapper[5004]: I1201 08:38:44.267256 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-df4fb84fc-flnws"] Dec 01 08:38:46 crc kubenswrapper[5004]: I1201 08:38:46.419294 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-hrcgz" Dec 01 08:38:46 crc kubenswrapper[5004]: I1201 08:38:46.429140 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-bvrdk" Dec 01 08:38:46 crc kubenswrapper[5004]: I1201 08:38:46.561229 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vsp2g\" (UniqueName: \"kubernetes.io/projected/7fe1d5ea-e24c-44b9-9de2-5011b3fc04fd-kube-api-access-vsp2g\") pod \"7fe1d5ea-e24c-44b9-9de2-5011b3fc04fd\" (UID: \"7fe1d5ea-e24c-44b9-9de2-5011b3fc04fd\") " Dec 01 08:38:46 crc kubenswrapper[5004]: I1201 08:38:46.561527 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pltws\" (UniqueName: \"kubernetes.io/projected/bbab4428-7b77-4b11-87b3-d720250c9b77-kube-api-access-pltws\") pod \"bbab4428-7b77-4b11-87b3-d720250c9b77\" (UID: \"bbab4428-7b77-4b11-87b3-d720250c9b77\") " Dec 01 08:38:46 crc kubenswrapper[5004]: I1201 08:38:46.561629 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bbab4428-7b77-4b11-87b3-d720250c9b77-dns-svc\") pod \"bbab4428-7b77-4b11-87b3-d720250c9b77\" (UID: \"bbab4428-7b77-4b11-87b3-d720250c9b77\") " Dec 01 08:38:46 crc kubenswrapper[5004]: I1201 08:38:46.561674 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbab4428-7b77-4b11-87b3-d720250c9b77-config\") pod \"bbab4428-7b77-4b11-87b3-d720250c9b77\" (UID: \"bbab4428-7b77-4b11-87b3-d720250c9b77\") " Dec 01 08:38:46 crc kubenswrapper[5004]: I1201 08:38:46.561695 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7fe1d5ea-e24c-44b9-9de2-5011b3fc04fd-dns-svc\") pod \"7fe1d5ea-e24c-44b9-9de2-5011b3fc04fd\" (UID: \"7fe1d5ea-e24c-44b9-9de2-5011b3fc04fd\") " Dec 01 08:38:46 crc kubenswrapper[5004]: I1201 08:38:46.561794 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fe1d5ea-e24c-44b9-9de2-5011b3fc04fd-config\") pod \"7fe1d5ea-e24c-44b9-9de2-5011b3fc04fd\" (UID: \"7fe1d5ea-e24c-44b9-9de2-5011b3fc04fd\") " Dec 01 08:38:46 crc kubenswrapper[5004]: I1201 08:38:46.578319 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fe1d5ea-e24c-44b9-9de2-5011b3fc04fd-kube-api-access-vsp2g" (OuterVolumeSpecName: "kube-api-access-vsp2g") pod "7fe1d5ea-e24c-44b9-9de2-5011b3fc04fd" (UID: "7fe1d5ea-e24c-44b9-9de2-5011b3fc04fd"). InnerVolumeSpecName "kube-api-access-vsp2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:38:46 crc kubenswrapper[5004]: I1201 08:38:46.578822 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbab4428-7b77-4b11-87b3-d720250c9b77-kube-api-access-pltws" (OuterVolumeSpecName: "kube-api-access-pltws") pod "bbab4428-7b77-4b11-87b3-d720250c9b77" (UID: "bbab4428-7b77-4b11-87b3-d720250c9b77"). InnerVolumeSpecName "kube-api-access-pltws". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:38:46 crc kubenswrapper[5004]: I1201 08:38:46.627218 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fe1d5ea-e24c-44b9-9de2-5011b3fc04fd-config" (OuterVolumeSpecName: "config") pod "7fe1d5ea-e24c-44b9-9de2-5011b3fc04fd" (UID: "7fe1d5ea-e24c-44b9-9de2-5011b3fc04fd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:38:46 crc kubenswrapper[5004]: I1201 08:38:46.665688 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vsp2g\" (UniqueName: \"kubernetes.io/projected/7fe1d5ea-e24c-44b9-9de2-5011b3fc04fd-kube-api-access-vsp2g\") on node \"crc\" DevicePath \"\"" Dec 01 08:38:46 crc kubenswrapper[5004]: I1201 08:38:46.665713 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pltws\" (UniqueName: \"kubernetes.io/projected/bbab4428-7b77-4b11-87b3-d720250c9b77-kube-api-access-pltws\") on node \"crc\" DevicePath \"\"" Dec 01 08:38:46 crc kubenswrapper[5004]: I1201 08:38:46.665722 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fe1d5ea-e24c-44b9-9de2-5011b3fc04fd-config\") on node \"crc\" DevicePath \"\"" Dec 01 08:38:46 crc kubenswrapper[5004]: I1201 08:38:46.671190 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bbab4428-7b77-4b11-87b3-d720250c9b77-config" (OuterVolumeSpecName: "config") pod "bbab4428-7b77-4b11-87b3-d720250c9b77" (UID: "bbab4428-7b77-4b11-87b3-d720250c9b77"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:38:46 crc kubenswrapper[5004]: I1201 08:38:46.684612 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bbab4428-7b77-4b11-87b3-d720250c9b77-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bbab4428-7b77-4b11-87b3-d720250c9b77" (UID: "bbab4428-7b77-4b11-87b3-d720250c9b77"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:38:46 crc kubenswrapper[5004]: I1201 08:38:46.767208 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbab4428-7b77-4b11-87b3-d720250c9b77-config\") on node \"crc\" DevicePath \"\"" Dec 01 08:38:46 crc kubenswrapper[5004]: I1201 08:38:46.767236 5004 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bbab4428-7b77-4b11-87b3-d720250c9b77-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 08:38:46 crc kubenswrapper[5004]: I1201 08:38:46.791733 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-tbwk4"] Dec 01 08:38:46 crc kubenswrapper[5004]: I1201 08:38:46.886462 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-wl2nn"] Dec 01 08:38:46 crc kubenswrapper[5004]: W1201 08:38:46.965213 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ce540a3_feaa_469f_85cf_ec800dd6b1bc.slice/crio-a89e8377f4b01b5ed6d4cbe04ed093c316cd9ff455c524df77f84864c669d4ef WatchSource:0}: Error finding container a89e8377f4b01b5ed6d4cbe04ed093c316cd9ff455c524df77f84864c669d4ef: Status 404 returned error can't find the container with id a89e8377f4b01b5ed6d4cbe04ed093c316cd9ff455c524df77f84864c669d4ef Dec 01 08:38:47 crc kubenswrapper[5004]: I1201 08:38:47.136119 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-4vb89"] Dec 01 08:38:47 crc kubenswrapper[5004]: I1201 08:38:47.225626 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-bvrdk" event={"ID":"bbab4428-7b77-4b11-87b3-d720250c9b77","Type":"ContainerDied","Data":"de65a58fd6478b999c51e4d7295ab1840ebf1aba822aa704f9a9dea79943fec1"} Dec 01 08:38:47 crc kubenswrapper[5004]: I1201 08:38:47.225704 5004 scope.go:117] "RemoveContainer" containerID="d93e4648a465a94411d3c539eb3775cf8b01b0fe9c299cfe89390de91e851d9e" Dec 01 08:38:47 crc kubenswrapper[5004]: I1201 08:38:47.225651 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-bvrdk" Dec 01 08:38:47 crc kubenswrapper[5004]: I1201 08:38:47.229593 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-hrcgz" event={"ID":"7fe1d5ea-e24c-44b9-9de2-5011b3fc04fd","Type":"ContainerDied","Data":"1f869fa7e415961f96123299de2ed0526f9521e7cfd496f9c69b5c7e73e24d8d"} Dec 01 08:38:47 crc kubenswrapper[5004]: I1201 08:38:47.229695 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-hrcgz" Dec 01 08:38:47 crc kubenswrapper[5004]: I1201 08:38:47.233727 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"04a6dd3a-f297-40b9-b480-0239383b9460","Type":"ContainerStarted","Data":"50ca48ebeb17aa1774a9773b52976dd8a57699e6cdf266ee73a5624b000e185d"} Dec 01 08:38:47 crc kubenswrapper[5004]: I1201 08:38:47.235786 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-tbwk4" event={"ID":"cfb808bc-b59f-492b-a3aa-d817263501a5","Type":"ContainerStarted","Data":"c31688e40439c2b213dabfbf1c7595b3ede6ab55fe542d57c6fbd1b6fd3e000d"} Dec 01 08:38:47 crc kubenswrapper[5004]: I1201 08:38:47.238330 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-wl2nn" event={"ID":"7ce540a3-feaa-469f-85cf-ec800dd6b1bc","Type":"ContainerStarted","Data":"a89e8377f4b01b5ed6d4cbe04ed093c316cd9ff455c524df77f84864c669d4ef"} Dec 01 08:38:47 crc kubenswrapper[5004]: I1201 08:38:47.240040 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-8qrgn" event={"ID":"0b64b7d6-2fe8-43b0-9632-84e70a749fe9","Type":"ContainerStarted","Data":"0363417b7abf7d86f188da842ba7e3f52e91f50cf37f70b816a42ac5c0e7d081"} Dec 01 08:38:47 crc kubenswrapper[5004]: W1201 08:38:47.306980 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podabf31963_3bf7_4b6e_adaa_8605634a9530.slice/crio-f0738a447292676b4bb7a8c51cb318e06595e9bdd94da6807b66fb438a2a5d73 WatchSource:0}: Error finding container f0738a447292676b4bb7a8c51cb318e06595e9bdd94da6807b66fb438a2a5d73: Status 404 returned error can't find the container with id f0738a447292676b4bb7a8c51cb318e06595e9bdd94da6807b66fb438a2a5d73 Dec 01 08:38:47 crc kubenswrapper[5004]: I1201 08:38:47.738812 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fe1d5ea-e24c-44b9-9de2-5011b3fc04fd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7fe1d5ea-e24c-44b9-9de2-5011b3fc04fd" (UID: "7fe1d5ea-e24c-44b9-9de2-5011b3fc04fd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:38:47 crc kubenswrapper[5004]: I1201 08:38:47.792291 5004 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7fe1d5ea-e24c-44b9-9de2-5011b3fc04fd-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 08:38:48 crc kubenswrapper[5004]: I1201 08:38:48.294096 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"08dfc50d-80b8-4885-826d-4a8314b46234","Type":"ContainerStarted","Data":"551a6091d025161f656f4b4c8d72e24cb5c2a734dc8bbeebc280ab8a97b61997"} Dec 01 08:38:48 crc kubenswrapper[5004]: I1201 08:38:48.295599 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 01 08:38:48 crc kubenswrapper[5004]: I1201 08:38:48.298282 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e17a426c-0069-4c51-91ad-e5fbf6e0bb2a","Type":"ContainerStarted","Data":"ccde5507419ccdb6bb1307ad7276e94115097f8e8b951d4a4f702511b46356d2"} Dec 01 08:38:48 crc kubenswrapper[5004]: I1201 08:38:48.301585 5004 generic.go:334] "Generic (PLEG): container finished" podID="0b64b7d6-2fe8-43b0-9632-84e70a749fe9" containerID="0363417b7abf7d86f188da842ba7e3f52e91f50cf37f70b816a42ac5c0e7d081" exitCode=0 Dec 01 08:38:48 crc kubenswrapper[5004]: I1201 08:38:48.301633 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-8qrgn" event={"ID":"0b64b7d6-2fe8-43b0-9632-84e70a749fe9","Type":"ContainerDied","Data":"0363417b7abf7d86f188da842ba7e3f52e91f50cf37f70b816a42ac5c0e7d081"} Dec 01 08:38:48 crc kubenswrapper[5004]: I1201 08:38:48.304313 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-4vb89" event={"ID":"abf31963-3bf7-4b6e-adaa-8605634a9530","Type":"ContainerStarted","Data":"f0738a447292676b4bb7a8c51cb318e06595e9bdd94da6807b66fb438a2a5d73"} Dec 01 08:38:48 crc kubenswrapper[5004]: I1201 08:38:48.323000 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=16.304627951 podStartE2EDuration="29.322984131s" podCreationTimestamp="2025-12-01 08:38:19 +0000 UTC" firstStartedPulling="2025-12-01 08:38:33.297873301 +0000 UTC m=+1290.862865283" lastFinishedPulling="2025-12-01 08:38:46.316229481 +0000 UTC m=+1303.881221463" observedRunningTime="2025-12-01 08:38:48.322363607 +0000 UTC m=+1305.887355589" watchObservedRunningTime="2025-12-01 08:38:48.322984131 +0000 UTC m=+1305.887976113" Dec 01 08:38:48 crc kubenswrapper[5004]: I1201 08:38:48.382704 5004 scope.go:117] "RemoveContainer" containerID="4a986e73a4741cdd92ca2dd2ca35677a113b10773a209e823827a6e0c8f57d38" Dec 01 08:38:48 crc kubenswrapper[5004]: I1201 08:38:48.407857 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-bvrdk"] Dec 01 08:38:48 crc kubenswrapper[5004]: I1201 08:38:48.421393 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-bvrdk"] Dec 01 08:38:48 crc kubenswrapper[5004]: I1201 08:38:48.432861 5004 scope.go:117] "RemoveContainer" containerID="3df8d11558a7bc0eb49a8052c0f22d46a69e0172109f9fd14bf498c269b2e7ad" Dec 01 08:38:48 crc kubenswrapper[5004]: I1201 08:38:48.434513 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-hrcgz"] Dec 01 08:38:48 crc kubenswrapper[5004]: I1201 08:38:48.443729 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-hrcgz"] Dec 01 08:38:48 crc kubenswrapper[5004]: I1201 08:38:48.696132 5004 scope.go:117] "RemoveContainer" containerID="11f560d5ee547328e29fe33ff5aac6d254796633ea0b39b787ba0fce806e015f" Dec 01 08:38:48 crc kubenswrapper[5004]: I1201 08:38:48.772139 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fe1d5ea-e24c-44b9-9de2-5011b3fc04fd" path="/var/lib/kubelet/pods/7fe1d5ea-e24c-44b9-9de2-5011b3fc04fd/volumes" Dec 01 08:38:49 crc kubenswrapper[5004]: I1201 08:38:48.772909 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbab4428-7b77-4b11-87b3-d720250c9b77" path="/var/lib/kubelet/pods/bbab4428-7b77-4b11-87b3-d720250c9b77/volumes" Dec 01 08:38:49 crc kubenswrapper[5004]: I1201 08:38:49.327196 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"bb30b7a7-42e1-421b-8673-7f3c8f5cfae3","Type":"ContainerStarted","Data":"99aea3f8161b028b3915e2c0d66375add82d5cf3a0bd8662ab97ca8898b982b4"} Dec 01 08:38:49 crc kubenswrapper[5004]: I1201 08:38:49.330508 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b571e2e5-2a78-45af-83aa-3d874b2569b3","Type":"ContainerStarted","Data":"17c1164a1a9ddf12e0f7bb16f0fda29c357c85933a430ff5b061e9d6f7746e89"} Dec 01 08:38:50 crc kubenswrapper[5004]: I1201 08:38:50.344968 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-r8x2b" event={"ID":"effd853b-0b95-4749-8119-88fcfaf8b0c0","Type":"ContainerStarted","Data":"ffcad7e1358a7bc74a3598b9654332eb91570c78cdb49282187c5d14934c9b23"} Dec 01 08:38:50 crc kubenswrapper[5004]: I1201 08:38:50.345427 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-r8x2b" Dec 01 08:38:50 crc kubenswrapper[5004]: I1201 08:38:50.346483 5004 generic.go:334] "Generic (PLEG): container finished" podID="7ce540a3-feaa-469f-85cf-ec800dd6b1bc" containerID="913ed74d61b950181258d4d507f390352ca9adcfdabb5124622e6e16451c41b1" exitCode=0 Dec 01 08:38:50 crc kubenswrapper[5004]: I1201 08:38:50.346544 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-wl2nn" event={"ID":"7ce540a3-feaa-469f-85cf-ec800dd6b1bc","Type":"ContainerDied","Data":"913ed74d61b950181258d4d507f390352ca9adcfdabb5124622e6e16451c41b1"} Dec 01 08:38:50 crc kubenswrapper[5004]: I1201 08:38:50.348772 5004 generic.go:334] "Generic (PLEG): container finished" podID="abf31963-3bf7-4b6e-adaa-8605634a9530" containerID="a01d23049691a77af21d859af56ed16e7ec48e3ec9e1c532027480d15cdf4b5b" exitCode=0 Dec 01 08:38:50 crc kubenswrapper[5004]: I1201 08:38:50.348815 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-4vb89" event={"ID":"abf31963-3bf7-4b6e-adaa-8605634a9530","Type":"ContainerDied","Data":"a01d23049691a77af21d859af56ed16e7ec48e3ec9e1c532027480d15cdf4b5b"} Dec 01 08:38:50 crc kubenswrapper[5004]: I1201 08:38:50.350648 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-5htt9" event={"ID":"1ff890d7-d00c-4b87-86d6-3eb403821ee3","Type":"ContainerStarted","Data":"a3322f9ba9936edd537afc51c4d7a41468fe340e539bbdafc123988bca0e5046"} Dec 01 08:38:50 crc kubenswrapper[5004]: I1201 08:38:50.353146 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5fcd0c1b-820e-4bc1-b3fd-22ac13415e3c","Type":"ContainerStarted","Data":"a3c90aa3298773adaa0bf995f7b18049ea6e437f21b642b2f55dbc4a2b40eb07"} Dec 01 08:38:50 crc kubenswrapper[5004]: I1201 08:38:50.353864 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 01 08:38:50 crc kubenswrapper[5004]: I1201 08:38:50.358908 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-8qrgn" event={"ID":"0b64b7d6-2fe8-43b0-9632-84e70a749fe9","Type":"ContainerStarted","Data":"09034b00a78844d078b5e22dfd935373dd46c98a5efb3f26994415882c6cba9d"} Dec 01 08:38:50 crc kubenswrapper[5004]: I1201 08:38:50.360549 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"6362259b-bac4-4df3-ad0c-d76511731aae","Type":"ContainerStarted","Data":"5cd1c7134250309988a56ca8118c4b5c485de0c8b707692b39ac695edd61319c"} Dec 01 08:38:50 crc kubenswrapper[5004]: I1201 08:38:50.365910 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-r8x2b" podStartSLOduration=11.356214104 podStartE2EDuration="24.365897748s" podCreationTimestamp="2025-12-01 08:38:26 +0000 UTC" firstStartedPulling="2025-12-01 08:38:33.297281937 +0000 UTC m=+1290.862273909" lastFinishedPulling="2025-12-01 08:38:46.306965571 +0000 UTC m=+1303.871957553" observedRunningTime="2025-12-01 08:38:50.361961294 +0000 UTC m=+1307.926953276" watchObservedRunningTime="2025-12-01 08:38:50.365897748 +0000 UTC m=+1307.930889730" Dec 01 08:38:50 crc kubenswrapper[5004]: I1201 08:38:50.368323 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c74b987d-43c4-49d9-92fc-f13f3b7b7dd2","Type":"ContainerStarted","Data":"a6b3fce395401c09921bbca6309f7298ab2c3a95b03a4fc68906e05780c8c3de"} Dec 01 08:38:50 crc kubenswrapper[5004]: I1201 08:38:50.371549 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"c5ea92c9-9b0a-473a-872f-a78f27946432","Type":"ContainerStarted","Data":"434f692a5a3b646469b6f4178019a127d25d960fb96d4511b9eb02bc63c4c90a"} Dec 01 08:38:50 crc kubenswrapper[5004]: I1201 08:38:50.382093 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=15.058959365 podStartE2EDuration="29.382079232s" podCreationTimestamp="2025-12-01 08:38:21 +0000 UTC" firstStartedPulling="2025-12-01 08:38:32.013061087 +0000 UTC m=+1289.578053069" lastFinishedPulling="2025-12-01 08:38:46.336180954 +0000 UTC m=+1303.901172936" observedRunningTime="2025-12-01 08:38:50.377912343 +0000 UTC m=+1307.942904385" watchObservedRunningTime="2025-12-01 08:38:50.382079232 +0000 UTC m=+1307.947071214" Dec 01 08:38:50 crc kubenswrapper[5004]: I1201 08:38:50.444814 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-5htt9" podStartSLOduration=14.117792439 podStartE2EDuration="28.444793848s" podCreationTimestamp="2025-12-01 08:38:22 +0000 UTC" firstStartedPulling="2025-12-01 08:38:32.00980151 +0000 UTC m=+1289.574793492" lastFinishedPulling="2025-12-01 08:38:46.336802919 +0000 UTC m=+1303.901794901" observedRunningTime="2025-12-01 08:38:50.431242507 +0000 UTC m=+1307.996234499" watchObservedRunningTime="2025-12-01 08:38:50.444793848 +0000 UTC m=+1308.009785840" Dec 01 08:38:50 crc kubenswrapper[5004]: I1201 08:38:50.535664 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-666b6646f7-hrcgz" podUID="7fe1d5ea-e24c-44b9-9de2-5011b3fc04fd" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.127:5353: i/o timeout" Dec 01 08:38:50 crc kubenswrapper[5004]: I1201 08:38:50.994738 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57d769cc4f-bvrdk" podUID="bbab4428-7b77-4b11-87b3-d720250c9b77" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.128:5353: i/o timeout" Dec 01 08:38:51 crc kubenswrapper[5004]: I1201 08:38:51.393167 5004 generic.go:334] "Generic (PLEG): container finished" podID="04a6dd3a-f297-40b9-b480-0239383b9460" containerID="50ca48ebeb17aa1774a9773b52976dd8a57699e6cdf266ee73a5624b000e185d" exitCode=0 Dec 01 08:38:51 crc kubenswrapper[5004]: I1201 08:38:51.393984 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"04a6dd3a-f297-40b9-b480-0239383b9460","Type":"ContainerDied","Data":"50ca48ebeb17aa1774a9773b52976dd8a57699e6cdf266ee73a5624b000e185d"} Dec 01 08:38:52 crc kubenswrapper[5004]: I1201 08:38:52.421282 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"c5ea92c9-9b0a-473a-872f-a78f27946432","Type":"ContainerStarted","Data":"8dbd70d4afc7ceed1cb7786bb7247b4eea8d46935786e18e7dcb167fe7561dcf"} Dec 01 08:38:52 crc kubenswrapper[5004]: I1201 08:38:52.424938 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-wl2nn" event={"ID":"7ce540a3-feaa-469f-85cf-ec800dd6b1bc","Type":"ContainerStarted","Data":"585d7ecc59736bdb2b1b8bfa3d3b43cd270853d71ec3122117fbf8cd85b4bd42"} Dec 01 08:38:52 crc kubenswrapper[5004]: I1201 08:38:52.425165 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5bf47b49b7-wl2nn" Dec 01 08:38:52 crc kubenswrapper[5004]: I1201 08:38:52.430931 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-8qrgn" event={"ID":"0b64b7d6-2fe8-43b0-9632-84e70a749fe9","Type":"ContainerStarted","Data":"9b10586032b9b044a5d32e8642822bfd81d677e2b59fb9b47fe60b8552317254"} Dec 01 08:38:52 crc kubenswrapper[5004]: I1201 08:38:52.431170 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-8qrgn" Dec 01 08:38:52 crc kubenswrapper[5004]: I1201 08:38:52.431248 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-8qrgn" Dec 01 08:38:52 crc kubenswrapper[5004]: I1201 08:38:52.434518 5004 generic.go:334] "Generic (PLEG): container finished" podID="bb30b7a7-42e1-421b-8673-7f3c8f5cfae3" containerID="99aea3f8161b028b3915e2c0d66375add82d5cf3a0bd8662ab97ca8898b982b4" exitCode=0 Dec 01 08:38:52 crc kubenswrapper[5004]: I1201 08:38:52.434735 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"bb30b7a7-42e1-421b-8673-7f3c8f5cfae3","Type":"ContainerDied","Data":"99aea3f8161b028b3915e2c0d66375add82d5cf3a0bd8662ab97ca8898b982b4"} Dec 01 08:38:52 crc kubenswrapper[5004]: I1201 08:38:52.441408 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"04a6dd3a-f297-40b9-b480-0239383b9460","Type":"ContainerStarted","Data":"ca9feaceffe1912e458d7188b6aa6334073d7367f02b49203eb05797f2a1f5cf"} Dec 01 08:38:52 crc kubenswrapper[5004]: I1201 08:38:52.447961 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-4vb89" event={"ID":"abf31963-3bf7-4b6e-adaa-8605634a9530","Type":"ContainerStarted","Data":"b06071e0f6127f7f903da48eaf50e37a9897f2c96e95fe95a19e061fad7210f3"} Dec 01 08:38:52 crc kubenswrapper[5004]: I1201 08:38:52.450874 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8554648995-4vb89" Dec 01 08:38:52 crc kubenswrapper[5004]: I1201 08:38:52.460093 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=10.77024249 podStartE2EDuration="28.46006927s" podCreationTimestamp="2025-12-01 08:38:24 +0000 UTC" firstStartedPulling="2025-12-01 08:38:33.877176188 +0000 UTC m=+1291.442168170" lastFinishedPulling="2025-12-01 08:38:51.567002968 +0000 UTC m=+1309.131994950" observedRunningTime="2025-12-01 08:38:52.443689601 +0000 UTC m=+1310.008681663" watchObservedRunningTime="2025-12-01 08:38:52.46006927 +0000 UTC m=+1310.025061252" Dec 01 08:38:52 crc kubenswrapper[5004]: I1201 08:38:52.467068 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 01 08:38:52 crc kubenswrapper[5004]: I1201 08:38:52.468966 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"6362259b-bac4-4df3-ad0c-d76511731aae","Type":"ContainerStarted","Data":"d00a70bb807c864773b6ffa0feea1ac9b31e16737e5900633fb1eab28c6a2508"} Dec 01 08:38:52 crc kubenswrapper[5004]: I1201 08:38:52.480964 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-tbwk4" event={"ID":"cfb808bc-b59f-492b-a3aa-d817263501a5","Type":"ContainerStarted","Data":"d5c8cfefe1fc12f6bbe44f6cc6f42104b2caf6a181a7d08424efb1a470959ca3"} Dec 01 08:38:52 crc kubenswrapper[5004]: I1201 08:38:52.533860 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 01 08:38:52 crc kubenswrapper[5004]: I1201 08:38:52.534011 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5bf47b49b7-wl2nn" podStartSLOduration=13.53391457 podStartE2EDuration="13.53391457s" podCreationTimestamp="2025-12-01 08:38:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:38:52.521844293 +0000 UTC m=+1310.086836285" watchObservedRunningTime="2025-12-01 08:38:52.53391457 +0000 UTC m=+1310.098906562" Dec 01 08:38:52 crc kubenswrapper[5004]: I1201 08:38:52.557155 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-8qrgn" podStartSLOduration=12.821898822 podStartE2EDuration="26.557138839s" podCreationTimestamp="2025-12-01 08:38:26 +0000 UTC" firstStartedPulling="2025-12-01 08:38:32.261546695 +0000 UTC m=+1289.826538687" lastFinishedPulling="2025-12-01 08:38:45.996786682 +0000 UTC m=+1303.561778704" observedRunningTime="2025-12-01 08:38:52.548979206 +0000 UTC m=+1310.113971198" watchObservedRunningTime="2025-12-01 08:38:52.557138839 +0000 UTC m=+1310.122130821" Dec 01 08:38:52 crc kubenswrapper[5004]: I1201 08:38:52.580345 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=20.690176267 podStartE2EDuration="35.580326069s" podCreationTimestamp="2025-12-01 08:38:17 +0000 UTC" firstStartedPulling="2025-12-01 08:38:31.04375377 +0000 UTC m=+1288.608745792" lastFinishedPulling="2025-12-01 08:38:45.933903612 +0000 UTC m=+1303.498895594" observedRunningTime="2025-12-01 08:38:52.572326599 +0000 UTC m=+1310.137318591" watchObservedRunningTime="2025-12-01 08:38:52.580326069 +0000 UTC m=+1310.145318051" Dec 01 08:38:52 crc kubenswrapper[5004]: I1201 08:38:52.626765 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8554648995-4vb89" podStartSLOduration=13.626748169 podStartE2EDuration="13.626748169s" podCreationTimestamp="2025-12-01 08:38:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:38:52.621017893 +0000 UTC m=+1310.186009875" watchObservedRunningTime="2025-12-01 08:38:52.626748169 +0000 UTC m=+1310.191740151" Dec 01 08:38:52 crc kubenswrapper[5004]: I1201 08:38:52.640508 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=6.28996876 podStartE2EDuration="24.640491205s" podCreationTimestamp="2025-12-01 08:38:28 +0000 UTC" firstStartedPulling="2025-12-01 08:38:33.292295379 +0000 UTC m=+1290.857287361" lastFinishedPulling="2025-12-01 08:38:51.642817814 +0000 UTC m=+1309.207809806" observedRunningTime="2025-12-01 08:38:52.639174054 +0000 UTC m=+1310.204166026" watchObservedRunningTime="2025-12-01 08:38:52.640491205 +0000 UTC m=+1310.205483187" Dec 01 08:38:52 crc kubenswrapper[5004]: I1201 08:38:52.653804 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-tbwk4" podStartSLOduration=9.983131537 podStartE2EDuration="14.653784889s" podCreationTimestamp="2025-12-01 08:38:38 +0000 UTC" firstStartedPulling="2025-12-01 08:38:46.81799251 +0000 UTC m=+1304.382984492" lastFinishedPulling="2025-12-01 08:38:51.488645862 +0000 UTC m=+1309.053637844" observedRunningTime="2025-12-01 08:38:52.653538134 +0000 UTC m=+1310.218530126" watchObservedRunningTime="2025-12-01 08:38:52.653784889 +0000 UTC m=+1310.218776871" Dec 01 08:38:53 crc kubenswrapper[5004]: I1201 08:38:53.496831 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"bb30b7a7-42e1-421b-8673-7f3c8f5cfae3","Type":"ContainerStarted","Data":"86488a1c914ebbab15d641c9862b2d95b9a274a5b6661a1890cf021d388a4410"} Dec 01 08:38:53 crc kubenswrapper[5004]: I1201 08:38:53.497400 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 01 08:38:53 crc kubenswrapper[5004]: I1201 08:38:53.543742 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=20.263429701 podStartE2EDuration="35.543721257s" podCreationTimestamp="2025-12-01 08:38:18 +0000 UTC" firstStartedPulling="2025-12-01 08:38:31.035066214 +0000 UTC m=+1288.600058206" lastFinishedPulling="2025-12-01 08:38:46.31535778 +0000 UTC m=+1303.880349762" observedRunningTime="2025-12-01 08:38:53.521003149 +0000 UTC m=+1311.085995171" watchObservedRunningTime="2025-12-01 08:38:53.543721257 +0000 UTC m=+1311.108713249" Dec 01 08:38:53 crc kubenswrapper[5004]: I1201 08:38:53.970636 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 01 08:38:54 crc kubenswrapper[5004]: I1201 08:38:54.042854 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 01 08:38:54 crc kubenswrapper[5004]: I1201 08:38:54.505483 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 01 08:38:54 crc kubenswrapper[5004]: I1201 08:38:54.657188 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 01 08:38:54 crc kubenswrapper[5004]: I1201 08:38:54.660155 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 01 08:38:55 crc kubenswrapper[5004]: I1201 08:38:55.061229 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 01 08:38:55 crc kubenswrapper[5004]: E1201 08:38:55.061715 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbab4428-7b77-4b11-87b3-d720250c9b77" containerName="init" Dec 01 08:38:55 crc kubenswrapper[5004]: I1201 08:38:55.061734 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbab4428-7b77-4b11-87b3-d720250c9b77" containerName="init" Dec 01 08:38:55 crc kubenswrapper[5004]: E1201 08:38:55.061775 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fe1d5ea-e24c-44b9-9de2-5011b3fc04fd" containerName="dnsmasq-dns" Dec 01 08:38:55 crc kubenswrapper[5004]: I1201 08:38:55.061784 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fe1d5ea-e24c-44b9-9de2-5011b3fc04fd" containerName="dnsmasq-dns" Dec 01 08:38:55 crc kubenswrapper[5004]: E1201 08:38:55.061804 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbab4428-7b77-4b11-87b3-d720250c9b77" containerName="dnsmasq-dns" Dec 01 08:38:55 crc kubenswrapper[5004]: I1201 08:38:55.061813 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbab4428-7b77-4b11-87b3-d720250c9b77" containerName="dnsmasq-dns" Dec 01 08:38:55 crc kubenswrapper[5004]: E1201 08:38:55.061834 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fe1d5ea-e24c-44b9-9de2-5011b3fc04fd" containerName="init" Dec 01 08:38:55 crc kubenswrapper[5004]: I1201 08:38:55.061842 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fe1d5ea-e24c-44b9-9de2-5011b3fc04fd" containerName="init" Dec 01 08:38:55 crc kubenswrapper[5004]: I1201 08:38:55.062078 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fe1d5ea-e24c-44b9-9de2-5011b3fc04fd" containerName="dnsmasq-dns" Dec 01 08:38:55 crc kubenswrapper[5004]: I1201 08:38:55.062107 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbab4428-7b77-4b11-87b3-d720250c9b77" containerName="dnsmasq-dns" Dec 01 08:38:55 crc kubenswrapper[5004]: I1201 08:38:55.063398 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 01 08:38:55 crc kubenswrapper[5004]: I1201 08:38:55.069741 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Dec 01 08:38:55 crc kubenswrapper[5004]: I1201 08:38:55.070005 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-2rwzd" Dec 01 08:38:55 crc kubenswrapper[5004]: I1201 08:38:55.070065 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 01 08:38:55 crc kubenswrapper[5004]: I1201 08:38:55.070157 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 01 08:38:55 crc kubenswrapper[5004]: I1201 08:38:55.086156 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 01 08:38:55 crc kubenswrapper[5004]: I1201 08:38:55.137753 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e7d0d76d-60c9-4ba6-b29b-8c6bc8a4d499-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"e7d0d76d-60c9-4ba6-b29b-8c6bc8a4d499\") " pod="openstack/ovn-northd-0" Dec 01 08:38:55 crc kubenswrapper[5004]: I1201 08:38:55.137821 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkk98\" (UniqueName: \"kubernetes.io/projected/e7d0d76d-60c9-4ba6-b29b-8c6bc8a4d499-kube-api-access-tkk98\") pod \"ovn-northd-0\" (UID: \"e7d0d76d-60c9-4ba6-b29b-8c6bc8a4d499\") " pod="openstack/ovn-northd-0" Dec 01 08:38:55 crc kubenswrapper[5004]: I1201 08:38:55.138027 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7d0d76d-60c9-4ba6-b29b-8c6bc8a4d499-config\") pod \"ovn-northd-0\" (UID: \"e7d0d76d-60c9-4ba6-b29b-8c6bc8a4d499\") " pod="openstack/ovn-northd-0" Dec 01 08:38:55 crc kubenswrapper[5004]: I1201 08:38:55.138079 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7d0d76d-60c9-4ba6-b29b-8c6bc8a4d499-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"e7d0d76d-60c9-4ba6-b29b-8c6bc8a4d499\") " pod="openstack/ovn-northd-0" Dec 01 08:38:55 crc kubenswrapper[5004]: I1201 08:38:55.138139 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e7d0d76d-60c9-4ba6-b29b-8c6bc8a4d499-scripts\") pod \"ovn-northd-0\" (UID: \"e7d0d76d-60c9-4ba6-b29b-8c6bc8a4d499\") " pod="openstack/ovn-northd-0" Dec 01 08:38:55 crc kubenswrapper[5004]: I1201 08:38:55.138289 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7d0d76d-60c9-4ba6-b29b-8c6bc8a4d499-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"e7d0d76d-60c9-4ba6-b29b-8c6bc8a4d499\") " pod="openstack/ovn-northd-0" Dec 01 08:38:55 crc kubenswrapper[5004]: I1201 08:38:55.138436 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7d0d76d-60c9-4ba6-b29b-8c6bc8a4d499-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"e7d0d76d-60c9-4ba6-b29b-8c6bc8a4d499\") " pod="openstack/ovn-northd-0" Dec 01 08:38:55 crc kubenswrapper[5004]: I1201 08:38:55.212253 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 01 08:38:55 crc kubenswrapper[5004]: I1201 08:38:55.240880 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7d0d76d-60c9-4ba6-b29b-8c6bc8a4d499-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"e7d0d76d-60c9-4ba6-b29b-8c6bc8a4d499\") " pod="openstack/ovn-northd-0" Dec 01 08:38:55 crc kubenswrapper[5004]: I1201 08:38:55.240957 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7d0d76d-60c9-4ba6-b29b-8c6bc8a4d499-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"e7d0d76d-60c9-4ba6-b29b-8c6bc8a4d499\") " pod="openstack/ovn-northd-0" Dec 01 08:38:55 crc kubenswrapper[5004]: I1201 08:38:55.241025 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e7d0d76d-60c9-4ba6-b29b-8c6bc8a4d499-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"e7d0d76d-60c9-4ba6-b29b-8c6bc8a4d499\") " pod="openstack/ovn-northd-0" Dec 01 08:38:55 crc kubenswrapper[5004]: I1201 08:38:55.241050 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkk98\" (UniqueName: \"kubernetes.io/projected/e7d0d76d-60c9-4ba6-b29b-8c6bc8a4d499-kube-api-access-tkk98\") pod \"ovn-northd-0\" (UID: \"e7d0d76d-60c9-4ba6-b29b-8c6bc8a4d499\") " pod="openstack/ovn-northd-0" Dec 01 08:38:55 crc kubenswrapper[5004]: I1201 08:38:55.241098 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7d0d76d-60c9-4ba6-b29b-8c6bc8a4d499-config\") pod \"ovn-northd-0\" (UID: \"e7d0d76d-60c9-4ba6-b29b-8c6bc8a4d499\") " pod="openstack/ovn-northd-0" Dec 01 08:38:55 crc kubenswrapper[5004]: I1201 08:38:55.241120 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7d0d76d-60c9-4ba6-b29b-8c6bc8a4d499-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"e7d0d76d-60c9-4ba6-b29b-8c6bc8a4d499\") " pod="openstack/ovn-northd-0" Dec 01 08:38:55 crc kubenswrapper[5004]: I1201 08:38:55.241145 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e7d0d76d-60c9-4ba6-b29b-8c6bc8a4d499-scripts\") pod \"ovn-northd-0\" (UID: \"e7d0d76d-60c9-4ba6-b29b-8c6bc8a4d499\") " pod="openstack/ovn-northd-0" Dec 01 08:38:55 crc kubenswrapper[5004]: I1201 08:38:55.241900 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e7d0d76d-60c9-4ba6-b29b-8c6bc8a4d499-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"e7d0d76d-60c9-4ba6-b29b-8c6bc8a4d499\") " pod="openstack/ovn-northd-0" Dec 01 08:38:55 crc kubenswrapper[5004]: I1201 08:38:55.242157 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7d0d76d-60c9-4ba6-b29b-8c6bc8a4d499-config\") pod \"ovn-northd-0\" (UID: \"e7d0d76d-60c9-4ba6-b29b-8c6bc8a4d499\") " pod="openstack/ovn-northd-0" Dec 01 08:38:55 crc kubenswrapper[5004]: I1201 08:38:55.242246 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e7d0d76d-60c9-4ba6-b29b-8c6bc8a4d499-scripts\") pod \"ovn-northd-0\" (UID: \"e7d0d76d-60c9-4ba6-b29b-8c6bc8a4d499\") " pod="openstack/ovn-northd-0" Dec 01 08:38:55 crc kubenswrapper[5004]: I1201 08:38:55.247217 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7d0d76d-60c9-4ba6-b29b-8c6bc8a4d499-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"e7d0d76d-60c9-4ba6-b29b-8c6bc8a4d499\") " pod="openstack/ovn-northd-0" Dec 01 08:38:55 crc kubenswrapper[5004]: I1201 08:38:55.247824 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7d0d76d-60c9-4ba6-b29b-8c6bc8a4d499-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"e7d0d76d-60c9-4ba6-b29b-8c6bc8a4d499\") " pod="openstack/ovn-northd-0" Dec 01 08:38:55 crc kubenswrapper[5004]: I1201 08:38:55.253111 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7d0d76d-60c9-4ba6-b29b-8c6bc8a4d499-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"e7d0d76d-60c9-4ba6-b29b-8c6bc8a4d499\") " pod="openstack/ovn-northd-0" Dec 01 08:38:55 crc kubenswrapper[5004]: I1201 08:38:55.260892 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkk98\" (UniqueName: \"kubernetes.io/projected/e7d0d76d-60c9-4ba6-b29b-8c6bc8a4d499-kube-api-access-tkk98\") pod \"ovn-northd-0\" (UID: \"e7d0d76d-60c9-4ba6-b29b-8c6bc8a4d499\") " pod="openstack/ovn-northd-0" Dec 01 08:38:55 crc kubenswrapper[5004]: I1201 08:38:55.391969 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 01 08:38:55 crc kubenswrapper[5004]: I1201 08:38:55.907214 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 01 08:38:55 crc kubenswrapper[5004]: W1201 08:38:55.910517 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7d0d76d_60c9_4ba6_b29b_8c6bc8a4d499.slice/crio-e5456fff12c8a61ca939a5fe6a663716e47bc40974962e05c4f37e0ea252365c WatchSource:0}: Error finding container e5456fff12c8a61ca939a5fe6a663716e47bc40974962e05c4f37e0ea252365c: Status 404 returned error can't find the container with id e5456fff12c8a61ca939a5fe6a663716e47bc40974962e05c4f37e0ea252365c Dec 01 08:38:56 crc kubenswrapper[5004]: I1201 08:38:56.536141 5004 generic.go:334] "Generic (PLEG): container finished" podID="c74b987d-43c4-49d9-92fc-f13f3b7b7dd2" containerID="a6b3fce395401c09921bbca6309f7298ab2c3a95b03a4fc68906e05780c8c3de" exitCode=0 Dec 01 08:38:56 crc kubenswrapper[5004]: I1201 08:38:56.536223 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c74b987d-43c4-49d9-92fc-f13f3b7b7dd2","Type":"ContainerDied","Data":"a6b3fce395401c09921bbca6309f7298ab2c3a95b03a4fc68906e05780c8c3de"} Dec 01 08:38:56 crc kubenswrapper[5004]: I1201 08:38:56.538339 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"e7d0d76d-60c9-4ba6-b29b-8c6bc8a4d499","Type":"ContainerStarted","Data":"e5456fff12c8a61ca939a5fe6a663716e47bc40974962e05c4f37e0ea252365c"} Dec 01 08:38:58 crc kubenswrapper[5004]: I1201 08:38:58.823753 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 01 08:38:58 crc kubenswrapper[5004]: I1201 08:38:58.825923 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 01 08:38:58 crc kubenswrapper[5004]: I1201 08:38:58.984366 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 01 08:38:59 crc kubenswrapper[5004]: I1201 08:38:59.455985 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5bf47b49b7-wl2nn" Dec 01 08:38:59 crc kubenswrapper[5004]: I1201 08:38:59.627808 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8554648995-4vb89" Dec 01 08:38:59 crc kubenswrapper[5004]: I1201 08:38:59.678745 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-wl2nn"] Dec 01 08:38:59 crc kubenswrapper[5004]: I1201 08:38:59.678955 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5bf47b49b7-wl2nn" podUID="7ce540a3-feaa-469f-85cf-ec800dd6b1bc" containerName="dnsmasq-dns" containerID="cri-o://585d7ecc59736bdb2b1b8bfa3d3b43cd270853d71ec3122117fbf8cd85b4bd42" gracePeriod=10 Dec 01 08:38:59 crc kubenswrapper[5004]: I1201 08:38:59.777736 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 01 08:38:59 crc kubenswrapper[5004]: I1201 08:38:59.884449 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 01 08:38:59 crc kubenswrapper[5004]: I1201 08:38:59.884485 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 01 08:39:00 crc kubenswrapper[5004]: I1201 08:39:00.038298 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 01 08:39:00 crc kubenswrapper[5004]: I1201 08:39:00.135841 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-3f6b-account-create-update-2gfdj"] Dec 01 08:39:00 crc kubenswrapper[5004]: I1201 08:39:00.137278 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3f6b-account-create-update-2gfdj" Dec 01 08:39:00 crc kubenswrapper[5004]: I1201 08:39:00.139197 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 01 08:39:00 crc kubenswrapper[5004]: I1201 08:39:00.156717 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-3f6b-account-create-update-2gfdj"] Dec 01 08:39:00 crc kubenswrapper[5004]: I1201 08:39:00.173108 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-d5gjb"] Dec 01 08:39:00 crc kubenswrapper[5004]: I1201 08:39:00.174512 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-d5gjb" Dec 01 08:39:00 crc kubenswrapper[5004]: I1201 08:39:00.189538 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-d5gjb"] Dec 01 08:39:00 crc kubenswrapper[5004]: I1201 08:39:00.267164 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmqzk\" (UniqueName: \"kubernetes.io/projected/9f70e608-5f44-45d3-9c98-d22ed21cf952-kube-api-access-xmqzk\") pod \"placement-db-create-d5gjb\" (UID: \"9f70e608-5f44-45d3-9c98-d22ed21cf952\") " pod="openstack/placement-db-create-d5gjb" Dec 01 08:39:00 crc kubenswrapper[5004]: I1201 08:39:00.267223 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b61fea80-031c-4ee8-b99f-562a8bb879eb-operator-scripts\") pod \"placement-3f6b-account-create-update-2gfdj\" (UID: \"b61fea80-031c-4ee8-b99f-562a8bb879eb\") " pod="openstack/placement-3f6b-account-create-update-2gfdj" Dec 01 08:39:00 crc kubenswrapper[5004]: I1201 08:39:00.267420 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f70e608-5f44-45d3-9c98-d22ed21cf952-operator-scripts\") pod \"placement-db-create-d5gjb\" (UID: \"9f70e608-5f44-45d3-9c98-d22ed21cf952\") " pod="openstack/placement-db-create-d5gjb" Dec 01 08:39:00 crc kubenswrapper[5004]: I1201 08:39:00.267515 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcr2k\" (UniqueName: \"kubernetes.io/projected/b61fea80-031c-4ee8-b99f-562a8bb879eb-kube-api-access-tcr2k\") pod \"placement-3f6b-account-create-update-2gfdj\" (UID: \"b61fea80-031c-4ee8-b99f-562a8bb879eb\") " pod="openstack/placement-3f6b-account-create-update-2gfdj" Dec 01 08:39:00 crc kubenswrapper[5004]: I1201 08:39:00.369296 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmqzk\" (UniqueName: \"kubernetes.io/projected/9f70e608-5f44-45d3-9c98-d22ed21cf952-kube-api-access-xmqzk\") pod \"placement-db-create-d5gjb\" (UID: \"9f70e608-5f44-45d3-9c98-d22ed21cf952\") " pod="openstack/placement-db-create-d5gjb" Dec 01 08:39:00 crc kubenswrapper[5004]: I1201 08:39:00.369389 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b61fea80-031c-4ee8-b99f-562a8bb879eb-operator-scripts\") pod \"placement-3f6b-account-create-update-2gfdj\" (UID: \"b61fea80-031c-4ee8-b99f-562a8bb879eb\") " pod="openstack/placement-3f6b-account-create-update-2gfdj" Dec 01 08:39:00 crc kubenswrapper[5004]: I1201 08:39:00.369473 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f70e608-5f44-45d3-9c98-d22ed21cf952-operator-scripts\") pod \"placement-db-create-d5gjb\" (UID: \"9f70e608-5f44-45d3-9c98-d22ed21cf952\") " pod="openstack/placement-db-create-d5gjb" Dec 01 08:39:00 crc kubenswrapper[5004]: I1201 08:39:00.369515 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcr2k\" (UniqueName: \"kubernetes.io/projected/b61fea80-031c-4ee8-b99f-562a8bb879eb-kube-api-access-tcr2k\") pod \"placement-3f6b-account-create-update-2gfdj\" (UID: \"b61fea80-031c-4ee8-b99f-562a8bb879eb\") " pod="openstack/placement-3f6b-account-create-update-2gfdj" Dec 01 08:39:00 crc kubenswrapper[5004]: I1201 08:39:00.371342 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f70e608-5f44-45d3-9c98-d22ed21cf952-operator-scripts\") pod \"placement-db-create-d5gjb\" (UID: \"9f70e608-5f44-45d3-9c98-d22ed21cf952\") " pod="openstack/placement-db-create-d5gjb" Dec 01 08:39:00 crc kubenswrapper[5004]: I1201 08:39:00.372341 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b61fea80-031c-4ee8-b99f-562a8bb879eb-operator-scripts\") pod \"placement-3f6b-account-create-update-2gfdj\" (UID: \"b61fea80-031c-4ee8-b99f-562a8bb879eb\") " pod="openstack/placement-3f6b-account-create-update-2gfdj" Dec 01 08:39:00 crc kubenswrapper[5004]: I1201 08:39:00.387796 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcr2k\" (UniqueName: \"kubernetes.io/projected/b61fea80-031c-4ee8-b99f-562a8bb879eb-kube-api-access-tcr2k\") pod \"placement-3f6b-account-create-update-2gfdj\" (UID: \"b61fea80-031c-4ee8-b99f-562a8bb879eb\") " pod="openstack/placement-3f6b-account-create-update-2gfdj" Dec 01 08:39:00 crc kubenswrapper[5004]: I1201 08:39:00.392156 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmqzk\" (UniqueName: \"kubernetes.io/projected/9f70e608-5f44-45d3-9c98-d22ed21cf952-kube-api-access-xmqzk\") pod \"placement-db-create-d5gjb\" (UID: \"9f70e608-5f44-45d3-9c98-d22ed21cf952\") " pod="openstack/placement-db-create-d5gjb" Dec 01 08:39:00 crc kubenswrapper[5004]: I1201 08:39:00.457605 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3f6b-account-create-update-2gfdj" Dec 01 08:39:00 crc kubenswrapper[5004]: I1201 08:39:00.492456 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-d5gjb" Dec 01 08:39:00 crc kubenswrapper[5004]: I1201 08:39:00.636574 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-wl2nn" Dec 01 08:39:00 crc kubenswrapper[5004]: I1201 08:39:00.674538 5004 generic.go:334] "Generic (PLEG): container finished" podID="7ce540a3-feaa-469f-85cf-ec800dd6b1bc" containerID="585d7ecc59736bdb2b1b8bfa3d3b43cd270853d71ec3122117fbf8cd85b4bd42" exitCode=0 Dec 01 08:39:00 crc kubenswrapper[5004]: I1201 08:39:00.674648 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-wl2nn" event={"ID":"7ce540a3-feaa-469f-85cf-ec800dd6b1bc","Type":"ContainerDied","Data":"585d7ecc59736bdb2b1b8bfa3d3b43cd270853d71ec3122117fbf8cd85b4bd42"} Dec 01 08:39:00 crc kubenswrapper[5004]: I1201 08:39:00.674705 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-wl2nn" event={"ID":"7ce540a3-feaa-469f-85cf-ec800dd6b1bc","Type":"ContainerDied","Data":"a89e8377f4b01b5ed6d4cbe04ed093c316cd9ff455c524df77f84864c669d4ef"} Dec 01 08:39:00 crc kubenswrapper[5004]: I1201 08:39:00.674723 5004 scope.go:117] "RemoveContainer" containerID="585d7ecc59736bdb2b1b8bfa3d3b43cd270853d71ec3122117fbf8cd85b4bd42" Dec 01 08:39:00 crc kubenswrapper[5004]: I1201 08:39:00.674910 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-wl2nn" Dec 01 08:39:00 crc kubenswrapper[5004]: I1201 08:39:00.681279 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7ce540a3-feaa-469f-85cf-ec800dd6b1bc-ovsdbserver-nb\") pod \"7ce540a3-feaa-469f-85cf-ec800dd6b1bc\" (UID: \"7ce540a3-feaa-469f-85cf-ec800dd6b1bc\") " Dec 01 08:39:00 crc kubenswrapper[5004]: I1201 08:39:00.681382 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ce540a3-feaa-469f-85cf-ec800dd6b1bc-dns-svc\") pod \"7ce540a3-feaa-469f-85cf-ec800dd6b1bc\" (UID: \"7ce540a3-feaa-469f-85cf-ec800dd6b1bc\") " Dec 01 08:39:00 crc kubenswrapper[5004]: I1201 08:39:00.681571 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ce540a3-feaa-469f-85cf-ec800dd6b1bc-config\") pod \"7ce540a3-feaa-469f-85cf-ec800dd6b1bc\" (UID: \"7ce540a3-feaa-469f-85cf-ec800dd6b1bc\") " Dec 01 08:39:00 crc kubenswrapper[5004]: I1201 08:39:00.681673 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97s6f\" (UniqueName: \"kubernetes.io/projected/7ce540a3-feaa-469f-85cf-ec800dd6b1bc-kube-api-access-97s6f\") pod \"7ce540a3-feaa-469f-85cf-ec800dd6b1bc\" (UID: \"7ce540a3-feaa-469f-85cf-ec800dd6b1bc\") " Dec 01 08:39:00 crc kubenswrapper[5004]: I1201 08:39:00.688595 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ce540a3-feaa-469f-85cf-ec800dd6b1bc-kube-api-access-97s6f" (OuterVolumeSpecName: "kube-api-access-97s6f") pod "7ce540a3-feaa-469f-85cf-ec800dd6b1bc" (UID: "7ce540a3-feaa-469f-85cf-ec800dd6b1bc"). InnerVolumeSpecName "kube-api-access-97s6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:39:00 crc kubenswrapper[5004]: I1201 08:39:00.719858 5004 scope.go:117] "RemoveContainer" containerID="913ed74d61b950181258d4d507f390352ca9adcfdabb5124622e6e16451c41b1" Dec 01 08:39:00 crc kubenswrapper[5004]: I1201 08:39:00.760164 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ce540a3-feaa-469f-85cf-ec800dd6b1bc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7ce540a3-feaa-469f-85cf-ec800dd6b1bc" (UID: "7ce540a3-feaa-469f-85cf-ec800dd6b1bc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:39:00 crc kubenswrapper[5004]: I1201 08:39:00.765210 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ce540a3-feaa-469f-85cf-ec800dd6b1bc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7ce540a3-feaa-469f-85cf-ec800dd6b1bc" (UID: "7ce540a3-feaa-469f-85cf-ec800dd6b1bc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:39:00 crc kubenswrapper[5004]: I1201 08:39:00.770190 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ce540a3-feaa-469f-85cf-ec800dd6b1bc-config" (OuterVolumeSpecName: "config") pod "7ce540a3-feaa-469f-85cf-ec800dd6b1bc" (UID: "7ce540a3-feaa-469f-85cf-ec800dd6b1bc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:39:00 crc kubenswrapper[5004]: I1201 08:39:00.783523 5004 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ce540a3-feaa-469f-85cf-ec800dd6b1bc-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:00 crc kubenswrapper[5004]: I1201 08:39:00.783550 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ce540a3-feaa-469f-85cf-ec800dd6b1bc-config\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:00 crc kubenswrapper[5004]: I1201 08:39:00.783576 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-97s6f\" (UniqueName: \"kubernetes.io/projected/7ce540a3-feaa-469f-85cf-ec800dd6b1bc-kube-api-access-97s6f\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:00 crc kubenswrapper[5004]: I1201 08:39:00.783585 5004 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7ce540a3-feaa-469f-85cf-ec800dd6b1bc-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:00 crc kubenswrapper[5004]: I1201 08:39:00.784814 5004 scope.go:117] "RemoveContainer" containerID="585d7ecc59736bdb2b1b8bfa3d3b43cd270853d71ec3122117fbf8cd85b4bd42" Dec 01 08:39:00 crc kubenswrapper[5004]: E1201 08:39:00.788642 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"585d7ecc59736bdb2b1b8bfa3d3b43cd270853d71ec3122117fbf8cd85b4bd42\": container with ID starting with 585d7ecc59736bdb2b1b8bfa3d3b43cd270853d71ec3122117fbf8cd85b4bd42 not found: ID does not exist" containerID="585d7ecc59736bdb2b1b8bfa3d3b43cd270853d71ec3122117fbf8cd85b4bd42" Dec 01 08:39:00 crc kubenswrapper[5004]: I1201 08:39:00.788701 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"585d7ecc59736bdb2b1b8bfa3d3b43cd270853d71ec3122117fbf8cd85b4bd42"} err="failed to get container status \"585d7ecc59736bdb2b1b8bfa3d3b43cd270853d71ec3122117fbf8cd85b4bd42\": rpc error: code = NotFound desc = could not find container \"585d7ecc59736bdb2b1b8bfa3d3b43cd270853d71ec3122117fbf8cd85b4bd42\": container with ID starting with 585d7ecc59736bdb2b1b8bfa3d3b43cd270853d71ec3122117fbf8cd85b4bd42 not found: ID does not exist" Dec 01 08:39:00 crc kubenswrapper[5004]: I1201 08:39:00.788730 5004 scope.go:117] "RemoveContainer" containerID="913ed74d61b950181258d4d507f390352ca9adcfdabb5124622e6e16451c41b1" Dec 01 08:39:00 crc kubenswrapper[5004]: E1201 08:39:00.789583 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"913ed74d61b950181258d4d507f390352ca9adcfdabb5124622e6e16451c41b1\": container with ID starting with 913ed74d61b950181258d4d507f390352ca9adcfdabb5124622e6e16451c41b1 not found: ID does not exist" containerID="913ed74d61b950181258d4d507f390352ca9adcfdabb5124622e6e16451c41b1" Dec 01 08:39:00 crc kubenswrapper[5004]: I1201 08:39:00.789674 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"913ed74d61b950181258d4d507f390352ca9adcfdabb5124622e6e16451c41b1"} err="failed to get container status \"913ed74d61b950181258d4d507f390352ca9adcfdabb5124622e6e16451c41b1\": rpc error: code = NotFound desc = could not find container \"913ed74d61b950181258d4d507f390352ca9adcfdabb5124622e6e16451c41b1\": container with ID starting with 913ed74d61b950181258d4d507f390352ca9adcfdabb5124622e6e16451c41b1 not found: ID does not exist" Dec 01 08:39:00 crc kubenswrapper[5004]: I1201 08:39:00.813446 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 01 08:39:01 crc kubenswrapper[5004]: I1201 08:39:01.001829 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-wl2nn"] Dec 01 08:39:01 crc kubenswrapper[5004]: I1201 08:39:01.011323 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-wl2nn"] Dec 01 08:39:01 crc kubenswrapper[5004]: W1201 08:39:01.020929 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb61fea80_031c_4ee8_b99f_562a8bb879eb.slice/crio-e79d233210c1dcb5f1728da5f19536426d477f29931eec1245ecb9e6a5b31cef WatchSource:0}: Error finding container e79d233210c1dcb5f1728da5f19536426d477f29931eec1245ecb9e6a5b31cef: Status 404 returned error can't find the container with id e79d233210c1dcb5f1728da5f19536426d477f29931eec1245ecb9e6a5b31cef Dec 01 08:39:01 crc kubenswrapper[5004]: I1201 08:39:01.020971 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-3f6b-account-create-update-2gfdj"] Dec 01 08:39:01 crc kubenswrapper[5004]: W1201 08:39:01.022635 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f70e608_5f44_45d3_9c98_d22ed21cf952.slice/crio-965dd8bcee771dbaa362faf500020899b91b0cf46ad3b221bcdab9d6821b9d8e WatchSource:0}: Error finding container 965dd8bcee771dbaa362faf500020899b91b0cf46ad3b221bcdab9d6821b9d8e: Status 404 returned error can't find the container with id 965dd8bcee771dbaa362faf500020899b91b0cf46ad3b221bcdab9d6821b9d8e Dec 01 08:39:01 crc kubenswrapper[5004]: I1201 08:39:01.028805 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-d5gjb"] Dec 01 08:39:01 crc kubenswrapper[5004]: I1201 08:39:01.695192 5004 generic.go:334] "Generic (PLEG): container finished" podID="9f70e608-5f44-45d3-9c98-d22ed21cf952" containerID="f23a85d1a241230784e372b2b68e777a885643c89da240dbda07441ec0857e39" exitCode=0 Dec 01 08:39:01 crc kubenswrapper[5004]: I1201 08:39:01.695291 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-d5gjb" event={"ID":"9f70e608-5f44-45d3-9c98-d22ed21cf952","Type":"ContainerDied","Data":"f23a85d1a241230784e372b2b68e777a885643c89da240dbda07441ec0857e39"} Dec 01 08:39:01 crc kubenswrapper[5004]: I1201 08:39:01.695653 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-d5gjb" event={"ID":"9f70e608-5f44-45d3-9c98-d22ed21cf952","Type":"ContainerStarted","Data":"965dd8bcee771dbaa362faf500020899b91b0cf46ad3b221bcdab9d6821b9d8e"} Dec 01 08:39:01 crc kubenswrapper[5004]: I1201 08:39:01.701323 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"e7d0d76d-60c9-4ba6-b29b-8c6bc8a4d499","Type":"ContainerStarted","Data":"21127abe44c8431171ad8fdd9c31cf1089758fbb14baac3a89f74abca303269b"} Dec 01 08:39:01 crc kubenswrapper[5004]: I1201 08:39:01.701368 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"e7d0d76d-60c9-4ba6-b29b-8c6bc8a4d499","Type":"ContainerStarted","Data":"34a9c129ca21236010ab6f926863a0a2ff7a5cecf30d9c785ae3ab0d215868b0"} Dec 01 08:39:01 crc kubenswrapper[5004]: I1201 08:39:01.701484 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 01 08:39:01 crc kubenswrapper[5004]: I1201 08:39:01.705058 5004 generic.go:334] "Generic (PLEG): container finished" podID="b61fea80-031c-4ee8-b99f-562a8bb879eb" containerID="13338ddb68fdaed9d4d35936b6e07985146beb7009af833b95f044b50a8f906c" exitCode=0 Dec 01 08:39:01 crc kubenswrapper[5004]: I1201 08:39:01.705106 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3f6b-account-create-update-2gfdj" event={"ID":"b61fea80-031c-4ee8-b99f-562a8bb879eb","Type":"ContainerDied","Data":"13338ddb68fdaed9d4d35936b6e07985146beb7009af833b95f044b50a8f906c"} Dec 01 08:39:01 crc kubenswrapper[5004]: I1201 08:39:01.705155 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3f6b-account-create-update-2gfdj" event={"ID":"b61fea80-031c-4ee8-b99f-562a8bb879eb","Type":"ContainerStarted","Data":"e79d233210c1dcb5f1728da5f19536426d477f29931eec1245ecb9e6a5b31cef"} Dec 01 08:39:01 crc kubenswrapper[5004]: I1201 08:39:01.745479 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.352630909 podStartE2EDuration="6.745450817s" podCreationTimestamp="2025-12-01 08:38:55 +0000 UTC" firstStartedPulling="2025-12-01 08:38:55.912406603 +0000 UTC m=+1313.477398585" lastFinishedPulling="2025-12-01 08:39:00.305226511 +0000 UTC m=+1317.870218493" observedRunningTime="2025-12-01 08:39:01.734205451 +0000 UTC m=+1319.299197463" watchObservedRunningTime="2025-12-01 08:39:01.745450817 +0000 UTC m=+1319.310442809" Dec 01 08:39:01 crc kubenswrapper[5004]: I1201 08:39:01.974956 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-c26nm"] Dec 01 08:39:01 crc kubenswrapper[5004]: E1201 08:39:01.975419 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ce540a3-feaa-469f-85cf-ec800dd6b1bc" containerName="init" Dec 01 08:39:01 crc kubenswrapper[5004]: I1201 08:39:01.975431 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ce540a3-feaa-469f-85cf-ec800dd6b1bc" containerName="init" Dec 01 08:39:01 crc kubenswrapper[5004]: E1201 08:39:01.975445 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ce540a3-feaa-469f-85cf-ec800dd6b1bc" containerName="dnsmasq-dns" Dec 01 08:39:01 crc kubenswrapper[5004]: I1201 08:39:01.975452 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ce540a3-feaa-469f-85cf-ec800dd6b1bc" containerName="dnsmasq-dns" Dec 01 08:39:01 crc kubenswrapper[5004]: I1201 08:39:01.975696 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ce540a3-feaa-469f-85cf-ec800dd6b1bc" containerName="dnsmasq-dns" Dec 01 08:39:01 crc kubenswrapper[5004]: I1201 08:39:01.976451 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-c26nm" Dec 01 08:39:01 crc kubenswrapper[5004]: I1201 08:39:01.980805 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-c26nm"] Dec 01 08:39:02 crc kubenswrapper[5004]: I1201 08:39:02.023853 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd186cbb-dea0-4dd5-9328-935a4041a137-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-c26nm\" (UID: \"cd186cbb-dea0-4dd5-9328-935a4041a137\") " pod="openstack/mysqld-exporter-openstack-db-create-c26nm" Dec 01 08:39:02 crc kubenswrapper[5004]: I1201 08:39:02.024050 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkq52\" (UniqueName: \"kubernetes.io/projected/cd186cbb-dea0-4dd5-9328-935a4041a137-kube-api-access-wkq52\") pod \"mysqld-exporter-openstack-db-create-c26nm\" (UID: \"cd186cbb-dea0-4dd5-9328-935a4041a137\") " pod="openstack/mysqld-exporter-openstack-db-create-c26nm" Dec 01 08:39:02 crc kubenswrapper[5004]: I1201 08:39:02.126691 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd186cbb-dea0-4dd5-9328-935a4041a137-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-c26nm\" (UID: \"cd186cbb-dea0-4dd5-9328-935a4041a137\") " pod="openstack/mysqld-exporter-openstack-db-create-c26nm" Dec 01 08:39:02 crc kubenswrapper[5004]: I1201 08:39:02.127042 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkq52\" (UniqueName: \"kubernetes.io/projected/cd186cbb-dea0-4dd5-9328-935a4041a137-kube-api-access-wkq52\") pod \"mysqld-exporter-openstack-db-create-c26nm\" (UID: \"cd186cbb-dea0-4dd5-9328-935a4041a137\") " pod="openstack/mysqld-exporter-openstack-db-create-c26nm" Dec 01 08:39:02 crc kubenswrapper[5004]: I1201 08:39:02.128202 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd186cbb-dea0-4dd5-9328-935a4041a137-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-c26nm\" (UID: \"cd186cbb-dea0-4dd5-9328-935a4041a137\") " pod="openstack/mysqld-exporter-openstack-db-create-c26nm" Dec 01 08:39:02 crc kubenswrapper[5004]: I1201 08:39:02.147273 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-2cda-account-create-update-gzdd5"] Dec 01 08:39:02 crc kubenswrapper[5004]: I1201 08:39:02.148473 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-2cda-account-create-update-gzdd5" Dec 01 08:39:02 crc kubenswrapper[5004]: I1201 08:39:02.161695 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkq52\" (UniqueName: \"kubernetes.io/projected/cd186cbb-dea0-4dd5-9328-935a4041a137-kube-api-access-wkq52\") pod \"mysqld-exporter-openstack-db-create-c26nm\" (UID: \"cd186cbb-dea0-4dd5-9328-935a4041a137\") " pod="openstack/mysqld-exporter-openstack-db-create-c26nm" Dec 01 08:39:02 crc kubenswrapper[5004]: I1201 08:39:02.161722 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-openstack-db-secret" Dec 01 08:39:02 crc kubenswrapper[5004]: I1201 08:39:02.169806 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-2cda-account-create-update-gzdd5"] Dec 01 08:39:02 crc kubenswrapper[5004]: I1201 08:39:02.226766 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-f2sxs"] Dec 01 08:39:02 crc kubenswrapper[5004]: I1201 08:39:02.228898 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-f2sxs" Dec 01 08:39:02 crc kubenswrapper[5004]: I1201 08:39:02.228962 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q55rt\" (UniqueName: \"kubernetes.io/projected/803e9984-2114-48e5-8d49-4536bd3cc6ef-kube-api-access-q55rt\") pod \"mysqld-exporter-2cda-account-create-update-gzdd5\" (UID: \"803e9984-2114-48e5-8d49-4536bd3cc6ef\") " pod="openstack/mysqld-exporter-2cda-account-create-update-gzdd5" Dec 01 08:39:02 crc kubenswrapper[5004]: I1201 08:39:02.229015 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/803e9984-2114-48e5-8d49-4536bd3cc6ef-operator-scripts\") pod \"mysqld-exporter-2cda-account-create-update-gzdd5\" (UID: \"803e9984-2114-48e5-8d49-4536bd3cc6ef\") " pod="openstack/mysqld-exporter-2cda-account-create-update-gzdd5" Dec 01 08:39:02 crc kubenswrapper[5004]: I1201 08:39:02.231377 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 01 08:39:02 crc kubenswrapper[5004]: I1201 08:39:02.280070 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-f2sxs"] Dec 01 08:39:02 crc kubenswrapper[5004]: I1201 08:39:02.316722 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-c26nm" Dec 01 08:39:02 crc kubenswrapper[5004]: I1201 08:39:02.371281 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9w2h\" (UniqueName: \"kubernetes.io/projected/4e59b4e1-4729-4161-ad22-e11718c0c6fe-kube-api-access-m9w2h\") pod \"dnsmasq-dns-b8fbc5445-f2sxs\" (UID: \"4e59b4e1-4729-4161-ad22-e11718c0c6fe\") " pod="openstack/dnsmasq-dns-b8fbc5445-f2sxs" Dec 01 08:39:02 crc kubenswrapper[5004]: I1201 08:39:02.371397 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e59b4e1-4729-4161-ad22-e11718c0c6fe-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-f2sxs\" (UID: \"4e59b4e1-4729-4161-ad22-e11718c0c6fe\") " pod="openstack/dnsmasq-dns-b8fbc5445-f2sxs" Dec 01 08:39:02 crc kubenswrapper[5004]: I1201 08:39:02.371436 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q55rt\" (UniqueName: \"kubernetes.io/projected/803e9984-2114-48e5-8d49-4536bd3cc6ef-kube-api-access-q55rt\") pod \"mysqld-exporter-2cda-account-create-update-gzdd5\" (UID: \"803e9984-2114-48e5-8d49-4536bd3cc6ef\") " pod="openstack/mysqld-exporter-2cda-account-create-update-gzdd5" Dec 01 08:39:02 crc kubenswrapper[5004]: I1201 08:39:02.371511 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/803e9984-2114-48e5-8d49-4536bd3cc6ef-operator-scripts\") pod \"mysqld-exporter-2cda-account-create-update-gzdd5\" (UID: \"803e9984-2114-48e5-8d49-4536bd3cc6ef\") " pod="openstack/mysqld-exporter-2cda-account-create-update-gzdd5" Dec 01 08:39:02 crc kubenswrapper[5004]: I1201 08:39:02.371611 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e59b4e1-4729-4161-ad22-e11718c0c6fe-config\") pod \"dnsmasq-dns-b8fbc5445-f2sxs\" (UID: \"4e59b4e1-4729-4161-ad22-e11718c0c6fe\") " pod="openstack/dnsmasq-dns-b8fbc5445-f2sxs" Dec 01 08:39:02 crc kubenswrapper[5004]: I1201 08:39:02.371714 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e59b4e1-4729-4161-ad22-e11718c0c6fe-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-f2sxs\" (UID: \"4e59b4e1-4729-4161-ad22-e11718c0c6fe\") " pod="openstack/dnsmasq-dns-b8fbc5445-f2sxs" Dec 01 08:39:02 crc kubenswrapper[5004]: I1201 08:39:02.371804 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e59b4e1-4729-4161-ad22-e11718c0c6fe-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-f2sxs\" (UID: \"4e59b4e1-4729-4161-ad22-e11718c0c6fe\") " pod="openstack/dnsmasq-dns-b8fbc5445-f2sxs" Dec 01 08:39:02 crc kubenswrapper[5004]: I1201 08:39:02.372765 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/803e9984-2114-48e5-8d49-4536bd3cc6ef-operator-scripts\") pod \"mysqld-exporter-2cda-account-create-update-gzdd5\" (UID: \"803e9984-2114-48e5-8d49-4536bd3cc6ef\") " pod="openstack/mysqld-exporter-2cda-account-create-update-gzdd5" Dec 01 08:39:02 crc kubenswrapper[5004]: I1201 08:39:02.403437 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q55rt\" (UniqueName: \"kubernetes.io/projected/803e9984-2114-48e5-8d49-4536bd3cc6ef-kube-api-access-q55rt\") pod \"mysqld-exporter-2cda-account-create-update-gzdd5\" (UID: \"803e9984-2114-48e5-8d49-4536bd3cc6ef\") " pod="openstack/mysqld-exporter-2cda-account-create-update-gzdd5" Dec 01 08:39:02 crc kubenswrapper[5004]: I1201 08:39:02.474695 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9w2h\" (UniqueName: \"kubernetes.io/projected/4e59b4e1-4729-4161-ad22-e11718c0c6fe-kube-api-access-m9w2h\") pod \"dnsmasq-dns-b8fbc5445-f2sxs\" (UID: \"4e59b4e1-4729-4161-ad22-e11718c0c6fe\") " pod="openstack/dnsmasq-dns-b8fbc5445-f2sxs" Dec 01 08:39:02 crc kubenswrapper[5004]: I1201 08:39:02.474800 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e59b4e1-4729-4161-ad22-e11718c0c6fe-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-f2sxs\" (UID: \"4e59b4e1-4729-4161-ad22-e11718c0c6fe\") " pod="openstack/dnsmasq-dns-b8fbc5445-f2sxs" Dec 01 08:39:02 crc kubenswrapper[5004]: I1201 08:39:02.474878 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e59b4e1-4729-4161-ad22-e11718c0c6fe-config\") pod \"dnsmasq-dns-b8fbc5445-f2sxs\" (UID: \"4e59b4e1-4729-4161-ad22-e11718c0c6fe\") " pod="openstack/dnsmasq-dns-b8fbc5445-f2sxs" Dec 01 08:39:02 crc kubenswrapper[5004]: I1201 08:39:02.474941 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e59b4e1-4729-4161-ad22-e11718c0c6fe-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-f2sxs\" (UID: \"4e59b4e1-4729-4161-ad22-e11718c0c6fe\") " pod="openstack/dnsmasq-dns-b8fbc5445-f2sxs" Dec 01 08:39:02 crc kubenswrapper[5004]: I1201 08:39:02.474998 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e59b4e1-4729-4161-ad22-e11718c0c6fe-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-f2sxs\" (UID: \"4e59b4e1-4729-4161-ad22-e11718c0c6fe\") " pod="openstack/dnsmasq-dns-b8fbc5445-f2sxs" Dec 01 08:39:02 crc kubenswrapper[5004]: I1201 08:39:02.475815 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e59b4e1-4729-4161-ad22-e11718c0c6fe-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-f2sxs\" (UID: \"4e59b4e1-4729-4161-ad22-e11718c0c6fe\") " pod="openstack/dnsmasq-dns-b8fbc5445-f2sxs" Dec 01 08:39:02 crc kubenswrapper[5004]: I1201 08:39:02.476261 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e59b4e1-4729-4161-ad22-e11718c0c6fe-config\") pod \"dnsmasq-dns-b8fbc5445-f2sxs\" (UID: \"4e59b4e1-4729-4161-ad22-e11718c0c6fe\") " pod="openstack/dnsmasq-dns-b8fbc5445-f2sxs" Dec 01 08:39:02 crc kubenswrapper[5004]: I1201 08:39:02.476420 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e59b4e1-4729-4161-ad22-e11718c0c6fe-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-f2sxs\" (UID: \"4e59b4e1-4729-4161-ad22-e11718c0c6fe\") " pod="openstack/dnsmasq-dns-b8fbc5445-f2sxs" Dec 01 08:39:02 crc kubenswrapper[5004]: I1201 08:39:02.479281 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e59b4e1-4729-4161-ad22-e11718c0c6fe-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-f2sxs\" (UID: \"4e59b4e1-4729-4161-ad22-e11718c0c6fe\") " pod="openstack/dnsmasq-dns-b8fbc5445-f2sxs" Dec 01 08:39:02 crc kubenswrapper[5004]: I1201 08:39:02.517500 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-2cda-account-create-update-gzdd5" Dec 01 08:39:02 crc kubenswrapper[5004]: I1201 08:39:02.550649 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9w2h\" (UniqueName: \"kubernetes.io/projected/4e59b4e1-4729-4161-ad22-e11718c0c6fe-kube-api-access-m9w2h\") pod \"dnsmasq-dns-b8fbc5445-f2sxs\" (UID: \"4e59b4e1-4729-4161-ad22-e11718c0c6fe\") " pod="openstack/dnsmasq-dns-b8fbc5445-f2sxs" Dec 01 08:39:02 crc kubenswrapper[5004]: I1201 08:39:02.562200 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-f2sxs" Dec 01 08:39:02 crc kubenswrapper[5004]: I1201 08:39:02.809229 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ce540a3-feaa-469f-85cf-ec800dd6b1bc" path="/var/lib/kubelet/pods/7ce540a3-feaa-469f-85cf-ec800dd6b1bc/volumes" Dec 01 08:39:03 crc kubenswrapper[5004]: I1201 08:39:03.395216 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-d5gjb" Dec 01 08:39:03 crc kubenswrapper[5004]: I1201 08:39:03.397458 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3f6b-account-create-update-2gfdj" Dec 01 08:39:03 crc kubenswrapper[5004]: I1201 08:39:03.465835 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Dec 01 08:39:03 crc kubenswrapper[5004]: E1201 08:39:03.469102 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b61fea80-031c-4ee8-b99f-562a8bb879eb" containerName="mariadb-account-create-update" Dec 01 08:39:03 crc kubenswrapper[5004]: I1201 08:39:03.469124 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="b61fea80-031c-4ee8-b99f-562a8bb879eb" containerName="mariadb-account-create-update" Dec 01 08:39:03 crc kubenswrapper[5004]: E1201 08:39:03.469162 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f70e608-5f44-45d3-9c98-d22ed21cf952" containerName="mariadb-database-create" Dec 01 08:39:03 crc kubenswrapper[5004]: I1201 08:39:03.469169 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f70e608-5f44-45d3-9c98-d22ed21cf952" containerName="mariadb-database-create" Dec 01 08:39:03 crc kubenswrapper[5004]: I1201 08:39:03.469368 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f70e608-5f44-45d3-9c98-d22ed21cf952" containerName="mariadb-database-create" Dec 01 08:39:03 crc kubenswrapper[5004]: I1201 08:39:03.469385 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="b61fea80-031c-4ee8-b99f-562a8bb879eb" containerName="mariadb-account-create-update" Dec 01 08:39:03 crc kubenswrapper[5004]: I1201 08:39:03.489370 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 01 08:39:03 crc kubenswrapper[5004]: I1201 08:39:03.493617 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-22w9c" Dec 01 08:39:03 crc kubenswrapper[5004]: I1201 08:39:03.493828 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Dec 01 08:39:03 crc kubenswrapper[5004]: I1201 08:39:03.493984 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Dec 01 08:39:03 crc kubenswrapper[5004]: I1201 08:39:03.494110 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Dec 01 08:39:03 crc kubenswrapper[5004]: I1201 08:39:03.494309 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-c26nm"] Dec 01 08:39:03 crc kubenswrapper[5004]: I1201 08:39:03.508408 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 01 08:39:03 crc kubenswrapper[5004]: I1201 08:39:03.517456 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tcr2k\" (UniqueName: \"kubernetes.io/projected/b61fea80-031c-4ee8-b99f-562a8bb879eb-kube-api-access-tcr2k\") pod \"b61fea80-031c-4ee8-b99f-562a8bb879eb\" (UID: \"b61fea80-031c-4ee8-b99f-562a8bb879eb\") " Dec 01 08:39:03 crc kubenswrapper[5004]: I1201 08:39:03.517517 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b61fea80-031c-4ee8-b99f-562a8bb879eb-operator-scripts\") pod \"b61fea80-031c-4ee8-b99f-562a8bb879eb\" (UID: \"b61fea80-031c-4ee8-b99f-562a8bb879eb\") " Dec 01 08:39:03 crc kubenswrapper[5004]: I1201 08:39:03.517618 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmqzk\" (UniqueName: \"kubernetes.io/projected/9f70e608-5f44-45d3-9c98-d22ed21cf952-kube-api-access-xmqzk\") pod \"9f70e608-5f44-45d3-9c98-d22ed21cf952\" (UID: \"9f70e608-5f44-45d3-9c98-d22ed21cf952\") " Dec 01 08:39:03 crc kubenswrapper[5004]: I1201 08:39:03.517730 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f70e608-5f44-45d3-9c98-d22ed21cf952-operator-scripts\") pod \"9f70e608-5f44-45d3-9c98-d22ed21cf952\" (UID: \"9f70e608-5f44-45d3-9c98-d22ed21cf952\") " Dec 01 08:39:03 crc kubenswrapper[5004]: I1201 08:39:03.518462 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b61fea80-031c-4ee8-b99f-562a8bb879eb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b61fea80-031c-4ee8-b99f-562a8bb879eb" (UID: "b61fea80-031c-4ee8-b99f-562a8bb879eb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:39:03 crc kubenswrapper[5004]: I1201 08:39:03.518599 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f70e608-5f44-45d3-9c98-d22ed21cf952-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9f70e608-5f44-45d3-9c98-d22ed21cf952" (UID: "9f70e608-5f44-45d3-9c98-d22ed21cf952"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:39:03 crc kubenswrapper[5004]: I1201 08:39:03.522194 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-clktj"] Dec 01 08:39:04 crc kubenswrapper[5004]: I1201 08:39:03.536446 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b61fea80-031c-4ee8-b99f-562a8bb879eb-kube-api-access-tcr2k" (OuterVolumeSpecName: "kube-api-access-tcr2k") pod "b61fea80-031c-4ee8-b99f-562a8bb879eb" (UID: "b61fea80-031c-4ee8-b99f-562a8bb879eb"). InnerVolumeSpecName "kube-api-access-tcr2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:39:04 crc kubenswrapper[5004]: I1201 08:39:03.537667 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-clktj" Dec 01 08:39:04 crc kubenswrapper[5004]: I1201 08:39:03.544692 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f70e608-5f44-45d3-9c98-d22ed21cf952-kube-api-access-xmqzk" (OuterVolumeSpecName: "kube-api-access-xmqzk") pod "9f70e608-5f44-45d3-9c98-d22ed21cf952" (UID: "9f70e608-5f44-45d3-9c98-d22ed21cf952"). InnerVolumeSpecName "kube-api-access-xmqzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:39:04 crc kubenswrapper[5004]: I1201 08:39:03.576025 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 01 08:39:04 crc kubenswrapper[5004]: I1201 08:39:03.578728 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 01 08:39:04 crc kubenswrapper[5004]: I1201 08:39:03.595426 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 01 08:39:04 crc kubenswrapper[5004]: I1201 08:39:03.625092 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/64792f42-dd08-4537-bce9-a632e644cf5a-etc-swift\") pod \"swift-storage-0\" (UID: \"64792f42-dd08-4537-bce9-a632e644cf5a\") " pod="openstack/swift-storage-0" Dec 01 08:39:04 crc kubenswrapper[5004]: I1201 08:39:03.625254 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e5a97177-1085-4eab-a646-c3b849dc73b5-dispersionconf\") pod \"swift-ring-rebalance-clktj\" (UID: \"e5a97177-1085-4eab-a646-c3b849dc73b5\") " pod="openstack/swift-ring-rebalance-clktj" Dec 01 08:39:04 crc kubenswrapper[5004]: I1201 08:39:03.625396 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/64792f42-dd08-4537-bce9-a632e644cf5a-lock\") pod \"swift-storage-0\" (UID: \"64792f42-dd08-4537-bce9-a632e644cf5a\") " pod="openstack/swift-storage-0" Dec 01 08:39:04 crc kubenswrapper[5004]: I1201 08:39:03.625667 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/64792f42-dd08-4537-bce9-a632e644cf5a-cache\") pod \"swift-storage-0\" (UID: \"64792f42-dd08-4537-bce9-a632e644cf5a\") " pod="openstack/swift-storage-0" Dec 01 08:39:04 crc kubenswrapper[5004]: I1201 08:39:03.626109 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbgkf\" (UniqueName: \"kubernetes.io/projected/e5a97177-1085-4eab-a646-c3b849dc73b5-kube-api-access-nbgkf\") pod \"swift-ring-rebalance-clktj\" (UID: \"e5a97177-1085-4eab-a646-c3b849dc73b5\") " pod="openstack/swift-ring-rebalance-clktj" Dec 01 08:39:04 crc kubenswrapper[5004]: I1201 08:39:03.626138 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fx5b\" (UniqueName: \"kubernetes.io/projected/64792f42-dd08-4537-bce9-a632e644cf5a-kube-api-access-4fx5b\") pod \"swift-storage-0\" (UID: \"64792f42-dd08-4537-bce9-a632e644cf5a\") " pod="openstack/swift-storage-0" Dec 01 08:39:04 crc kubenswrapper[5004]: I1201 08:39:03.626309 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5a97177-1085-4eab-a646-c3b849dc73b5-combined-ca-bundle\") pod \"swift-ring-rebalance-clktj\" (UID: \"e5a97177-1085-4eab-a646-c3b849dc73b5\") " pod="openstack/swift-ring-rebalance-clktj" Dec 01 08:39:04 crc kubenswrapper[5004]: I1201 08:39:03.626445 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"64792f42-dd08-4537-bce9-a632e644cf5a\") " pod="openstack/swift-storage-0" Dec 01 08:39:04 crc kubenswrapper[5004]: I1201 08:39:03.626607 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e5a97177-1085-4eab-a646-c3b849dc73b5-ring-data-devices\") pod \"swift-ring-rebalance-clktj\" (UID: \"e5a97177-1085-4eab-a646-c3b849dc73b5\") " pod="openstack/swift-ring-rebalance-clktj" Dec 01 08:39:04 crc kubenswrapper[5004]: I1201 08:39:03.626632 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e5a97177-1085-4eab-a646-c3b849dc73b5-swiftconf\") pod \"swift-ring-rebalance-clktj\" (UID: \"e5a97177-1085-4eab-a646-c3b849dc73b5\") " pod="openstack/swift-ring-rebalance-clktj" Dec 01 08:39:04 crc kubenswrapper[5004]: I1201 08:39:03.626711 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e5a97177-1085-4eab-a646-c3b849dc73b5-etc-swift\") pod \"swift-ring-rebalance-clktj\" (UID: \"e5a97177-1085-4eab-a646-c3b849dc73b5\") " pod="openstack/swift-ring-rebalance-clktj" Dec 01 08:39:04 crc kubenswrapper[5004]: I1201 08:39:03.626947 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e5a97177-1085-4eab-a646-c3b849dc73b5-scripts\") pod \"swift-ring-rebalance-clktj\" (UID: \"e5a97177-1085-4eab-a646-c3b849dc73b5\") " pod="openstack/swift-ring-rebalance-clktj" Dec 01 08:39:04 crc kubenswrapper[5004]: I1201 08:39:03.627370 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tcr2k\" (UniqueName: \"kubernetes.io/projected/b61fea80-031c-4ee8-b99f-562a8bb879eb-kube-api-access-tcr2k\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:04 crc kubenswrapper[5004]: I1201 08:39:03.627385 5004 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b61fea80-031c-4ee8-b99f-562a8bb879eb-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:04 crc kubenswrapper[5004]: I1201 08:39:03.627395 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmqzk\" (UniqueName: \"kubernetes.io/projected/9f70e608-5f44-45d3-9c98-d22ed21cf952-kube-api-access-xmqzk\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:04 crc kubenswrapper[5004]: I1201 08:39:03.627405 5004 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f70e608-5f44-45d3-9c98-d22ed21cf952-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:04 crc kubenswrapper[5004]: I1201 08:39:03.634901 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-openstack-db-secret" Dec 01 08:39:04 crc kubenswrapper[5004]: I1201 08:39:03.646830 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-clktj"] Dec 01 08:39:04 crc kubenswrapper[5004]: I1201 08:39:03.674367 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-2cda-account-create-update-gzdd5"] Dec 01 08:39:04 crc kubenswrapper[5004]: E1201 08:39:03.731473 5004 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 01 08:39:04 crc kubenswrapper[5004]: E1201 08:39:03.731495 5004 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 01 08:39:04 crc kubenswrapper[5004]: E1201 08:39:03.731550 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/64792f42-dd08-4537-bce9-a632e644cf5a-etc-swift podName:64792f42-dd08-4537-bce9-a632e644cf5a nodeName:}" failed. No retries permitted until 2025-12-01 08:39:04.231533757 +0000 UTC m=+1321.796525739 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/64792f42-dd08-4537-bce9-a632e644cf5a-etc-swift") pod "swift-storage-0" (UID: "64792f42-dd08-4537-bce9-a632e644cf5a") : configmap "swift-ring-files" not found Dec 01 08:39:04 crc kubenswrapper[5004]: I1201 08:39:03.733448 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/64792f42-dd08-4537-bce9-a632e644cf5a-etc-swift\") pod \"swift-storage-0\" (UID: \"64792f42-dd08-4537-bce9-a632e644cf5a\") " pod="openstack/swift-storage-0" Dec 01 08:39:04 crc kubenswrapper[5004]: I1201 08:39:03.733905 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e5a97177-1085-4eab-a646-c3b849dc73b5-dispersionconf\") pod \"swift-ring-rebalance-clktj\" (UID: \"e5a97177-1085-4eab-a646-c3b849dc73b5\") " pod="openstack/swift-ring-rebalance-clktj" Dec 01 08:39:04 crc kubenswrapper[5004]: I1201 08:39:03.734494 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/64792f42-dd08-4537-bce9-a632e644cf5a-lock\") pod \"swift-storage-0\" (UID: \"64792f42-dd08-4537-bce9-a632e644cf5a\") " pod="openstack/swift-storage-0" Dec 01 08:39:04 crc kubenswrapper[5004]: I1201 08:39:03.734967 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/64792f42-dd08-4537-bce9-a632e644cf5a-cache\") pod \"swift-storage-0\" (UID: \"64792f42-dd08-4537-bce9-a632e644cf5a\") " pod="openstack/swift-storage-0" Dec 01 08:39:04 crc kubenswrapper[5004]: I1201 08:39:03.735167 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbgkf\" (UniqueName: \"kubernetes.io/projected/e5a97177-1085-4eab-a646-c3b849dc73b5-kube-api-access-nbgkf\") pod \"swift-ring-rebalance-clktj\" (UID: \"e5a97177-1085-4eab-a646-c3b849dc73b5\") " pod="openstack/swift-ring-rebalance-clktj" Dec 01 08:39:04 crc kubenswrapper[5004]: I1201 08:39:03.735861 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fx5b\" (UniqueName: \"kubernetes.io/projected/64792f42-dd08-4537-bce9-a632e644cf5a-kube-api-access-4fx5b\") pod \"swift-storage-0\" (UID: \"64792f42-dd08-4537-bce9-a632e644cf5a\") " pod="openstack/swift-storage-0" Dec 01 08:39:04 crc kubenswrapper[5004]: I1201 08:39:03.735913 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/64792f42-dd08-4537-bce9-a632e644cf5a-cache\") pod \"swift-storage-0\" (UID: \"64792f42-dd08-4537-bce9-a632e644cf5a\") " pod="openstack/swift-storage-0" Dec 01 08:39:04 crc kubenswrapper[5004]: I1201 08:39:03.736067 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5a97177-1085-4eab-a646-c3b849dc73b5-combined-ca-bundle\") pod \"swift-ring-rebalance-clktj\" (UID: \"e5a97177-1085-4eab-a646-c3b849dc73b5\") " pod="openstack/swift-ring-rebalance-clktj" Dec 01 08:39:04 crc kubenswrapper[5004]: I1201 08:39:03.736082 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/64792f42-dd08-4537-bce9-a632e644cf5a-lock\") pod \"swift-storage-0\" (UID: \"64792f42-dd08-4537-bce9-a632e644cf5a\") " pod="openstack/swift-storage-0" Dec 01 08:39:04 crc kubenswrapper[5004]: I1201 08:39:03.736108 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"64792f42-dd08-4537-bce9-a632e644cf5a\") " pod="openstack/swift-storage-0" Dec 01 08:39:04 crc kubenswrapper[5004]: I1201 08:39:03.736145 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e5a97177-1085-4eab-a646-c3b849dc73b5-ring-data-devices\") pod \"swift-ring-rebalance-clktj\" (UID: \"e5a97177-1085-4eab-a646-c3b849dc73b5\") " pod="openstack/swift-ring-rebalance-clktj" Dec 01 08:39:04 crc kubenswrapper[5004]: I1201 08:39:03.736173 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e5a97177-1085-4eab-a646-c3b849dc73b5-swiftconf\") pod \"swift-ring-rebalance-clktj\" (UID: \"e5a97177-1085-4eab-a646-c3b849dc73b5\") " pod="openstack/swift-ring-rebalance-clktj" Dec 01 08:39:04 crc kubenswrapper[5004]: I1201 08:39:03.736230 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e5a97177-1085-4eab-a646-c3b849dc73b5-etc-swift\") pod \"swift-ring-rebalance-clktj\" (UID: \"e5a97177-1085-4eab-a646-c3b849dc73b5\") " pod="openstack/swift-ring-rebalance-clktj" Dec 01 08:39:04 crc kubenswrapper[5004]: I1201 08:39:03.736444 5004 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"64792f42-dd08-4537-bce9-a632e644cf5a\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/swift-storage-0" Dec 01 08:39:04 crc kubenswrapper[5004]: I1201 08:39:03.736762 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e5a97177-1085-4eab-a646-c3b849dc73b5-dispersionconf\") pod \"swift-ring-rebalance-clktj\" (UID: \"e5a97177-1085-4eab-a646-c3b849dc73b5\") " pod="openstack/swift-ring-rebalance-clktj" Dec 01 08:39:04 crc kubenswrapper[5004]: I1201 08:39:03.736931 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e5a97177-1085-4eab-a646-c3b849dc73b5-ring-data-devices\") pod \"swift-ring-rebalance-clktj\" (UID: \"e5a97177-1085-4eab-a646-c3b849dc73b5\") " pod="openstack/swift-ring-rebalance-clktj" Dec 01 08:39:04 crc kubenswrapper[5004]: I1201 08:39:03.738125 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e5a97177-1085-4eab-a646-c3b849dc73b5-etc-swift\") pod \"swift-ring-rebalance-clktj\" (UID: \"e5a97177-1085-4eab-a646-c3b849dc73b5\") " pod="openstack/swift-ring-rebalance-clktj" Dec 01 08:39:04 crc kubenswrapper[5004]: I1201 08:39:03.738524 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e5a97177-1085-4eab-a646-c3b849dc73b5-scripts\") pod \"swift-ring-rebalance-clktj\" (UID: \"e5a97177-1085-4eab-a646-c3b849dc73b5\") " pod="openstack/swift-ring-rebalance-clktj" Dec 01 08:39:04 crc kubenswrapper[5004]: I1201 08:39:03.739079 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5a97177-1085-4eab-a646-c3b849dc73b5-combined-ca-bundle\") pod \"swift-ring-rebalance-clktj\" (UID: \"e5a97177-1085-4eab-a646-c3b849dc73b5\") " pod="openstack/swift-ring-rebalance-clktj" Dec 01 08:39:04 crc kubenswrapper[5004]: I1201 08:39:03.739655 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e5a97177-1085-4eab-a646-c3b849dc73b5-scripts\") pod \"swift-ring-rebalance-clktj\" (UID: \"e5a97177-1085-4eab-a646-c3b849dc73b5\") " pod="openstack/swift-ring-rebalance-clktj" Dec 01 08:39:04 crc kubenswrapper[5004]: I1201 08:39:03.740879 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e5a97177-1085-4eab-a646-c3b849dc73b5-swiftconf\") pod \"swift-ring-rebalance-clktj\" (UID: \"e5a97177-1085-4eab-a646-c3b849dc73b5\") " pod="openstack/swift-ring-rebalance-clktj" Dec 01 08:39:04 crc kubenswrapper[5004]: I1201 08:39:03.749784 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-f2sxs"] Dec 01 08:39:04 crc kubenswrapper[5004]: I1201 08:39:03.752925 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-f2sxs" event={"ID":"4e59b4e1-4729-4161-ad22-e11718c0c6fe","Type":"ContainerStarted","Data":"c139241a209375b7c7ce0fa66ad09eb4f88af8e6252175a970edc88d41e70319"} Dec 01 08:39:04 crc kubenswrapper[5004]: I1201 08:39:03.760617 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbgkf\" (UniqueName: \"kubernetes.io/projected/e5a97177-1085-4eab-a646-c3b849dc73b5-kube-api-access-nbgkf\") pod \"swift-ring-rebalance-clktj\" (UID: \"e5a97177-1085-4eab-a646-c3b849dc73b5\") " pod="openstack/swift-ring-rebalance-clktj" Dec 01 08:39:04 crc kubenswrapper[5004]: I1201 08:39:03.761028 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-2cda-account-create-update-gzdd5" event={"ID":"803e9984-2114-48e5-8d49-4536bd3cc6ef","Type":"ContainerStarted","Data":"647b423e6f1221cbf75a78fe0c1ee746d4de7a1e35718f392714fbe3e1658dd2"} Dec 01 08:39:04 crc kubenswrapper[5004]: I1201 08:39:03.789373 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fx5b\" (UniqueName: \"kubernetes.io/projected/64792f42-dd08-4537-bce9-a632e644cf5a-kube-api-access-4fx5b\") pod \"swift-storage-0\" (UID: \"64792f42-dd08-4537-bce9-a632e644cf5a\") " pod="openstack/swift-storage-0" Dec 01 08:39:04 crc kubenswrapper[5004]: I1201 08:39:03.801619 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"64792f42-dd08-4537-bce9-a632e644cf5a\") " pod="openstack/swift-storage-0" Dec 01 08:39:04 crc kubenswrapper[5004]: I1201 08:39:03.829087 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3f6b-account-create-update-2gfdj" event={"ID":"b61fea80-031c-4ee8-b99f-562a8bb879eb","Type":"ContainerDied","Data":"e79d233210c1dcb5f1728da5f19536426d477f29931eec1245ecb9e6a5b31cef"} Dec 01 08:39:04 crc kubenswrapper[5004]: I1201 08:39:03.829142 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e79d233210c1dcb5f1728da5f19536426d477f29931eec1245ecb9e6a5b31cef" Dec 01 08:39:04 crc kubenswrapper[5004]: I1201 08:39:03.829261 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3f6b-account-create-update-2gfdj" Dec 01 08:39:04 crc kubenswrapper[5004]: I1201 08:39:03.849597 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-c26nm" event={"ID":"cd186cbb-dea0-4dd5-9328-935a4041a137","Type":"ContainerStarted","Data":"4dc6096f39c1faedcb879d414f107f7c4c75ea8ce9196db4d16df937ced6b98e"} Dec 01 08:39:04 crc kubenswrapper[5004]: I1201 08:39:03.854880 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-d5gjb" event={"ID":"9f70e608-5f44-45d3-9c98-d22ed21cf952","Type":"ContainerDied","Data":"965dd8bcee771dbaa362faf500020899b91b0cf46ad3b221bcdab9d6821b9d8e"} Dec 01 08:39:04 crc kubenswrapper[5004]: I1201 08:39:03.854915 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="965dd8bcee771dbaa362faf500020899b91b0cf46ad3b221bcdab9d6821b9d8e" Dec 01 08:39:04 crc kubenswrapper[5004]: I1201 08:39:03.854991 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-d5gjb" Dec 01 08:39:04 crc kubenswrapper[5004]: I1201 08:39:03.951902 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-clktj" Dec 01 08:39:04 crc kubenswrapper[5004]: I1201 08:39:04.264960 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/64792f42-dd08-4537-bce9-a632e644cf5a-etc-swift\") pod \"swift-storage-0\" (UID: \"64792f42-dd08-4537-bce9-a632e644cf5a\") " pod="openstack/swift-storage-0" Dec 01 08:39:04 crc kubenswrapper[5004]: E1201 08:39:04.265708 5004 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 01 08:39:04 crc kubenswrapper[5004]: E1201 08:39:04.265737 5004 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 01 08:39:04 crc kubenswrapper[5004]: E1201 08:39:04.265804 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/64792f42-dd08-4537-bce9-a632e644cf5a-etc-swift podName:64792f42-dd08-4537-bce9-a632e644cf5a nodeName:}" failed. No retries permitted until 2025-12-01 08:39:05.265779126 +0000 UTC m=+1322.830771128 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/64792f42-dd08-4537-bce9-a632e644cf5a-etc-swift") pod "swift-storage-0" (UID: "64792f42-dd08-4537-bce9-a632e644cf5a") : configmap "swift-ring-files" not found Dec 01 08:39:04 crc kubenswrapper[5004]: W1201 08:39:04.800787 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5a97177_1085_4eab_a646_c3b849dc73b5.slice/crio-4f1610ece8cf7f124d14bf143b111bc057655b6f52d40cd61a992a01f2e8fcf4 WatchSource:0}: Error finding container 4f1610ece8cf7f124d14bf143b111bc057655b6f52d40cd61a992a01f2e8fcf4: Status 404 returned error can't find the container with id 4f1610ece8cf7f124d14bf143b111bc057655b6f52d40cd61a992a01f2e8fcf4 Dec 01 08:39:04 crc kubenswrapper[5004]: I1201 08:39:04.803785 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-clktj"] Dec 01 08:39:04 crc kubenswrapper[5004]: I1201 08:39:04.870937 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-clktj" event={"ID":"e5a97177-1085-4eab-a646-c3b849dc73b5","Type":"ContainerStarted","Data":"4f1610ece8cf7f124d14bf143b111bc057655b6f52d40cd61a992a01f2e8fcf4"} Dec 01 08:39:04 crc kubenswrapper[5004]: I1201 08:39:04.873932 5004 generic.go:334] "Generic (PLEG): container finished" podID="cd186cbb-dea0-4dd5-9328-935a4041a137" containerID="b49bfa958360807f66ead1b9a529493cd21242e17329f00d236357a6bfa29197" exitCode=0 Dec 01 08:39:04 crc kubenswrapper[5004]: I1201 08:39:04.873999 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-c26nm" event={"ID":"cd186cbb-dea0-4dd5-9328-935a4041a137","Type":"ContainerDied","Data":"b49bfa958360807f66ead1b9a529493cd21242e17329f00d236357a6bfa29197"} Dec 01 08:39:04 crc kubenswrapper[5004]: I1201 08:39:04.875873 5004 generic.go:334] "Generic (PLEG): container finished" podID="4e59b4e1-4729-4161-ad22-e11718c0c6fe" containerID="21701a04ad6f4449f87cea6a10e39fc1f1d0e7b777aef1e2de0843de3e97d9b1" exitCode=0 Dec 01 08:39:04 crc kubenswrapper[5004]: I1201 08:39:04.875942 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-f2sxs" event={"ID":"4e59b4e1-4729-4161-ad22-e11718c0c6fe","Type":"ContainerDied","Data":"21701a04ad6f4449f87cea6a10e39fc1f1d0e7b777aef1e2de0843de3e97d9b1"} Dec 01 08:39:04 crc kubenswrapper[5004]: I1201 08:39:04.878769 5004 generic.go:334] "Generic (PLEG): container finished" podID="803e9984-2114-48e5-8d49-4536bd3cc6ef" containerID="aa15d35533926d66b511361f8c464ae977f0281b05407483211edc0c7ab555d9" exitCode=0 Dec 01 08:39:04 crc kubenswrapper[5004]: I1201 08:39:04.878807 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-2cda-account-create-update-gzdd5" event={"ID":"803e9984-2114-48e5-8d49-4536bd3cc6ef","Type":"ContainerDied","Data":"aa15d35533926d66b511361f8c464ae977f0281b05407483211edc0c7ab555d9"} Dec 01 08:39:05 crc kubenswrapper[5004]: I1201 08:39:05.292860 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/64792f42-dd08-4537-bce9-a632e644cf5a-etc-swift\") pod \"swift-storage-0\" (UID: \"64792f42-dd08-4537-bce9-a632e644cf5a\") " pod="openstack/swift-storage-0" Dec 01 08:39:05 crc kubenswrapper[5004]: E1201 08:39:05.293154 5004 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 01 08:39:05 crc kubenswrapper[5004]: E1201 08:39:05.293187 5004 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 01 08:39:05 crc kubenswrapper[5004]: E1201 08:39:05.293239 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/64792f42-dd08-4537-bce9-a632e644cf5a-etc-swift podName:64792f42-dd08-4537-bce9-a632e644cf5a nodeName:}" failed. No retries permitted until 2025-12-01 08:39:07.293220082 +0000 UTC m=+1324.858212064 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/64792f42-dd08-4537-bce9-a632e644cf5a-etc-swift") pod "swift-storage-0" (UID: "64792f42-dd08-4537-bce9-a632e644cf5a") : configmap "swift-ring-files" not found Dec 01 08:39:05 crc kubenswrapper[5004]: I1201 08:39:05.489403 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-88qsh"] Dec 01 08:39:05 crc kubenswrapper[5004]: I1201 08:39:05.490871 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-88qsh" Dec 01 08:39:05 crc kubenswrapper[5004]: I1201 08:39:05.506748 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-b5b4-account-create-update-4c6cr"] Dec 01 08:39:05 crc kubenswrapper[5004]: I1201 08:39:05.508922 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b5b4-account-create-update-4c6cr" Dec 01 08:39:05 crc kubenswrapper[5004]: I1201 08:39:05.514636 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 01 08:39:05 crc kubenswrapper[5004]: I1201 08:39:05.525134 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-88qsh"] Dec 01 08:39:05 crc kubenswrapper[5004]: I1201 08:39:05.534593 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-b5b4-account-create-update-4c6cr"] Dec 01 08:39:05 crc kubenswrapper[5004]: I1201 08:39:05.598342 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpsbc\" (UniqueName: \"kubernetes.io/projected/a5e4ccfc-1b99-4ee0-ab46-8e3e53669634-kube-api-access-hpsbc\") pod \"glance-db-create-88qsh\" (UID: \"a5e4ccfc-1b99-4ee0-ab46-8e3e53669634\") " pod="openstack/glance-db-create-88qsh" Dec 01 08:39:05 crc kubenswrapper[5004]: I1201 08:39:05.598393 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxg6h\" (UniqueName: \"kubernetes.io/projected/c0817a2b-0491-4c64-a2d8-0c03a938dd4a-kube-api-access-pxg6h\") pod \"glance-b5b4-account-create-update-4c6cr\" (UID: \"c0817a2b-0491-4c64-a2d8-0c03a938dd4a\") " pod="openstack/glance-b5b4-account-create-update-4c6cr" Dec 01 08:39:05 crc kubenswrapper[5004]: I1201 08:39:05.598505 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0817a2b-0491-4c64-a2d8-0c03a938dd4a-operator-scripts\") pod \"glance-b5b4-account-create-update-4c6cr\" (UID: \"c0817a2b-0491-4c64-a2d8-0c03a938dd4a\") " pod="openstack/glance-b5b4-account-create-update-4c6cr" Dec 01 08:39:05 crc kubenswrapper[5004]: I1201 08:39:05.598593 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5e4ccfc-1b99-4ee0-ab46-8e3e53669634-operator-scripts\") pod \"glance-db-create-88qsh\" (UID: \"a5e4ccfc-1b99-4ee0-ab46-8e3e53669634\") " pod="openstack/glance-db-create-88qsh" Dec 01 08:39:05 crc kubenswrapper[5004]: I1201 08:39:05.700113 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5e4ccfc-1b99-4ee0-ab46-8e3e53669634-operator-scripts\") pod \"glance-db-create-88qsh\" (UID: \"a5e4ccfc-1b99-4ee0-ab46-8e3e53669634\") " pod="openstack/glance-db-create-88qsh" Dec 01 08:39:05 crc kubenswrapper[5004]: I1201 08:39:05.700281 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpsbc\" (UniqueName: \"kubernetes.io/projected/a5e4ccfc-1b99-4ee0-ab46-8e3e53669634-kube-api-access-hpsbc\") pod \"glance-db-create-88qsh\" (UID: \"a5e4ccfc-1b99-4ee0-ab46-8e3e53669634\") " pod="openstack/glance-db-create-88qsh" Dec 01 08:39:05 crc kubenswrapper[5004]: I1201 08:39:05.700336 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxg6h\" (UniqueName: \"kubernetes.io/projected/c0817a2b-0491-4c64-a2d8-0c03a938dd4a-kube-api-access-pxg6h\") pod \"glance-b5b4-account-create-update-4c6cr\" (UID: \"c0817a2b-0491-4c64-a2d8-0c03a938dd4a\") " pod="openstack/glance-b5b4-account-create-update-4c6cr" Dec 01 08:39:05 crc kubenswrapper[5004]: I1201 08:39:05.700373 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0817a2b-0491-4c64-a2d8-0c03a938dd4a-operator-scripts\") pod \"glance-b5b4-account-create-update-4c6cr\" (UID: \"c0817a2b-0491-4c64-a2d8-0c03a938dd4a\") " pod="openstack/glance-b5b4-account-create-update-4c6cr" Dec 01 08:39:05 crc kubenswrapper[5004]: I1201 08:39:05.701097 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5e4ccfc-1b99-4ee0-ab46-8e3e53669634-operator-scripts\") pod \"glance-db-create-88qsh\" (UID: \"a5e4ccfc-1b99-4ee0-ab46-8e3e53669634\") " pod="openstack/glance-db-create-88qsh" Dec 01 08:39:05 crc kubenswrapper[5004]: I1201 08:39:05.701134 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0817a2b-0491-4c64-a2d8-0c03a938dd4a-operator-scripts\") pod \"glance-b5b4-account-create-update-4c6cr\" (UID: \"c0817a2b-0491-4c64-a2d8-0c03a938dd4a\") " pod="openstack/glance-b5b4-account-create-update-4c6cr" Dec 01 08:39:05 crc kubenswrapper[5004]: I1201 08:39:05.718523 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxg6h\" (UniqueName: \"kubernetes.io/projected/c0817a2b-0491-4c64-a2d8-0c03a938dd4a-kube-api-access-pxg6h\") pod \"glance-b5b4-account-create-update-4c6cr\" (UID: \"c0817a2b-0491-4c64-a2d8-0c03a938dd4a\") " pod="openstack/glance-b5b4-account-create-update-4c6cr" Dec 01 08:39:05 crc kubenswrapper[5004]: I1201 08:39:05.741416 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpsbc\" (UniqueName: \"kubernetes.io/projected/a5e4ccfc-1b99-4ee0-ab46-8e3e53669634-kube-api-access-hpsbc\") pod \"glance-db-create-88qsh\" (UID: \"a5e4ccfc-1b99-4ee0-ab46-8e3e53669634\") " pod="openstack/glance-db-create-88qsh" Dec 01 08:39:05 crc kubenswrapper[5004]: I1201 08:39:05.816344 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-88qsh" Dec 01 08:39:05 crc kubenswrapper[5004]: I1201 08:39:05.836397 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b5b4-account-create-update-4c6cr" Dec 01 08:39:05 crc kubenswrapper[5004]: I1201 08:39:05.892269 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-f2sxs" event={"ID":"4e59b4e1-4729-4161-ad22-e11718c0c6fe","Type":"ContainerStarted","Data":"e03e643932ac005453b74c50822abc37bafc358c6daaabb3f00e8cae6e89c37f"} Dec 01 08:39:05 crc kubenswrapper[5004]: I1201 08:39:05.910433 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8fbc5445-f2sxs" podStartSLOduration=3.910391475 podStartE2EDuration="3.910391475s" podCreationTimestamp="2025-12-01 08:39:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:39:05.906749959 +0000 UTC m=+1323.471741941" watchObservedRunningTime="2025-12-01 08:39:05.910391475 +0000 UTC m=+1323.475383457" Dec 01 08:39:06 crc kubenswrapper[5004]: I1201 08:39:06.903648 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-f2sxs" Dec 01 08:39:07 crc kubenswrapper[5004]: I1201 08:39:07.342931 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/64792f42-dd08-4537-bce9-a632e644cf5a-etc-swift\") pod \"swift-storage-0\" (UID: \"64792f42-dd08-4537-bce9-a632e644cf5a\") " pod="openstack/swift-storage-0" Dec 01 08:39:07 crc kubenswrapper[5004]: E1201 08:39:07.343284 5004 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 01 08:39:07 crc kubenswrapper[5004]: E1201 08:39:07.343317 5004 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 01 08:39:07 crc kubenswrapper[5004]: E1201 08:39:07.343372 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/64792f42-dd08-4537-bce9-a632e644cf5a-etc-swift podName:64792f42-dd08-4537-bce9-a632e644cf5a nodeName:}" failed. No retries permitted until 2025-12-01 08:39:11.343351519 +0000 UTC m=+1328.908343521 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/64792f42-dd08-4537-bce9-a632e644cf5a-etc-swift") pod "swift-storage-0" (UID: "64792f42-dd08-4537-bce9-a632e644cf5a") : configmap "swift-ring-files" not found Dec 01 08:39:09 crc kubenswrapper[5004]: I1201 08:39:09.129730 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-c26nm" Dec 01 08:39:09 crc kubenswrapper[5004]: I1201 08:39:09.134067 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-2cda-account-create-update-gzdd5" Dec 01 08:39:09 crc kubenswrapper[5004]: I1201 08:39:09.181509 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q55rt\" (UniqueName: \"kubernetes.io/projected/803e9984-2114-48e5-8d49-4536bd3cc6ef-kube-api-access-q55rt\") pod \"803e9984-2114-48e5-8d49-4536bd3cc6ef\" (UID: \"803e9984-2114-48e5-8d49-4536bd3cc6ef\") " Dec 01 08:39:09 crc kubenswrapper[5004]: I1201 08:39:09.181698 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd186cbb-dea0-4dd5-9328-935a4041a137-operator-scripts\") pod \"cd186cbb-dea0-4dd5-9328-935a4041a137\" (UID: \"cd186cbb-dea0-4dd5-9328-935a4041a137\") " Dec 01 08:39:09 crc kubenswrapper[5004]: I1201 08:39:09.181761 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkq52\" (UniqueName: \"kubernetes.io/projected/cd186cbb-dea0-4dd5-9328-935a4041a137-kube-api-access-wkq52\") pod \"cd186cbb-dea0-4dd5-9328-935a4041a137\" (UID: \"cd186cbb-dea0-4dd5-9328-935a4041a137\") " Dec 01 08:39:09 crc kubenswrapper[5004]: I1201 08:39:09.181863 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/803e9984-2114-48e5-8d49-4536bd3cc6ef-operator-scripts\") pod \"803e9984-2114-48e5-8d49-4536bd3cc6ef\" (UID: \"803e9984-2114-48e5-8d49-4536bd3cc6ef\") " Dec 01 08:39:09 crc kubenswrapper[5004]: I1201 08:39:09.182911 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd186cbb-dea0-4dd5-9328-935a4041a137-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cd186cbb-dea0-4dd5-9328-935a4041a137" (UID: "cd186cbb-dea0-4dd5-9328-935a4041a137"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:39:09 crc kubenswrapper[5004]: I1201 08:39:09.183009 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/803e9984-2114-48e5-8d49-4536bd3cc6ef-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "803e9984-2114-48e5-8d49-4536bd3cc6ef" (UID: "803e9984-2114-48e5-8d49-4536bd3cc6ef"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:39:09 crc kubenswrapper[5004]: I1201 08:39:09.190973 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/803e9984-2114-48e5-8d49-4536bd3cc6ef-kube-api-access-q55rt" (OuterVolumeSpecName: "kube-api-access-q55rt") pod "803e9984-2114-48e5-8d49-4536bd3cc6ef" (UID: "803e9984-2114-48e5-8d49-4536bd3cc6ef"). InnerVolumeSpecName "kube-api-access-q55rt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:39:09 crc kubenswrapper[5004]: I1201 08:39:09.191687 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd186cbb-dea0-4dd5-9328-935a4041a137-kube-api-access-wkq52" (OuterVolumeSpecName: "kube-api-access-wkq52") pod "cd186cbb-dea0-4dd5-9328-935a4041a137" (UID: "cd186cbb-dea0-4dd5-9328-935a4041a137"). InnerVolumeSpecName "kube-api-access-wkq52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:39:09 crc kubenswrapper[5004]: I1201 08:39:09.284217 5004 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/803e9984-2114-48e5-8d49-4536bd3cc6ef-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:09 crc kubenswrapper[5004]: I1201 08:39:09.284251 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q55rt\" (UniqueName: \"kubernetes.io/projected/803e9984-2114-48e5-8d49-4536bd3cc6ef-kube-api-access-q55rt\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:09 crc kubenswrapper[5004]: I1201 08:39:09.284262 5004 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd186cbb-dea0-4dd5-9328-935a4041a137-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:09 crc kubenswrapper[5004]: I1201 08:39:09.284270 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkq52\" (UniqueName: \"kubernetes.io/projected/cd186cbb-dea0-4dd5-9328-935a4041a137-kube-api-access-wkq52\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:09 crc kubenswrapper[5004]: I1201 08:39:09.310307 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-df4fb84fc-flnws" podUID="a5680092-beb9-4fe4-b35b-4c795980e350" containerName="console" containerID="cri-o://e143900ac807a60c11908b700a66e156576792fed4ea3c1340002b772b6d7488" gracePeriod=15 Dec 01 08:39:09 crc kubenswrapper[5004]: I1201 08:39:09.731996 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-dwhqn"] Dec 01 08:39:09 crc kubenswrapper[5004]: E1201 08:39:09.732620 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd186cbb-dea0-4dd5-9328-935a4041a137" containerName="mariadb-database-create" Dec 01 08:39:09 crc kubenswrapper[5004]: I1201 08:39:09.732658 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd186cbb-dea0-4dd5-9328-935a4041a137" containerName="mariadb-database-create" Dec 01 08:39:09 crc kubenswrapper[5004]: E1201 08:39:09.732758 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="803e9984-2114-48e5-8d49-4536bd3cc6ef" containerName="mariadb-account-create-update" Dec 01 08:39:09 crc kubenswrapper[5004]: I1201 08:39:09.732776 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="803e9984-2114-48e5-8d49-4536bd3cc6ef" containerName="mariadb-account-create-update" Dec 01 08:39:09 crc kubenswrapper[5004]: I1201 08:39:09.733108 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="803e9984-2114-48e5-8d49-4536bd3cc6ef" containerName="mariadb-account-create-update" Dec 01 08:39:09 crc kubenswrapper[5004]: I1201 08:39:09.733146 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd186cbb-dea0-4dd5-9328-935a4041a137" containerName="mariadb-database-create" Dec 01 08:39:09 crc kubenswrapper[5004]: I1201 08:39:09.734220 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-dwhqn" Dec 01 08:39:09 crc kubenswrapper[5004]: I1201 08:39:09.741645 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-dwhqn"] Dec 01 08:39:09 crc kubenswrapper[5004]: I1201 08:39:09.795627 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f716a201-3ee3-486c-ba54-785c9e603805-operator-scripts\") pod \"keystone-db-create-dwhqn\" (UID: \"f716a201-3ee3-486c-ba54-785c9e603805\") " pod="openstack/keystone-db-create-dwhqn" Dec 01 08:39:09 crc kubenswrapper[5004]: I1201 08:39:09.796001 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44n7h\" (UniqueName: \"kubernetes.io/projected/f716a201-3ee3-486c-ba54-785c9e603805-kube-api-access-44n7h\") pod \"keystone-db-create-dwhqn\" (UID: \"f716a201-3ee3-486c-ba54-785c9e603805\") " pod="openstack/keystone-db-create-dwhqn" Dec 01 08:39:09 crc kubenswrapper[5004]: I1201 08:39:09.828189 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-d3bc-account-create-update-d2dfq"] Dec 01 08:39:09 crc kubenswrapper[5004]: I1201 08:39:09.829811 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d3bc-account-create-update-d2dfq" Dec 01 08:39:09 crc kubenswrapper[5004]: I1201 08:39:09.838934 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-d3bc-account-create-update-d2dfq"] Dec 01 08:39:09 crc kubenswrapper[5004]: I1201 08:39:09.843439 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 01 08:39:09 crc kubenswrapper[5004]: I1201 08:39:09.897543 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44n7h\" (UniqueName: \"kubernetes.io/projected/f716a201-3ee3-486c-ba54-785c9e603805-kube-api-access-44n7h\") pod \"keystone-db-create-dwhqn\" (UID: \"f716a201-3ee3-486c-ba54-785c9e603805\") " pod="openstack/keystone-db-create-dwhqn" Dec 01 08:39:09 crc kubenswrapper[5004]: I1201 08:39:09.897708 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f716a201-3ee3-486c-ba54-785c9e603805-operator-scripts\") pod \"keystone-db-create-dwhqn\" (UID: \"f716a201-3ee3-486c-ba54-785c9e603805\") " pod="openstack/keystone-db-create-dwhqn" Dec 01 08:39:09 crc kubenswrapper[5004]: I1201 08:39:09.897760 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvxdp\" (UniqueName: \"kubernetes.io/projected/cf702a13-23f3-471e-b896-03b6e58d429d-kube-api-access-cvxdp\") pod \"keystone-d3bc-account-create-update-d2dfq\" (UID: \"cf702a13-23f3-471e-b896-03b6e58d429d\") " pod="openstack/keystone-d3bc-account-create-update-d2dfq" Dec 01 08:39:09 crc kubenswrapper[5004]: I1201 08:39:09.898072 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf702a13-23f3-471e-b896-03b6e58d429d-operator-scripts\") pod \"keystone-d3bc-account-create-update-d2dfq\" (UID: \"cf702a13-23f3-471e-b896-03b6e58d429d\") " pod="openstack/keystone-d3bc-account-create-update-d2dfq" Dec 01 08:39:09 crc kubenswrapper[5004]: I1201 08:39:09.899110 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f716a201-3ee3-486c-ba54-785c9e603805-operator-scripts\") pod \"keystone-db-create-dwhqn\" (UID: \"f716a201-3ee3-486c-ba54-785c9e603805\") " pod="openstack/keystone-db-create-dwhqn" Dec 01 08:39:09 crc kubenswrapper[5004]: I1201 08:39:09.924737 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44n7h\" (UniqueName: \"kubernetes.io/projected/f716a201-3ee3-486c-ba54-785c9e603805-kube-api-access-44n7h\") pod \"keystone-db-create-dwhqn\" (UID: \"f716a201-3ee3-486c-ba54-785c9e603805\") " pod="openstack/keystone-db-create-dwhqn" Dec 01 08:39:09 crc kubenswrapper[5004]: I1201 08:39:09.945098 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-c26nm" event={"ID":"cd186cbb-dea0-4dd5-9328-935a4041a137","Type":"ContainerDied","Data":"4dc6096f39c1faedcb879d414f107f7c4c75ea8ce9196db4d16df937ced6b98e"} Dec 01 08:39:09 crc kubenswrapper[5004]: I1201 08:39:09.945134 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4dc6096f39c1faedcb879d414f107f7c4c75ea8ce9196db4d16df937ced6b98e" Dec 01 08:39:09 crc kubenswrapper[5004]: I1201 08:39:09.945136 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-c26nm" Dec 01 08:39:09 crc kubenswrapper[5004]: I1201 08:39:09.947604 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-2cda-account-create-update-gzdd5" event={"ID":"803e9984-2114-48e5-8d49-4536bd3cc6ef","Type":"ContainerDied","Data":"647b423e6f1221cbf75a78fe0c1ee746d4de7a1e35718f392714fbe3e1658dd2"} Dec 01 08:39:09 crc kubenswrapper[5004]: I1201 08:39:09.947626 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="647b423e6f1221cbf75a78fe0c1ee746d4de7a1e35718f392714fbe3e1658dd2" Dec 01 08:39:09 crc kubenswrapper[5004]: I1201 08:39:09.947625 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-2cda-account-create-update-gzdd5" Dec 01 08:39:09 crc kubenswrapper[5004]: I1201 08:39:09.950125 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-df4fb84fc-flnws_a5680092-beb9-4fe4-b35b-4c795980e350/console/0.log" Dec 01 08:39:09 crc kubenswrapper[5004]: I1201 08:39:09.950172 5004 generic.go:334] "Generic (PLEG): container finished" podID="a5680092-beb9-4fe4-b35b-4c795980e350" containerID="e143900ac807a60c11908b700a66e156576792fed4ea3c1340002b772b6d7488" exitCode=2 Dec 01 08:39:09 crc kubenswrapper[5004]: I1201 08:39:09.950201 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-df4fb84fc-flnws" event={"ID":"a5680092-beb9-4fe4-b35b-4c795980e350","Type":"ContainerDied","Data":"e143900ac807a60c11908b700a66e156576792fed4ea3c1340002b772b6d7488"} Dec 01 08:39:10 crc kubenswrapper[5004]: I1201 08:39:10.003025 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvxdp\" (UniqueName: \"kubernetes.io/projected/cf702a13-23f3-471e-b896-03b6e58d429d-kube-api-access-cvxdp\") pod \"keystone-d3bc-account-create-update-d2dfq\" (UID: \"cf702a13-23f3-471e-b896-03b6e58d429d\") " pod="openstack/keystone-d3bc-account-create-update-d2dfq" Dec 01 08:39:10 crc kubenswrapper[5004]: I1201 08:39:10.003485 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf702a13-23f3-471e-b896-03b6e58d429d-operator-scripts\") pod \"keystone-d3bc-account-create-update-d2dfq\" (UID: \"cf702a13-23f3-471e-b896-03b6e58d429d\") " pod="openstack/keystone-d3bc-account-create-update-d2dfq" Dec 01 08:39:10 crc kubenswrapper[5004]: I1201 08:39:10.004176 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf702a13-23f3-471e-b896-03b6e58d429d-operator-scripts\") pod \"keystone-d3bc-account-create-update-d2dfq\" (UID: \"cf702a13-23f3-471e-b896-03b6e58d429d\") " pod="openstack/keystone-d3bc-account-create-update-d2dfq" Dec 01 08:39:10 crc kubenswrapper[5004]: I1201 08:39:10.023933 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvxdp\" (UniqueName: \"kubernetes.io/projected/cf702a13-23f3-471e-b896-03b6e58d429d-kube-api-access-cvxdp\") pod \"keystone-d3bc-account-create-update-d2dfq\" (UID: \"cf702a13-23f3-471e-b896-03b6e58d429d\") " pod="openstack/keystone-d3bc-account-create-update-d2dfq" Dec 01 08:39:10 crc kubenswrapper[5004]: I1201 08:39:10.060538 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-dwhqn" Dec 01 08:39:10 crc kubenswrapper[5004]: I1201 08:39:10.150815 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d3bc-account-create-update-d2dfq" Dec 01 08:39:10 crc kubenswrapper[5004]: I1201 08:39:10.479200 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 01 08:39:11 crc kubenswrapper[5004]: I1201 08:39:11.437595 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/64792f42-dd08-4537-bce9-a632e644cf5a-etc-swift\") pod \"swift-storage-0\" (UID: \"64792f42-dd08-4537-bce9-a632e644cf5a\") " pod="openstack/swift-storage-0" Dec 01 08:39:11 crc kubenswrapper[5004]: E1201 08:39:11.437765 5004 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 01 08:39:11 crc kubenswrapper[5004]: E1201 08:39:11.438011 5004 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 01 08:39:11 crc kubenswrapper[5004]: E1201 08:39:11.438065 5004 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/64792f42-dd08-4537-bce9-a632e644cf5a-etc-swift podName:64792f42-dd08-4537-bce9-a632e644cf5a nodeName:}" failed. No retries permitted until 2025-12-01 08:39:19.438050812 +0000 UTC m=+1337.003042794 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/64792f42-dd08-4537-bce9-a632e644cf5a-etc-swift") pod "swift-storage-0" (UID: "64792f42-dd08-4537-bce9-a632e644cf5a") : configmap "swift-ring-files" not found Dec 01 08:39:12 crc kubenswrapper[5004]: I1201 08:39:12.065532 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-df4fb84fc-flnws_a5680092-beb9-4fe4-b35b-4c795980e350/console/0.log" Dec 01 08:39:12 crc kubenswrapper[5004]: I1201 08:39:12.066042 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-df4fb84fc-flnws" Dec 01 08:39:12 crc kubenswrapper[5004]: I1201 08:39:12.157537 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a5680092-beb9-4fe4-b35b-4c795980e350-service-ca\") pod \"a5680092-beb9-4fe4-b35b-4c795980e350\" (UID: \"a5680092-beb9-4fe4-b35b-4c795980e350\") " Dec 01 08:39:12 crc kubenswrapper[5004]: I1201 08:39:12.157815 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a5680092-beb9-4fe4-b35b-4c795980e350-console-serving-cert\") pod \"a5680092-beb9-4fe4-b35b-4c795980e350\" (UID: \"a5680092-beb9-4fe4-b35b-4c795980e350\") " Dec 01 08:39:12 crc kubenswrapper[5004]: I1201 08:39:12.157856 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a5680092-beb9-4fe4-b35b-4c795980e350-console-config\") pod \"a5680092-beb9-4fe4-b35b-4c795980e350\" (UID: \"a5680092-beb9-4fe4-b35b-4c795980e350\") " Dec 01 08:39:12 crc kubenswrapper[5004]: I1201 08:39:12.157934 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a5680092-beb9-4fe4-b35b-4c795980e350-trusted-ca-bundle\") pod \"a5680092-beb9-4fe4-b35b-4c795980e350\" (UID: \"a5680092-beb9-4fe4-b35b-4c795980e350\") " Dec 01 08:39:12 crc kubenswrapper[5004]: I1201 08:39:12.157984 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a5680092-beb9-4fe4-b35b-4c795980e350-oauth-serving-cert\") pod \"a5680092-beb9-4fe4-b35b-4c795980e350\" (UID: \"a5680092-beb9-4fe4-b35b-4c795980e350\") " Dec 01 08:39:12 crc kubenswrapper[5004]: I1201 08:39:12.158046 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a5680092-beb9-4fe4-b35b-4c795980e350-console-oauth-config\") pod \"a5680092-beb9-4fe4-b35b-4c795980e350\" (UID: \"a5680092-beb9-4fe4-b35b-4c795980e350\") " Dec 01 08:39:12 crc kubenswrapper[5004]: I1201 08:39:12.158104 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5bqg\" (UniqueName: \"kubernetes.io/projected/a5680092-beb9-4fe4-b35b-4c795980e350-kube-api-access-l5bqg\") pod \"a5680092-beb9-4fe4-b35b-4c795980e350\" (UID: \"a5680092-beb9-4fe4-b35b-4c795980e350\") " Dec 01 08:39:12 crc kubenswrapper[5004]: I1201 08:39:12.159742 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5680092-beb9-4fe4-b35b-4c795980e350-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "a5680092-beb9-4fe4-b35b-4c795980e350" (UID: "a5680092-beb9-4fe4-b35b-4c795980e350"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:39:12 crc kubenswrapper[5004]: I1201 08:39:12.159852 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5680092-beb9-4fe4-b35b-4c795980e350-console-config" (OuterVolumeSpecName: "console-config") pod "a5680092-beb9-4fe4-b35b-4c795980e350" (UID: "a5680092-beb9-4fe4-b35b-4c795980e350"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:39:12 crc kubenswrapper[5004]: I1201 08:39:12.159888 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5680092-beb9-4fe4-b35b-4c795980e350-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "a5680092-beb9-4fe4-b35b-4c795980e350" (UID: "a5680092-beb9-4fe4-b35b-4c795980e350"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:39:12 crc kubenswrapper[5004]: I1201 08:39:12.159902 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5680092-beb9-4fe4-b35b-4c795980e350-service-ca" (OuterVolumeSpecName: "service-ca") pod "a5680092-beb9-4fe4-b35b-4c795980e350" (UID: "a5680092-beb9-4fe4-b35b-4c795980e350"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:39:12 crc kubenswrapper[5004]: I1201 08:39:12.171762 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5680092-beb9-4fe4-b35b-4c795980e350-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "a5680092-beb9-4fe4-b35b-4c795980e350" (UID: "a5680092-beb9-4fe4-b35b-4c795980e350"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:39:12 crc kubenswrapper[5004]: I1201 08:39:12.171776 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5680092-beb9-4fe4-b35b-4c795980e350-kube-api-access-l5bqg" (OuterVolumeSpecName: "kube-api-access-l5bqg") pod "a5680092-beb9-4fe4-b35b-4c795980e350" (UID: "a5680092-beb9-4fe4-b35b-4c795980e350"). InnerVolumeSpecName "kube-api-access-l5bqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:39:12 crc kubenswrapper[5004]: I1201 08:39:12.171809 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5680092-beb9-4fe4-b35b-4c795980e350-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "a5680092-beb9-4fe4-b35b-4c795980e350" (UID: "a5680092-beb9-4fe4-b35b-4c795980e350"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:39:12 crc kubenswrapper[5004]: I1201 08:39:12.260190 5004 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a5680092-beb9-4fe4-b35b-4c795980e350-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:12 crc kubenswrapper[5004]: I1201 08:39:12.260633 5004 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a5680092-beb9-4fe4-b35b-4c795980e350-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:12 crc kubenswrapper[5004]: I1201 08:39:12.260692 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5bqg\" (UniqueName: \"kubernetes.io/projected/a5680092-beb9-4fe4-b35b-4c795980e350-kube-api-access-l5bqg\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:12 crc kubenswrapper[5004]: I1201 08:39:12.260741 5004 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a5680092-beb9-4fe4-b35b-4c795980e350-service-ca\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:12 crc kubenswrapper[5004]: I1201 08:39:12.260791 5004 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a5680092-beb9-4fe4-b35b-4c795980e350-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:12 crc kubenswrapper[5004]: I1201 08:39:12.260839 5004 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a5680092-beb9-4fe4-b35b-4c795980e350-console-config\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:12 crc kubenswrapper[5004]: I1201 08:39:12.260895 5004 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a5680092-beb9-4fe4-b35b-4c795980e350-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:12 crc kubenswrapper[5004]: I1201 08:39:12.274480 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-88qsh"] Dec 01 08:39:12 crc kubenswrapper[5004]: I1201 08:39:12.331866 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-dvjqn"] Dec 01 08:39:12 crc kubenswrapper[5004]: E1201 08:39:12.332418 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5680092-beb9-4fe4-b35b-4c795980e350" containerName="console" Dec 01 08:39:12 crc kubenswrapper[5004]: I1201 08:39:12.332493 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5680092-beb9-4fe4-b35b-4c795980e350" containerName="console" Dec 01 08:39:12 crc kubenswrapper[5004]: I1201 08:39:12.332751 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5680092-beb9-4fe4-b35b-4c795980e350" containerName="console" Dec 01 08:39:12 crc kubenswrapper[5004]: I1201 08:39:12.333694 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-dvjqn" Dec 01 08:39:12 crc kubenswrapper[5004]: I1201 08:39:12.345174 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-dvjqn"] Dec 01 08:39:12 crc kubenswrapper[5004]: I1201 08:39:12.463984 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2j2qx\" (UniqueName: \"kubernetes.io/projected/5facd201-c917-4a20-86f7-1a95b6604bd1-kube-api-access-2j2qx\") pod \"mysqld-exporter-openstack-cell1-db-create-dvjqn\" (UID: \"5facd201-c917-4a20-86f7-1a95b6604bd1\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-dvjqn" Dec 01 08:39:12 crc kubenswrapper[5004]: I1201 08:39:12.464504 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5facd201-c917-4a20-86f7-1a95b6604bd1-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-dvjqn\" (UID: \"5facd201-c917-4a20-86f7-1a95b6604bd1\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-dvjqn" Dec 01 08:39:12 crc kubenswrapper[5004]: I1201 08:39:12.565725 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8fbc5445-f2sxs" Dec 01 08:39:12 crc kubenswrapper[5004]: I1201 08:39:12.566981 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5facd201-c917-4a20-86f7-1a95b6604bd1-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-dvjqn\" (UID: \"5facd201-c917-4a20-86f7-1a95b6604bd1\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-dvjqn" Dec 01 08:39:12 crc kubenswrapper[5004]: I1201 08:39:12.567109 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2j2qx\" (UniqueName: \"kubernetes.io/projected/5facd201-c917-4a20-86f7-1a95b6604bd1-kube-api-access-2j2qx\") pod \"mysqld-exporter-openstack-cell1-db-create-dvjqn\" (UID: \"5facd201-c917-4a20-86f7-1a95b6604bd1\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-dvjqn" Dec 01 08:39:12 crc kubenswrapper[5004]: I1201 08:39:12.567999 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5facd201-c917-4a20-86f7-1a95b6604bd1-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-dvjqn\" (UID: \"5facd201-c917-4a20-86f7-1a95b6604bd1\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-dvjqn" Dec 01 08:39:12 crc kubenswrapper[5004]: I1201 08:39:12.816191 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2j2qx\" (UniqueName: \"kubernetes.io/projected/5facd201-c917-4a20-86f7-1a95b6604bd1-kube-api-access-2j2qx\") pod \"mysqld-exporter-openstack-cell1-db-create-dvjqn\" (UID: \"5facd201-c917-4a20-86f7-1a95b6604bd1\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-dvjqn" Dec 01 08:39:12 crc kubenswrapper[5004]: I1201 08:39:12.864958 5004 patch_prober.go:28] interesting pod/console-df4fb84fc-flnws container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.91:8443/health\": dial tcp 10.217.0.91:8443: i/o timeout" start-of-body= Dec 01 08:39:12 crc kubenswrapper[5004]: I1201 08:39:12.865332 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-df4fb84fc-flnws" podUID="a5680092-beb9-4fe4-b35b-4c795980e350" containerName="console" probeResult="failure" output="Get \"https://10.217.0.91:8443/health\": dial tcp 10.217.0.91:8443: i/o timeout" Dec 01 08:39:12 crc kubenswrapper[5004]: I1201 08:39:12.866320 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-bd92-account-create-update-zhnsd"] Dec 01 08:39:12 crc kubenswrapper[5004]: I1201 08:39:12.868203 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-bd92-account-create-update-zhnsd" Dec 01 08:39:12 crc kubenswrapper[5004]: I1201 08:39:12.872828 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-openstack-cell1-db-secret" Dec 01 08:39:12 crc kubenswrapper[5004]: I1201 08:39:12.914627 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-bd92-account-create-update-zhnsd"] Dec 01 08:39:12 crc kubenswrapper[5004]: I1201 08:39:12.955545 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-dvjqn" Dec 01 08:39:12 crc kubenswrapper[5004]: I1201 08:39:12.963478 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-b5b4-account-create-update-4c6cr"] Dec 01 08:39:12 crc kubenswrapper[5004]: I1201 08:39:12.976453 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcrdg\" (UniqueName: \"kubernetes.io/projected/1d4469a9-2f5c-4a4f-a6cf-aa8d58aa8da4-kube-api-access-zcrdg\") pod \"mysqld-exporter-bd92-account-create-update-zhnsd\" (UID: \"1d4469a9-2f5c-4a4f-a6cf-aa8d58aa8da4\") " pod="openstack/mysqld-exporter-bd92-account-create-update-zhnsd" Dec 01 08:39:12 crc kubenswrapper[5004]: I1201 08:39:12.976516 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d4469a9-2f5c-4a4f-a6cf-aa8d58aa8da4-operator-scripts\") pod \"mysqld-exporter-bd92-account-create-update-zhnsd\" (UID: \"1d4469a9-2f5c-4a4f-a6cf-aa8d58aa8da4\") " pod="openstack/mysqld-exporter-bd92-account-create-update-zhnsd" Dec 01 08:39:13 crc kubenswrapper[5004]: I1201 08:39:13.003609 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-dwhqn"] Dec 01 08:39:13 crc kubenswrapper[5004]: I1201 08:39:13.039222 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b5b4-account-create-update-4c6cr" event={"ID":"c0817a2b-0491-4c64-a2d8-0c03a938dd4a","Type":"ContainerStarted","Data":"cec9b4e9898d69687f4442ecc4686660cccdf0893afc97a3fb2984622cafe9c6"} Dec 01 08:39:13 crc kubenswrapper[5004]: I1201 08:39:13.043828 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-4vb89"] Dec 01 08:39:13 crc kubenswrapper[5004]: I1201 08:39:13.044139 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8554648995-4vb89" podUID="abf31963-3bf7-4b6e-adaa-8605634a9530" containerName="dnsmasq-dns" containerID="cri-o://b06071e0f6127f7f903da48eaf50e37a9897f2c96e95fe95a19e061fad7210f3" gracePeriod=10 Dec 01 08:39:13 crc kubenswrapper[5004]: I1201 08:39:13.059425 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-d3bc-account-create-update-d2dfq"] Dec 01 08:39:13 crc kubenswrapper[5004]: I1201 08:39:13.066090 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-clktj" event={"ID":"e5a97177-1085-4eab-a646-c3b849dc73b5","Type":"ContainerStarted","Data":"d5654ead08ebc53482c2c426493beaca3ba2b4868056f8efc786ebf26bc0f01d"} Dec 01 08:39:13 crc kubenswrapper[5004]: I1201 08:39:13.072303 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c74b987d-43c4-49d9-92fc-f13f3b7b7dd2","Type":"ContainerStarted","Data":"c41b3a9d677c658eecfa2ea7bc9bd4172a1e4f97b3def3ea8e66714dc66f870c"} Dec 01 08:39:13 crc kubenswrapper[5004]: I1201 08:39:13.078228 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-dwhqn" event={"ID":"f716a201-3ee3-486c-ba54-785c9e603805","Type":"ContainerStarted","Data":"91e72775daae17a9fb194af3ffc7a7de0b58cc607820f255ecdd01c7c3ce9466"} Dec 01 08:39:13 crc kubenswrapper[5004]: I1201 08:39:13.083225 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcrdg\" (UniqueName: \"kubernetes.io/projected/1d4469a9-2f5c-4a4f-a6cf-aa8d58aa8da4-kube-api-access-zcrdg\") pod \"mysqld-exporter-bd92-account-create-update-zhnsd\" (UID: \"1d4469a9-2f5c-4a4f-a6cf-aa8d58aa8da4\") " pod="openstack/mysqld-exporter-bd92-account-create-update-zhnsd" Dec 01 08:39:13 crc kubenswrapper[5004]: I1201 08:39:13.083299 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d4469a9-2f5c-4a4f-a6cf-aa8d58aa8da4-operator-scripts\") pod \"mysqld-exporter-bd92-account-create-update-zhnsd\" (UID: \"1d4469a9-2f5c-4a4f-a6cf-aa8d58aa8da4\") " pod="openstack/mysqld-exporter-bd92-account-create-update-zhnsd" Dec 01 08:39:13 crc kubenswrapper[5004]: I1201 08:39:13.084297 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d4469a9-2f5c-4a4f-a6cf-aa8d58aa8da4-operator-scripts\") pod \"mysqld-exporter-bd92-account-create-update-zhnsd\" (UID: \"1d4469a9-2f5c-4a4f-a6cf-aa8d58aa8da4\") " pod="openstack/mysqld-exporter-bd92-account-create-update-zhnsd" Dec 01 08:39:13 crc kubenswrapper[5004]: I1201 08:39:13.087142 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-88qsh" event={"ID":"a5e4ccfc-1b99-4ee0-ab46-8e3e53669634","Type":"ContainerStarted","Data":"7d7480261e115075222fa0617873257b35057ada4729aab52996815fb4c5fe07"} Dec 01 08:39:13 crc kubenswrapper[5004]: I1201 08:39:13.087179 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-88qsh" event={"ID":"a5e4ccfc-1b99-4ee0-ab46-8e3e53669634","Type":"ContainerStarted","Data":"a7ea00139a82c3c90525c39c3f1f6b6109b250140097e5e83e09db1a79921cbd"} Dec 01 08:39:13 crc kubenswrapper[5004]: I1201 08:39:13.096784 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d3bc-account-create-update-d2dfq" event={"ID":"cf702a13-23f3-471e-b896-03b6e58d429d","Type":"ContainerStarted","Data":"5f423ff57b4d5d42b2f22609ae251d6703d21262500e1a294c9690293d8d7095"} Dec 01 08:39:13 crc kubenswrapper[5004]: I1201 08:39:13.097262 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-clktj" podStartSLOduration=3.129363895 podStartE2EDuration="10.097249627s" podCreationTimestamp="2025-12-01 08:39:03 +0000 UTC" firstStartedPulling="2025-12-01 08:39:04.804769408 +0000 UTC m=+1322.369761400" lastFinishedPulling="2025-12-01 08:39:11.77265515 +0000 UTC m=+1329.337647132" observedRunningTime="2025-12-01 08:39:13.084937836 +0000 UTC m=+1330.649929818" watchObservedRunningTime="2025-12-01 08:39:13.097249627 +0000 UTC m=+1330.662241609" Dec 01 08:39:13 crc kubenswrapper[5004]: I1201 08:39:13.101365 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcrdg\" (UniqueName: \"kubernetes.io/projected/1d4469a9-2f5c-4a4f-a6cf-aa8d58aa8da4-kube-api-access-zcrdg\") pod \"mysqld-exporter-bd92-account-create-update-zhnsd\" (UID: \"1d4469a9-2f5c-4a4f-a6cf-aa8d58aa8da4\") " pod="openstack/mysqld-exporter-bd92-account-create-update-zhnsd" Dec 01 08:39:13 crc kubenswrapper[5004]: I1201 08:39:13.113996 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-df4fb84fc-flnws_a5680092-beb9-4fe4-b35b-4c795980e350/console/0.log" Dec 01 08:39:13 crc kubenswrapper[5004]: I1201 08:39:13.114065 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-df4fb84fc-flnws" event={"ID":"a5680092-beb9-4fe4-b35b-4c795980e350","Type":"ContainerDied","Data":"df197f448c2301ab71449ed0b2648e5d631dd3b50232f1d832fa12a4258bbe66"} Dec 01 08:39:13 crc kubenswrapper[5004]: I1201 08:39:13.114101 5004 scope.go:117] "RemoveContainer" containerID="e143900ac807a60c11908b700a66e156576792fed4ea3c1340002b772b6d7488" Dec 01 08:39:13 crc kubenswrapper[5004]: I1201 08:39:13.114173 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-df4fb84fc-flnws" Dec 01 08:39:13 crc kubenswrapper[5004]: I1201 08:39:13.202801 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-bd92-account-create-update-zhnsd" Dec 01 08:39:13 crc kubenswrapper[5004]: I1201 08:39:13.246932 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-df4fb84fc-flnws"] Dec 01 08:39:13 crc kubenswrapper[5004]: I1201 08:39:13.256012 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-df4fb84fc-flnws"] Dec 01 08:39:13 crc kubenswrapper[5004]: I1201 08:39:13.536604 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-4vb89" Dec 01 08:39:13 crc kubenswrapper[5004]: I1201 08:39:13.592975 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abf31963-3bf7-4b6e-adaa-8605634a9530-config\") pod \"abf31963-3bf7-4b6e-adaa-8605634a9530\" (UID: \"abf31963-3bf7-4b6e-adaa-8605634a9530\") " Dec 01 08:39:13 crc kubenswrapper[5004]: I1201 08:39:13.593173 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/abf31963-3bf7-4b6e-adaa-8605634a9530-ovsdbserver-nb\") pod \"abf31963-3bf7-4b6e-adaa-8605634a9530\" (UID: \"abf31963-3bf7-4b6e-adaa-8605634a9530\") " Dec 01 08:39:13 crc kubenswrapper[5004]: I1201 08:39:13.593204 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/abf31963-3bf7-4b6e-adaa-8605634a9530-ovsdbserver-sb\") pod \"abf31963-3bf7-4b6e-adaa-8605634a9530\" (UID: \"abf31963-3bf7-4b6e-adaa-8605634a9530\") " Dec 01 08:39:13 crc kubenswrapper[5004]: I1201 08:39:13.593268 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gcqr\" (UniqueName: \"kubernetes.io/projected/abf31963-3bf7-4b6e-adaa-8605634a9530-kube-api-access-9gcqr\") pod \"abf31963-3bf7-4b6e-adaa-8605634a9530\" (UID: \"abf31963-3bf7-4b6e-adaa-8605634a9530\") " Dec 01 08:39:13 crc kubenswrapper[5004]: I1201 08:39:13.593342 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/abf31963-3bf7-4b6e-adaa-8605634a9530-dns-svc\") pod \"abf31963-3bf7-4b6e-adaa-8605634a9530\" (UID: \"abf31963-3bf7-4b6e-adaa-8605634a9530\") " Dec 01 08:39:13 crc kubenswrapper[5004]: I1201 08:39:13.601520 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abf31963-3bf7-4b6e-adaa-8605634a9530-kube-api-access-9gcqr" (OuterVolumeSpecName: "kube-api-access-9gcqr") pod "abf31963-3bf7-4b6e-adaa-8605634a9530" (UID: "abf31963-3bf7-4b6e-adaa-8605634a9530"). InnerVolumeSpecName "kube-api-access-9gcqr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:39:13 crc kubenswrapper[5004]: I1201 08:39:13.661480 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-dvjqn"] Dec 01 08:39:13 crc kubenswrapper[5004]: I1201 08:39:13.696428 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9gcqr\" (UniqueName: \"kubernetes.io/projected/abf31963-3bf7-4b6e-adaa-8605634a9530-kube-api-access-9gcqr\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:13 crc kubenswrapper[5004]: I1201 08:39:13.766370 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abf31963-3bf7-4b6e-adaa-8605634a9530-config" (OuterVolumeSpecName: "config") pod "abf31963-3bf7-4b6e-adaa-8605634a9530" (UID: "abf31963-3bf7-4b6e-adaa-8605634a9530"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:39:13 crc kubenswrapper[5004]: I1201 08:39:13.799511 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abf31963-3bf7-4b6e-adaa-8605634a9530-config\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:13 crc kubenswrapper[5004]: I1201 08:39:13.814715 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-bd92-account-create-update-zhnsd"] Dec 01 08:39:13 crc kubenswrapper[5004]: W1201 08:39:13.911809 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5facd201_c917_4a20_86f7_1a95b6604bd1.slice/crio-efe559438d2a717ec712e9b842cd0761959ac4f7676bc03eb859c1a1540ee167 WatchSource:0}: Error finding container efe559438d2a717ec712e9b842cd0761959ac4f7676bc03eb859c1a1540ee167: Status 404 returned error can't find the container with id efe559438d2a717ec712e9b842cd0761959ac4f7676bc03eb859c1a1540ee167 Dec 01 08:39:13 crc kubenswrapper[5004]: W1201 08:39:13.916287 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d4469a9_2f5c_4a4f_a6cf_aa8d58aa8da4.slice/crio-d540756bd15177a711464c45456c7bb761dba669b725c845c0ab9a347b1096e8 WatchSource:0}: Error finding container d540756bd15177a711464c45456c7bb761dba669b725c845c0ab9a347b1096e8: Status 404 returned error can't find the container with id d540756bd15177a711464c45456c7bb761dba669b725c845c0ab9a347b1096e8 Dec 01 08:39:14 crc kubenswrapper[5004]: I1201 08:39:14.140803 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-bd92-account-create-update-zhnsd" event={"ID":"1d4469a9-2f5c-4a4f-a6cf-aa8d58aa8da4","Type":"ContainerStarted","Data":"d540756bd15177a711464c45456c7bb761dba669b725c845c0ab9a347b1096e8"} Dec 01 08:39:14 crc kubenswrapper[5004]: I1201 08:39:14.143103 5004 generic.go:334] "Generic (PLEG): container finished" podID="cf702a13-23f3-471e-b896-03b6e58d429d" containerID="95e7525308b38af4759baae3c397c557cb54a7e90e0123a85a516e3116e0c0f5" exitCode=0 Dec 01 08:39:14 crc kubenswrapper[5004]: I1201 08:39:14.143153 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d3bc-account-create-update-d2dfq" event={"ID":"cf702a13-23f3-471e-b896-03b6e58d429d","Type":"ContainerDied","Data":"95e7525308b38af4759baae3c397c557cb54a7e90e0123a85a516e3116e0c0f5"} Dec 01 08:39:14 crc kubenswrapper[5004]: I1201 08:39:14.146119 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-dvjqn" event={"ID":"5facd201-c917-4a20-86f7-1a95b6604bd1","Type":"ContainerStarted","Data":"efe559438d2a717ec712e9b842cd0761959ac4f7676bc03eb859c1a1540ee167"} Dec 01 08:39:14 crc kubenswrapper[5004]: I1201 08:39:14.151640 5004 generic.go:334] "Generic (PLEG): container finished" podID="abf31963-3bf7-4b6e-adaa-8605634a9530" containerID="b06071e0f6127f7f903da48eaf50e37a9897f2c96e95fe95a19e061fad7210f3" exitCode=0 Dec 01 08:39:14 crc kubenswrapper[5004]: I1201 08:39:14.151689 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-4vb89" event={"ID":"abf31963-3bf7-4b6e-adaa-8605634a9530","Type":"ContainerDied","Data":"b06071e0f6127f7f903da48eaf50e37a9897f2c96e95fe95a19e061fad7210f3"} Dec 01 08:39:14 crc kubenswrapper[5004]: I1201 08:39:14.151716 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-4vb89" event={"ID":"abf31963-3bf7-4b6e-adaa-8605634a9530","Type":"ContainerDied","Data":"f0738a447292676b4bb7a8c51cb318e06595e9bdd94da6807b66fb438a2a5d73"} Dec 01 08:39:14 crc kubenswrapper[5004]: I1201 08:39:14.151732 5004 scope.go:117] "RemoveContainer" containerID="b06071e0f6127f7f903da48eaf50e37a9897f2c96e95fe95a19e061fad7210f3" Dec 01 08:39:14 crc kubenswrapper[5004]: I1201 08:39:14.151770 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-4vb89" Dec 01 08:39:14 crc kubenswrapper[5004]: I1201 08:39:14.166003 5004 generic.go:334] "Generic (PLEG): container finished" podID="c0817a2b-0491-4c64-a2d8-0c03a938dd4a" containerID="cdb5df3a5cc301e4c812d5825d2ce15c2b16d6a8caa39f1c5fd496e98e12085f" exitCode=0 Dec 01 08:39:14 crc kubenswrapper[5004]: I1201 08:39:14.166139 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b5b4-account-create-update-4c6cr" event={"ID":"c0817a2b-0491-4c64-a2d8-0c03a938dd4a","Type":"ContainerDied","Data":"cdb5df3a5cc301e4c812d5825d2ce15c2b16d6a8caa39f1c5fd496e98e12085f"} Dec 01 08:39:14 crc kubenswrapper[5004]: I1201 08:39:14.167974 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abf31963-3bf7-4b6e-adaa-8605634a9530-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "abf31963-3bf7-4b6e-adaa-8605634a9530" (UID: "abf31963-3bf7-4b6e-adaa-8605634a9530"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:39:14 crc kubenswrapper[5004]: I1201 08:39:14.174516 5004 generic.go:334] "Generic (PLEG): container finished" podID="f716a201-3ee3-486c-ba54-785c9e603805" containerID="fbef74624a42737b16db9753cb53083ffb17a94a36748a8290bcd3c4a7d868a5" exitCode=0 Dec 01 08:39:14 crc kubenswrapper[5004]: I1201 08:39:14.174631 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-dwhqn" event={"ID":"f716a201-3ee3-486c-ba54-785c9e603805","Type":"ContainerDied","Data":"fbef74624a42737b16db9753cb53083ffb17a94a36748a8290bcd3c4a7d868a5"} Dec 01 08:39:14 crc kubenswrapper[5004]: I1201 08:39:14.177468 5004 generic.go:334] "Generic (PLEG): container finished" podID="a5e4ccfc-1b99-4ee0-ab46-8e3e53669634" containerID="7d7480261e115075222fa0617873257b35057ada4729aab52996815fb4c5fe07" exitCode=0 Dec 01 08:39:14 crc kubenswrapper[5004]: I1201 08:39:14.178213 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-88qsh" event={"ID":"a5e4ccfc-1b99-4ee0-ab46-8e3e53669634","Type":"ContainerDied","Data":"7d7480261e115075222fa0617873257b35057ada4729aab52996815fb4c5fe07"} Dec 01 08:39:14 crc kubenswrapper[5004]: I1201 08:39:14.218460 5004 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/abf31963-3bf7-4b6e-adaa-8605634a9530-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:14 crc kubenswrapper[5004]: I1201 08:39:14.393688 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abf31963-3bf7-4b6e-adaa-8605634a9530-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "abf31963-3bf7-4b6e-adaa-8605634a9530" (UID: "abf31963-3bf7-4b6e-adaa-8605634a9530"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:39:14 crc kubenswrapper[5004]: I1201 08:39:14.410544 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abf31963-3bf7-4b6e-adaa-8605634a9530-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "abf31963-3bf7-4b6e-adaa-8605634a9530" (UID: "abf31963-3bf7-4b6e-adaa-8605634a9530"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:39:14 crc kubenswrapper[5004]: I1201 08:39:14.446842 5004 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/abf31963-3bf7-4b6e-adaa-8605634a9530-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:14 crc kubenswrapper[5004]: I1201 08:39:14.446876 5004 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/abf31963-3bf7-4b6e-adaa-8605634a9530-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:14 crc kubenswrapper[5004]: I1201 08:39:14.770206 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5680092-beb9-4fe4-b35b-4c795980e350" path="/var/lib/kubelet/pods/a5680092-beb9-4fe4-b35b-4c795980e350/volumes" Dec 01 08:39:14 crc kubenswrapper[5004]: I1201 08:39:14.891967 5004 scope.go:117] "RemoveContainer" containerID="a01d23049691a77af21d859af56ed16e7ec48e3ec9e1c532027480d15cdf4b5b" Dec 01 08:39:15 crc kubenswrapper[5004]: I1201 08:39:15.060064 5004 scope.go:117] "RemoveContainer" containerID="b06071e0f6127f7f903da48eaf50e37a9897f2c96e95fe95a19e061fad7210f3" Dec 01 08:39:15 crc kubenswrapper[5004]: E1201 08:39:15.061969 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b06071e0f6127f7f903da48eaf50e37a9897f2c96e95fe95a19e061fad7210f3\": container with ID starting with b06071e0f6127f7f903da48eaf50e37a9897f2c96e95fe95a19e061fad7210f3 not found: ID does not exist" containerID="b06071e0f6127f7f903da48eaf50e37a9897f2c96e95fe95a19e061fad7210f3" Dec 01 08:39:15 crc kubenswrapper[5004]: I1201 08:39:15.062019 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b06071e0f6127f7f903da48eaf50e37a9897f2c96e95fe95a19e061fad7210f3"} err="failed to get container status \"b06071e0f6127f7f903da48eaf50e37a9897f2c96e95fe95a19e061fad7210f3\": rpc error: code = NotFound desc = could not find container \"b06071e0f6127f7f903da48eaf50e37a9897f2c96e95fe95a19e061fad7210f3\": container with ID starting with b06071e0f6127f7f903da48eaf50e37a9897f2c96e95fe95a19e061fad7210f3 not found: ID does not exist" Dec 01 08:39:15 crc kubenswrapper[5004]: I1201 08:39:15.062061 5004 scope.go:117] "RemoveContainer" containerID="a01d23049691a77af21d859af56ed16e7ec48e3ec9e1c532027480d15cdf4b5b" Dec 01 08:39:15 crc kubenswrapper[5004]: E1201 08:39:15.062381 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a01d23049691a77af21d859af56ed16e7ec48e3ec9e1c532027480d15cdf4b5b\": container with ID starting with a01d23049691a77af21d859af56ed16e7ec48e3ec9e1c532027480d15cdf4b5b not found: ID does not exist" containerID="a01d23049691a77af21d859af56ed16e7ec48e3ec9e1c532027480d15cdf4b5b" Dec 01 08:39:15 crc kubenswrapper[5004]: I1201 08:39:15.062420 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a01d23049691a77af21d859af56ed16e7ec48e3ec9e1c532027480d15cdf4b5b"} err="failed to get container status \"a01d23049691a77af21d859af56ed16e7ec48e3ec9e1c532027480d15cdf4b5b\": rpc error: code = NotFound desc = could not find container \"a01d23049691a77af21d859af56ed16e7ec48e3ec9e1c532027480d15cdf4b5b\": container with ID starting with a01d23049691a77af21d859af56ed16e7ec48e3ec9e1c532027480d15cdf4b5b not found: ID does not exist" Dec 01 08:39:15 crc kubenswrapper[5004]: I1201 08:39:15.112422 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-88qsh" Dec 01 08:39:15 crc kubenswrapper[5004]: I1201 08:39:15.161083 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpsbc\" (UniqueName: \"kubernetes.io/projected/a5e4ccfc-1b99-4ee0-ab46-8e3e53669634-kube-api-access-hpsbc\") pod \"a5e4ccfc-1b99-4ee0-ab46-8e3e53669634\" (UID: \"a5e4ccfc-1b99-4ee0-ab46-8e3e53669634\") " Dec 01 08:39:15 crc kubenswrapper[5004]: I1201 08:39:15.161220 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5e4ccfc-1b99-4ee0-ab46-8e3e53669634-operator-scripts\") pod \"a5e4ccfc-1b99-4ee0-ab46-8e3e53669634\" (UID: \"a5e4ccfc-1b99-4ee0-ab46-8e3e53669634\") " Dec 01 08:39:15 crc kubenswrapper[5004]: I1201 08:39:15.163215 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5e4ccfc-1b99-4ee0-ab46-8e3e53669634-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a5e4ccfc-1b99-4ee0-ab46-8e3e53669634" (UID: "a5e4ccfc-1b99-4ee0-ab46-8e3e53669634"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:39:15 crc kubenswrapper[5004]: I1201 08:39:15.168736 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5e4ccfc-1b99-4ee0-ab46-8e3e53669634-kube-api-access-hpsbc" (OuterVolumeSpecName: "kube-api-access-hpsbc") pod "a5e4ccfc-1b99-4ee0-ab46-8e3e53669634" (UID: "a5e4ccfc-1b99-4ee0-ab46-8e3e53669634"). InnerVolumeSpecName "kube-api-access-hpsbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:39:15 crc kubenswrapper[5004]: I1201 08:39:15.196603 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c74b987d-43c4-49d9-92fc-f13f3b7b7dd2","Type":"ContainerStarted","Data":"b57d2b5c7d6ed26f2e6a9db85deba20ccb98973f439f391253510bce5949ec96"} Dec 01 08:39:15 crc kubenswrapper[5004]: I1201 08:39:15.198921 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-88qsh" Dec 01 08:39:15 crc kubenswrapper[5004]: I1201 08:39:15.199422 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-88qsh" event={"ID":"a5e4ccfc-1b99-4ee0-ab46-8e3e53669634","Type":"ContainerDied","Data":"a7ea00139a82c3c90525c39c3f1f6b6109b250140097e5e83e09db1a79921cbd"} Dec 01 08:39:15 crc kubenswrapper[5004]: I1201 08:39:15.199704 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7ea00139a82c3c90525c39c3f1f6b6109b250140097e5e83e09db1a79921cbd" Dec 01 08:39:15 crc kubenswrapper[5004]: I1201 08:39:15.264006 5004 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5e4ccfc-1b99-4ee0-ab46-8e3e53669634-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:15 crc kubenswrapper[5004]: I1201 08:39:15.264043 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hpsbc\" (UniqueName: \"kubernetes.io/projected/a5e4ccfc-1b99-4ee0-ab46-8e3e53669634-kube-api-access-hpsbc\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:15 crc kubenswrapper[5004]: I1201 08:39:15.671517 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d3bc-account-create-update-d2dfq" Dec 01 08:39:15 crc kubenswrapper[5004]: I1201 08:39:15.726189 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-dwhqn" Dec 01 08:39:15 crc kubenswrapper[5004]: I1201 08:39:15.738060 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b5b4-account-create-update-4c6cr" Dec 01 08:39:15 crc kubenswrapper[5004]: I1201 08:39:15.779711 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44n7h\" (UniqueName: \"kubernetes.io/projected/f716a201-3ee3-486c-ba54-785c9e603805-kube-api-access-44n7h\") pod \"f716a201-3ee3-486c-ba54-785c9e603805\" (UID: \"f716a201-3ee3-486c-ba54-785c9e603805\") " Dec 01 08:39:15 crc kubenswrapper[5004]: I1201 08:39:15.779793 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvxdp\" (UniqueName: \"kubernetes.io/projected/cf702a13-23f3-471e-b896-03b6e58d429d-kube-api-access-cvxdp\") pod \"cf702a13-23f3-471e-b896-03b6e58d429d\" (UID: \"cf702a13-23f3-471e-b896-03b6e58d429d\") " Dec 01 08:39:15 crc kubenswrapper[5004]: I1201 08:39:15.779839 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0817a2b-0491-4c64-a2d8-0c03a938dd4a-operator-scripts\") pod \"c0817a2b-0491-4c64-a2d8-0c03a938dd4a\" (UID: \"c0817a2b-0491-4c64-a2d8-0c03a938dd4a\") " Dec 01 08:39:15 crc kubenswrapper[5004]: I1201 08:39:15.779864 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f716a201-3ee3-486c-ba54-785c9e603805-operator-scripts\") pod \"f716a201-3ee3-486c-ba54-785c9e603805\" (UID: \"f716a201-3ee3-486c-ba54-785c9e603805\") " Dec 01 08:39:15 crc kubenswrapper[5004]: I1201 08:39:15.780271 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf702a13-23f3-471e-b896-03b6e58d429d-operator-scripts\") pod \"cf702a13-23f3-471e-b896-03b6e58d429d\" (UID: \"cf702a13-23f3-471e-b896-03b6e58d429d\") " Dec 01 08:39:15 crc kubenswrapper[5004]: I1201 08:39:15.780300 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxg6h\" (UniqueName: \"kubernetes.io/projected/c0817a2b-0491-4c64-a2d8-0c03a938dd4a-kube-api-access-pxg6h\") pod \"c0817a2b-0491-4c64-a2d8-0c03a938dd4a\" (UID: \"c0817a2b-0491-4c64-a2d8-0c03a938dd4a\") " Dec 01 08:39:15 crc kubenswrapper[5004]: I1201 08:39:15.781030 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0817a2b-0491-4c64-a2d8-0c03a938dd4a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c0817a2b-0491-4c64-a2d8-0c03a938dd4a" (UID: "c0817a2b-0491-4c64-a2d8-0c03a938dd4a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:39:15 crc kubenswrapper[5004]: I1201 08:39:15.781277 5004 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0817a2b-0491-4c64-a2d8-0c03a938dd4a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:15 crc kubenswrapper[5004]: I1201 08:39:15.781306 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf702a13-23f3-471e-b896-03b6e58d429d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cf702a13-23f3-471e-b896-03b6e58d429d" (UID: "cf702a13-23f3-471e-b896-03b6e58d429d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:39:15 crc kubenswrapper[5004]: I1201 08:39:15.781340 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f716a201-3ee3-486c-ba54-785c9e603805-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f716a201-3ee3-486c-ba54-785c9e603805" (UID: "f716a201-3ee3-486c-ba54-785c9e603805"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:39:15 crc kubenswrapper[5004]: I1201 08:39:15.788723 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0817a2b-0491-4c64-a2d8-0c03a938dd4a-kube-api-access-pxg6h" (OuterVolumeSpecName: "kube-api-access-pxg6h") pod "c0817a2b-0491-4c64-a2d8-0c03a938dd4a" (UID: "c0817a2b-0491-4c64-a2d8-0c03a938dd4a"). InnerVolumeSpecName "kube-api-access-pxg6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:39:15 crc kubenswrapper[5004]: I1201 08:39:15.788858 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f716a201-3ee3-486c-ba54-785c9e603805-kube-api-access-44n7h" (OuterVolumeSpecName: "kube-api-access-44n7h") pod "f716a201-3ee3-486c-ba54-785c9e603805" (UID: "f716a201-3ee3-486c-ba54-785c9e603805"). InnerVolumeSpecName "kube-api-access-44n7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:39:15 crc kubenswrapper[5004]: I1201 08:39:15.789678 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf702a13-23f3-471e-b896-03b6e58d429d-kube-api-access-cvxdp" (OuterVolumeSpecName: "kube-api-access-cvxdp") pod "cf702a13-23f3-471e-b896-03b6e58d429d" (UID: "cf702a13-23f3-471e-b896-03b6e58d429d"). InnerVolumeSpecName "kube-api-access-cvxdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:39:15 crc kubenswrapper[5004]: I1201 08:39:15.884717 5004 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf702a13-23f3-471e-b896-03b6e58d429d-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:15 crc kubenswrapper[5004]: I1201 08:39:15.884990 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pxg6h\" (UniqueName: \"kubernetes.io/projected/c0817a2b-0491-4c64-a2d8-0c03a938dd4a-kube-api-access-pxg6h\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:15 crc kubenswrapper[5004]: I1201 08:39:15.885017 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44n7h\" (UniqueName: \"kubernetes.io/projected/f716a201-3ee3-486c-ba54-785c9e603805-kube-api-access-44n7h\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:15 crc kubenswrapper[5004]: I1201 08:39:15.885031 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvxdp\" (UniqueName: \"kubernetes.io/projected/cf702a13-23f3-471e-b896-03b6e58d429d-kube-api-access-cvxdp\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:15 crc kubenswrapper[5004]: I1201 08:39:15.885043 5004 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f716a201-3ee3-486c-ba54-785c9e603805-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:16 crc kubenswrapper[5004]: I1201 08:39:16.213388 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b5b4-account-create-update-4c6cr" event={"ID":"c0817a2b-0491-4c64-a2d8-0c03a938dd4a","Type":"ContainerDied","Data":"cec9b4e9898d69687f4442ecc4686660cccdf0893afc97a3fb2984622cafe9c6"} Dec 01 08:39:16 crc kubenswrapper[5004]: I1201 08:39:16.213446 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cec9b4e9898d69687f4442ecc4686660cccdf0893afc97a3fb2984622cafe9c6" Dec 01 08:39:16 crc kubenswrapper[5004]: I1201 08:39:16.213413 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b5b4-account-create-update-4c6cr" Dec 01 08:39:16 crc kubenswrapper[5004]: I1201 08:39:16.216180 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-dwhqn" Dec 01 08:39:16 crc kubenswrapper[5004]: I1201 08:39:16.216165 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-dwhqn" event={"ID":"f716a201-3ee3-486c-ba54-785c9e603805","Type":"ContainerDied","Data":"91e72775daae17a9fb194af3ffc7a7de0b58cc607820f255ecdd01c7c3ce9466"} Dec 01 08:39:16 crc kubenswrapper[5004]: I1201 08:39:16.216495 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91e72775daae17a9fb194af3ffc7a7de0b58cc607820f255ecdd01c7c3ce9466" Dec 01 08:39:16 crc kubenswrapper[5004]: I1201 08:39:16.217883 5004 generic.go:334] "Generic (PLEG): container finished" podID="1d4469a9-2f5c-4a4f-a6cf-aa8d58aa8da4" containerID="247a94e6863d939428ba1456ca3b6a08df9811c1d6d875563e57d4221da23638" exitCode=0 Dec 01 08:39:16 crc kubenswrapper[5004]: I1201 08:39:16.217966 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-bd92-account-create-update-zhnsd" event={"ID":"1d4469a9-2f5c-4a4f-a6cf-aa8d58aa8da4","Type":"ContainerDied","Data":"247a94e6863d939428ba1456ca3b6a08df9811c1d6d875563e57d4221da23638"} Dec 01 08:39:16 crc kubenswrapper[5004]: I1201 08:39:16.223417 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d3bc-account-create-update-d2dfq" event={"ID":"cf702a13-23f3-471e-b896-03b6e58d429d","Type":"ContainerDied","Data":"5f423ff57b4d5d42b2f22609ae251d6703d21262500e1a294c9690293d8d7095"} Dec 01 08:39:16 crc kubenswrapper[5004]: I1201 08:39:16.223702 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f423ff57b4d5d42b2f22609ae251d6703d21262500e1a294c9690293d8d7095" Dec 01 08:39:16 crc kubenswrapper[5004]: I1201 08:39:16.223525 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d3bc-account-create-update-d2dfq" Dec 01 08:39:16 crc kubenswrapper[5004]: I1201 08:39:16.246770 5004 generic.go:334] "Generic (PLEG): container finished" podID="5facd201-c917-4a20-86f7-1a95b6604bd1" containerID="d0a59018b3622bb84896710d76b6dff37ddcd7c9d8cf9f4f30a8b43dc3695768" exitCode=0 Dec 01 08:39:16 crc kubenswrapper[5004]: I1201 08:39:16.246820 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-dvjqn" event={"ID":"5facd201-c917-4a20-86f7-1a95b6604bd1","Type":"ContainerDied","Data":"d0a59018b3622bb84896710d76b6dff37ddcd7c9d8cf9f4f30a8b43dc3695768"} Dec 01 08:39:18 crc kubenswrapper[5004]: I1201 08:39:18.545333 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-dvjqn" Dec 01 08:39:18 crc kubenswrapper[5004]: I1201 08:39:18.552109 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-bd92-account-create-update-zhnsd" Dec 01 08:39:18 crc kubenswrapper[5004]: I1201 08:39:18.646220 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5facd201-c917-4a20-86f7-1a95b6604bd1-operator-scripts\") pod \"5facd201-c917-4a20-86f7-1a95b6604bd1\" (UID: \"5facd201-c917-4a20-86f7-1a95b6604bd1\") " Dec 01 08:39:18 crc kubenswrapper[5004]: I1201 08:39:18.646617 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2j2qx\" (UniqueName: \"kubernetes.io/projected/5facd201-c917-4a20-86f7-1a95b6604bd1-kube-api-access-2j2qx\") pod \"5facd201-c917-4a20-86f7-1a95b6604bd1\" (UID: \"5facd201-c917-4a20-86f7-1a95b6604bd1\") " Dec 01 08:39:18 crc kubenswrapper[5004]: I1201 08:39:18.646926 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d4469a9-2f5c-4a4f-a6cf-aa8d58aa8da4-operator-scripts\") pod \"1d4469a9-2f5c-4a4f-a6cf-aa8d58aa8da4\" (UID: \"1d4469a9-2f5c-4a4f-a6cf-aa8d58aa8da4\") " Dec 01 08:39:18 crc kubenswrapper[5004]: I1201 08:39:18.646987 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zcrdg\" (UniqueName: \"kubernetes.io/projected/1d4469a9-2f5c-4a4f-a6cf-aa8d58aa8da4-kube-api-access-zcrdg\") pod \"1d4469a9-2f5c-4a4f-a6cf-aa8d58aa8da4\" (UID: \"1d4469a9-2f5c-4a4f-a6cf-aa8d58aa8da4\") " Dec 01 08:39:18 crc kubenswrapper[5004]: I1201 08:39:18.647524 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5facd201-c917-4a20-86f7-1a95b6604bd1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5facd201-c917-4a20-86f7-1a95b6604bd1" (UID: "5facd201-c917-4a20-86f7-1a95b6604bd1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:39:18 crc kubenswrapper[5004]: I1201 08:39:18.647657 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d4469a9-2f5c-4a4f-a6cf-aa8d58aa8da4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1d4469a9-2f5c-4a4f-a6cf-aa8d58aa8da4" (UID: "1d4469a9-2f5c-4a4f-a6cf-aa8d58aa8da4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:39:18 crc kubenswrapper[5004]: I1201 08:39:18.648624 5004 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d4469a9-2f5c-4a4f-a6cf-aa8d58aa8da4-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:18 crc kubenswrapper[5004]: I1201 08:39:18.648667 5004 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5facd201-c917-4a20-86f7-1a95b6604bd1-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:18 crc kubenswrapper[5004]: I1201 08:39:18.652806 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5facd201-c917-4a20-86f7-1a95b6604bd1-kube-api-access-2j2qx" (OuterVolumeSpecName: "kube-api-access-2j2qx") pod "5facd201-c917-4a20-86f7-1a95b6604bd1" (UID: "5facd201-c917-4a20-86f7-1a95b6604bd1"). InnerVolumeSpecName "kube-api-access-2j2qx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:39:18 crc kubenswrapper[5004]: I1201 08:39:18.656004 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d4469a9-2f5c-4a4f-a6cf-aa8d58aa8da4-kube-api-access-zcrdg" (OuterVolumeSpecName: "kube-api-access-zcrdg") pod "1d4469a9-2f5c-4a4f-a6cf-aa8d58aa8da4" (UID: "1d4469a9-2f5c-4a4f-a6cf-aa8d58aa8da4"). InnerVolumeSpecName "kube-api-access-zcrdg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:39:18 crc kubenswrapper[5004]: I1201 08:39:18.750810 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zcrdg\" (UniqueName: \"kubernetes.io/projected/1d4469a9-2f5c-4a4f-a6cf-aa8d58aa8da4-kube-api-access-zcrdg\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:18 crc kubenswrapper[5004]: I1201 08:39:18.750853 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2j2qx\" (UniqueName: \"kubernetes.io/projected/5facd201-c917-4a20-86f7-1a95b6604bd1-kube-api-access-2j2qx\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:19 crc kubenswrapper[5004]: I1201 08:39:19.280631 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-bd92-account-create-update-zhnsd" event={"ID":"1d4469a9-2f5c-4a4f-a6cf-aa8d58aa8da4","Type":"ContainerDied","Data":"d540756bd15177a711464c45456c7bb761dba669b725c845c0ab9a347b1096e8"} Dec 01 08:39:19 crc kubenswrapper[5004]: I1201 08:39:19.280695 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d540756bd15177a711464c45456c7bb761dba669b725c845c0ab9a347b1096e8" Dec 01 08:39:19 crc kubenswrapper[5004]: I1201 08:39:19.280731 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-bd92-account-create-update-zhnsd" Dec 01 08:39:19 crc kubenswrapper[5004]: I1201 08:39:19.281652 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-dvjqn" event={"ID":"5facd201-c917-4a20-86f7-1a95b6604bd1","Type":"ContainerDied","Data":"efe559438d2a717ec712e9b842cd0761959ac4f7676bc03eb859c1a1540ee167"} Dec 01 08:39:19 crc kubenswrapper[5004]: I1201 08:39:19.281675 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="efe559438d2a717ec712e9b842cd0761959ac4f7676bc03eb859c1a1540ee167" Dec 01 08:39:19 crc kubenswrapper[5004]: I1201 08:39:19.281682 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-dvjqn" Dec 01 08:39:19 crc kubenswrapper[5004]: I1201 08:39:19.284630 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c74b987d-43c4-49d9-92fc-f13f3b7b7dd2","Type":"ContainerStarted","Data":"f5e4833da82e765e056046bb7bdddc857cd7685f25b5126b4083f490bdf5907e"} Dec 01 08:39:19 crc kubenswrapper[5004]: I1201 08:39:19.330956 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=10.882825307 podStartE2EDuration="57.330927385s" podCreationTimestamp="2025-12-01 08:38:22 +0000 UTC" firstStartedPulling="2025-12-01 08:38:31.991478506 +0000 UTC m=+1289.556470488" lastFinishedPulling="2025-12-01 08:39:18.439580574 +0000 UTC m=+1336.004572566" observedRunningTime="2025-12-01 08:39:19.317489856 +0000 UTC m=+1336.882481858" watchObservedRunningTime="2025-12-01 08:39:19.330927385 +0000 UTC m=+1336.895919397" Dec 01 08:39:19 crc kubenswrapper[5004]: I1201 08:39:19.485950 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/64792f42-dd08-4537-bce9-a632e644cf5a-etc-swift\") pod \"swift-storage-0\" (UID: \"64792f42-dd08-4537-bce9-a632e644cf5a\") " pod="openstack/swift-storage-0" Dec 01 08:39:19 crc kubenswrapper[5004]: I1201 08:39:19.493988 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/64792f42-dd08-4537-bce9-a632e644cf5a-etc-swift\") pod \"swift-storage-0\" (UID: \"64792f42-dd08-4537-bce9-a632e644cf5a\") " pod="openstack/swift-storage-0" Dec 01 08:39:19 crc kubenswrapper[5004]: I1201 08:39:19.538622 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 01 08:39:20 crc kubenswrapper[5004]: I1201 08:39:20.214263 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 01 08:39:20 crc kubenswrapper[5004]: W1201 08:39:20.215810 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64792f42_dd08_4537_bce9_a632e644cf5a.slice/crio-950721c324b2312c2cda71363c4064bc31f6ec0f1c80693dbfce1c081e3c5d4e WatchSource:0}: Error finding container 950721c324b2312c2cda71363c4064bc31f6ec0f1c80693dbfce1c081e3c5d4e: Status 404 returned error can't find the container with id 950721c324b2312c2cda71363c4064bc31f6ec0f1c80693dbfce1c081e3c5d4e Dec 01 08:39:20 crc kubenswrapper[5004]: I1201 08:39:20.294174 5004 generic.go:334] "Generic (PLEG): container finished" podID="e5a97177-1085-4eab-a646-c3b849dc73b5" containerID="d5654ead08ebc53482c2c426493beaca3ba2b4868056f8efc786ebf26bc0f01d" exitCode=0 Dec 01 08:39:20 crc kubenswrapper[5004]: I1201 08:39:20.294252 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-clktj" event={"ID":"e5a97177-1085-4eab-a646-c3b849dc73b5","Type":"ContainerDied","Data":"d5654ead08ebc53482c2c426493beaca3ba2b4868056f8efc786ebf26bc0f01d"} Dec 01 08:39:20 crc kubenswrapper[5004]: I1201 08:39:20.297358 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"64792f42-dd08-4537-bce9-a632e644cf5a","Type":"ContainerStarted","Data":"950721c324b2312c2cda71363c4064bc31f6ec0f1c80693dbfce1c081e3c5d4e"} Dec 01 08:39:20 crc kubenswrapper[5004]: I1201 08:39:20.299908 5004 generic.go:334] "Generic (PLEG): container finished" podID="e17a426c-0069-4c51-91ad-e5fbf6e0bb2a" containerID="ccde5507419ccdb6bb1307ad7276e94115097f8e8b951d4a4f702511b46356d2" exitCode=0 Dec 01 08:39:20 crc kubenswrapper[5004]: I1201 08:39:20.299996 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e17a426c-0069-4c51-91ad-e5fbf6e0bb2a","Type":"ContainerDied","Data":"ccde5507419ccdb6bb1307ad7276e94115097f8e8b951d4a4f702511b46356d2"} Dec 01 08:39:20 crc kubenswrapper[5004]: I1201 08:39:20.678092 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-vp28t"] Dec 01 08:39:20 crc kubenswrapper[5004]: E1201 08:39:20.678728 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0817a2b-0491-4c64-a2d8-0c03a938dd4a" containerName="mariadb-account-create-update" Dec 01 08:39:20 crc kubenswrapper[5004]: I1201 08:39:20.678746 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0817a2b-0491-4c64-a2d8-0c03a938dd4a" containerName="mariadb-account-create-update" Dec 01 08:39:20 crc kubenswrapper[5004]: E1201 08:39:20.678762 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf702a13-23f3-471e-b896-03b6e58d429d" containerName="mariadb-account-create-update" Dec 01 08:39:20 crc kubenswrapper[5004]: I1201 08:39:20.678770 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf702a13-23f3-471e-b896-03b6e58d429d" containerName="mariadb-account-create-update" Dec 01 08:39:20 crc kubenswrapper[5004]: E1201 08:39:20.678800 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abf31963-3bf7-4b6e-adaa-8605634a9530" containerName="dnsmasq-dns" Dec 01 08:39:20 crc kubenswrapper[5004]: I1201 08:39:20.678805 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="abf31963-3bf7-4b6e-adaa-8605634a9530" containerName="dnsmasq-dns" Dec 01 08:39:20 crc kubenswrapper[5004]: E1201 08:39:20.678819 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f716a201-3ee3-486c-ba54-785c9e603805" containerName="mariadb-database-create" Dec 01 08:39:20 crc kubenswrapper[5004]: I1201 08:39:20.678826 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="f716a201-3ee3-486c-ba54-785c9e603805" containerName="mariadb-database-create" Dec 01 08:39:20 crc kubenswrapper[5004]: E1201 08:39:20.678843 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abf31963-3bf7-4b6e-adaa-8605634a9530" containerName="init" Dec 01 08:39:20 crc kubenswrapper[5004]: I1201 08:39:20.678848 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="abf31963-3bf7-4b6e-adaa-8605634a9530" containerName="init" Dec 01 08:39:20 crc kubenswrapper[5004]: E1201 08:39:20.678859 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d4469a9-2f5c-4a4f-a6cf-aa8d58aa8da4" containerName="mariadb-account-create-update" Dec 01 08:39:20 crc kubenswrapper[5004]: I1201 08:39:20.678865 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d4469a9-2f5c-4a4f-a6cf-aa8d58aa8da4" containerName="mariadb-account-create-update" Dec 01 08:39:20 crc kubenswrapper[5004]: E1201 08:39:20.678876 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5e4ccfc-1b99-4ee0-ab46-8e3e53669634" containerName="mariadb-database-create" Dec 01 08:39:20 crc kubenswrapper[5004]: I1201 08:39:20.678881 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5e4ccfc-1b99-4ee0-ab46-8e3e53669634" containerName="mariadb-database-create" Dec 01 08:39:20 crc kubenswrapper[5004]: E1201 08:39:20.678893 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5facd201-c917-4a20-86f7-1a95b6604bd1" containerName="mariadb-database-create" Dec 01 08:39:20 crc kubenswrapper[5004]: I1201 08:39:20.678898 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="5facd201-c917-4a20-86f7-1a95b6604bd1" containerName="mariadb-database-create" Dec 01 08:39:20 crc kubenswrapper[5004]: I1201 08:39:20.679065 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="abf31963-3bf7-4b6e-adaa-8605634a9530" containerName="dnsmasq-dns" Dec 01 08:39:20 crc kubenswrapper[5004]: I1201 08:39:20.679082 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="f716a201-3ee3-486c-ba54-785c9e603805" containerName="mariadb-database-create" Dec 01 08:39:20 crc kubenswrapper[5004]: I1201 08:39:20.679117 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d4469a9-2f5c-4a4f-a6cf-aa8d58aa8da4" containerName="mariadb-account-create-update" Dec 01 08:39:20 crc kubenswrapper[5004]: I1201 08:39:20.679129 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5e4ccfc-1b99-4ee0-ab46-8e3e53669634" containerName="mariadb-database-create" Dec 01 08:39:20 crc kubenswrapper[5004]: I1201 08:39:20.679139 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0817a2b-0491-4c64-a2d8-0c03a938dd4a" containerName="mariadb-account-create-update" Dec 01 08:39:20 crc kubenswrapper[5004]: I1201 08:39:20.679155 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="5facd201-c917-4a20-86f7-1a95b6604bd1" containerName="mariadb-database-create" Dec 01 08:39:20 crc kubenswrapper[5004]: I1201 08:39:20.679167 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf702a13-23f3-471e-b896-03b6e58d429d" containerName="mariadb-account-create-update" Dec 01 08:39:20 crc kubenswrapper[5004]: I1201 08:39:20.679829 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-vp28t" Dec 01 08:39:20 crc kubenswrapper[5004]: I1201 08:39:20.681724 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 01 08:39:20 crc kubenswrapper[5004]: I1201 08:39:20.682245 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-fldwv" Dec 01 08:39:20 crc kubenswrapper[5004]: I1201 08:39:20.706537 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-vp28t"] Dec 01 08:39:20 crc kubenswrapper[5004]: I1201 08:39:20.741266 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96f41ff9-6619-4262-8ecb-0a577f611f68-combined-ca-bundle\") pod \"glance-db-sync-vp28t\" (UID: \"96f41ff9-6619-4262-8ecb-0a577f611f68\") " pod="openstack/glance-db-sync-vp28t" Dec 01 08:39:20 crc kubenswrapper[5004]: I1201 08:39:20.741320 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96f41ff9-6619-4262-8ecb-0a577f611f68-config-data\") pod \"glance-db-sync-vp28t\" (UID: \"96f41ff9-6619-4262-8ecb-0a577f611f68\") " pod="openstack/glance-db-sync-vp28t" Dec 01 08:39:20 crc kubenswrapper[5004]: I1201 08:39:20.741347 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/96f41ff9-6619-4262-8ecb-0a577f611f68-db-sync-config-data\") pod \"glance-db-sync-vp28t\" (UID: \"96f41ff9-6619-4262-8ecb-0a577f611f68\") " pod="openstack/glance-db-sync-vp28t" Dec 01 08:39:20 crc kubenswrapper[5004]: I1201 08:39:20.741492 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8drr\" (UniqueName: \"kubernetes.io/projected/96f41ff9-6619-4262-8ecb-0a577f611f68-kube-api-access-g8drr\") pod \"glance-db-sync-vp28t\" (UID: \"96f41ff9-6619-4262-8ecb-0a577f611f68\") " pod="openstack/glance-db-sync-vp28t" Dec 01 08:39:20 crc kubenswrapper[5004]: I1201 08:39:20.843146 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96f41ff9-6619-4262-8ecb-0a577f611f68-combined-ca-bundle\") pod \"glance-db-sync-vp28t\" (UID: \"96f41ff9-6619-4262-8ecb-0a577f611f68\") " pod="openstack/glance-db-sync-vp28t" Dec 01 08:39:20 crc kubenswrapper[5004]: I1201 08:39:20.843430 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96f41ff9-6619-4262-8ecb-0a577f611f68-config-data\") pod \"glance-db-sync-vp28t\" (UID: \"96f41ff9-6619-4262-8ecb-0a577f611f68\") " pod="openstack/glance-db-sync-vp28t" Dec 01 08:39:20 crc kubenswrapper[5004]: I1201 08:39:20.843455 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/96f41ff9-6619-4262-8ecb-0a577f611f68-db-sync-config-data\") pod \"glance-db-sync-vp28t\" (UID: \"96f41ff9-6619-4262-8ecb-0a577f611f68\") " pod="openstack/glance-db-sync-vp28t" Dec 01 08:39:20 crc kubenswrapper[5004]: I1201 08:39:20.843545 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8drr\" (UniqueName: \"kubernetes.io/projected/96f41ff9-6619-4262-8ecb-0a577f611f68-kube-api-access-g8drr\") pod \"glance-db-sync-vp28t\" (UID: \"96f41ff9-6619-4262-8ecb-0a577f611f68\") " pod="openstack/glance-db-sync-vp28t" Dec 01 08:39:20 crc kubenswrapper[5004]: I1201 08:39:20.848619 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96f41ff9-6619-4262-8ecb-0a577f611f68-combined-ca-bundle\") pod \"glance-db-sync-vp28t\" (UID: \"96f41ff9-6619-4262-8ecb-0a577f611f68\") " pod="openstack/glance-db-sync-vp28t" Dec 01 08:39:20 crc kubenswrapper[5004]: I1201 08:39:20.951481 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/96f41ff9-6619-4262-8ecb-0a577f611f68-db-sync-config-data\") pod \"glance-db-sync-vp28t\" (UID: \"96f41ff9-6619-4262-8ecb-0a577f611f68\") " pod="openstack/glance-db-sync-vp28t" Dec 01 08:39:20 crc kubenswrapper[5004]: I1201 08:39:20.951892 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96f41ff9-6619-4262-8ecb-0a577f611f68-config-data\") pod \"glance-db-sync-vp28t\" (UID: \"96f41ff9-6619-4262-8ecb-0a577f611f68\") " pod="openstack/glance-db-sync-vp28t" Dec 01 08:39:20 crc kubenswrapper[5004]: I1201 08:39:20.954342 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8drr\" (UniqueName: \"kubernetes.io/projected/96f41ff9-6619-4262-8ecb-0a577f611f68-kube-api-access-g8drr\") pod \"glance-db-sync-vp28t\" (UID: \"96f41ff9-6619-4262-8ecb-0a577f611f68\") " pod="openstack/glance-db-sync-vp28t" Dec 01 08:39:21 crc kubenswrapper[5004]: I1201 08:39:21.000153 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-vp28t" Dec 01 08:39:21 crc kubenswrapper[5004]: I1201 08:39:21.325291 5004 generic.go:334] "Generic (PLEG): container finished" podID="b571e2e5-2a78-45af-83aa-3d874b2569b3" containerID="17c1164a1a9ddf12e0f7bb16f0fda29c357c85933a430ff5b061e9d6f7746e89" exitCode=0 Dec 01 08:39:21 crc kubenswrapper[5004]: I1201 08:39:21.325833 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b571e2e5-2a78-45af-83aa-3d874b2569b3","Type":"ContainerDied","Data":"17c1164a1a9ddf12e0f7bb16f0fda29c357c85933a430ff5b061e9d6f7746e89"} Dec 01 08:39:21 crc kubenswrapper[5004]: I1201 08:39:21.332900 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e17a426c-0069-4c51-91ad-e5fbf6e0bb2a","Type":"ContainerStarted","Data":"5c7f10624a418d25374fc5f6d483787b7f89ecdac57f34c3dd622c1e3de143e9"} Dec 01 08:39:21 crc kubenswrapper[5004]: I1201 08:39:21.333899 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 01 08:39:21 crc kubenswrapper[5004]: I1201 08:39:21.379953 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=53.111046799 podStartE2EDuration="1m6.379934815s" podCreationTimestamp="2025-12-01 08:38:15 +0000 UTC" firstStartedPulling="2025-12-01 08:38:31.531161679 +0000 UTC m=+1289.096153671" lastFinishedPulling="2025-12-01 08:38:44.800049665 +0000 UTC m=+1302.365041687" observedRunningTime="2025-12-01 08:39:21.374896636 +0000 UTC m=+1338.939888618" watchObservedRunningTime="2025-12-01 08:39:21.379934815 +0000 UTC m=+1338.944926797" Dec 01 08:39:21 crc kubenswrapper[5004]: I1201 08:39:21.466217 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-r8x2b" podUID="effd853b-0b95-4749-8119-88fcfaf8b0c0" containerName="ovn-controller" probeResult="failure" output=< Dec 01 08:39:21 crc kubenswrapper[5004]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 01 08:39:21 crc kubenswrapper[5004]: > Dec 01 08:39:21 crc kubenswrapper[5004]: I1201 08:39:21.494447 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-8qrgn" Dec 01 08:39:21 crc kubenswrapper[5004]: I1201 08:39:21.555265 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-8qrgn" Dec 01 08:39:21 crc kubenswrapper[5004]: I1201 08:39:21.650325 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-vp28t"] Dec 01 08:39:21 crc kubenswrapper[5004]: I1201 08:39:21.820386 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-r8x2b-config-t8m5h"] Dec 01 08:39:21 crc kubenswrapper[5004]: I1201 08:39:21.821755 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-r8x2b-config-t8m5h" Dec 01 08:39:21 crc kubenswrapper[5004]: I1201 08:39:21.827979 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 01 08:39:21 crc kubenswrapper[5004]: I1201 08:39:21.859306 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-r8x2b-config-t8m5h"] Dec 01 08:39:21 crc kubenswrapper[5004]: I1201 08:39:21.870891 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e9a74158-8447-40e0-91ae-556ce9096972-var-run-ovn\") pod \"ovn-controller-r8x2b-config-t8m5h\" (UID: \"e9a74158-8447-40e0-91ae-556ce9096972\") " pod="openstack/ovn-controller-r8x2b-config-t8m5h" Dec 01 08:39:21 crc kubenswrapper[5004]: I1201 08:39:21.870967 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlzcx\" (UniqueName: \"kubernetes.io/projected/e9a74158-8447-40e0-91ae-556ce9096972-kube-api-access-rlzcx\") pod \"ovn-controller-r8x2b-config-t8m5h\" (UID: \"e9a74158-8447-40e0-91ae-556ce9096972\") " pod="openstack/ovn-controller-r8x2b-config-t8m5h" Dec 01 08:39:21 crc kubenswrapper[5004]: I1201 08:39:21.870987 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e9a74158-8447-40e0-91ae-556ce9096972-var-run\") pod \"ovn-controller-r8x2b-config-t8m5h\" (UID: \"e9a74158-8447-40e0-91ae-556ce9096972\") " pod="openstack/ovn-controller-r8x2b-config-t8m5h" Dec 01 08:39:21 crc kubenswrapper[5004]: I1201 08:39:21.871013 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e9a74158-8447-40e0-91ae-556ce9096972-additional-scripts\") pod \"ovn-controller-r8x2b-config-t8m5h\" (UID: \"e9a74158-8447-40e0-91ae-556ce9096972\") " pod="openstack/ovn-controller-r8x2b-config-t8m5h" Dec 01 08:39:21 crc kubenswrapper[5004]: I1201 08:39:21.871068 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e9a74158-8447-40e0-91ae-556ce9096972-scripts\") pod \"ovn-controller-r8x2b-config-t8m5h\" (UID: \"e9a74158-8447-40e0-91ae-556ce9096972\") " pod="openstack/ovn-controller-r8x2b-config-t8m5h" Dec 01 08:39:21 crc kubenswrapper[5004]: I1201 08:39:21.871108 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e9a74158-8447-40e0-91ae-556ce9096972-var-log-ovn\") pod \"ovn-controller-r8x2b-config-t8m5h\" (UID: \"e9a74158-8447-40e0-91ae-556ce9096972\") " pod="openstack/ovn-controller-r8x2b-config-t8m5h" Dec 01 08:39:21 crc kubenswrapper[5004]: I1201 08:39:21.971725 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e9a74158-8447-40e0-91ae-556ce9096972-scripts\") pod \"ovn-controller-r8x2b-config-t8m5h\" (UID: \"e9a74158-8447-40e0-91ae-556ce9096972\") " pod="openstack/ovn-controller-r8x2b-config-t8m5h" Dec 01 08:39:21 crc kubenswrapper[5004]: I1201 08:39:21.971812 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e9a74158-8447-40e0-91ae-556ce9096972-var-log-ovn\") pod \"ovn-controller-r8x2b-config-t8m5h\" (UID: \"e9a74158-8447-40e0-91ae-556ce9096972\") " pod="openstack/ovn-controller-r8x2b-config-t8m5h" Dec 01 08:39:21 crc kubenswrapper[5004]: I1201 08:39:21.971859 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e9a74158-8447-40e0-91ae-556ce9096972-var-run-ovn\") pod \"ovn-controller-r8x2b-config-t8m5h\" (UID: \"e9a74158-8447-40e0-91ae-556ce9096972\") " pod="openstack/ovn-controller-r8x2b-config-t8m5h" Dec 01 08:39:21 crc kubenswrapper[5004]: I1201 08:39:21.971929 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlzcx\" (UniqueName: \"kubernetes.io/projected/e9a74158-8447-40e0-91ae-556ce9096972-kube-api-access-rlzcx\") pod \"ovn-controller-r8x2b-config-t8m5h\" (UID: \"e9a74158-8447-40e0-91ae-556ce9096972\") " pod="openstack/ovn-controller-r8x2b-config-t8m5h" Dec 01 08:39:21 crc kubenswrapper[5004]: I1201 08:39:21.971951 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e9a74158-8447-40e0-91ae-556ce9096972-var-run\") pod \"ovn-controller-r8x2b-config-t8m5h\" (UID: \"e9a74158-8447-40e0-91ae-556ce9096972\") " pod="openstack/ovn-controller-r8x2b-config-t8m5h" Dec 01 08:39:21 crc kubenswrapper[5004]: I1201 08:39:21.971974 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e9a74158-8447-40e0-91ae-556ce9096972-additional-scripts\") pod \"ovn-controller-r8x2b-config-t8m5h\" (UID: \"e9a74158-8447-40e0-91ae-556ce9096972\") " pod="openstack/ovn-controller-r8x2b-config-t8m5h" Dec 01 08:39:21 crc kubenswrapper[5004]: I1201 08:39:21.972660 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e9a74158-8447-40e0-91ae-556ce9096972-additional-scripts\") pod \"ovn-controller-r8x2b-config-t8m5h\" (UID: \"e9a74158-8447-40e0-91ae-556ce9096972\") " pod="openstack/ovn-controller-r8x2b-config-t8m5h" Dec 01 08:39:21 crc kubenswrapper[5004]: I1201 08:39:21.972920 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e9a74158-8447-40e0-91ae-556ce9096972-var-log-ovn\") pod \"ovn-controller-r8x2b-config-t8m5h\" (UID: \"e9a74158-8447-40e0-91ae-556ce9096972\") " pod="openstack/ovn-controller-r8x2b-config-t8m5h" Dec 01 08:39:21 crc kubenswrapper[5004]: I1201 08:39:21.972961 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e9a74158-8447-40e0-91ae-556ce9096972-var-run-ovn\") pod \"ovn-controller-r8x2b-config-t8m5h\" (UID: \"e9a74158-8447-40e0-91ae-556ce9096972\") " pod="openstack/ovn-controller-r8x2b-config-t8m5h" Dec 01 08:39:21 crc kubenswrapper[5004]: I1201 08:39:21.973225 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e9a74158-8447-40e0-91ae-556ce9096972-var-run\") pod \"ovn-controller-r8x2b-config-t8m5h\" (UID: \"e9a74158-8447-40e0-91ae-556ce9096972\") " pod="openstack/ovn-controller-r8x2b-config-t8m5h" Dec 01 08:39:21 crc kubenswrapper[5004]: I1201 08:39:21.973529 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e9a74158-8447-40e0-91ae-556ce9096972-scripts\") pod \"ovn-controller-r8x2b-config-t8m5h\" (UID: \"e9a74158-8447-40e0-91ae-556ce9096972\") " pod="openstack/ovn-controller-r8x2b-config-t8m5h" Dec 01 08:39:22 crc kubenswrapper[5004]: I1201 08:39:22.024412 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlzcx\" (UniqueName: \"kubernetes.io/projected/e9a74158-8447-40e0-91ae-556ce9096972-kube-api-access-rlzcx\") pod \"ovn-controller-r8x2b-config-t8m5h\" (UID: \"e9a74158-8447-40e0-91ae-556ce9096972\") " pod="openstack/ovn-controller-r8x2b-config-t8m5h" Dec 01 08:39:22 crc kubenswrapper[5004]: I1201 08:39:22.138973 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-r8x2b-config-t8m5h" Dec 01 08:39:22 crc kubenswrapper[5004]: W1201 08:39:22.271853 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod96f41ff9_6619_4262_8ecb_0a577f611f68.slice/crio-c16361b2a3792b7c2aa5d59a9818c300496c7aa7c644ca47d0869d6854577c83 WatchSource:0}: Error finding container c16361b2a3792b7c2aa5d59a9818c300496c7aa7c644ca47d0869d6854577c83: Status 404 returned error can't find the container with id c16361b2a3792b7c2aa5d59a9818c300496c7aa7c644ca47d0869d6854577c83 Dec 01 08:39:22 crc kubenswrapper[5004]: I1201 08:39:22.369145 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-vp28t" event={"ID":"96f41ff9-6619-4262-8ecb-0a577f611f68","Type":"ContainerStarted","Data":"c16361b2a3792b7c2aa5d59a9818c300496c7aa7c644ca47d0869d6854577c83"} Dec 01 08:39:22 crc kubenswrapper[5004]: I1201 08:39:22.371091 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-clktj" event={"ID":"e5a97177-1085-4eab-a646-c3b849dc73b5","Type":"ContainerDied","Data":"4f1610ece8cf7f124d14bf143b111bc057655b6f52d40cd61a992a01f2e8fcf4"} Dec 01 08:39:22 crc kubenswrapper[5004]: I1201 08:39:22.371118 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f1610ece8cf7f124d14bf143b111bc057655b6f52d40cd61a992a01f2e8fcf4" Dec 01 08:39:22 crc kubenswrapper[5004]: I1201 08:39:22.392850 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b571e2e5-2a78-45af-83aa-3d874b2569b3","Type":"ContainerStarted","Data":"8781fe91187bcd78fc6abd4d08c3d2787079b1137a12a60f1c7b65f90c2a6635"} Dec 01 08:39:22 crc kubenswrapper[5004]: I1201 08:39:22.394004 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:39:22 crc kubenswrapper[5004]: I1201 08:39:22.396030 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-clktj" Dec 01 08:39:22 crc kubenswrapper[5004]: I1201 08:39:22.430764 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=53.195035749 podStartE2EDuration="1m7.430744494s" podCreationTimestamp="2025-12-01 08:38:15 +0000 UTC" firstStartedPulling="2025-12-01 08:38:32.016329795 +0000 UTC m=+1289.581321777" lastFinishedPulling="2025-12-01 08:38:46.25203853 +0000 UTC m=+1303.817030522" observedRunningTime="2025-12-01 08:39:22.425258105 +0000 UTC m=+1339.990250127" watchObservedRunningTime="2025-12-01 08:39:22.430744494 +0000 UTC m=+1339.995736476" Dec 01 08:39:22 crc kubenswrapper[5004]: I1201 08:39:22.586169 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbgkf\" (UniqueName: \"kubernetes.io/projected/e5a97177-1085-4eab-a646-c3b849dc73b5-kube-api-access-nbgkf\") pod \"e5a97177-1085-4eab-a646-c3b849dc73b5\" (UID: \"e5a97177-1085-4eab-a646-c3b849dc73b5\") " Dec 01 08:39:22 crc kubenswrapper[5004]: I1201 08:39:22.586685 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e5a97177-1085-4eab-a646-c3b849dc73b5-dispersionconf\") pod \"e5a97177-1085-4eab-a646-c3b849dc73b5\" (UID: \"e5a97177-1085-4eab-a646-c3b849dc73b5\") " Dec 01 08:39:22 crc kubenswrapper[5004]: I1201 08:39:22.586760 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e5a97177-1085-4eab-a646-c3b849dc73b5-ring-data-devices\") pod \"e5a97177-1085-4eab-a646-c3b849dc73b5\" (UID: \"e5a97177-1085-4eab-a646-c3b849dc73b5\") " Dec 01 08:39:22 crc kubenswrapper[5004]: I1201 08:39:22.586831 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e5a97177-1085-4eab-a646-c3b849dc73b5-scripts\") pod \"e5a97177-1085-4eab-a646-c3b849dc73b5\" (UID: \"e5a97177-1085-4eab-a646-c3b849dc73b5\") " Dec 01 08:39:22 crc kubenswrapper[5004]: I1201 08:39:22.586962 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e5a97177-1085-4eab-a646-c3b849dc73b5-swiftconf\") pod \"e5a97177-1085-4eab-a646-c3b849dc73b5\" (UID: \"e5a97177-1085-4eab-a646-c3b849dc73b5\") " Dec 01 08:39:22 crc kubenswrapper[5004]: I1201 08:39:22.587042 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5a97177-1085-4eab-a646-c3b849dc73b5-combined-ca-bundle\") pod \"e5a97177-1085-4eab-a646-c3b849dc73b5\" (UID: \"e5a97177-1085-4eab-a646-c3b849dc73b5\") " Dec 01 08:39:22 crc kubenswrapper[5004]: I1201 08:39:22.587084 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e5a97177-1085-4eab-a646-c3b849dc73b5-etc-swift\") pod \"e5a97177-1085-4eab-a646-c3b849dc73b5\" (UID: \"e5a97177-1085-4eab-a646-c3b849dc73b5\") " Dec 01 08:39:22 crc kubenswrapper[5004]: I1201 08:39:22.591010 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5a97177-1085-4eab-a646-c3b849dc73b5-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "e5a97177-1085-4eab-a646-c3b849dc73b5" (UID: "e5a97177-1085-4eab-a646-c3b849dc73b5"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:39:22 crc kubenswrapper[5004]: I1201 08:39:22.591548 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5a97177-1085-4eab-a646-c3b849dc73b5-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "e5a97177-1085-4eab-a646-c3b849dc73b5" (UID: "e5a97177-1085-4eab-a646-c3b849dc73b5"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:39:22 crc kubenswrapper[5004]: I1201 08:39:22.593202 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5a97177-1085-4eab-a646-c3b849dc73b5-kube-api-access-nbgkf" (OuterVolumeSpecName: "kube-api-access-nbgkf") pod "e5a97177-1085-4eab-a646-c3b849dc73b5" (UID: "e5a97177-1085-4eab-a646-c3b849dc73b5"). InnerVolumeSpecName "kube-api-access-nbgkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:39:22 crc kubenswrapper[5004]: I1201 08:39:22.602695 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5a97177-1085-4eab-a646-c3b849dc73b5-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "e5a97177-1085-4eab-a646-c3b849dc73b5" (UID: "e5a97177-1085-4eab-a646-c3b849dc73b5"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:39:22 crc kubenswrapper[5004]: I1201 08:39:22.623949 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5a97177-1085-4eab-a646-c3b849dc73b5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e5a97177-1085-4eab-a646-c3b849dc73b5" (UID: "e5a97177-1085-4eab-a646-c3b849dc73b5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:39:22 crc kubenswrapper[5004]: I1201 08:39:22.625702 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5a97177-1085-4eab-a646-c3b849dc73b5-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "e5a97177-1085-4eab-a646-c3b849dc73b5" (UID: "e5a97177-1085-4eab-a646-c3b849dc73b5"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:39:22 crc kubenswrapper[5004]: I1201 08:39:22.646975 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5a97177-1085-4eab-a646-c3b849dc73b5-scripts" (OuterVolumeSpecName: "scripts") pod "e5a97177-1085-4eab-a646-c3b849dc73b5" (UID: "e5a97177-1085-4eab-a646-c3b849dc73b5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:39:22 crc kubenswrapper[5004]: I1201 08:39:22.689819 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5a97177-1085-4eab-a646-c3b849dc73b5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:22 crc kubenswrapper[5004]: I1201 08:39:22.689853 5004 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e5a97177-1085-4eab-a646-c3b849dc73b5-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:22 crc kubenswrapper[5004]: I1201 08:39:22.689863 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbgkf\" (UniqueName: \"kubernetes.io/projected/e5a97177-1085-4eab-a646-c3b849dc73b5-kube-api-access-nbgkf\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:22 crc kubenswrapper[5004]: I1201 08:39:22.689872 5004 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e5a97177-1085-4eab-a646-c3b849dc73b5-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:22 crc kubenswrapper[5004]: I1201 08:39:22.689881 5004 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e5a97177-1085-4eab-a646-c3b849dc73b5-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:22 crc kubenswrapper[5004]: I1201 08:39:22.689889 5004 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e5a97177-1085-4eab-a646-c3b849dc73b5-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:22 crc kubenswrapper[5004]: I1201 08:39:22.689900 5004 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e5a97177-1085-4eab-a646-c3b849dc73b5-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:22 crc kubenswrapper[5004]: I1201 08:39:22.841879 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-r8x2b-config-t8m5h"] Dec 01 08:39:23 crc kubenswrapper[5004]: I1201 08:39:23.017840 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-0"] Dec 01 08:39:23 crc kubenswrapper[5004]: E1201 08:39:23.019387 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5a97177-1085-4eab-a646-c3b849dc73b5" containerName="swift-ring-rebalance" Dec 01 08:39:23 crc kubenswrapper[5004]: I1201 08:39:23.019413 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5a97177-1085-4eab-a646-c3b849dc73b5" containerName="swift-ring-rebalance" Dec 01 08:39:23 crc kubenswrapper[5004]: I1201 08:39:23.024096 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5a97177-1085-4eab-a646-c3b849dc73b5" containerName="swift-ring-rebalance" Dec 01 08:39:23 crc kubenswrapper[5004]: I1201 08:39:23.025252 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Dec 01 08:39:23 crc kubenswrapper[5004]: I1201 08:39:23.029995 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-config-data" Dec 01 08:39:23 crc kubenswrapper[5004]: I1201 08:39:23.055655 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Dec 01 08:39:23 crc kubenswrapper[5004]: I1201 08:39:23.204230 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/110193dd-2d1c-4ae1-86b3-985b039a16f0-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"110193dd-2d1c-4ae1-86b3-985b039a16f0\") " pod="openstack/mysqld-exporter-0" Dec 01 08:39:23 crc kubenswrapper[5004]: I1201 08:39:23.204476 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4xhx\" (UniqueName: \"kubernetes.io/projected/110193dd-2d1c-4ae1-86b3-985b039a16f0-kube-api-access-q4xhx\") pod \"mysqld-exporter-0\" (UID: \"110193dd-2d1c-4ae1-86b3-985b039a16f0\") " pod="openstack/mysqld-exporter-0" Dec 01 08:39:23 crc kubenswrapper[5004]: I1201 08:39:23.204711 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/110193dd-2d1c-4ae1-86b3-985b039a16f0-config-data\") pod \"mysqld-exporter-0\" (UID: \"110193dd-2d1c-4ae1-86b3-985b039a16f0\") " pod="openstack/mysqld-exporter-0" Dec 01 08:39:23 crc kubenswrapper[5004]: I1201 08:39:23.306506 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/110193dd-2d1c-4ae1-86b3-985b039a16f0-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"110193dd-2d1c-4ae1-86b3-985b039a16f0\") " pod="openstack/mysqld-exporter-0" Dec 01 08:39:23 crc kubenswrapper[5004]: I1201 08:39:23.306659 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4xhx\" (UniqueName: \"kubernetes.io/projected/110193dd-2d1c-4ae1-86b3-985b039a16f0-kube-api-access-q4xhx\") pod \"mysqld-exporter-0\" (UID: \"110193dd-2d1c-4ae1-86b3-985b039a16f0\") " pod="openstack/mysqld-exporter-0" Dec 01 08:39:23 crc kubenswrapper[5004]: I1201 08:39:23.306734 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/110193dd-2d1c-4ae1-86b3-985b039a16f0-config-data\") pod \"mysqld-exporter-0\" (UID: \"110193dd-2d1c-4ae1-86b3-985b039a16f0\") " pod="openstack/mysqld-exporter-0" Dec 01 08:39:23 crc kubenswrapper[5004]: I1201 08:39:23.312708 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/110193dd-2d1c-4ae1-86b3-985b039a16f0-config-data\") pod \"mysqld-exporter-0\" (UID: \"110193dd-2d1c-4ae1-86b3-985b039a16f0\") " pod="openstack/mysqld-exporter-0" Dec 01 08:39:23 crc kubenswrapper[5004]: I1201 08:39:23.312740 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/110193dd-2d1c-4ae1-86b3-985b039a16f0-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"110193dd-2d1c-4ae1-86b3-985b039a16f0\") " pod="openstack/mysqld-exporter-0" Dec 01 08:39:23 crc kubenswrapper[5004]: I1201 08:39:23.329120 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4xhx\" (UniqueName: \"kubernetes.io/projected/110193dd-2d1c-4ae1-86b3-985b039a16f0-kube-api-access-q4xhx\") pod \"mysqld-exporter-0\" (UID: \"110193dd-2d1c-4ae1-86b3-985b039a16f0\") " pod="openstack/mysqld-exporter-0" Dec 01 08:39:23 crc kubenswrapper[5004]: I1201 08:39:23.391173 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Dec 01 08:39:23 crc kubenswrapper[5004]: I1201 08:39:23.404791 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-r8x2b-config-t8m5h" event={"ID":"e9a74158-8447-40e0-91ae-556ce9096972","Type":"ContainerStarted","Data":"ec680bb32d9915bd23ca29120151d3cfa36003f257b9d23c42450708b4ce7365"} Dec 01 08:39:23 crc kubenswrapper[5004]: I1201 08:39:23.407446 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"64792f42-dd08-4537-bce9-a632e644cf5a","Type":"ContainerStarted","Data":"7981b54f8465851727e2e4b1b9d4b1988481ba19a30e1de29e6a406e9131cf5d"} Dec 01 08:39:23 crc kubenswrapper[5004]: I1201 08:39:23.407483 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"64792f42-dd08-4537-bce9-a632e644cf5a","Type":"ContainerStarted","Data":"212bce7c4243482023791b86cadad0a94a3ed0273081fc603684474f77ed0300"} Dec 01 08:39:23 crc kubenswrapper[5004]: I1201 08:39:23.407551 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-clktj" Dec 01 08:39:23 crc kubenswrapper[5004]: I1201 08:39:23.516376 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Dec 01 08:39:23 crc kubenswrapper[5004]: I1201 08:39:23.516735 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Dec 01 08:39:23 crc kubenswrapper[5004]: I1201 08:39:23.519020 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Dec 01 08:39:24 crc kubenswrapper[5004]: I1201 08:39:24.048543 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Dec 01 08:39:24 crc kubenswrapper[5004]: W1201 08:39:24.051486 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod110193dd_2d1c_4ae1_86b3_985b039a16f0.slice/crio-4e9ac53ce4f5cae561b79ba8b1c2d86a347f4912461ebf56af1b8c2be1c9d7eb WatchSource:0}: Error finding container 4e9ac53ce4f5cae561b79ba8b1c2d86a347f4912461ebf56af1b8c2be1c9d7eb: Status 404 returned error can't find the container with id 4e9ac53ce4f5cae561b79ba8b1c2d86a347f4912461ebf56af1b8c2be1c9d7eb Dec 01 08:39:24 crc kubenswrapper[5004]: I1201 08:39:24.056068 5004 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 08:39:24 crc kubenswrapper[5004]: I1201 08:39:24.417237 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"110193dd-2d1c-4ae1-86b3-985b039a16f0","Type":"ContainerStarted","Data":"4e9ac53ce4f5cae561b79ba8b1c2d86a347f4912461ebf56af1b8c2be1c9d7eb"} Dec 01 08:39:24 crc kubenswrapper[5004]: I1201 08:39:24.420212 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Dec 01 08:39:25 crc kubenswrapper[5004]: I1201 08:39:25.469892 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-r8x2b-config-t8m5h" event={"ID":"e9a74158-8447-40e0-91ae-556ce9096972","Type":"ContainerStarted","Data":"fd6b84e7f410a429e25a4656ba7585b1a342e9cf42fc1d50125a3d8802012aa0"} Dec 01 08:39:25 crc kubenswrapper[5004]: I1201 08:39:25.473735 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"64792f42-dd08-4537-bce9-a632e644cf5a","Type":"ContainerStarted","Data":"ed053f5c2718db06659c6e5f9e75b8e9a5ddddf2058eb71586a0d179560ae818"} Dec 01 08:39:25 crc kubenswrapper[5004]: I1201 08:39:25.502731 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-r8x2b-config-t8m5h" podStartSLOduration=4.502705174 podStartE2EDuration="4.502705174s" podCreationTimestamp="2025-12-01 08:39:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:39:25.492918252 +0000 UTC m=+1343.057910244" watchObservedRunningTime="2025-12-01 08:39:25.502705174 +0000 UTC m=+1343.067697146" Dec 01 08:39:26 crc kubenswrapper[5004]: I1201 08:39:26.478132 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-r8x2b" Dec 01 08:39:26 crc kubenswrapper[5004]: I1201 08:39:26.492013 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"64792f42-dd08-4537-bce9-a632e644cf5a","Type":"ContainerStarted","Data":"e5438f01f76931fcdf5b57d8ee96d262696f3bca62f7e370dd46c2bcfea590ac"} Dec 01 08:39:26 crc kubenswrapper[5004]: I1201 08:39:26.495011 5004 generic.go:334] "Generic (PLEG): container finished" podID="e9a74158-8447-40e0-91ae-556ce9096972" containerID="fd6b84e7f410a429e25a4656ba7585b1a342e9cf42fc1d50125a3d8802012aa0" exitCode=0 Dec 01 08:39:26 crc kubenswrapper[5004]: I1201 08:39:26.495041 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-r8x2b-config-t8m5h" event={"ID":"e9a74158-8447-40e0-91ae-556ce9096972","Type":"ContainerDied","Data":"fd6b84e7f410a429e25a4656ba7585b1a342e9cf42fc1d50125a3d8802012aa0"} Dec 01 08:39:27 crc kubenswrapper[5004]: I1201 08:39:27.094758 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 01 08:39:27 crc kubenswrapper[5004]: I1201 08:39:27.095011 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="c74b987d-43c4-49d9-92fc-f13f3b7b7dd2" containerName="prometheus" containerID="cri-o://c41b3a9d677c658eecfa2ea7bc9bd4172a1e4f97b3def3ea8e66714dc66f870c" gracePeriod=600 Dec 01 08:39:27 crc kubenswrapper[5004]: I1201 08:39:27.095154 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="c74b987d-43c4-49d9-92fc-f13f3b7b7dd2" containerName="thanos-sidecar" containerID="cri-o://f5e4833da82e765e056046bb7bdddc857cd7685f25b5126b4083f490bdf5907e" gracePeriod=600 Dec 01 08:39:27 crc kubenswrapper[5004]: I1201 08:39:27.095414 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="c74b987d-43c4-49d9-92fc-f13f3b7b7dd2" containerName="config-reloader" containerID="cri-o://b57d2b5c7d6ed26f2e6a9db85deba20ccb98973f439f391253510bce5949ec96" gracePeriod=600 Dec 01 08:39:27 crc kubenswrapper[5004]: I1201 08:39:27.526355 5004 generic.go:334] "Generic (PLEG): container finished" podID="c74b987d-43c4-49d9-92fc-f13f3b7b7dd2" containerID="f5e4833da82e765e056046bb7bdddc857cd7685f25b5126b4083f490bdf5907e" exitCode=0 Dec 01 08:39:27 crc kubenswrapper[5004]: I1201 08:39:27.526683 5004 generic.go:334] "Generic (PLEG): container finished" podID="c74b987d-43c4-49d9-92fc-f13f3b7b7dd2" containerID="c41b3a9d677c658eecfa2ea7bc9bd4172a1e4f97b3def3ea8e66714dc66f870c" exitCode=0 Dec 01 08:39:27 crc kubenswrapper[5004]: I1201 08:39:27.526832 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c74b987d-43c4-49d9-92fc-f13f3b7b7dd2","Type":"ContainerDied","Data":"f5e4833da82e765e056046bb7bdddc857cd7685f25b5126b4083f490bdf5907e"} Dec 01 08:39:27 crc kubenswrapper[5004]: I1201 08:39:27.526856 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c74b987d-43c4-49d9-92fc-f13f3b7b7dd2","Type":"ContainerDied","Data":"c41b3a9d677c658eecfa2ea7bc9bd4172a1e4f97b3def3ea8e66714dc66f870c"} Dec 01 08:39:27 crc kubenswrapper[5004]: I1201 08:39:27.926738 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-r8x2b-config-t8m5h" Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.012875 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e9a74158-8447-40e0-91ae-556ce9096972-scripts\") pod \"e9a74158-8447-40e0-91ae-556ce9096972\" (UID: \"e9a74158-8447-40e0-91ae-556ce9096972\") " Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.012961 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlzcx\" (UniqueName: \"kubernetes.io/projected/e9a74158-8447-40e0-91ae-556ce9096972-kube-api-access-rlzcx\") pod \"e9a74158-8447-40e0-91ae-556ce9096972\" (UID: \"e9a74158-8447-40e0-91ae-556ce9096972\") " Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.013061 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e9a74158-8447-40e0-91ae-556ce9096972-var-run\") pod \"e9a74158-8447-40e0-91ae-556ce9096972\" (UID: \"e9a74158-8447-40e0-91ae-556ce9096972\") " Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.013087 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e9a74158-8447-40e0-91ae-556ce9096972-var-log-ovn\") pod \"e9a74158-8447-40e0-91ae-556ce9096972\" (UID: \"e9a74158-8447-40e0-91ae-556ce9096972\") " Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.013123 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e9a74158-8447-40e0-91ae-556ce9096972-additional-scripts\") pod \"e9a74158-8447-40e0-91ae-556ce9096972\" (UID: \"e9a74158-8447-40e0-91ae-556ce9096972\") " Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.013230 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e9a74158-8447-40e0-91ae-556ce9096972-var-run-ovn\") pod \"e9a74158-8447-40e0-91ae-556ce9096972\" (UID: \"e9a74158-8447-40e0-91ae-556ce9096972\") " Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.013648 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e9a74158-8447-40e0-91ae-556ce9096972-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "e9a74158-8447-40e0-91ae-556ce9096972" (UID: "e9a74158-8447-40e0-91ae-556ce9096972"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.013677 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e9a74158-8447-40e0-91ae-556ce9096972-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "e9a74158-8447-40e0-91ae-556ce9096972" (UID: "e9a74158-8447-40e0-91ae-556ce9096972"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.013670 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e9a74158-8447-40e0-91ae-556ce9096972-var-run" (OuterVolumeSpecName: "var-run") pod "e9a74158-8447-40e0-91ae-556ce9096972" (UID: "e9a74158-8447-40e0-91ae-556ce9096972"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.014750 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9a74158-8447-40e0-91ae-556ce9096972-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "e9a74158-8447-40e0-91ae-556ce9096972" (UID: "e9a74158-8447-40e0-91ae-556ce9096972"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.014863 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9a74158-8447-40e0-91ae-556ce9096972-scripts" (OuterVolumeSpecName: "scripts") pod "e9a74158-8447-40e0-91ae-556ce9096972" (UID: "e9a74158-8447-40e0-91ae-556ce9096972"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.019462 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9a74158-8447-40e0-91ae-556ce9096972-kube-api-access-rlzcx" (OuterVolumeSpecName: "kube-api-access-rlzcx") pod "e9a74158-8447-40e0-91ae-556ce9096972" (UID: "e9a74158-8447-40e0-91ae-556ce9096972"). InnerVolumeSpecName "kube-api-access-rlzcx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.118008 5004 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e9a74158-8447-40e0-91ae-556ce9096972-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.118037 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlzcx\" (UniqueName: \"kubernetes.io/projected/e9a74158-8447-40e0-91ae-556ce9096972-kube-api-access-rlzcx\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.118048 5004 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e9a74158-8447-40e0-91ae-556ce9096972-var-run\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.118057 5004 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e9a74158-8447-40e0-91ae-556ce9096972-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.118067 5004 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e9a74158-8447-40e0-91ae-556ce9096972-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.118075 5004 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e9a74158-8447-40e0-91ae-556ce9096972-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.550133 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"110193dd-2d1c-4ae1-86b3-985b039a16f0","Type":"ContainerStarted","Data":"1ed44d3c6abede1281aa4a670ff6d7d20fd5b8f0daf2f9f0651e0beb75b0f07f"} Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.553420 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-r8x2b-config-t8m5h" event={"ID":"e9a74158-8447-40e0-91ae-556ce9096972","Type":"ContainerDied","Data":"ec680bb32d9915bd23ca29120151d3cfa36003f257b9d23c42450708b4ce7365"} Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.553451 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec680bb32d9915bd23ca29120151d3cfa36003f257b9d23c42450708b4ce7365" Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.553529 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-r8x2b-config-t8m5h" Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.572876 5004 generic.go:334] "Generic (PLEG): container finished" podID="c74b987d-43c4-49d9-92fc-f13f3b7b7dd2" containerID="b57d2b5c7d6ed26f2e6a9db85deba20ccb98973f439f391253510bce5949ec96" exitCode=0 Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.572950 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c74b987d-43c4-49d9-92fc-f13f3b7b7dd2","Type":"ContainerDied","Data":"b57d2b5c7d6ed26f2e6a9db85deba20ccb98973f439f391253510bce5949ec96"} Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.603232 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-0" podStartSLOduration=3.294968322 podStartE2EDuration="6.603213431s" podCreationTimestamp="2025-12-01 08:39:22 +0000 UTC" firstStartedPulling="2025-12-01 08:39:24.05580691 +0000 UTC m=+1341.620798892" lastFinishedPulling="2025-12-01 08:39:27.364052019 +0000 UTC m=+1344.929044001" observedRunningTime="2025-12-01 08:39:28.59515524 +0000 UTC m=+1346.160147252" watchObservedRunningTime="2025-12-01 08:39:28.603213431 +0000 UTC m=+1346.168205413" Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.637332 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.698178 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-r8x2b-config-t8m5h"] Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.710871 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-r8x2b-config-t8m5h"] Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.729682 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c74b987d-43c4-49d9-92fc-f13f3b7b7dd2-thanos-prometheus-http-client-file\") pod \"c74b987d-43c4-49d9-92fc-f13f3b7b7dd2\" (UID: \"c74b987d-43c4-49d9-92fc-f13f3b7b7dd2\") " Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.730395 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxgx2\" (UniqueName: \"kubernetes.io/projected/c74b987d-43c4-49d9-92fc-f13f3b7b7dd2-kube-api-access-kxgx2\") pod \"c74b987d-43c4-49d9-92fc-f13f3b7b7dd2\" (UID: \"c74b987d-43c4-49d9-92fc-f13f3b7b7dd2\") " Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.730716 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c74b987d-43c4-49d9-92fc-f13f3b7b7dd2-web-config\") pod \"c74b987d-43c4-49d9-92fc-f13f3b7b7dd2\" (UID: \"c74b987d-43c4-49d9-92fc-f13f3b7b7dd2\") " Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.731454 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"c74b987d-43c4-49d9-92fc-f13f3b7b7dd2\" (UID: \"c74b987d-43c4-49d9-92fc-f13f3b7b7dd2\") " Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.731642 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c74b987d-43c4-49d9-92fc-f13f3b7b7dd2-prometheus-metric-storage-rulefiles-0\") pod \"c74b987d-43c4-49d9-92fc-f13f3b7b7dd2\" (UID: \"c74b987d-43c4-49d9-92fc-f13f3b7b7dd2\") " Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.731770 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c74b987d-43c4-49d9-92fc-f13f3b7b7dd2-config\") pod \"c74b987d-43c4-49d9-92fc-f13f3b7b7dd2\" (UID: \"c74b987d-43c4-49d9-92fc-f13f3b7b7dd2\") " Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.731924 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c74b987d-43c4-49d9-92fc-f13f3b7b7dd2-tls-assets\") pod \"c74b987d-43c4-49d9-92fc-f13f3b7b7dd2\" (UID: \"c74b987d-43c4-49d9-92fc-f13f3b7b7dd2\") " Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.732046 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c74b987d-43c4-49d9-92fc-f13f3b7b7dd2-config-out\") pod \"c74b987d-43c4-49d9-92fc-f13f3b7b7dd2\" (UID: \"c74b987d-43c4-49d9-92fc-f13f3b7b7dd2\") " Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.735550 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c74b987d-43c4-49d9-92fc-f13f3b7b7dd2-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "c74b987d-43c4-49d9-92fc-f13f3b7b7dd2" (UID: "c74b987d-43c4-49d9-92fc-f13f3b7b7dd2"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.737105 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c74b987d-43c4-49d9-92fc-f13f3b7b7dd2-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "c74b987d-43c4-49d9-92fc-f13f3b7b7dd2" (UID: "c74b987d-43c4-49d9-92fc-f13f3b7b7dd2"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.743258 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c74b987d-43c4-49d9-92fc-f13f3b7b7dd2-kube-api-access-kxgx2" (OuterVolumeSpecName: "kube-api-access-kxgx2") pod "c74b987d-43c4-49d9-92fc-f13f3b7b7dd2" (UID: "c74b987d-43c4-49d9-92fc-f13f3b7b7dd2"). InnerVolumeSpecName "kube-api-access-kxgx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.743356 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c74b987d-43c4-49d9-92fc-f13f3b7b7dd2-config-out" (OuterVolumeSpecName: "config-out") pod "c74b987d-43c4-49d9-92fc-f13f3b7b7dd2" (UID: "c74b987d-43c4-49d9-92fc-f13f3b7b7dd2"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.743883 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c74b987d-43c4-49d9-92fc-f13f3b7b7dd2-config" (OuterVolumeSpecName: "config") pod "c74b987d-43c4-49d9-92fc-f13f3b7b7dd2" (UID: "c74b987d-43c4-49d9-92fc-f13f3b7b7dd2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.758749 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "c74b987d-43c4-49d9-92fc-f13f3b7b7dd2" (UID: "c74b987d-43c4-49d9-92fc-f13f3b7b7dd2"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.760918 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c74b987d-43c4-49d9-92fc-f13f3b7b7dd2-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "c74b987d-43c4-49d9-92fc-f13f3b7b7dd2" (UID: "c74b987d-43c4-49d9-92fc-f13f3b7b7dd2"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.777200 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9a74158-8447-40e0-91ae-556ce9096972" path="/var/lib/kubelet/pods/e9a74158-8447-40e0-91ae-556ce9096972/volumes" Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.785945 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c74b987d-43c4-49d9-92fc-f13f3b7b7dd2-web-config" (OuterVolumeSpecName: "web-config") pod "c74b987d-43c4-49d9-92fc-f13f3b7b7dd2" (UID: "c74b987d-43c4-49d9-92fc-f13f3b7b7dd2"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.834515 5004 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c74b987d-43c4-49d9-92fc-f13f3b7b7dd2-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.834548 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxgx2\" (UniqueName: \"kubernetes.io/projected/c74b987d-43c4-49d9-92fc-f13f3b7b7dd2-kube-api-access-kxgx2\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.834574 5004 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c74b987d-43c4-49d9-92fc-f13f3b7b7dd2-web-config\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.834609 5004 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.834619 5004 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c74b987d-43c4-49d9-92fc-f13f3b7b7dd2-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.834628 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/c74b987d-43c4-49d9-92fc-f13f3b7b7dd2-config\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.834637 5004 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c74b987d-43c4-49d9-92fc-f13f3b7b7dd2-tls-assets\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.834645 5004 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c74b987d-43c4-49d9-92fc-f13f3b7b7dd2-config-out\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.860996 5004 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.871982 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-r8x2b-config-t66b8"] Dec 01 08:39:28 crc kubenswrapper[5004]: E1201 08:39:28.872521 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c74b987d-43c4-49d9-92fc-f13f3b7b7dd2" containerName="init-config-reloader" Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.872541 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="c74b987d-43c4-49d9-92fc-f13f3b7b7dd2" containerName="init-config-reloader" Dec 01 08:39:28 crc kubenswrapper[5004]: E1201 08:39:28.872575 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c74b987d-43c4-49d9-92fc-f13f3b7b7dd2" containerName="thanos-sidecar" Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.872585 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="c74b987d-43c4-49d9-92fc-f13f3b7b7dd2" containerName="thanos-sidecar" Dec 01 08:39:28 crc kubenswrapper[5004]: E1201 08:39:28.872599 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c74b987d-43c4-49d9-92fc-f13f3b7b7dd2" containerName="prometheus" Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.872606 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="c74b987d-43c4-49d9-92fc-f13f3b7b7dd2" containerName="prometheus" Dec 01 08:39:28 crc kubenswrapper[5004]: E1201 08:39:28.872634 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9a74158-8447-40e0-91ae-556ce9096972" containerName="ovn-config" Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.872643 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9a74158-8447-40e0-91ae-556ce9096972" containerName="ovn-config" Dec 01 08:39:28 crc kubenswrapper[5004]: E1201 08:39:28.872663 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c74b987d-43c4-49d9-92fc-f13f3b7b7dd2" containerName="config-reloader" Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.872670 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="c74b987d-43c4-49d9-92fc-f13f3b7b7dd2" containerName="config-reloader" Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.872939 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="c74b987d-43c4-49d9-92fc-f13f3b7b7dd2" containerName="thanos-sidecar" Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.872965 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="c74b987d-43c4-49d9-92fc-f13f3b7b7dd2" containerName="prometheus" Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.872976 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="c74b987d-43c4-49d9-92fc-f13f3b7b7dd2" containerName="config-reloader" Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.872997 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9a74158-8447-40e0-91ae-556ce9096972" containerName="ovn-config" Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.873932 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-r8x2b-config-t66b8" Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.876298 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.887527 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-r8x2b-config-t66b8"] Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.935920 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/807bbaaa-1a65-4f04-8906-546e9c85e7aa-additional-scripts\") pod \"ovn-controller-r8x2b-config-t66b8\" (UID: \"807bbaaa-1a65-4f04-8906-546e9c85e7aa\") " pod="openstack/ovn-controller-r8x2b-config-t66b8" Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.935967 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5bs4\" (UniqueName: \"kubernetes.io/projected/807bbaaa-1a65-4f04-8906-546e9c85e7aa-kube-api-access-k5bs4\") pod \"ovn-controller-r8x2b-config-t66b8\" (UID: \"807bbaaa-1a65-4f04-8906-546e9c85e7aa\") " pod="openstack/ovn-controller-r8x2b-config-t66b8" Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.936015 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/807bbaaa-1a65-4f04-8906-546e9c85e7aa-var-run\") pod \"ovn-controller-r8x2b-config-t66b8\" (UID: \"807bbaaa-1a65-4f04-8906-546e9c85e7aa\") " pod="openstack/ovn-controller-r8x2b-config-t66b8" Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.936151 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/807bbaaa-1a65-4f04-8906-546e9c85e7aa-var-run-ovn\") pod \"ovn-controller-r8x2b-config-t66b8\" (UID: \"807bbaaa-1a65-4f04-8906-546e9c85e7aa\") " pod="openstack/ovn-controller-r8x2b-config-t66b8" Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.936180 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/807bbaaa-1a65-4f04-8906-546e9c85e7aa-var-log-ovn\") pod \"ovn-controller-r8x2b-config-t66b8\" (UID: \"807bbaaa-1a65-4f04-8906-546e9c85e7aa\") " pod="openstack/ovn-controller-r8x2b-config-t66b8" Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.936220 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/807bbaaa-1a65-4f04-8906-546e9c85e7aa-scripts\") pod \"ovn-controller-r8x2b-config-t66b8\" (UID: \"807bbaaa-1a65-4f04-8906-546e9c85e7aa\") " pod="openstack/ovn-controller-r8x2b-config-t66b8" Dec 01 08:39:28 crc kubenswrapper[5004]: I1201 08:39:28.936797 5004 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:29 crc kubenswrapper[5004]: I1201 08:39:29.038144 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/807bbaaa-1a65-4f04-8906-546e9c85e7aa-var-run-ovn\") pod \"ovn-controller-r8x2b-config-t66b8\" (UID: \"807bbaaa-1a65-4f04-8906-546e9c85e7aa\") " pod="openstack/ovn-controller-r8x2b-config-t66b8" Dec 01 08:39:29 crc kubenswrapper[5004]: I1201 08:39:29.038192 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/807bbaaa-1a65-4f04-8906-546e9c85e7aa-var-log-ovn\") pod \"ovn-controller-r8x2b-config-t66b8\" (UID: \"807bbaaa-1a65-4f04-8906-546e9c85e7aa\") " pod="openstack/ovn-controller-r8x2b-config-t66b8" Dec 01 08:39:29 crc kubenswrapper[5004]: I1201 08:39:29.038237 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/807bbaaa-1a65-4f04-8906-546e9c85e7aa-scripts\") pod \"ovn-controller-r8x2b-config-t66b8\" (UID: \"807bbaaa-1a65-4f04-8906-546e9c85e7aa\") " pod="openstack/ovn-controller-r8x2b-config-t66b8" Dec 01 08:39:29 crc kubenswrapper[5004]: I1201 08:39:29.038299 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/807bbaaa-1a65-4f04-8906-546e9c85e7aa-additional-scripts\") pod \"ovn-controller-r8x2b-config-t66b8\" (UID: \"807bbaaa-1a65-4f04-8906-546e9c85e7aa\") " pod="openstack/ovn-controller-r8x2b-config-t66b8" Dec 01 08:39:29 crc kubenswrapper[5004]: I1201 08:39:29.038321 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5bs4\" (UniqueName: \"kubernetes.io/projected/807bbaaa-1a65-4f04-8906-546e9c85e7aa-kube-api-access-k5bs4\") pod \"ovn-controller-r8x2b-config-t66b8\" (UID: \"807bbaaa-1a65-4f04-8906-546e9c85e7aa\") " pod="openstack/ovn-controller-r8x2b-config-t66b8" Dec 01 08:39:29 crc kubenswrapper[5004]: I1201 08:39:29.038354 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/807bbaaa-1a65-4f04-8906-546e9c85e7aa-var-run\") pod \"ovn-controller-r8x2b-config-t66b8\" (UID: \"807bbaaa-1a65-4f04-8906-546e9c85e7aa\") " pod="openstack/ovn-controller-r8x2b-config-t66b8" Dec 01 08:39:29 crc kubenswrapper[5004]: I1201 08:39:29.038501 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/807bbaaa-1a65-4f04-8906-546e9c85e7aa-var-run\") pod \"ovn-controller-r8x2b-config-t66b8\" (UID: \"807bbaaa-1a65-4f04-8906-546e9c85e7aa\") " pod="openstack/ovn-controller-r8x2b-config-t66b8" Dec 01 08:39:29 crc kubenswrapper[5004]: I1201 08:39:29.038510 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/807bbaaa-1a65-4f04-8906-546e9c85e7aa-var-run-ovn\") pod \"ovn-controller-r8x2b-config-t66b8\" (UID: \"807bbaaa-1a65-4f04-8906-546e9c85e7aa\") " pod="openstack/ovn-controller-r8x2b-config-t66b8" Dec 01 08:39:29 crc kubenswrapper[5004]: I1201 08:39:29.038533 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/807bbaaa-1a65-4f04-8906-546e9c85e7aa-var-log-ovn\") pod \"ovn-controller-r8x2b-config-t66b8\" (UID: \"807bbaaa-1a65-4f04-8906-546e9c85e7aa\") " pod="openstack/ovn-controller-r8x2b-config-t66b8" Dec 01 08:39:29 crc kubenswrapper[5004]: I1201 08:39:29.039306 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/807bbaaa-1a65-4f04-8906-546e9c85e7aa-additional-scripts\") pod \"ovn-controller-r8x2b-config-t66b8\" (UID: \"807bbaaa-1a65-4f04-8906-546e9c85e7aa\") " pod="openstack/ovn-controller-r8x2b-config-t66b8" Dec 01 08:39:29 crc kubenswrapper[5004]: I1201 08:39:29.040239 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/807bbaaa-1a65-4f04-8906-546e9c85e7aa-scripts\") pod \"ovn-controller-r8x2b-config-t66b8\" (UID: \"807bbaaa-1a65-4f04-8906-546e9c85e7aa\") " pod="openstack/ovn-controller-r8x2b-config-t66b8" Dec 01 08:39:29 crc kubenswrapper[5004]: I1201 08:39:29.072739 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5bs4\" (UniqueName: \"kubernetes.io/projected/807bbaaa-1a65-4f04-8906-546e9c85e7aa-kube-api-access-k5bs4\") pod \"ovn-controller-r8x2b-config-t66b8\" (UID: \"807bbaaa-1a65-4f04-8906-546e9c85e7aa\") " pod="openstack/ovn-controller-r8x2b-config-t66b8" Dec 01 08:39:29 crc kubenswrapper[5004]: I1201 08:39:29.201479 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-r8x2b-config-t66b8" Dec 01 08:39:29 crc kubenswrapper[5004]: I1201 08:39:29.559116 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-r8x2b-config-t66b8"] Dec 01 08:39:29 crc kubenswrapper[5004]: I1201 08:39:29.611476 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-r8x2b-config-t66b8" event={"ID":"807bbaaa-1a65-4f04-8906-546e9c85e7aa","Type":"ContainerStarted","Data":"4258b9ac9e624f9970c01f0d6516ebb6a0a61815faf6b63cefb363ac2addd417"} Dec 01 08:39:29 crc kubenswrapper[5004]: I1201 08:39:29.623269 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c74b987d-43c4-49d9-92fc-f13f3b7b7dd2","Type":"ContainerDied","Data":"3d2a125fdb9901a662791555122bad5681e504991ed1ee3567418b3d384274a2"} Dec 01 08:39:29 crc kubenswrapper[5004]: I1201 08:39:29.628632 5004 scope.go:117] "RemoveContainer" containerID="f5e4833da82e765e056046bb7bdddc857cd7685f25b5126b4083f490bdf5907e" Dec 01 08:39:29 crc kubenswrapper[5004]: I1201 08:39:29.628221 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 01 08:39:29 crc kubenswrapper[5004]: I1201 08:39:29.634249 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"64792f42-dd08-4537-bce9-a632e644cf5a","Type":"ContainerStarted","Data":"53fac5b7ca79221c235d0121d990fa7f7f5b93b9e027df80870f6ee6e8a39ccf"} Dec 01 08:39:29 crc kubenswrapper[5004]: I1201 08:39:29.639244 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"64792f42-dd08-4537-bce9-a632e644cf5a","Type":"ContainerStarted","Data":"9e8917313ea4c0e5cb65968b6981920bc70e552347232b454b6423d84e34d039"} Dec 01 08:39:29 crc kubenswrapper[5004]: I1201 08:39:29.655149 5004 scope.go:117] "RemoveContainer" containerID="b57d2b5c7d6ed26f2e6a9db85deba20ccb98973f439f391253510bce5949ec96" Dec 01 08:39:29 crc kubenswrapper[5004]: I1201 08:39:29.677479 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 01 08:39:29 crc kubenswrapper[5004]: I1201 08:39:29.685128 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 01 08:39:29 crc kubenswrapper[5004]: I1201 08:39:29.711693 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 01 08:39:29 crc kubenswrapper[5004]: I1201 08:39:29.714773 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 01 08:39:29 crc kubenswrapper[5004]: I1201 08:39:29.717489 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Dec 01 08:39:29 crc kubenswrapper[5004]: I1201 08:39:29.717653 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Dec 01 08:39:29 crc kubenswrapper[5004]: I1201 08:39:29.717779 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Dec 01 08:39:29 crc kubenswrapper[5004]: I1201 08:39:29.717996 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-z6dvc" Dec 01 08:39:29 crc kubenswrapper[5004]: I1201 08:39:29.718791 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Dec 01 08:39:29 crc kubenswrapper[5004]: I1201 08:39:29.719291 5004 scope.go:117] "RemoveContainer" containerID="c41b3a9d677c658eecfa2ea7bc9bd4172a1e4f97b3def3ea8e66714dc66f870c" Dec 01 08:39:29 crc kubenswrapper[5004]: I1201 08:39:29.724719 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Dec 01 08:39:29 crc kubenswrapper[5004]: I1201 08:39:29.735488 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 01 08:39:29 crc kubenswrapper[5004]: I1201 08:39:29.754134 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Dec 01 08:39:29 crc kubenswrapper[5004]: I1201 08:39:29.779157 5004 scope.go:117] "RemoveContainer" containerID="a6b3fce395401c09921bbca6309f7298ab2c3a95b03a4fc68906e05780c8c3de" Dec 01 08:39:29 crc kubenswrapper[5004]: I1201 08:39:29.854672 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dlbj\" (UniqueName: \"kubernetes.io/projected/b0dfa0be-7482-49a2-adc0-425cedd5c597-kube-api-access-4dlbj\") pod \"prometheus-metric-storage-0\" (UID: \"b0dfa0be-7482-49a2-adc0-425cedd5c597\") " pod="openstack/prometheus-metric-storage-0" Dec 01 08:39:29 crc kubenswrapper[5004]: I1201 08:39:29.854738 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b0dfa0be-7482-49a2-adc0-425cedd5c597-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"b0dfa0be-7482-49a2-adc0-425cedd5c597\") " pod="openstack/prometheus-metric-storage-0" Dec 01 08:39:29 crc kubenswrapper[5004]: I1201 08:39:29.854770 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/b0dfa0be-7482-49a2-adc0-425cedd5c597-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"b0dfa0be-7482-49a2-adc0-425cedd5c597\") " pod="openstack/prometheus-metric-storage-0" Dec 01 08:39:29 crc kubenswrapper[5004]: I1201 08:39:29.854795 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/b0dfa0be-7482-49a2-adc0-425cedd5c597-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"b0dfa0be-7482-49a2-adc0-425cedd5c597\") " pod="openstack/prometheus-metric-storage-0" Dec 01 08:39:29 crc kubenswrapper[5004]: I1201 08:39:29.854834 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b0dfa0be-7482-49a2-adc0-425cedd5c597-config\") pod \"prometheus-metric-storage-0\" (UID: \"b0dfa0be-7482-49a2-adc0-425cedd5c597\") " pod="openstack/prometheus-metric-storage-0" Dec 01 08:39:29 crc kubenswrapper[5004]: I1201 08:39:29.854856 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0dfa0be-7482-49a2-adc0-425cedd5c597-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"b0dfa0be-7482-49a2-adc0-425cedd5c597\") " pod="openstack/prometheus-metric-storage-0" Dec 01 08:39:29 crc kubenswrapper[5004]: I1201 08:39:29.854885 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b0dfa0be-7482-49a2-adc0-425cedd5c597-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"b0dfa0be-7482-49a2-adc0-425cedd5c597\") " pod="openstack/prometheus-metric-storage-0" Dec 01 08:39:29 crc kubenswrapper[5004]: I1201 08:39:29.854933 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b0dfa0be-7482-49a2-adc0-425cedd5c597-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"b0dfa0be-7482-49a2-adc0-425cedd5c597\") " pod="openstack/prometheus-metric-storage-0" Dec 01 08:39:29 crc kubenswrapper[5004]: I1201 08:39:29.854969 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b0dfa0be-7482-49a2-adc0-425cedd5c597-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"b0dfa0be-7482-49a2-adc0-425cedd5c597\") " pod="openstack/prometheus-metric-storage-0" Dec 01 08:39:29 crc kubenswrapper[5004]: I1201 08:39:29.855001 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"prometheus-metric-storage-0\" (UID: \"b0dfa0be-7482-49a2-adc0-425cedd5c597\") " pod="openstack/prometheus-metric-storage-0" Dec 01 08:39:29 crc kubenswrapper[5004]: I1201 08:39:29.855027 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b0dfa0be-7482-49a2-adc0-425cedd5c597-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"b0dfa0be-7482-49a2-adc0-425cedd5c597\") " pod="openstack/prometheus-metric-storage-0" Dec 01 08:39:29 crc kubenswrapper[5004]: I1201 08:39:29.956941 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/b0dfa0be-7482-49a2-adc0-425cedd5c597-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"b0dfa0be-7482-49a2-adc0-425cedd5c597\") " pod="openstack/prometheus-metric-storage-0" Dec 01 08:39:29 crc kubenswrapper[5004]: I1201 08:39:29.957003 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b0dfa0be-7482-49a2-adc0-425cedd5c597-config\") pod \"prometheus-metric-storage-0\" (UID: \"b0dfa0be-7482-49a2-adc0-425cedd5c597\") " pod="openstack/prometheus-metric-storage-0" Dec 01 08:39:29 crc kubenswrapper[5004]: I1201 08:39:29.957045 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0dfa0be-7482-49a2-adc0-425cedd5c597-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"b0dfa0be-7482-49a2-adc0-425cedd5c597\") " pod="openstack/prometheus-metric-storage-0" Dec 01 08:39:29 crc kubenswrapper[5004]: I1201 08:39:29.957075 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b0dfa0be-7482-49a2-adc0-425cedd5c597-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"b0dfa0be-7482-49a2-adc0-425cedd5c597\") " pod="openstack/prometheus-metric-storage-0" Dec 01 08:39:29 crc kubenswrapper[5004]: I1201 08:39:29.957130 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b0dfa0be-7482-49a2-adc0-425cedd5c597-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"b0dfa0be-7482-49a2-adc0-425cedd5c597\") " pod="openstack/prometheus-metric-storage-0" Dec 01 08:39:29 crc kubenswrapper[5004]: I1201 08:39:29.957165 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b0dfa0be-7482-49a2-adc0-425cedd5c597-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"b0dfa0be-7482-49a2-adc0-425cedd5c597\") " pod="openstack/prometheus-metric-storage-0" Dec 01 08:39:29 crc kubenswrapper[5004]: I1201 08:39:29.957200 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"prometheus-metric-storage-0\" (UID: \"b0dfa0be-7482-49a2-adc0-425cedd5c597\") " pod="openstack/prometheus-metric-storage-0" Dec 01 08:39:29 crc kubenswrapper[5004]: I1201 08:39:29.957231 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b0dfa0be-7482-49a2-adc0-425cedd5c597-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"b0dfa0be-7482-49a2-adc0-425cedd5c597\") " pod="openstack/prometheus-metric-storage-0" Dec 01 08:39:29 crc kubenswrapper[5004]: I1201 08:39:29.957262 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dlbj\" (UniqueName: \"kubernetes.io/projected/b0dfa0be-7482-49a2-adc0-425cedd5c597-kube-api-access-4dlbj\") pod \"prometheus-metric-storage-0\" (UID: \"b0dfa0be-7482-49a2-adc0-425cedd5c597\") " pod="openstack/prometheus-metric-storage-0" Dec 01 08:39:29 crc kubenswrapper[5004]: I1201 08:39:29.957296 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b0dfa0be-7482-49a2-adc0-425cedd5c597-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"b0dfa0be-7482-49a2-adc0-425cedd5c597\") " pod="openstack/prometheus-metric-storage-0" Dec 01 08:39:29 crc kubenswrapper[5004]: I1201 08:39:29.957319 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/b0dfa0be-7482-49a2-adc0-425cedd5c597-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"b0dfa0be-7482-49a2-adc0-425cedd5c597\") " pod="openstack/prometheus-metric-storage-0" Dec 01 08:39:29 crc kubenswrapper[5004]: I1201 08:39:29.961543 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b0dfa0be-7482-49a2-adc0-425cedd5c597-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"b0dfa0be-7482-49a2-adc0-425cedd5c597\") " pod="openstack/prometheus-metric-storage-0" Dec 01 08:39:29 crc kubenswrapper[5004]: I1201 08:39:29.962013 5004 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"prometheus-metric-storage-0\" (UID: \"b0dfa0be-7482-49a2-adc0-425cedd5c597\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/prometheus-metric-storage-0" Dec 01 08:39:29 crc kubenswrapper[5004]: I1201 08:39:29.965891 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b0dfa0be-7482-49a2-adc0-425cedd5c597-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"b0dfa0be-7482-49a2-adc0-425cedd5c597\") " pod="openstack/prometheus-metric-storage-0" Dec 01 08:39:29 crc kubenswrapper[5004]: I1201 08:39:29.967087 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b0dfa0be-7482-49a2-adc0-425cedd5c597-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"b0dfa0be-7482-49a2-adc0-425cedd5c597\") " pod="openstack/prometheus-metric-storage-0" Dec 01 08:39:29 crc kubenswrapper[5004]: I1201 08:39:29.967898 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0dfa0be-7482-49a2-adc0-425cedd5c597-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"b0dfa0be-7482-49a2-adc0-425cedd5c597\") " pod="openstack/prometheus-metric-storage-0" Dec 01 08:39:29 crc kubenswrapper[5004]: I1201 08:39:29.968854 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b0dfa0be-7482-49a2-adc0-425cedd5c597-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"b0dfa0be-7482-49a2-adc0-425cedd5c597\") " pod="openstack/prometheus-metric-storage-0" Dec 01 08:39:29 crc kubenswrapper[5004]: I1201 08:39:29.970637 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/b0dfa0be-7482-49a2-adc0-425cedd5c597-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"b0dfa0be-7482-49a2-adc0-425cedd5c597\") " pod="openstack/prometheus-metric-storage-0" Dec 01 08:39:29 crc kubenswrapper[5004]: I1201 08:39:29.972910 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/b0dfa0be-7482-49a2-adc0-425cedd5c597-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"b0dfa0be-7482-49a2-adc0-425cedd5c597\") " pod="openstack/prometheus-metric-storage-0" Dec 01 08:39:29 crc kubenswrapper[5004]: I1201 08:39:29.981828 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b0dfa0be-7482-49a2-adc0-425cedd5c597-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"b0dfa0be-7482-49a2-adc0-425cedd5c597\") " pod="openstack/prometheus-metric-storage-0" Dec 01 08:39:29 crc kubenswrapper[5004]: I1201 08:39:29.985323 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dlbj\" (UniqueName: \"kubernetes.io/projected/b0dfa0be-7482-49a2-adc0-425cedd5c597-kube-api-access-4dlbj\") pod \"prometheus-metric-storage-0\" (UID: \"b0dfa0be-7482-49a2-adc0-425cedd5c597\") " pod="openstack/prometheus-metric-storage-0" Dec 01 08:39:29 crc kubenswrapper[5004]: I1201 08:39:29.988436 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b0dfa0be-7482-49a2-adc0-425cedd5c597-config\") pod \"prometheus-metric-storage-0\" (UID: \"b0dfa0be-7482-49a2-adc0-425cedd5c597\") " pod="openstack/prometheus-metric-storage-0" Dec 01 08:39:30 crc kubenswrapper[5004]: I1201 08:39:30.072325 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"prometheus-metric-storage-0\" (UID: \"b0dfa0be-7482-49a2-adc0-425cedd5c597\") " pod="openstack/prometheus-metric-storage-0" Dec 01 08:39:30 crc kubenswrapper[5004]: I1201 08:39:30.335151 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 01 08:39:30 crc kubenswrapper[5004]: I1201 08:39:30.647961 5004 generic.go:334] "Generic (PLEG): container finished" podID="807bbaaa-1a65-4f04-8906-546e9c85e7aa" containerID="4de3c6d2d3d88f3acc76200c5a2cb8933d7a660a9287a6c92cb0c5adb7e50d17" exitCode=0 Dec 01 08:39:30 crc kubenswrapper[5004]: I1201 08:39:30.649082 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-r8x2b-config-t66b8" event={"ID":"807bbaaa-1a65-4f04-8906-546e9c85e7aa","Type":"ContainerDied","Data":"4de3c6d2d3d88f3acc76200c5a2cb8933d7a660a9287a6c92cb0c5adb7e50d17"} Dec 01 08:39:30 crc kubenswrapper[5004]: I1201 08:39:30.674604 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"64792f42-dd08-4537-bce9-a632e644cf5a","Type":"ContainerStarted","Data":"e04aead7e1c1f65cf6d42395b98007d458d30e30e3e678bdb11ed671dc0e42f1"} Dec 01 08:39:30 crc kubenswrapper[5004]: I1201 08:39:30.674647 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"64792f42-dd08-4537-bce9-a632e644cf5a","Type":"ContainerStarted","Data":"b640975a783c37cfd7f775258777ec41be20205fe02aac61ab42cbc3c834bcc7"} Dec 01 08:39:30 crc kubenswrapper[5004]: I1201 08:39:30.775024 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c74b987d-43c4-49d9-92fc-f13f3b7b7dd2" path="/var/lib/kubelet/pods/c74b987d-43c4-49d9-92fc-f13f3b7b7dd2/volumes" Dec 01 08:39:30 crc kubenswrapper[5004]: I1201 08:39:30.899589 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 01 08:39:31 crc kubenswrapper[5004]: I1201 08:39:31.518436 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="c74b987d-43c4-49d9-92fc-f13f3b7b7dd2" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.137:9090/-/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 08:39:31 crc kubenswrapper[5004]: I1201 08:39:31.693775 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"b0dfa0be-7482-49a2-adc0-425cedd5c597","Type":"ContainerStarted","Data":"2257d188e873d49c784d6f3c041374c823700e3f03adbd8f0c0dfb471460545f"} Dec 01 08:39:32 crc kubenswrapper[5004]: I1201 08:39:32.208680 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-r8x2b-config-t66b8" Dec 01 08:39:32 crc kubenswrapper[5004]: I1201 08:39:32.341439 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/807bbaaa-1a65-4f04-8906-546e9c85e7aa-scripts\") pod \"807bbaaa-1a65-4f04-8906-546e9c85e7aa\" (UID: \"807bbaaa-1a65-4f04-8906-546e9c85e7aa\") " Dec 01 08:39:32 crc kubenswrapper[5004]: I1201 08:39:32.341520 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/807bbaaa-1a65-4f04-8906-546e9c85e7aa-var-log-ovn\") pod \"807bbaaa-1a65-4f04-8906-546e9c85e7aa\" (UID: \"807bbaaa-1a65-4f04-8906-546e9c85e7aa\") " Dec 01 08:39:32 crc kubenswrapper[5004]: I1201 08:39:32.341576 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/807bbaaa-1a65-4f04-8906-546e9c85e7aa-var-run-ovn\") pod \"807bbaaa-1a65-4f04-8906-546e9c85e7aa\" (UID: \"807bbaaa-1a65-4f04-8906-546e9c85e7aa\") " Dec 01 08:39:32 crc kubenswrapper[5004]: I1201 08:39:32.341631 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5bs4\" (UniqueName: \"kubernetes.io/projected/807bbaaa-1a65-4f04-8906-546e9c85e7aa-kube-api-access-k5bs4\") pod \"807bbaaa-1a65-4f04-8906-546e9c85e7aa\" (UID: \"807bbaaa-1a65-4f04-8906-546e9c85e7aa\") " Dec 01 08:39:32 crc kubenswrapper[5004]: I1201 08:39:32.341655 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/807bbaaa-1a65-4f04-8906-546e9c85e7aa-var-run\") pod \"807bbaaa-1a65-4f04-8906-546e9c85e7aa\" (UID: \"807bbaaa-1a65-4f04-8906-546e9c85e7aa\") " Dec 01 08:39:32 crc kubenswrapper[5004]: I1201 08:39:32.341644 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/807bbaaa-1a65-4f04-8906-546e9c85e7aa-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "807bbaaa-1a65-4f04-8906-546e9c85e7aa" (UID: "807bbaaa-1a65-4f04-8906-546e9c85e7aa"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 08:39:32 crc kubenswrapper[5004]: I1201 08:39:32.341690 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/807bbaaa-1a65-4f04-8906-546e9c85e7aa-additional-scripts\") pod \"807bbaaa-1a65-4f04-8906-546e9c85e7aa\" (UID: \"807bbaaa-1a65-4f04-8906-546e9c85e7aa\") " Dec 01 08:39:32 crc kubenswrapper[5004]: I1201 08:39:32.341764 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/807bbaaa-1a65-4f04-8906-546e9c85e7aa-var-run" (OuterVolumeSpecName: "var-run") pod "807bbaaa-1a65-4f04-8906-546e9c85e7aa" (UID: "807bbaaa-1a65-4f04-8906-546e9c85e7aa"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 08:39:32 crc kubenswrapper[5004]: I1201 08:39:32.341782 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/807bbaaa-1a65-4f04-8906-546e9c85e7aa-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "807bbaaa-1a65-4f04-8906-546e9c85e7aa" (UID: "807bbaaa-1a65-4f04-8906-546e9c85e7aa"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 08:39:32 crc kubenswrapper[5004]: I1201 08:39:32.342189 5004 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/807bbaaa-1a65-4f04-8906-546e9c85e7aa-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:32 crc kubenswrapper[5004]: I1201 08:39:32.342206 5004 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/807bbaaa-1a65-4f04-8906-546e9c85e7aa-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:32 crc kubenswrapper[5004]: I1201 08:39:32.342214 5004 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/807bbaaa-1a65-4f04-8906-546e9c85e7aa-var-run\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:32 crc kubenswrapper[5004]: I1201 08:39:32.342256 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/807bbaaa-1a65-4f04-8906-546e9c85e7aa-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "807bbaaa-1a65-4f04-8906-546e9c85e7aa" (UID: "807bbaaa-1a65-4f04-8906-546e9c85e7aa"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:39:32 crc kubenswrapper[5004]: I1201 08:39:32.342975 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/807bbaaa-1a65-4f04-8906-546e9c85e7aa-scripts" (OuterVolumeSpecName: "scripts") pod "807bbaaa-1a65-4f04-8906-546e9c85e7aa" (UID: "807bbaaa-1a65-4f04-8906-546e9c85e7aa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:39:32 crc kubenswrapper[5004]: I1201 08:39:32.347737 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/807bbaaa-1a65-4f04-8906-546e9c85e7aa-kube-api-access-k5bs4" (OuterVolumeSpecName: "kube-api-access-k5bs4") pod "807bbaaa-1a65-4f04-8906-546e9c85e7aa" (UID: "807bbaaa-1a65-4f04-8906-546e9c85e7aa"). InnerVolumeSpecName "kube-api-access-k5bs4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:39:32 crc kubenswrapper[5004]: I1201 08:39:32.444406 5004 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/807bbaaa-1a65-4f04-8906-546e9c85e7aa-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:32 crc kubenswrapper[5004]: I1201 08:39:32.444437 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5bs4\" (UniqueName: \"kubernetes.io/projected/807bbaaa-1a65-4f04-8906-546e9c85e7aa-kube-api-access-k5bs4\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:32 crc kubenswrapper[5004]: I1201 08:39:32.444448 5004 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/807bbaaa-1a65-4f04-8906-546e9c85e7aa-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:32 crc kubenswrapper[5004]: I1201 08:39:32.711437 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-r8x2b-config-t66b8" event={"ID":"807bbaaa-1a65-4f04-8906-546e9c85e7aa","Type":"ContainerDied","Data":"4258b9ac9e624f9970c01f0d6516ebb6a0a61815faf6b63cefb363ac2addd417"} Dec 01 08:39:32 crc kubenswrapper[5004]: I1201 08:39:32.711825 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4258b9ac9e624f9970c01f0d6516ebb6a0a61815faf6b63cefb363ac2addd417" Dec 01 08:39:32 crc kubenswrapper[5004]: I1201 08:39:32.711936 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-r8x2b-config-t66b8" Dec 01 08:39:32 crc kubenswrapper[5004]: I1201 08:39:32.716650 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"64792f42-dd08-4537-bce9-a632e644cf5a","Type":"ContainerStarted","Data":"4cbc219e80d870941d41b8ea763af7e0bf0d9ef033563d209ddf2fc636ab9c30"} Dec 01 08:39:32 crc kubenswrapper[5004]: I1201 08:39:32.716716 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"64792f42-dd08-4537-bce9-a632e644cf5a","Type":"ContainerStarted","Data":"74203973e1019fffe93b43f7adc6d06ddf73cc78b4073e633ddec68870266687"} Dec 01 08:39:32 crc kubenswrapper[5004]: I1201 08:39:32.716734 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"64792f42-dd08-4537-bce9-a632e644cf5a","Type":"ContainerStarted","Data":"46039d13681ce4c9f28c69ae5b6eea4a20414146386d57e10e439edd14ca1d0a"} Dec 01 08:39:33 crc kubenswrapper[5004]: I1201 08:39:33.308429 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-r8x2b-config-t66b8"] Dec 01 08:39:33 crc kubenswrapper[5004]: I1201 08:39:33.320118 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-r8x2b-config-t66b8"] Dec 01 08:39:33 crc kubenswrapper[5004]: I1201 08:39:33.732169 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"64792f42-dd08-4537-bce9-a632e644cf5a","Type":"ContainerStarted","Data":"5ec1c19ac0bfc0abc273ed4706f209d982c8ae324d15703e4719638ebc570e80"} Dec 01 08:39:33 crc kubenswrapper[5004]: I1201 08:39:33.732599 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"64792f42-dd08-4537-bce9-a632e644cf5a","Type":"ContainerStarted","Data":"44052e10aa16d6b5dd525fb973b835ca9ecaedec42067861e013616eb4105b62"} Dec 01 08:39:34 crc kubenswrapper[5004]: I1201 08:39:34.751182 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"b0dfa0be-7482-49a2-adc0-425cedd5c597","Type":"ContainerStarted","Data":"0309c5e82756a858a9b25dbf890fb071010bba1c03f21b9b5370f7c23d2502a4"} Dec 01 08:39:34 crc kubenswrapper[5004]: I1201 08:39:34.776967 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="807bbaaa-1a65-4f04-8906-546e9c85e7aa" path="/var/lib/kubelet/pods/807bbaaa-1a65-4f04-8906-546e9c85e7aa/volumes" Dec 01 08:39:34 crc kubenswrapper[5004]: I1201 08:39:34.778035 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"64792f42-dd08-4537-bce9-a632e644cf5a","Type":"ContainerStarted","Data":"c568f7fbc653f5daf0497a626e5d7c160a1f63acba7df35928010a7f4cc85d2c"} Dec 01 08:39:36 crc kubenswrapper[5004]: I1201 08:39:36.691996 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.083739 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.100653 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-dlfw2"] Dec 01 08:39:37 crc kubenswrapper[5004]: E1201 08:39:37.101072 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="807bbaaa-1a65-4f04-8906-546e9c85e7aa" containerName="ovn-config" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.101091 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="807bbaaa-1a65-4f04-8906-546e9c85e7aa" containerName="ovn-config" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.101320 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="807bbaaa-1a65-4f04-8906-546e9c85e7aa" containerName="ovn-config" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.102003 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-dlfw2" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.111242 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-dlfw2"] Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.185477 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73a0817f-d450-4a46-863a-a7483f144851-operator-scripts\") pod \"cinder-db-create-dlfw2\" (UID: \"73a0817f-d450-4a46-863a-a7483f144851\") " pod="openstack/cinder-db-create-dlfw2" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.185631 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7k7px\" (UniqueName: \"kubernetes.io/projected/73a0817f-d450-4a46-863a-a7483f144851-kube-api-access-7k7px\") pod \"cinder-db-create-dlfw2\" (UID: \"73a0817f-d450-4a46-863a-a7483f144851\") " pod="openstack/cinder-db-create-dlfw2" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.242871 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-kbqp2"] Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.244242 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-kbqp2" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.265035 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-kbqp2"] Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.288839 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73a0817f-d450-4a46-863a-a7483f144851-operator-scripts\") pod \"cinder-db-create-dlfw2\" (UID: \"73a0817f-d450-4a46-863a-a7483f144851\") " pod="openstack/cinder-db-create-dlfw2" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.289804 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73a0817f-d450-4a46-863a-a7483f144851-operator-scripts\") pod \"cinder-db-create-dlfw2\" (UID: \"73a0817f-d450-4a46-863a-a7483f144851\") " pod="openstack/cinder-db-create-dlfw2" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.289935 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7k7px\" (UniqueName: \"kubernetes.io/projected/73a0817f-d450-4a46-863a-a7483f144851-kube-api-access-7k7px\") pod \"cinder-db-create-dlfw2\" (UID: \"73a0817f-d450-4a46-863a-a7483f144851\") " pod="openstack/cinder-db-create-dlfw2" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.319508 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-e274-account-create-update-jn8sw"] Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.320934 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-e274-account-create-update-jn8sw" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.325581 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.326955 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7k7px\" (UniqueName: \"kubernetes.io/projected/73a0817f-d450-4a46-863a-a7483f144851-kube-api-access-7k7px\") pod \"cinder-db-create-dlfw2\" (UID: \"73a0817f-d450-4a46-863a-a7483f144851\") " pod="openstack/cinder-db-create-dlfw2" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.342273 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-e274-account-create-update-jn8sw"] Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.396745 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd7ffdb9-9cec-4f3a-8b55-65d8f6e42286-operator-scripts\") pod \"barbican-e274-account-create-update-jn8sw\" (UID: \"dd7ffdb9-9cec-4f3a-8b55-65d8f6e42286\") " pod="openstack/barbican-e274-account-create-update-jn8sw" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.396808 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhvt2\" (UniqueName: \"kubernetes.io/projected/dd7ffdb9-9cec-4f3a-8b55-65d8f6e42286-kube-api-access-nhvt2\") pod \"barbican-e274-account-create-update-jn8sw\" (UID: \"dd7ffdb9-9cec-4f3a-8b55-65d8f6e42286\") " pod="openstack/barbican-e274-account-create-update-jn8sw" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.396941 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hk86f\" (UniqueName: \"kubernetes.io/projected/38373747-4dbe-42c7-9060-ada117a776e8-kube-api-access-hk86f\") pod \"barbican-db-create-kbqp2\" (UID: \"38373747-4dbe-42c7-9060-ada117a776e8\") " pod="openstack/barbican-db-create-kbqp2" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.396981 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38373747-4dbe-42c7-9060-ada117a776e8-operator-scripts\") pod \"barbican-db-create-kbqp2\" (UID: \"38373747-4dbe-42c7-9060-ada117a776e8\") " pod="openstack/barbican-db-create-kbqp2" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.430686 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-z4cqn"] Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.432122 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-z4cqn" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.438914 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-dlfw2" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.450904 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-29bc-account-create-update-sgcbq"] Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.452332 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-29bc-account-create-update-sgcbq" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.454214 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.458453 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-z4cqn"] Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.466150 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-29bc-account-create-update-sgcbq"] Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.498830 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sk9qh\" (UniqueName: \"kubernetes.io/projected/e2a8cf8a-0ff4-4fba-a0fa-e62d564230c3-kube-api-access-sk9qh\") pod \"heat-db-create-z4cqn\" (UID: \"e2a8cf8a-0ff4-4fba-a0fa-e62d564230c3\") " pod="openstack/heat-db-create-z4cqn" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.500872 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hk86f\" (UniqueName: \"kubernetes.io/projected/38373747-4dbe-42c7-9060-ada117a776e8-kube-api-access-hk86f\") pod \"barbican-db-create-kbqp2\" (UID: \"38373747-4dbe-42c7-9060-ada117a776e8\") " pod="openstack/barbican-db-create-kbqp2" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.501355 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e2a8cf8a-0ff4-4fba-a0fa-e62d564230c3-operator-scripts\") pod \"heat-db-create-z4cqn\" (UID: \"e2a8cf8a-0ff4-4fba-a0fa-e62d564230c3\") " pod="openstack/heat-db-create-z4cqn" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.501515 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38373747-4dbe-42c7-9060-ada117a776e8-operator-scripts\") pod \"barbican-db-create-kbqp2\" (UID: \"38373747-4dbe-42c7-9060-ada117a776e8\") " pod="openstack/barbican-db-create-kbqp2" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.501617 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd7ffdb9-9cec-4f3a-8b55-65d8f6e42286-operator-scripts\") pod \"barbican-e274-account-create-update-jn8sw\" (UID: \"dd7ffdb9-9cec-4f3a-8b55-65d8f6e42286\") " pod="openstack/barbican-e274-account-create-update-jn8sw" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.502276 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd7ffdb9-9cec-4f3a-8b55-65d8f6e42286-operator-scripts\") pod \"barbican-e274-account-create-update-jn8sw\" (UID: \"dd7ffdb9-9cec-4f3a-8b55-65d8f6e42286\") " pod="openstack/barbican-e274-account-create-update-jn8sw" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.502583 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhvt2\" (UniqueName: \"kubernetes.io/projected/dd7ffdb9-9cec-4f3a-8b55-65d8f6e42286-kube-api-access-nhvt2\") pod \"barbican-e274-account-create-update-jn8sw\" (UID: \"dd7ffdb9-9cec-4f3a-8b55-65d8f6e42286\") " pod="openstack/barbican-e274-account-create-update-jn8sw" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.502656 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38373747-4dbe-42c7-9060-ada117a776e8-operator-scripts\") pod \"barbican-db-create-kbqp2\" (UID: \"38373747-4dbe-42c7-9060-ada117a776e8\") " pod="openstack/barbican-db-create-kbqp2" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.518603 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-2nz97"] Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.519837 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-2nz97" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.532152 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-2nz97"] Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.556750 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhvt2\" (UniqueName: \"kubernetes.io/projected/dd7ffdb9-9cec-4f3a-8b55-65d8f6e42286-kube-api-access-nhvt2\") pod \"barbican-e274-account-create-update-jn8sw\" (UID: \"dd7ffdb9-9cec-4f3a-8b55-65d8f6e42286\") " pod="openstack/barbican-e274-account-create-update-jn8sw" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.557054 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hk86f\" (UniqueName: \"kubernetes.io/projected/38373747-4dbe-42c7-9060-ada117a776e8-kube-api-access-hk86f\") pod \"barbican-db-create-kbqp2\" (UID: \"38373747-4dbe-42c7-9060-ada117a776e8\") " pod="openstack/barbican-db-create-kbqp2" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.578943 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-kbqp2" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.586772 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfc0-account-create-update-6ccd5"] Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.606539 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfc0-account-create-update-6ccd5" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.611065 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.614272 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sk9qh\" (UniqueName: \"kubernetes.io/projected/e2a8cf8a-0ff4-4fba-a0fa-e62d564230c3-kube-api-access-sk9qh\") pod \"heat-db-create-z4cqn\" (UID: \"e2a8cf8a-0ff4-4fba-a0fa-e62d564230c3\") " pod="openstack/heat-db-create-z4cqn" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.614330 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac742a71-b3d4-433d-b550-12300a92941d-operator-scripts\") pod \"neutron-db-create-2nz97\" (UID: \"ac742a71-b3d4-433d-b550-12300a92941d\") " pod="openstack/neutron-db-create-2nz97" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.614382 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/00524f9c-1b21-433f-8886-9b685c169469-operator-scripts\") pod \"cinder-29bc-account-create-update-sgcbq\" (UID: \"00524f9c-1b21-433f-8886-9b685c169469\") " pod="openstack/cinder-29bc-account-create-update-sgcbq" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.614405 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tnrn\" (UniqueName: \"kubernetes.io/projected/00524f9c-1b21-433f-8886-9b685c169469-kube-api-access-8tnrn\") pod \"cinder-29bc-account-create-update-sgcbq\" (UID: \"00524f9c-1b21-433f-8886-9b685c169469\") " pod="openstack/cinder-29bc-account-create-update-sgcbq" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.614439 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e2a8cf8a-0ff4-4fba-a0fa-e62d564230c3-operator-scripts\") pod \"heat-db-create-z4cqn\" (UID: \"e2a8cf8a-0ff4-4fba-a0fa-e62d564230c3\") " pod="openstack/heat-db-create-z4cqn" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.614515 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-728lj\" (UniqueName: \"kubernetes.io/projected/ac742a71-b3d4-433d-b550-12300a92941d-kube-api-access-728lj\") pod \"neutron-db-create-2nz97\" (UID: \"ac742a71-b3d4-433d-b550-12300a92941d\") " pod="openstack/neutron-db-create-2nz97" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.618857 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfc0-account-create-update-6ccd5"] Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.624660 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e2a8cf8a-0ff4-4fba-a0fa-e62d564230c3-operator-scripts\") pod \"heat-db-create-z4cqn\" (UID: \"e2a8cf8a-0ff4-4fba-a0fa-e62d564230c3\") " pod="openstack/heat-db-create-z4cqn" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.636275 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sk9qh\" (UniqueName: \"kubernetes.io/projected/e2a8cf8a-0ff4-4fba-a0fa-e62d564230c3-kube-api-access-sk9qh\") pod \"heat-db-create-z4cqn\" (UID: \"e2a8cf8a-0ff4-4fba-a0fa-e62d564230c3\") " pod="openstack/heat-db-create-z4cqn" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.662701 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-wlpnv"] Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.672722 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-wlpnv" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.675505 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.675505 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.678966 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-nlkxt" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.681847 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-wlpnv"] Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.682877 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.688078 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-e274-account-create-update-jn8sw" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.717368 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac742a71-b3d4-433d-b550-12300a92941d-operator-scripts\") pod \"neutron-db-create-2nz97\" (UID: \"ac742a71-b3d4-433d-b550-12300a92941d\") " pod="openstack/neutron-db-create-2nz97" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.717428 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/00524f9c-1b21-433f-8886-9b685c169469-operator-scripts\") pod \"cinder-29bc-account-create-update-sgcbq\" (UID: \"00524f9c-1b21-433f-8886-9b685c169469\") " pod="openstack/cinder-29bc-account-create-update-sgcbq" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.717451 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tnrn\" (UniqueName: \"kubernetes.io/projected/00524f9c-1b21-433f-8886-9b685c169469-kube-api-access-8tnrn\") pod \"cinder-29bc-account-create-update-sgcbq\" (UID: \"00524f9c-1b21-433f-8886-9b685c169469\") " pod="openstack/cinder-29bc-account-create-update-sgcbq" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.717477 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xhl5\" (UniqueName: \"kubernetes.io/projected/9f34518b-865b-4fe5-b2ff-7060cfced9eb-kube-api-access-6xhl5\") pod \"heat-cfc0-account-create-update-6ccd5\" (UID: \"9f34518b-865b-4fe5-b2ff-7060cfced9eb\") " pod="openstack/heat-cfc0-account-create-update-6ccd5" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.717525 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-728lj\" (UniqueName: \"kubernetes.io/projected/ac742a71-b3d4-433d-b550-12300a92941d-kube-api-access-728lj\") pod \"neutron-db-create-2nz97\" (UID: \"ac742a71-b3d4-433d-b550-12300a92941d\") " pod="openstack/neutron-db-create-2nz97" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.717616 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f34518b-865b-4fe5-b2ff-7060cfced9eb-operator-scripts\") pod \"heat-cfc0-account-create-update-6ccd5\" (UID: \"9f34518b-865b-4fe5-b2ff-7060cfced9eb\") " pod="openstack/heat-cfc0-account-create-update-6ccd5" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.718269 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac742a71-b3d4-433d-b550-12300a92941d-operator-scripts\") pod \"neutron-db-create-2nz97\" (UID: \"ac742a71-b3d4-433d-b550-12300a92941d\") " pod="openstack/neutron-db-create-2nz97" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.719110 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/00524f9c-1b21-433f-8886-9b685c169469-operator-scripts\") pod \"cinder-29bc-account-create-update-sgcbq\" (UID: \"00524f9c-1b21-433f-8886-9b685c169469\") " pod="openstack/cinder-29bc-account-create-update-sgcbq" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.741871 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-cba6-account-create-update-rzw6q"] Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.743497 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-cba6-account-create-update-rzw6q" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.745061 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-728lj\" (UniqueName: \"kubernetes.io/projected/ac742a71-b3d4-433d-b550-12300a92941d-kube-api-access-728lj\") pod \"neutron-db-create-2nz97\" (UID: \"ac742a71-b3d4-433d-b550-12300a92941d\") " pod="openstack/neutron-db-create-2nz97" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.745174 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tnrn\" (UniqueName: \"kubernetes.io/projected/00524f9c-1b21-433f-8886-9b685c169469-kube-api-access-8tnrn\") pod \"cinder-29bc-account-create-update-sgcbq\" (UID: \"00524f9c-1b21-433f-8886-9b685c169469\") " pod="openstack/cinder-29bc-account-create-update-sgcbq" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.745500 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.755209 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-z4cqn" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.779958 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-cba6-account-create-update-rzw6q"] Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.781138 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-29bc-account-create-update-sgcbq" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.819977 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xhl5\" (UniqueName: \"kubernetes.io/projected/9f34518b-865b-4fe5-b2ff-7060cfced9eb-kube-api-access-6xhl5\") pod \"heat-cfc0-account-create-update-6ccd5\" (UID: \"9f34518b-865b-4fe5-b2ff-7060cfced9eb\") " pod="openstack/heat-cfc0-account-create-update-6ccd5" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.820074 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc5ce303-7a1e-40b2-86f6-861898171b29-combined-ca-bundle\") pod \"keystone-db-sync-wlpnv\" (UID: \"fc5ce303-7a1e-40b2-86f6-861898171b29\") " pod="openstack/keystone-db-sync-wlpnv" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.820167 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c59984a3-cd16-4aa7-841e-29de227d4f70-operator-scripts\") pod \"neutron-cba6-account-create-update-rzw6q\" (UID: \"c59984a3-cd16-4aa7-841e-29de227d4f70\") " pod="openstack/neutron-cba6-account-create-update-rzw6q" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.820240 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zszrd\" (UniqueName: \"kubernetes.io/projected/fc5ce303-7a1e-40b2-86f6-861898171b29-kube-api-access-zszrd\") pod \"keystone-db-sync-wlpnv\" (UID: \"fc5ce303-7a1e-40b2-86f6-861898171b29\") " pod="openstack/keystone-db-sync-wlpnv" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.820296 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2lh4\" (UniqueName: \"kubernetes.io/projected/c59984a3-cd16-4aa7-841e-29de227d4f70-kube-api-access-r2lh4\") pod \"neutron-cba6-account-create-update-rzw6q\" (UID: \"c59984a3-cd16-4aa7-841e-29de227d4f70\") " pod="openstack/neutron-cba6-account-create-update-rzw6q" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.820331 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc5ce303-7a1e-40b2-86f6-861898171b29-config-data\") pod \"keystone-db-sync-wlpnv\" (UID: \"fc5ce303-7a1e-40b2-86f6-861898171b29\") " pod="openstack/keystone-db-sync-wlpnv" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.820393 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f34518b-865b-4fe5-b2ff-7060cfced9eb-operator-scripts\") pod \"heat-cfc0-account-create-update-6ccd5\" (UID: \"9f34518b-865b-4fe5-b2ff-7060cfced9eb\") " pod="openstack/heat-cfc0-account-create-update-6ccd5" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.821575 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f34518b-865b-4fe5-b2ff-7060cfced9eb-operator-scripts\") pod \"heat-cfc0-account-create-update-6ccd5\" (UID: \"9f34518b-865b-4fe5-b2ff-7060cfced9eb\") " pod="openstack/heat-cfc0-account-create-update-6ccd5" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.865282 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xhl5\" (UniqueName: \"kubernetes.io/projected/9f34518b-865b-4fe5-b2ff-7060cfced9eb-kube-api-access-6xhl5\") pod \"heat-cfc0-account-create-update-6ccd5\" (UID: \"9f34518b-865b-4fe5-b2ff-7060cfced9eb\") " pod="openstack/heat-cfc0-account-create-update-6ccd5" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.921922 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c59984a3-cd16-4aa7-841e-29de227d4f70-operator-scripts\") pod \"neutron-cba6-account-create-update-rzw6q\" (UID: \"c59984a3-cd16-4aa7-841e-29de227d4f70\") " pod="openstack/neutron-cba6-account-create-update-rzw6q" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.921988 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zszrd\" (UniqueName: \"kubernetes.io/projected/fc5ce303-7a1e-40b2-86f6-861898171b29-kube-api-access-zszrd\") pod \"keystone-db-sync-wlpnv\" (UID: \"fc5ce303-7a1e-40b2-86f6-861898171b29\") " pod="openstack/keystone-db-sync-wlpnv" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.922017 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2lh4\" (UniqueName: \"kubernetes.io/projected/c59984a3-cd16-4aa7-841e-29de227d4f70-kube-api-access-r2lh4\") pod \"neutron-cba6-account-create-update-rzw6q\" (UID: \"c59984a3-cd16-4aa7-841e-29de227d4f70\") " pod="openstack/neutron-cba6-account-create-update-rzw6q" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.922054 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc5ce303-7a1e-40b2-86f6-861898171b29-config-data\") pod \"keystone-db-sync-wlpnv\" (UID: \"fc5ce303-7a1e-40b2-86f6-861898171b29\") " pod="openstack/keystone-db-sync-wlpnv" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.922172 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc5ce303-7a1e-40b2-86f6-861898171b29-combined-ca-bundle\") pod \"keystone-db-sync-wlpnv\" (UID: \"fc5ce303-7a1e-40b2-86f6-861898171b29\") " pod="openstack/keystone-db-sync-wlpnv" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.923107 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c59984a3-cd16-4aa7-841e-29de227d4f70-operator-scripts\") pod \"neutron-cba6-account-create-update-rzw6q\" (UID: \"c59984a3-cd16-4aa7-841e-29de227d4f70\") " pod="openstack/neutron-cba6-account-create-update-rzw6q" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.925943 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc5ce303-7a1e-40b2-86f6-861898171b29-config-data\") pod \"keystone-db-sync-wlpnv\" (UID: \"fc5ce303-7a1e-40b2-86f6-861898171b29\") " pod="openstack/keystone-db-sync-wlpnv" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.926388 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc5ce303-7a1e-40b2-86f6-861898171b29-combined-ca-bundle\") pod \"keystone-db-sync-wlpnv\" (UID: \"fc5ce303-7a1e-40b2-86f6-861898171b29\") " pod="openstack/keystone-db-sync-wlpnv" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.938063 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-2nz97" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.940981 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2lh4\" (UniqueName: \"kubernetes.io/projected/c59984a3-cd16-4aa7-841e-29de227d4f70-kube-api-access-r2lh4\") pod \"neutron-cba6-account-create-update-rzw6q\" (UID: \"c59984a3-cd16-4aa7-841e-29de227d4f70\") " pod="openstack/neutron-cba6-account-create-update-rzw6q" Dec 01 08:39:37 crc kubenswrapper[5004]: I1201 08:39:37.962222 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zszrd\" (UniqueName: \"kubernetes.io/projected/fc5ce303-7a1e-40b2-86f6-861898171b29-kube-api-access-zszrd\") pod \"keystone-db-sync-wlpnv\" (UID: \"fc5ce303-7a1e-40b2-86f6-861898171b29\") " pod="openstack/keystone-db-sync-wlpnv" Dec 01 08:39:38 crc kubenswrapper[5004]: I1201 08:39:38.003138 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfc0-account-create-update-6ccd5" Dec 01 08:39:38 crc kubenswrapper[5004]: I1201 08:39:38.012016 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-wlpnv" Dec 01 08:39:38 crc kubenswrapper[5004]: I1201 08:39:38.098639 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-cba6-account-create-update-rzw6q" Dec 01 08:39:41 crc kubenswrapper[5004]: I1201 08:39:41.870307 5004 generic.go:334] "Generic (PLEG): container finished" podID="b0dfa0be-7482-49a2-adc0-425cedd5c597" containerID="0309c5e82756a858a9b25dbf890fb071010bba1c03f21b9b5370f7c23d2502a4" exitCode=0 Dec 01 08:39:41 crc kubenswrapper[5004]: I1201 08:39:41.870355 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"b0dfa0be-7482-49a2-adc0-425cedd5c597","Type":"ContainerDied","Data":"0309c5e82756a858a9b25dbf890fb071010bba1c03f21b9b5370f7c23d2502a4"} Dec 01 08:39:43 crc kubenswrapper[5004]: I1201 08:39:43.465566 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-29bc-account-create-update-sgcbq"] Dec 01 08:39:43 crc kubenswrapper[5004]: I1201 08:39:43.598203 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-2nz97"] Dec 01 08:39:43 crc kubenswrapper[5004]: I1201 08:39:43.612515 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-dlfw2"] Dec 01 08:39:43 crc kubenswrapper[5004]: W1201 08:39:43.614449 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac742a71_b3d4_433d_b550_12300a92941d.slice/crio-615626db86f6bc160373c413deaa131b43df1a5abf812051ca0b105c2fe377a1 WatchSource:0}: Error finding container 615626db86f6bc160373c413deaa131b43df1a5abf812051ca0b105c2fe377a1: Status 404 returned error can't find the container with id 615626db86f6bc160373c413deaa131b43df1a5abf812051ca0b105c2fe377a1 Dec 01 08:39:43 crc kubenswrapper[5004]: W1201 08:39:43.617258 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod73a0817f_d450_4a46_863a_a7483f144851.slice/crio-ccc7c4407c072ad4426f1e20a792f414475a80513d3b14ececdbf781ad99aa81 WatchSource:0}: Error finding container ccc7c4407c072ad4426f1e20a792f414475a80513d3b14ececdbf781ad99aa81: Status 404 returned error can't find the container with id ccc7c4407c072ad4426f1e20a792f414475a80513d3b14ececdbf781ad99aa81 Dec 01 08:39:43 crc kubenswrapper[5004]: W1201 08:39:43.621199 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc5ce303_7a1e_40b2_86f6_861898171b29.slice/crio-30ea54a94cc388536da4eaf7b87752d2cf91c96c1e9980921a1201648acb4b35 WatchSource:0}: Error finding container 30ea54a94cc388536da4eaf7b87752d2cf91c96c1e9980921a1201648acb4b35: Status 404 returned error can't find the container with id 30ea54a94cc388536da4eaf7b87752d2cf91c96c1e9980921a1201648acb4b35 Dec 01 08:39:43 crc kubenswrapper[5004]: I1201 08:39:43.625898 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-wlpnv"] Dec 01 08:39:43 crc kubenswrapper[5004]: I1201 08:39:43.939331 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-29bc-account-create-update-sgcbq" event={"ID":"00524f9c-1b21-433f-8886-9b685c169469","Type":"ContainerStarted","Data":"eeabc7c4ec03654e4c7de9c13196eaa169ce257ae0ad80bc728660322766fb84"} Dec 01 08:39:43 crc kubenswrapper[5004]: I1201 08:39:43.939389 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-29bc-account-create-update-sgcbq" event={"ID":"00524f9c-1b21-433f-8886-9b685c169469","Type":"ContainerStarted","Data":"c4c582bb82f5451297ad793aa8d70f776a725a3ced2317432b3f813da05e47f1"} Dec 01 08:39:43 crc kubenswrapper[5004]: I1201 08:39:43.943860 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-wlpnv" event={"ID":"fc5ce303-7a1e-40b2-86f6-861898171b29","Type":"ContainerStarted","Data":"30ea54a94cc388536da4eaf7b87752d2cf91c96c1e9980921a1201648acb4b35"} Dec 01 08:39:43 crc kubenswrapper[5004]: I1201 08:39:43.957025 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"b0dfa0be-7482-49a2-adc0-425cedd5c597","Type":"ContainerStarted","Data":"534c6d7212738f943030e3cd7f9d33f3c0b0dbaeb65f931541100b6ddee977a0"} Dec 01 08:39:43 crc kubenswrapper[5004]: I1201 08:39:43.959005 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-dlfw2" event={"ID":"73a0817f-d450-4a46-863a-a7483f144851","Type":"ContainerStarted","Data":"ccc7c4407c072ad4426f1e20a792f414475a80513d3b14ececdbf781ad99aa81"} Dec 01 08:39:43 crc kubenswrapper[5004]: I1201 08:39:43.965536 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-29bc-account-create-update-sgcbq" podStartSLOduration=6.965516264 podStartE2EDuration="6.965516264s" podCreationTimestamp="2025-12-01 08:39:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:39:43.965025082 +0000 UTC m=+1361.530017074" watchObservedRunningTime="2025-12-01 08:39:43.965516264 +0000 UTC m=+1361.530508246" Dec 01 08:39:43 crc kubenswrapper[5004]: I1201 08:39:43.974090 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"64792f42-dd08-4537-bce9-a632e644cf5a","Type":"ContainerStarted","Data":"89e82021f0480774655d41996cbac4728929b1d3e002f0711ff65b912c17c981"} Dec 01 08:39:43 crc kubenswrapper[5004]: I1201 08:39:43.985322 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-2nz97" event={"ID":"ac742a71-b3d4-433d-b550-12300a92941d","Type":"ContainerStarted","Data":"6e2ee528ac706228aefbe21d58b251f96222ef8ade16570465449fde858fe964"} Dec 01 08:39:43 crc kubenswrapper[5004]: I1201 08:39:43.985364 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-2nz97" event={"ID":"ac742a71-b3d4-433d-b550-12300a92941d","Type":"ContainerStarted","Data":"615626db86f6bc160373c413deaa131b43df1a5abf812051ca0b105c2fe377a1"} Dec 01 08:39:44 crc kubenswrapper[5004]: I1201 08:39:44.036956 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfc0-account-create-update-6ccd5"] Dec 01 08:39:44 crc kubenswrapper[5004]: I1201 08:39:44.062274 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-z4cqn"] Dec 01 08:39:44 crc kubenswrapper[5004]: I1201 08:39:44.074364 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=30.467999899 podStartE2EDuration="42.074344902s" podCreationTimestamp="2025-12-01 08:39:02 +0000 UTC" firstStartedPulling="2025-12-01 08:39:20.219708384 +0000 UTC m=+1337.784700386" lastFinishedPulling="2025-12-01 08:39:31.826053417 +0000 UTC m=+1349.391045389" observedRunningTime="2025-12-01 08:39:44.03416135 +0000 UTC m=+1361.599153342" watchObservedRunningTime="2025-12-01 08:39:44.074344902 +0000 UTC m=+1361.639336884" Dec 01 08:39:44 crc kubenswrapper[5004]: I1201 08:39:44.085325 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-2nz97" podStartSLOduration=7.085309169 podStartE2EDuration="7.085309169s" podCreationTimestamp="2025-12-01 08:39:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:39:44.058977697 +0000 UTC m=+1361.623969679" watchObservedRunningTime="2025-12-01 08:39:44.085309169 +0000 UTC m=+1361.650301151" Dec 01 08:39:44 crc kubenswrapper[5004]: I1201 08:39:44.086631 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-kbqp2"] Dec 01 08:39:44 crc kubenswrapper[5004]: I1201 08:39:44.105616 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-e274-account-create-update-jn8sw"] Dec 01 08:39:44 crc kubenswrapper[5004]: I1201 08:39:44.109525 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-cba6-account-create-update-rzw6q"] Dec 01 08:39:44 crc kubenswrapper[5004]: W1201 08:39:44.155790 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod38373747_4dbe_42c7_9060_ada117a776e8.slice/crio-ee939f7a90a2fe8d6cea11e2d319cb488672d5369b04df1d870efc89d581c510 WatchSource:0}: Error finding container ee939f7a90a2fe8d6cea11e2d319cb488672d5369b04df1d870efc89d581c510: Status 404 returned error can't find the container with id ee939f7a90a2fe8d6cea11e2d319cb488672d5369b04df1d870efc89d581c510 Dec 01 08:39:44 crc kubenswrapper[5004]: I1201 08:39:44.366013 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-25snn"] Dec 01 08:39:44 crc kubenswrapper[5004]: I1201 08:39:44.368385 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-25snn" Dec 01 08:39:44 crc kubenswrapper[5004]: I1201 08:39:44.370972 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Dec 01 08:39:44 crc kubenswrapper[5004]: I1201 08:39:44.380322 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-25snn"] Dec 01 08:39:44 crc kubenswrapper[5004]: I1201 08:39:44.476034 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cce6a698-0c81-45a6-8025-c04547cf2768-dns-swift-storage-0\") pod \"dnsmasq-dns-6d5b6d6b67-25snn\" (UID: \"cce6a698-0c81-45a6-8025-c04547cf2768\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-25snn" Dec 01 08:39:44 crc kubenswrapper[5004]: I1201 08:39:44.476095 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cce6a698-0c81-45a6-8025-c04547cf2768-dns-svc\") pod \"dnsmasq-dns-6d5b6d6b67-25snn\" (UID: \"cce6a698-0c81-45a6-8025-c04547cf2768\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-25snn" Dec 01 08:39:44 crc kubenswrapper[5004]: I1201 08:39:44.476222 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cce6a698-0c81-45a6-8025-c04547cf2768-config\") pod \"dnsmasq-dns-6d5b6d6b67-25snn\" (UID: \"cce6a698-0c81-45a6-8025-c04547cf2768\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-25snn" Dec 01 08:39:44 crc kubenswrapper[5004]: I1201 08:39:44.476251 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfkq4\" (UniqueName: \"kubernetes.io/projected/cce6a698-0c81-45a6-8025-c04547cf2768-kube-api-access-cfkq4\") pod \"dnsmasq-dns-6d5b6d6b67-25snn\" (UID: \"cce6a698-0c81-45a6-8025-c04547cf2768\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-25snn" Dec 01 08:39:44 crc kubenswrapper[5004]: I1201 08:39:44.476426 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cce6a698-0c81-45a6-8025-c04547cf2768-ovsdbserver-nb\") pod \"dnsmasq-dns-6d5b6d6b67-25snn\" (UID: \"cce6a698-0c81-45a6-8025-c04547cf2768\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-25snn" Dec 01 08:39:44 crc kubenswrapper[5004]: I1201 08:39:44.476488 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cce6a698-0c81-45a6-8025-c04547cf2768-ovsdbserver-sb\") pod \"dnsmasq-dns-6d5b6d6b67-25snn\" (UID: \"cce6a698-0c81-45a6-8025-c04547cf2768\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-25snn" Dec 01 08:39:44 crc kubenswrapper[5004]: I1201 08:39:44.579613 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cce6a698-0c81-45a6-8025-c04547cf2768-dns-swift-storage-0\") pod \"dnsmasq-dns-6d5b6d6b67-25snn\" (UID: \"cce6a698-0c81-45a6-8025-c04547cf2768\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-25snn" Dec 01 08:39:44 crc kubenswrapper[5004]: I1201 08:39:44.579675 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cce6a698-0c81-45a6-8025-c04547cf2768-dns-svc\") pod \"dnsmasq-dns-6d5b6d6b67-25snn\" (UID: \"cce6a698-0c81-45a6-8025-c04547cf2768\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-25snn" Dec 01 08:39:44 crc kubenswrapper[5004]: I1201 08:39:44.579785 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cce6a698-0c81-45a6-8025-c04547cf2768-config\") pod \"dnsmasq-dns-6d5b6d6b67-25snn\" (UID: \"cce6a698-0c81-45a6-8025-c04547cf2768\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-25snn" Dec 01 08:39:44 crc kubenswrapper[5004]: I1201 08:39:44.579807 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfkq4\" (UniqueName: \"kubernetes.io/projected/cce6a698-0c81-45a6-8025-c04547cf2768-kube-api-access-cfkq4\") pod \"dnsmasq-dns-6d5b6d6b67-25snn\" (UID: \"cce6a698-0c81-45a6-8025-c04547cf2768\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-25snn" Dec 01 08:39:44 crc kubenswrapper[5004]: I1201 08:39:44.579884 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cce6a698-0c81-45a6-8025-c04547cf2768-ovsdbserver-nb\") pod \"dnsmasq-dns-6d5b6d6b67-25snn\" (UID: \"cce6a698-0c81-45a6-8025-c04547cf2768\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-25snn" Dec 01 08:39:44 crc kubenswrapper[5004]: I1201 08:39:44.579927 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cce6a698-0c81-45a6-8025-c04547cf2768-ovsdbserver-sb\") pod \"dnsmasq-dns-6d5b6d6b67-25snn\" (UID: \"cce6a698-0c81-45a6-8025-c04547cf2768\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-25snn" Dec 01 08:39:44 crc kubenswrapper[5004]: I1201 08:39:44.580916 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cce6a698-0c81-45a6-8025-c04547cf2768-config\") pod \"dnsmasq-dns-6d5b6d6b67-25snn\" (UID: \"cce6a698-0c81-45a6-8025-c04547cf2768\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-25snn" Dec 01 08:39:44 crc kubenswrapper[5004]: I1201 08:39:44.580938 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cce6a698-0c81-45a6-8025-c04547cf2768-ovsdbserver-sb\") pod \"dnsmasq-dns-6d5b6d6b67-25snn\" (UID: \"cce6a698-0c81-45a6-8025-c04547cf2768\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-25snn" Dec 01 08:39:44 crc kubenswrapper[5004]: I1201 08:39:44.581994 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cce6a698-0c81-45a6-8025-c04547cf2768-ovsdbserver-nb\") pod \"dnsmasq-dns-6d5b6d6b67-25snn\" (UID: \"cce6a698-0c81-45a6-8025-c04547cf2768\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-25snn" Dec 01 08:39:44 crc kubenswrapper[5004]: I1201 08:39:44.583098 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cce6a698-0c81-45a6-8025-c04547cf2768-dns-svc\") pod \"dnsmasq-dns-6d5b6d6b67-25snn\" (UID: \"cce6a698-0c81-45a6-8025-c04547cf2768\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-25snn" Dec 01 08:39:44 crc kubenswrapper[5004]: I1201 08:39:44.584135 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cce6a698-0c81-45a6-8025-c04547cf2768-dns-swift-storage-0\") pod \"dnsmasq-dns-6d5b6d6b67-25snn\" (UID: \"cce6a698-0c81-45a6-8025-c04547cf2768\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-25snn" Dec 01 08:39:44 crc kubenswrapper[5004]: I1201 08:39:44.607702 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfkq4\" (UniqueName: \"kubernetes.io/projected/cce6a698-0c81-45a6-8025-c04547cf2768-kube-api-access-cfkq4\") pod \"dnsmasq-dns-6d5b6d6b67-25snn\" (UID: \"cce6a698-0c81-45a6-8025-c04547cf2768\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-25snn" Dec 01 08:39:44 crc kubenswrapper[5004]: I1201 08:39:44.874206 5004 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podabf31963-3bf7-4b6e-adaa-8605634a9530"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podabf31963-3bf7-4b6e-adaa-8605634a9530] : Timed out while waiting for systemd to remove kubepods-besteffort-podabf31963_3bf7_4b6e_adaa_8605634a9530.slice" Dec 01 08:39:44 crc kubenswrapper[5004]: E1201 08:39:44.874256 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort podabf31963-3bf7-4b6e-adaa-8605634a9530] : unable to destroy cgroup paths for cgroup [kubepods besteffort podabf31963-3bf7-4b6e-adaa-8605634a9530] : Timed out while waiting for systemd to remove kubepods-besteffort-podabf31963_3bf7_4b6e_adaa_8605634a9530.slice" pod="openstack/dnsmasq-dns-8554648995-4vb89" podUID="abf31963-3bf7-4b6e-adaa-8605634a9530" Dec 01 08:39:45 crc kubenswrapper[5004]: I1201 08:39:45.013467 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-25snn" Dec 01 08:39:45 crc kubenswrapper[5004]: I1201 08:39:45.030601 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfc0-account-create-update-6ccd5" event={"ID":"9f34518b-865b-4fe5-b2ff-7060cfced9eb","Type":"ContainerStarted","Data":"c17028943fa6f4e919a00b9cd99dbee6194954d370f0b8b0ff598ebde9129147"} Dec 01 08:39:45 crc kubenswrapper[5004]: I1201 08:39:45.033412 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-cba6-account-create-update-rzw6q" event={"ID":"c59984a3-cd16-4aa7-841e-29de227d4f70","Type":"ContainerStarted","Data":"c97c0dcdd47c103d11157a9c2ba1ba8dc1b27bbb9935560e32e9f0236f3435db"} Dec 01 08:39:45 crc kubenswrapper[5004]: I1201 08:39:45.035780 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-e274-account-create-update-jn8sw" event={"ID":"dd7ffdb9-9cec-4f3a-8b55-65d8f6e42286","Type":"ContainerStarted","Data":"895a7fff306308ff201de96aef23aa893bf75ea1e7829852b23cb00853c1e14d"} Dec 01 08:39:45 crc kubenswrapper[5004]: I1201 08:39:45.038744 5004 generic.go:334] "Generic (PLEG): container finished" podID="ac742a71-b3d4-433d-b550-12300a92941d" containerID="6e2ee528ac706228aefbe21d58b251f96222ef8ade16570465449fde858fe964" exitCode=0 Dec 01 08:39:45 crc kubenswrapper[5004]: I1201 08:39:45.038797 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-2nz97" event={"ID":"ac742a71-b3d4-433d-b550-12300a92941d","Type":"ContainerDied","Data":"6e2ee528ac706228aefbe21d58b251f96222ef8ade16570465449fde858fe964"} Dec 01 08:39:45 crc kubenswrapper[5004]: I1201 08:39:45.046440 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-kbqp2" event={"ID":"38373747-4dbe-42c7-9060-ada117a776e8","Type":"ContainerStarted","Data":"ee939f7a90a2fe8d6cea11e2d319cb488672d5369b04df1d870efc89d581c510"} Dec 01 08:39:45 crc kubenswrapper[5004]: I1201 08:39:45.047636 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-z4cqn" event={"ID":"e2a8cf8a-0ff4-4fba-a0fa-e62d564230c3","Type":"ContainerStarted","Data":"1bcd2e05b27ac25006bd11058d6f2883caf349cc2fed830432ac1420ae8701e8"} Dec 01 08:39:45 crc kubenswrapper[5004]: I1201 08:39:45.052629 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-vp28t" event={"ID":"96f41ff9-6619-4262-8ecb-0a577f611f68","Type":"ContainerStarted","Data":"c661d564a31c9f258ed7a2bd32494a86bfb175494539a7c0df1df08e5114efe1"} Dec 01 08:39:45 crc kubenswrapper[5004]: I1201 08:39:45.055511 5004 generic.go:334] "Generic (PLEG): container finished" podID="00524f9c-1b21-433f-8886-9b685c169469" containerID="eeabc7c4ec03654e4c7de9c13196eaa169ce257ae0ad80bc728660322766fb84" exitCode=0 Dec 01 08:39:45 crc kubenswrapper[5004]: I1201 08:39:45.055591 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-29bc-account-create-update-sgcbq" event={"ID":"00524f9c-1b21-433f-8886-9b685c169469","Type":"ContainerDied","Data":"eeabc7c4ec03654e4c7de9c13196eaa169ce257ae0ad80bc728660322766fb84"} Dec 01 08:39:45 crc kubenswrapper[5004]: I1201 08:39:45.057630 5004 generic.go:334] "Generic (PLEG): container finished" podID="73a0817f-d450-4a46-863a-a7483f144851" containerID="98f2535ae0a25e20bcb06db1552d9613f56abf1fe5c4d14ede6957f9aacf1df2" exitCode=0 Dec 01 08:39:45 crc kubenswrapper[5004]: I1201 08:39:45.057714 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-4vb89" Dec 01 08:39:45 crc kubenswrapper[5004]: I1201 08:39:45.058104 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-dlfw2" event={"ID":"73a0817f-d450-4a46-863a-a7483f144851","Type":"ContainerDied","Data":"98f2535ae0a25e20bcb06db1552d9613f56abf1fe5c4d14ede6957f9aacf1df2"} Dec 01 08:39:45 crc kubenswrapper[5004]: I1201 08:39:45.075282 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-vp28t" podStartSLOduration=4.279705103 podStartE2EDuration="25.07526594s" podCreationTimestamp="2025-12-01 08:39:20 +0000 UTC" firstStartedPulling="2025-12-01 08:39:22.312940323 +0000 UTC m=+1339.877932305" lastFinishedPulling="2025-12-01 08:39:43.10850116 +0000 UTC m=+1360.673493142" observedRunningTime="2025-12-01 08:39:45.069222183 +0000 UTC m=+1362.634214185" watchObservedRunningTime="2025-12-01 08:39:45.07526594 +0000 UTC m=+1362.640257922" Dec 01 08:39:45 crc kubenswrapper[5004]: I1201 08:39:45.495438 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-4vb89"] Dec 01 08:39:45 crc kubenswrapper[5004]: I1201 08:39:45.508878 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8554648995-4vb89"] Dec 01 08:39:45 crc kubenswrapper[5004]: I1201 08:39:45.523508 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-25snn"] Dec 01 08:39:45 crc kubenswrapper[5004]: W1201 08:39:45.612029 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcce6a698_0c81_45a6_8025_c04547cf2768.slice/crio-ad9bcba064c64b134d03a865738a880242a41685450a5396536117217c56be21 WatchSource:0}: Error finding container ad9bcba064c64b134d03a865738a880242a41685450a5396536117217c56be21: Status 404 returned error can't find the container with id ad9bcba064c64b134d03a865738a880242a41685450a5396536117217c56be21 Dec 01 08:39:46 crc kubenswrapper[5004]: I1201 08:39:46.073363 5004 generic.go:334] "Generic (PLEG): container finished" podID="38373747-4dbe-42c7-9060-ada117a776e8" containerID="86b40f1f3075aac7be5469dc08cc364ca5d64826b94e214f37b900ceb881d552" exitCode=0 Dec 01 08:39:46 crc kubenswrapper[5004]: I1201 08:39:46.073422 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-kbqp2" event={"ID":"38373747-4dbe-42c7-9060-ada117a776e8","Type":"ContainerDied","Data":"86b40f1f3075aac7be5469dc08cc364ca5d64826b94e214f37b900ceb881d552"} Dec 01 08:39:46 crc kubenswrapper[5004]: I1201 08:39:46.075059 5004 generic.go:334] "Generic (PLEG): container finished" podID="e2a8cf8a-0ff4-4fba-a0fa-e62d564230c3" containerID="c8c8f04a8e2afb0cdfb75fd84a86f5dd282017b6649feadc92a9109ebe31898c" exitCode=0 Dec 01 08:39:46 crc kubenswrapper[5004]: I1201 08:39:46.075124 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-z4cqn" event={"ID":"e2a8cf8a-0ff4-4fba-a0fa-e62d564230c3","Type":"ContainerDied","Data":"c8c8f04a8e2afb0cdfb75fd84a86f5dd282017b6649feadc92a9109ebe31898c"} Dec 01 08:39:46 crc kubenswrapper[5004]: I1201 08:39:46.078986 5004 generic.go:334] "Generic (PLEG): container finished" podID="9f34518b-865b-4fe5-b2ff-7060cfced9eb" containerID="da82e03befca2bf3ab63cb88ce3c79cca63e76d05276ed4118d1de5853eb3be5" exitCode=0 Dec 01 08:39:46 crc kubenswrapper[5004]: I1201 08:39:46.079094 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfc0-account-create-update-6ccd5" event={"ID":"9f34518b-865b-4fe5-b2ff-7060cfced9eb","Type":"ContainerDied","Data":"da82e03befca2bf3ab63cb88ce3c79cca63e76d05276ed4118d1de5853eb3be5"} Dec 01 08:39:46 crc kubenswrapper[5004]: I1201 08:39:46.080649 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-25snn" event={"ID":"cce6a698-0c81-45a6-8025-c04547cf2768","Type":"ContainerStarted","Data":"ad9bcba064c64b134d03a865738a880242a41685450a5396536117217c56be21"} Dec 01 08:39:46 crc kubenswrapper[5004]: I1201 08:39:46.082231 5004 generic.go:334] "Generic (PLEG): container finished" podID="dd7ffdb9-9cec-4f3a-8b55-65d8f6e42286" containerID="8967286d52c4953e254ebe1d7620e40da3f3c0b514f684c4b6e4d9ac963665ee" exitCode=0 Dec 01 08:39:46 crc kubenswrapper[5004]: I1201 08:39:46.082448 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-e274-account-create-update-jn8sw" event={"ID":"dd7ffdb9-9cec-4f3a-8b55-65d8f6e42286","Type":"ContainerDied","Data":"8967286d52c4953e254ebe1d7620e40da3f3c0b514f684c4b6e4d9ac963665ee"} Dec 01 08:39:46 crc kubenswrapper[5004]: I1201 08:39:46.774044 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abf31963-3bf7-4b6e-adaa-8605634a9530" path="/var/lib/kubelet/pods/abf31963-3bf7-4b6e-adaa-8605634a9530/volumes" Dec 01 08:39:47 crc kubenswrapper[5004]: I1201 08:39:47.101151 5004 generic.go:334] "Generic (PLEG): container finished" podID="c59984a3-cd16-4aa7-841e-29de227d4f70" containerID="8c8510fcff8031c7a9daf6c7d56792ec25e303f0f25a8d8ac95b1d4083593225" exitCode=0 Dec 01 08:39:47 crc kubenswrapper[5004]: I1201 08:39:47.101213 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-cba6-account-create-update-rzw6q" event={"ID":"c59984a3-cd16-4aa7-841e-29de227d4f70","Type":"ContainerDied","Data":"8c8510fcff8031c7a9daf6c7d56792ec25e303f0f25a8d8ac95b1d4083593225"} Dec 01 08:39:47 crc kubenswrapper[5004]: I1201 08:39:47.122199 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"b0dfa0be-7482-49a2-adc0-425cedd5c597","Type":"ContainerStarted","Data":"fdec30f0369330e524f58a0bd7551601c96794b197ea25c2d4812fd2f13874ad"} Dec 01 08:39:47 crc kubenswrapper[5004]: I1201 08:39:47.130852 5004 generic.go:334] "Generic (PLEG): container finished" podID="cce6a698-0c81-45a6-8025-c04547cf2768" containerID="b9fd84922bf36669f215326eb888558fc06f6cbf95bab1e4f4a38827a42a3a6d" exitCode=0 Dec 01 08:39:47 crc kubenswrapper[5004]: I1201 08:39:47.132034 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-25snn" event={"ID":"cce6a698-0c81-45a6-8025-c04547cf2768","Type":"ContainerDied","Data":"b9fd84922bf36669f215326eb888558fc06f6cbf95bab1e4f4a38827a42a3a6d"} Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.007046 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-29bc-account-create-update-sgcbq" Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.034767 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-z4cqn" Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.040285 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfc0-account-create-update-6ccd5" Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.061240 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-cba6-account-create-update-rzw6q" Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.093300 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-2nz97" Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.101537 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-kbqp2" Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.107906 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-e274-account-create-update-jn8sw" Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.122022 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xhl5\" (UniqueName: \"kubernetes.io/projected/9f34518b-865b-4fe5-b2ff-7060cfced9eb-kube-api-access-6xhl5\") pod \"9f34518b-865b-4fe5-b2ff-7060cfced9eb\" (UID: \"9f34518b-865b-4fe5-b2ff-7060cfced9eb\") " Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.122076 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac742a71-b3d4-433d-b550-12300a92941d-operator-scripts\") pod \"ac742a71-b3d4-433d-b550-12300a92941d\" (UID: \"ac742a71-b3d4-433d-b550-12300a92941d\") " Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.122112 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f34518b-865b-4fe5-b2ff-7060cfced9eb-operator-scripts\") pod \"9f34518b-865b-4fe5-b2ff-7060cfced9eb\" (UID: \"9f34518b-865b-4fe5-b2ff-7060cfced9eb\") " Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.122168 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tnrn\" (UniqueName: \"kubernetes.io/projected/00524f9c-1b21-433f-8886-9b685c169469-kube-api-access-8tnrn\") pod \"00524f9c-1b21-433f-8886-9b685c169469\" (UID: \"00524f9c-1b21-433f-8886-9b685c169469\") " Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.122228 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2lh4\" (UniqueName: \"kubernetes.io/projected/c59984a3-cd16-4aa7-841e-29de227d4f70-kube-api-access-r2lh4\") pod \"c59984a3-cd16-4aa7-841e-29de227d4f70\" (UID: \"c59984a3-cd16-4aa7-841e-29de227d4f70\") " Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.122262 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c59984a3-cd16-4aa7-841e-29de227d4f70-operator-scripts\") pod \"c59984a3-cd16-4aa7-841e-29de227d4f70\" (UID: \"c59984a3-cd16-4aa7-841e-29de227d4f70\") " Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.122342 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/00524f9c-1b21-433f-8886-9b685c169469-operator-scripts\") pod \"00524f9c-1b21-433f-8886-9b685c169469\" (UID: \"00524f9c-1b21-433f-8886-9b685c169469\") " Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.122409 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e2a8cf8a-0ff4-4fba-a0fa-e62d564230c3-operator-scripts\") pod \"e2a8cf8a-0ff4-4fba-a0fa-e62d564230c3\" (UID: \"e2a8cf8a-0ff4-4fba-a0fa-e62d564230c3\") " Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.122425 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sk9qh\" (UniqueName: \"kubernetes.io/projected/e2a8cf8a-0ff4-4fba-a0fa-e62d564230c3-kube-api-access-sk9qh\") pod \"e2a8cf8a-0ff4-4fba-a0fa-e62d564230c3\" (UID: \"e2a8cf8a-0ff4-4fba-a0fa-e62d564230c3\") " Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.122462 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-728lj\" (UniqueName: \"kubernetes.io/projected/ac742a71-b3d4-433d-b550-12300a92941d-kube-api-access-728lj\") pod \"ac742a71-b3d4-433d-b550-12300a92941d\" (UID: \"ac742a71-b3d4-433d-b550-12300a92941d\") " Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.123318 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac742a71-b3d4-433d-b550-12300a92941d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ac742a71-b3d4-433d-b550-12300a92941d" (UID: "ac742a71-b3d4-433d-b550-12300a92941d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.123772 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00524f9c-1b21-433f-8886-9b685c169469-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "00524f9c-1b21-433f-8886-9b685c169469" (UID: "00524f9c-1b21-433f-8886-9b685c169469"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.123892 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f34518b-865b-4fe5-b2ff-7060cfced9eb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9f34518b-865b-4fe5-b2ff-7060cfced9eb" (UID: "9f34518b-865b-4fe5-b2ff-7060cfced9eb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.124089 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-dlfw2" Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.125114 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2a8cf8a-0ff4-4fba-a0fa-e62d564230c3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e2a8cf8a-0ff4-4fba-a0fa-e62d564230c3" (UID: "e2a8cf8a-0ff4-4fba-a0fa-e62d564230c3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.126830 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c59984a3-cd16-4aa7-841e-29de227d4f70-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c59984a3-cd16-4aa7-841e-29de227d4f70" (UID: "c59984a3-cd16-4aa7-841e-29de227d4f70"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.151980 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f34518b-865b-4fe5-b2ff-7060cfced9eb-kube-api-access-6xhl5" (OuterVolumeSpecName: "kube-api-access-6xhl5") pod "9f34518b-865b-4fe5-b2ff-7060cfced9eb" (UID: "9f34518b-865b-4fe5-b2ff-7060cfced9eb"). InnerVolumeSpecName "kube-api-access-6xhl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.152687 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2a8cf8a-0ff4-4fba-a0fa-e62d564230c3-kube-api-access-sk9qh" (OuterVolumeSpecName: "kube-api-access-sk9qh") pod "e2a8cf8a-0ff4-4fba-a0fa-e62d564230c3" (UID: "e2a8cf8a-0ff4-4fba-a0fa-e62d564230c3"). InnerVolumeSpecName "kube-api-access-sk9qh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.154047 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c59984a3-cd16-4aa7-841e-29de227d4f70-kube-api-access-r2lh4" (OuterVolumeSpecName: "kube-api-access-r2lh4") pod "c59984a3-cd16-4aa7-841e-29de227d4f70" (UID: "c59984a3-cd16-4aa7-841e-29de227d4f70"). InnerVolumeSpecName "kube-api-access-r2lh4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.155981 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac742a71-b3d4-433d-b550-12300a92941d-kube-api-access-728lj" (OuterVolumeSpecName: "kube-api-access-728lj") pod "ac742a71-b3d4-433d-b550-12300a92941d" (UID: "ac742a71-b3d4-433d-b550-12300a92941d"). InnerVolumeSpecName "kube-api-access-728lj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.157592 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00524f9c-1b21-433f-8886-9b685c169469-kube-api-access-8tnrn" (OuterVolumeSpecName: "kube-api-access-8tnrn") pod "00524f9c-1b21-433f-8886-9b685c169469" (UID: "00524f9c-1b21-433f-8886-9b685c169469"). InnerVolumeSpecName "kube-api-access-8tnrn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.165387 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-29bc-account-create-update-sgcbq" event={"ID":"00524f9c-1b21-433f-8886-9b685c169469","Type":"ContainerDied","Data":"c4c582bb82f5451297ad793aa8d70f776a725a3ced2317432b3f813da05e47f1"} Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.165438 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4c582bb82f5451297ad793aa8d70f776a725a3ced2317432b3f813da05e47f1" Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.165511 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-29bc-account-create-update-sgcbq" Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.174814 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-z4cqn" event={"ID":"e2a8cf8a-0ff4-4fba-a0fa-e62d564230c3","Type":"ContainerDied","Data":"1bcd2e05b27ac25006bd11058d6f2883caf349cc2fed830432ac1420ae8701e8"} Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.174853 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1bcd2e05b27ac25006bd11058d6f2883caf349cc2fed830432ac1420ae8701e8" Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.175140 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-z4cqn" Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.178038 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfc0-account-create-update-6ccd5" Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.178147 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfc0-account-create-update-6ccd5" event={"ID":"9f34518b-865b-4fe5-b2ff-7060cfced9eb","Type":"ContainerDied","Data":"c17028943fa6f4e919a00b9cd99dbee6194954d370f0b8b0ff598ebde9129147"} Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.178980 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c17028943fa6f4e919a00b9cd99dbee6194954d370f0b8b0ff598ebde9129147" Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.180955 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-cba6-account-create-update-rzw6q" event={"ID":"c59984a3-cd16-4aa7-841e-29de227d4f70","Type":"ContainerDied","Data":"c97c0dcdd47c103d11157a9c2ba1ba8dc1b27bbb9935560e32e9f0236f3435db"} Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.186673 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c97c0dcdd47c103d11157a9c2ba1ba8dc1b27bbb9935560e32e9f0236f3435db" Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.186722 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-dlfw2" event={"ID":"73a0817f-d450-4a46-863a-a7483f144851","Type":"ContainerDied","Data":"ccc7c4407c072ad4426f1e20a792f414475a80513d3b14ececdbf781ad99aa81"} Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.186756 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ccc7c4407c072ad4426f1e20a792f414475a80513d3b14ececdbf781ad99aa81" Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.185386 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-dlfw2" Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.180975 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-cba6-account-create-update-rzw6q" Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.188990 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-e274-account-create-update-jn8sw" Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.188994 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-e274-account-create-update-jn8sw" event={"ID":"dd7ffdb9-9cec-4f3a-8b55-65d8f6e42286","Type":"ContainerDied","Data":"895a7fff306308ff201de96aef23aa893bf75ea1e7829852b23cb00853c1e14d"} Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.189511 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="895a7fff306308ff201de96aef23aa893bf75ea1e7829852b23cb00853c1e14d" Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.193052 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-2nz97" event={"ID":"ac742a71-b3d4-433d-b550-12300a92941d","Type":"ContainerDied","Data":"615626db86f6bc160373c413deaa131b43df1a5abf812051ca0b105c2fe377a1"} Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.193177 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="615626db86f6bc160373c413deaa131b43df1a5abf812051ca0b105c2fe377a1" Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.193277 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-2nz97" Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.195462 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-kbqp2" event={"ID":"38373747-4dbe-42c7-9060-ada117a776e8","Type":"ContainerDied","Data":"ee939f7a90a2fe8d6cea11e2d319cb488672d5369b04df1d870efc89d581c510"} Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.195484 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee939f7a90a2fe8d6cea11e2d319cb488672d5369b04df1d870efc89d581c510" Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.195518 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-kbqp2" Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.225992 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38373747-4dbe-42c7-9060-ada117a776e8-operator-scripts\") pod \"38373747-4dbe-42c7-9060-ada117a776e8\" (UID: \"38373747-4dbe-42c7-9060-ada117a776e8\") " Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.226076 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7k7px\" (UniqueName: \"kubernetes.io/projected/73a0817f-d450-4a46-863a-a7483f144851-kube-api-access-7k7px\") pod \"73a0817f-d450-4a46-863a-a7483f144851\" (UID: \"73a0817f-d450-4a46-863a-a7483f144851\") " Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.226114 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd7ffdb9-9cec-4f3a-8b55-65d8f6e42286-operator-scripts\") pod \"dd7ffdb9-9cec-4f3a-8b55-65d8f6e42286\" (UID: \"dd7ffdb9-9cec-4f3a-8b55-65d8f6e42286\") " Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.226135 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hk86f\" (UniqueName: \"kubernetes.io/projected/38373747-4dbe-42c7-9060-ada117a776e8-kube-api-access-hk86f\") pod \"38373747-4dbe-42c7-9060-ada117a776e8\" (UID: \"38373747-4dbe-42c7-9060-ada117a776e8\") " Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.226234 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73a0817f-d450-4a46-863a-a7483f144851-operator-scripts\") pod \"73a0817f-d450-4a46-863a-a7483f144851\" (UID: \"73a0817f-d450-4a46-863a-a7483f144851\") " Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.226265 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhvt2\" (UniqueName: \"kubernetes.io/projected/dd7ffdb9-9cec-4f3a-8b55-65d8f6e42286-kube-api-access-nhvt2\") pod \"dd7ffdb9-9cec-4f3a-8b55-65d8f6e42286\" (UID: \"dd7ffdb9-9cec-4f3a-8b55-65d8f6e42286\") " Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.227100 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd7ffdb9-9cec-4f3a-8b55-65d8f6e42286-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dd7ffdb9-9cec-4f3a-8b55-65d8f6e42286" (UID: "dd7ffdb9-9cec-4f3a-8b55-65d8f6e42286"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.227322 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38373747-4dbe-42c7-9060-ada117a776e8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "38373747-4dbe-42c7-9060-ada117a776e8" (UID: "38373747-4dbe-42c7-9060-ada117a776e8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.227800 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73a0817f-d450-4a46-863a-a7483f144851-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "73a0817f-d450-4a46-863a-a7483f144851" (UID: "73a0817f-d450-4a46-863a-a7483f144851"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.228783 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2lh4\" (UniqueName: \"kubernetes.io/projected/c59984a3-cd16-4aa7-841e-29de227d4f70-kube-api-access-r2lh4\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.228806 5004 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c59984a3-cd16-4aa7-841e-29de227d4f70-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.228820 5004 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38373747-4dbe-42c7-9060-ada117a776e8-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.228830 5004 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/00524f9c-1b21-433f-8886-9b685c169469-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.228839 5004 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd7ffdb9-9cec-4f3a-8b55-65d8f6e42286-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.228849 5004 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e2a8cf8a-0ff4-4fba-a0fa-e62d564230c3-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.228860 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sk9qh\" (UniqueName: \"kubernetes.io/projected/e2a8cf8a-0ff4-4fba-a0fa-e62d564230c3-kube-api-access-sk9qh\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.228869 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-728lj\" (UniqueName: \"kubernetes.io/projected/ac742a71-b3d4-433d-b550-12300a92941d-kube-api-access-728lj\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.228877 5004 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73a0817f-d450-4a46-863a-a7483f144851-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.228886 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xhl5\" (UniqueName: \"kubernetes.io/projected/9f34518b-865b-4fe5-b2ff-7060cfced9eb-kube-api-access-6xhl5\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.228898 5004 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac742a71-b3d4-433d-b550-12300a92941d-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.228907 5004 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f34518b-865b-4fe5-b2ff-7060cfced9eb-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.228916 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tnrn\" (UniqueName: \"kubernetes.io/projected/00524f9c-1b21-433f-8886-9b685c169469-kube-api-access-8tnrn\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.237657 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73a0817f-d450-4a46-863a-a7483f144851-kube-api-access-7k7px" (OuterVolumeSpecName: "kube-api-access-7k7px") pod "73a0817f-d450-4a46-863a-a7483f144851" (UID: "73a0817f-d450-4a46-863a-a7483f144851"). InnerVolumeSpecName "kube-api-access-7k7px". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.237756 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38373747-4dbe-42c7-9060-ada117a776e8-kube-api-access-hk86f" (OuterVolumeSpecName: "kube-api-access-hk86f") pod "38373747-4dbe-42c7-9060-ada117a776e8" (UID: "38373747-4dbe-42c7-9060-ada117a776e8"). InnerVolumeSpecName "kube-api-access-hk86f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.239293 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd7ffdb9-9cec-4f3a-8b55-65d8f6e42286-kube-api-access-nhvt2" (OuterVolumeSpecName: "kube-api-access-nhvt2") pod "dd7ffdb9-9cec-4f3a-8b55-65d8f6e42286" (UID: "dd7ffdb9-9cec-4f3a-8b55-65d8f6e42286"). InnerVolumeSpecName "kube-api-access-nhvt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.331076 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7k7px\" (UniqueName: \"kubernetes.io/projected/73a0817f-d450-4a46-863a-a7483f144851-kube-api-access-7k7px\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.331111 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hk86f\" (UniqueName: \"kubernetes.io/projected/38373747-4dbe-42c7-9060-ada117a776e8-kube-api-access-hk86f\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:50 crc kubenswrapper[5004]: I1201 08:39:50.331120 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhvt2\" (UniqueName: \"kubernetes.io/projected/dd7ffdb9-9cec-4f3a-8b55-65d8f6e42286-kube-api-access-nhvt2\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:51 crc kubenswrapper[5004]: I1201 08:39:51.209119 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-wlpnv" event={"ID":"fc5ce303-7a1e-40b2-86f6-861898171b29","Type":"ContainerStarted","Data":"6cdeeb25645ef9ece11e2207f448c252ccaf6e0317573978fdd1df83bec2dfa0"} Dec 01 08:39:51 crc kubenswrapper[5004]: I1201 08:39:51.214412 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"b0dfa0be-7482-49a2-adc0-425cedd5c597","Type":"ContainerStarted","Data":"e0130b8860bfefdb0440c80fc5de2da47de2dba4f4e0942e5cb2662a7cabfd3f"} Dec 01 08:39:51 crc kubenswrapper[5004]: I1201 08:39:51.217429 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-25snn" event={"ID":"cce6a698-0c81-45a6-8025-c04547cf2768","Type":"ContainerStarted","Data":"d3bb2dbdf7036f16f9ae1e924e8377910b900e9426f42d3670895a9275c98393"} Dec 01 08:39:51 crc kubenswrapper[5004]: I1201 08:39:51.217663 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d5b6d6b67-25snn" Dec 01 08:39:51 crc kubenswrapper[5004]: I1201 08:39:51.234256 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-wlpnv" podStartSLOduration=8.063987654 podStartE2EDuration="14.234234181s" podCreationTimestamp="2025-12-01 08:39:37 +0000 UTC" firstStartedPulling="2025-12-01 08:39:43.655275925 +0000 UTC m=+1361.220267907" lastFinishedPulling="2025-12-01 08:39:49.825522442 +0000 UTC m=+1367.390514434" observedRunningTime="2025-12-01 08:39:51.22965434 +0000 UTC m=+1368.794646362" watchObservedRunningTime="2025-12-01 08:39:51.234234181 +0000 UTC m=+1368.799226163" Dec 01 08:39:51 crc kubenswrapper[5004]: I1201 08:39:51.275771 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=22.275755176 podStartE2EDuration="22.275755176s" podCreationTimestamp="2025-12-01 08:39:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:39:51.273082721 +0000 UTC m=+1368.838074723" watchObservedRunningTime="2025-12-01 08:39:51.275755176 +0000 UTC m=+1368.840747158" Dec 01 08:39:51 crc kubenswrapper[5004]: I1201 08:39:51.296188 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d5b6d6b67-25snn" podStartSLOduration=7.296169015 podStartE2EDuration="7.296169015s" podCreationTimestamp="2025-12-01 08:39:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:39:51.290461035 +0000 UTC m=+1368.855453017" watchObservedRunningTime="2025-12-01 08:39:51.296169015 +0000 UTC m=+1368.861161007" Dec 01 08:39:52 crc kubenswrapper[5004]: I1201 08:39:52.227550 5004 generic.go:334] "Generic (PLEG): container finished" podID="96f41ff9-6619-4262-8ecb-0a577f611f68" containerID="c661d564a31c9f258ed7a2bd32494a86bfb175494539a7c0df1df08e5114efe1" exitCode=0 Dec 01 08:39:52 crc kubenswrapper[5004]: I1201 08:39:52.227602 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-vp28t" event={"ID":"96f41ff9-6619-4262-8ecb-0a577f611f68","Type":"ContainerDied","Data":"c661d564a31c9f258ed7a2bd32494a86bfb175494539a7c0df1df08e5114efe1"} Dec 01 08:39:53 crc kubenswrapper[5004]: I1201 08:39:53.660437 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-vp28t" Dec 01 08:39:53 crc kubenswrapper[5004]: I1201 08:39:53.700585 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96f41ff9-6619-4262-8ecb-0a577f611f68-combined-ca-bundle\") pod \"96f41ff9-6619-4262-8ecb-0a577f611f68\" (UID: \"96f41ff9-6619-4262-8ecb-0a577f611f68\") " Dec 01 08:39:53 crc kubenswrapper[5004]: I1201 08:39:53.700829 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8drr\" (UniqueName: \"kubernetes.io/projected/96f41ff9-6619-4262-8ecb-0a577f611f68-kube-api-access-g8drr\") pod \"96f41ff9-6619-4262-8ecb-0a577f611f68\" (UID: \"96f41ff9-6619-4262-8ecb-0a577f611f68\") " Dec 01 08:39:53 crc kubenswrapper[5004]: I1201 08:39:53.700926 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96f41ff9-6619-4262-8ecb-0a577f611f68-config-data\") pod \"96f41ff9-6619-4262-8ecb-0a577f611f68\" (UID: \"96f41ff9-6619-4262-8ecb-0a577f611f68\") " Dec 01 08:39:53 crc kubenswrapper[5004]: I1201 08:39:53.701083 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/96f41ff9-6619-4262-8ecb-0a577f611f68-db-sync-config-data\") pod \"96f41ff9-6619-4262-8ecb-0a577f611f68\" (UID: \"96f41ff9-6619-4262-8ecb-0a577f611f68\") " Dec 01 08:39:53 crc kubenswrapper[5004]: I1201 08:39:53.708943 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96f41ff9-6619-4262-8ecb-0a577f611f68-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "96f41ff9-6619-4262-8ecb-0a577f611f68" (UID: "96f41ff9-6619-4262-8ecb-0a577f611f68"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:39:53 crc kubenswrapper[5004]: I1201 08:39:53.711738 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96f41ff9-6619-4262-8ecb-0a577f611f68-kube-api-access-g8drr" (OuterVolumeSpecName: "kube-api-access-g8drr") pod "96f41ff9-6619-4262-8ecb-0a577f611f68" (UID: "96f41ff9-6619-4262-8ecb-0a577f611f68"). InnerVolumeSpecName "kube-api-access-g8drr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:39:53 crc kubenswrapper[5004]: I1201 08:39:53.754737 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96f41ff9-6619-4262-8ecb-0a577f611f68-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "96f41ff9-6619-4262-8ecb-0a577f611f68" (UID: "96f41ff9-6619-4262-8ecb-0a577f611f68"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:39:53 crc kubenswrapper[5004]: I1201 08:39:53.779775 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96f41ff9-6619-4262-8ecb-0a577f611f68-config-data" (OuterVolumeSpecName: "config-data") pod "96f41ff9-6619-4262-8ecb-0a577f611f68" (UID: "96f41ff9-6619-4262-8ecb-0a577f611f68"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:39:53 crc kubenswrapper[5004]: I1201 08:39:53.803481 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96f41ff9-6619-4262-8ecb-0a577f611f68-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:53 crc kubenswrapper[5004]: I1201 08:39:53.803598 5004 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/96f41ff9-6619-4262-8ecb-0a577f611f68-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:53 crc kubenswrapper[5004]: I1201 08:39:53.803728 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96f41ff9-6619-4262-8ecb-0a577f611f68-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:53 crc kubenswrapper[5004]: I1201 08:39:53.803800 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8drr\" (UniqueName: \"kubernetes.io/projected/96f41ff9-6619-4262-8ecb-0a577f611f68-kube-api-access-g8drr\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:54 crc kubenswrapper[5004]: I1201 08:39:54.243898 5004 generic.go:334] "Generic (PLEG): container finished" podID="fc5ce303-7a1e-40b2-86f6-861898171b29" containerID="6cdeeb25645ef9ece11e2207f448c252ccaf6e0317573978fdd1df83bec2dfa0" exitCode=0 Dec 01 08:39:54 crc kubenswrapper[5004]: I1201 08:39:54.243985 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-wlpnv" event={"ID":"fc5ce303-7a1e-40b2-86f6-861898171b29","Type":"ContainerDied","Data":"6cdeeb25645ef9ece11e2207f448c252ccaf6e0317573978fdd1df83bec2dfa0"} Dec 01 08:39:54 crc kubenswrapper[5004]: I1201 08:39:54.245305 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-vp28t" event={"ID":"96f41ff9-6619-4262-8ecb-0a577f611f68","Type":"ContainerDied","Data":"c16361b2a3792b7c2aa5d59a9818c300496c7aa7c644ca47d0869d6854577c83"} Dec 01 08:39:54 crc kubenswrapper[5004]: I1201 08:39:54.245325 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c16361b2a3792b7c2aa5d59a9818c300496c7aa7c644ca47d0869d6854577c83" Dec 01 08:39:54 crc kubenswrapper[5004]: I1201 08:39:54.245954 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-vp28t" Dec 01 08:39:54 crc kubenswrapper[5004]: I1201 08:39:54.662894 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-25snn"] Dec 01 08:39:54 crc kubenswrapper[5004]: I1201 08:39:54.663411 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d5b6d6b67-25snn" podUID="cce6a698-0c81-45a6-8025-c04547cf2768" containerName="dnsmasq-dns" containerID="cri-o://d3bb2dbdf7036f16f9ae1e924e8377910b900e9426f42d3670895a9275c98393" gracePeriod=10 Dec 01 08:39:54 crc kubenswrapper[5004]: I1201 08:39:54.728375 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-z2whz"] Dec 01 08:39:54 crc kubenswrapper[5004]: E1201 08:39:54.728897 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd7ffdb9-9cec-4f3a-8b55-65d8f6e42286" containerName="mariadb-account-create-update" Dec 01 08:39:54 crc kubenswrapper[5004]: I1201 08:39:54.728922 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd7ffdb9-9cec-4f3a-8b55-65d8f6e42286" containerName="mariadb-account-create-update" Dec 01 08:39:54 crc kubenswrapper[5004]: E1201 08:39:54.728946 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96f41ff9-6619-4262-8ecb-0a577f611f68" containerName="glance-db-sync" Dec 01 08:39:54 crc kubenswrapper[5004]: I1201 08:39:54.728956 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="96f41ff9-6619-4262-8ecb-0a577f611f68" containerName="glance-db-sync" Dec 01 08:39:54 crc kubenswrapper[5004]: E1201 08:39:54.728967 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2a8cf8a-0ff4-4fba-a0fa-e62d564230c3" containerName="mariadb-database-create" Dec 01 08:39:54 crc kubenswrapper[5004]: I1201 08:39:54.728976 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2a8cf8a-0ff4-4fba-a0fa-e62d564230c3" containerName="mariadb-database-create" Dec 01 08:39:54 crc kubenswrapper[5004]: E1201 08:39:54.728986 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f34518b-865b-4fe5-b2ff-7060cfced9eb" containerName="mariadb-account-create-update" Dec 01 08:39:54 crc kubenswrapper[5004]: I1201 08:39:54.728997 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f34518b-865b-4fe5-b2ff-7060cfced9eb" containerName="mariadb-account-create-update" Dec 01 08:39:54 crc kubenswrapper[5004]: E1201 08:39:54.729008 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38373747-4dbe-42c7-9060-ada117a776e8" containerName="mariadb-database-create" Dec 01 08:39:54 crc kubenswrapper[5004]: I1201 08:39:54.729016 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="38373747-4dbe-42c7-9060-ada117a776e8" containerName="mariadb-database-create" Dec 01 08:39:54 crc kubenswrapper[5004]: E1201 08:39:54.729034 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00524f9c-1b21-433f-8886-9b685c169469" containerName="mariadb-account-create-update" Dec 01 08:39:54 crc kubenswrapper[5004]: I1201 08:39:54.729041 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="00524f9c-1b21-433f-8886-9b685c169469" containerName="mariadb-account-create-update" Dec 01 08:39:54 crc kubenswrapper[5004]: E1201 08:39:54.729058 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac742a71-b3d4-433d-b550-12300a92941d" containerName="mariadb-database-create" Dec 01 08:39:54 crc kubenswrapper[5004]: I1201 08:39:54.729067 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac742a71-b3d4-433d-b550-12300a92941d" containerName="mariadb-database-create" Dec 01 08:39:54 crc kubenswrapper[5004]: E1201 08:39:54.729084 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73a0817f-d450-4a46-863a-a7483f144851" containerName="mariadb-database-create" Dec 01 08:39:54 crc kubenswrapper[5004]: I1201 08:39:54.729092 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="73a0817f-d450-4a46-863a-a7483f144851" containerName="mariadb-database-create" Dec 01 08:39:54 crc kubenswrapper[5004]: E1201 08:39:54.729109 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c59984a3-cd16-4aa7-841e-29de227d4f70" containerName="mariadb-account-create-update" Dec 01 08:39:54 crc kubenswrapper[5004]: I1201 08:39:54.729118 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="c59984a3-cd16-4aa7-841e-29de227d4f70" containerName="mariadb-account-create-update" Dec 01 08:39:54 crc kubenswrapper[5004]: I1201 08:39:54.729377 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="00524f9c-1b21-433f-8886-9b685c169469" containerName="mariadb-account-create-update" Dec 01 08:39:54 crc kubenswrapper[5004]: I1201 08:39:54.729433 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="38373747-4dbe-42c7-9060-ada117a776e8" containerName="mariadb-database-create" Dec 01 08:39:54 crc kubenswrapper[5004]: I1201 08:39:54.729456 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="73a0817f-d450-4a46-863a-a7483f144851" containerName="mariadb-database-create" Dec 01 08:39:54 crc kubenswrapper[5004]: I1201 08:39:54.729481 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd7ffdb9-9cec-4f3a-8b55-65d8f6e42286" containerName="mariadb-account-create-update" Dec 01 08:39:54 crc kubenswrapper[5004]: I1201 08:39:54.729496 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="c59984a3-cd16-4aa7-841e-29de227d4f70" containerName="mariadb-account-create-update" Dec 01 08:39:54 crc kubenswrapper[5004]: I1201 08:39:54.729510 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac742a71-b3d4-433d-b550-12300a92941d" containerName="mariadb-database-create" Dec 01 08:39:54 crc kubenswrapper[5004]: I1201 08:39:54.729530 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="96f41ff9-6619-4262-8ecb-0a577f611f68" containerName="glance-db-sync" Dec 01 08:39:54 crc kubenswrapper[5004]: I1201 08:39:54.729541 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2a8cf8a-0ff4-4fba-a0fa-e62d564230c3" containerName="mariadb-database-create" Dec 01 08:39:54 crc kubenswrapper[5004]: I1201 08:39:54.729552 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f34518b-865b-4fe5-b2ff-7060cfced9eb" containerName="mariadb-account-create-update" Dec 01 08:39:54 crc kubenswrapper[5004]: I1201 08:39:54.741913 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-895cf5cf-z2whz" Dec 01 08:39:54 crc kubenswrapper[5004]: I1201 08:39:54.780541 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-z2whz"] Dec 01 08:39:54 crc kubenswrapper[5004]: I1201 08:39:54.942154 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/92bb6b5b-96fd-416b-98a4-199390fd61b1-ovsdbserver-sb\") pod \"dnsmasq-dns-895cf5cf-z2whz\" (UID: \"92bb6b5b-96fd-416b-98a4-199390fd61b1\") " pod="openstack/dnsmasq-dns-895cf5cf-z2whz" Dec 01 08:39:54 crc kubenswrapper[5004]: I1201 08:39:54.942261 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92bb6b5b-96fd-416b-98a4-199390fd61b1-config\") pod \"dnsmasq-dns-895cf5cf-z2whz\" (UID: \"92bb6b5b-96fd-416b-98a4-199390fd61b1\") " pod="openstack/dnsmasq-dns-895cf5cf-z2whz" Dec 01 08:39:54 crc kubenswrapper[5004]: I1201 08:39:54.942322 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/92bb6b5b-96fd-416b-98a4-199390fd61b1-ovsdbserver-nb\") pod \"dnsmasq-dns-895cf5cf-z2whz\" (UID: \"92bb6b5b-96fd-416b-98a4-199390fd61b1\") " pod="openstack/dnsmasq-dns-895cf5cf-z2whz" Dec 01 08:39:54 crc kubenswrapper[5004]: I1201 08:39:54.942357 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/92bb6b5b-96fd-416b-98a4-199390fd61b1-dns-swift-storage-0\") pod \"dnsmasq-dns-895cf5cf-z2whz\" (UID: \"92bb6b5b-96fd-416b-98a4-199390fd61b1\") " pod="openstack/dnsmasq-dns-895cf5cf-z2whz" Dec 01 08:39:54 crc kubenswrapper[5004]: I1201 08:39:54.942415 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dd5cd\" (UniqueName: \"kubernetes.io/projected/92bb6b5b-96fd-416b-98a4-199390fd61b1-kube-api-access-dd5cd\") pod \"dnsmasq-dns-895cf5cf-z2whz\" (UID: \"92bb6b5b-96fd-416b-98a4-199390fd61b1\") " pod="openstack/dnsmasq-dns-895cf5cf-z2whz" Dec 01 08:39:54 crc kubenswrapper[5004]: I1201 08:39:54.944282 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/92bb6b5b-96fd-416b-98a4-199390fd61b1-dns-svc\") pod \"dnsmasq-dns-895cf5cf-z2whz\" (UID: \"92bb6b5b-96fd-416b-98a4-199390fd61b1\") " pod="openstack/dnsmasq-dns-895cf5cf-z2whz" Dec 01 08:39:55 crc kubenswrapper[5004]: I1201 08:39:55.045621 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/92bb6b5b-96fd-416b-98a4-199390fd61b1-ovsdbserver-sb\") pod \"dnsmasq-dns-895cf5cf-z2whz\" (UID: \"92bb6b5b-96fd-416b-98a4-199390fd61b1\") " pod="openstack/dnsmasq-dns-895cf5cf-z2whz" Dec 01 08:39:55 crc kubenswrapper[5004]: I1201 08:39:55.045708 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92bb6b5b-96fd-416b-98a4-199390fd61b1-config\") pod \"dnsmasq-dns-895cf5cf-z2whz\" (UID: \"92bb6b5b-96fd-416b-98a4-199390fd61b1\") " pod="openstack/dnsmasq-dns-895cf5cf-z2whz" Dec 01 08:39:55 crc kubenswrapper[5004]: I1201 08:39:55.045769 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/92bb6b5b-96fd-416b-98a4-199390fd61b1-ovsdbserver-nb\") pod \"dnsmasq-dns-895cf5cf-z2whz\" (UID: \"92bb6b5b-96fd-416b-98a4-199390fd61b1\") " pod="openstack/dnsmasq-dns-895cf5cf-z2whz" Dec 01 08:39:55 crc kubenswrapper[5004]: I1201 08:39:55.045797 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/92bb6b5b-96fd-416b-98a4-199390fd61b1-dns-swift-storage-0\") pod \"dnsmasq-dns-895cf5cf-z2whz\" (UID: \"92bb6b5b-96fd-416b-98a4-199390fd61b1\") " pod="openstack/dnsmasq-dns-895cf5cf-z2whz" Dec 01 08:39:55 crc kubenswrapper[5004]: I1201 08:39:55.045834 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dd5cd\" (UniqueName: \"kubernetes.io/projected/92bb6b5b-96fd-416b-98a4-199390fd61b1-kube-api-access-dd5cd\") pod \"dnsmasq-dns-895cf5cf-z2whz\" (UID: \"92bb6b5b-96fd-416b-98a4-199390fd61b1\") " pod="openstack/dnsmasq-dns-895cf5cf-z2whz" Dec 01 08:39:55 crc kubenswrapper[5004]: I1201 08:39:55.045889 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/92bb6b5b-96fd-416b-98a4-199390fd61b1-dns-svc\") pod \"dnsmasq-dns-895cf5cf-z2whz\" (UID: \"92bb6b5b-96fd-416b-98a4-199390fd61b1\") " pod="openstack/dnsmasq-dns-895cf5cf-z2whz" Dec 01 08:39:55 crc kubenswrapper[5004]: I1201 08:39:55.046772 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92bb6b5b-96fd-416b-98a4-199390fd61b1-config\") pod \"dnsmasq-dns-895cf5cf-z2whz\" (UID: \"92bb6b5b-96fd-416b-98a4-199390fd61b1\") " pod="openstack/dnsmasq-dns-895cf5cf-z2whz" Dec 01 08:39:55 crc kubenswrapper[5004]: I1201 08:39:55.046842 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/92bb6b5b-96fd-416b-98a4-199390fd61b1-ovsdbserver-sb\") pod \"dnsmasq-dns-895cf5cf-z2whz\" (UID: \"92bb6b5b-96fd-416b-98a4-199390fd61b1\") " pod="openstack/dnsmasq-dns-895cf5cf-z2whz" Dec 01 08:39:55 crc kubenswrapper[5004]: I1201 08:39:55.047156 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/92bb6b5b-96fd-416b-98a4-199390fd61b1-dns-svc\") pod \"dnsmasq-dns-895cf5cf-z2whz\" (UID: \"92bb6b5b-96fd-416b-98a4-199390fd61b1\") " pod="openstack/dnsmasq-dns-895cf5cf-z2whz" Dec 01 08:39:55 crc kubenswrapper[5004]: I1201 08:39:55.047166 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/92bb6b5b-96fd-416b-98a4-199390fd61b1-dns-swift-storage-0\") pod \"dnsmasq-dns-895cf5cf-z2whz\" (UID: \"92bb6b5b-96fd-416b-98a4-199390fd61b1\") " pod="openstack/dnsmasq-dns-895cf5cf-z2whz" Dec 01 08:39:55 crc kubenswrapper[5004]: I1201 08:39:55.047170 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/92bb6b5b-96fd-416b-98a4-199390fd61b1-ovsdbserver-nb\") pod \"dnsmasq-dns-895cf5cf-z2whz\" (UID: \"92bb6b5b-96fd-416b-98a4-199390fd61b1\") " pod="openstack/dnsmasq-dns-895cf5cf-z2whz" Dec 01 08:39:55 crc kubenswrapper[5004]: I1201 08:39:55.077719 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dd5cd\" (UniqueName: \"kubernetes.io/projected/92bb6b5b-96fd-416b-98a4-199390fd61b1-kube-api-access-dd5cd\") pod \"dnsmasq-dns-895cf5cf-z2whz\" (UID: \"92bb6b5b-96fd-416b-98a4-199390fd61b1\") " pod="openstack/dnsmasq-dns-895cf5cf-z2whz" Dec 01 08:39:55 crc kubenswrapper[5004]: I1201 08:39:55.155955 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-895cf5cf-z2whz" Dec 01 08:39:55 crc kubenswrapper[5004]: I1201 08:39:55.270088 5004 generic.go:334] "Generic (PLEG): container finished" podID="cce6a698-0c81-45a6-8025-c04547cf2768" containerID="d3bb2dbdf7036f16f9ae1e924e8377910b900e9426f42d3670895a9275c98393" exitCode=0 Dec 01 08:39:55 crc kubenswrapper[5004]: I1201 08:39:55.270296 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-25snn" event={"ID":"cce6a698-0c81-45a6-8025-c04547cf2768","Type":"ContainerDied","Data":"d3bb2dbdf7036f16f9ae1e924e8377910b900e9426f42d3670895a9275c98393"} Dec 01 08:39:55 crc kubenswrapper[5004]: I1201 08:39:55.270321 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-25snn" event={"ID":"cce6a698-0c81-45a6-8025-c04547cf2768","Type":"ContainerDied","Data":"ad9bcba064c64b134d03a865738a880242a41685450a5396536117217c56be21"} Dec 01 08:39:55 crc kubenswrapper[5004]: I1201 08:39:55.270333 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad9bcba064c64b134d03a865738a880242a41685450a5396536117217c56be21" Dec 01 08:39:55 crc kubenswrapper[5004]: I1201 08:39:55.339861 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Dec 01 08:39:55 crc kubenswrapper[5004]: I1201 08:39:55.351319 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-25snn" Dec 01 08:39:55 crc kubenswrapper[5004]: I1201 08:39:55.455695 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cce6a698-0c81-45a6-8025-c04547cf2768-ovsdbserver-sb\") pod \"cce6a698-0c81-45a6-8025-c04547cf2768\" (UID: \"cce6a698-0c81-45a6-8025-c04547cf2768\") " Dec 01 08:39:55 crc kubenswrapper[5004]: I1201 08:39:55.455765 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cce6a698-0c81-45a6-8025-c04547cf2768-dns-swift-storage-0\") pod \"cce6a698-0c81-45a6-8025-c04547cf2768\" (UID: \"cce6a698-0c81-45a6-8025-c04547cf2768\") " Dec 01 08:39:55 crc kubenswrapper[5004]: I1201 08:39:55.455792 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfkq4\" (UniqueName: \"kubernetes.io/projected/cce6a698-0c81-45a6-8025-c04547cf2768-kube-api-access-cfkq4\") pod \"cce6a698-0c81-45a6-8025-c04547cf2768\" (UID: \"cce6a698-0c81-45a6-8025-c04547cf2768\") " Dec 01 08:39:55 crc kubenswrapper[5004]: I1201 08:39:55.456033 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cce6a698-0c81-45a6-8025-c04547cf2768-config\") pod \"cce6a698-0c81-45a6-8025-c04547cf2768\" (UID: \"cce6a698-0c81-45a6-8025-c04547cf2768\") " Dec 01 08:39:55 crc kubenswrapper[5004]: I1201 08:39:55.456141 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cce6a698-0c81-45a6-8025-c04547cf2768-dns-svc\") pod \"cce6a698-0c81-45a6-8025-c04547cf2768\" (UID: \"cce6a698-0c81-45a6-8025-c04547cf2768\") " Dec 01 08:39:55 crc kubenswrapper[5004]: I1201 08:39:55.456171 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cce6a698-0c81-45a6-8025-c04547cf2768-ovsdbserver-nb\") pod \"cce6a698-0c81-45a6-8025-c04547cf2768\" (UID: \"cce6a698-0c81-45a6-8025-c04547cf2768\") " Dec 01 08:39:55 crc kubenswrapper[5004]: I1201 08:39:55.469622 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cce6a698-0c81-45a6-8025-c04547cf2768-kube-api-access-cfkq4" (OuterVolumeSpecName: "kube-api-access-cfkq4") pod "cce6a698-0c81-45a6-8025-c04547cf2768" (UID: "cce6a698-0c81-45a6-8025-c04547cf2768"). InnerVolumeSpecName "kube-api-access-cfkq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:39:55 crc kubenswrapper[5004]: I1201 08:39:55.558872 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfkq4\" (UniqueName: \"kubernetes.io/projected/cce6a698-0c81-45a6-8025-c04547cf2768-kube-api-access-cfkq4\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:55 crc kubenswrapper[5004]: I1201 08:39:55.578074 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cce6a698-0c81-45a6-8025-c04547cf2768-config" (OuterVolumeSpecName: "config") pod "cce6a698-0c81-45a6-8025-c04547cf2768" (UID: "cce6a698-0c81-45a6-8025-c04547cf2768"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:39:55 crc kubenswrapper[5004]: I1201 08:39:55.578776 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cce6a698-0c81-45a6-8025-c04547cf2768-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cce6a698-0c81-45a6-8025-c04547cf2768" (UID: "cce6a698-0c81-45a6-8025-c04547cf2768"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:39:55 crc kubenswrapper[5004]: I1201 08:39:55.585954 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cce6a698-0c81-45a6-8025-c04547cf2768-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "cce6a698-0c81-45a6-8025-c04547cf2768" (UID: "cce6a698-0c81-45a6-8025-c04547cf2768"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:39:55 crc kubenswrapper[5004]: I1201 08:39:55.603306 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cce6a698-0c81-45a6-8025-c04547cf2768-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cce6a698-0c81-45a6-8025-c04547cf2768" (UID: "cce6a698-0c81-45a6-8025-c04547cf2768"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:39:55 crc kubenswrapper[5004]: I1201 08:39:55.630146 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cce6a698-0c81-45a6-8025-c04547cf2768-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cce6a698-0c81-45a6-8025-c04547cf2768" (UID: "cce6a698-0c81-45a6-8025-c04547cf2768"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:39:55 crc kubenswrapper[5004]: I1201 08:39:55.660519 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cce6a698-0c81-45a6-8025-c04547cf2768-config\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:55 crc kubenswrapper[5004]: I1201 08:39:55.660557 5004 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cce6a698-0c81-45a6-8025-c04547cf2768-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:55 crc kubenswrapper[5004]: I1201 08:39:55.660584 5004 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cce6a698-0c81-45a6-8025-c04547cf2768-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:55 crc kubenswrapper[5004]: I1201 08:39:55.660596 5004 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cce6a698-0c81-45a6-8025-c04547cf2768-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:55 crc kubenswrapper[5004]: I1201 08:39:55.660607 5004 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cce6a698-0c81-45a6-8025-c04547cf2768-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:55 crc kubenswrapper[5004]: W1201 08:39:55.895393 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod92bb6b5b_96fd_416b_98a4_199390fd61b1.slice/crio-ff0c0ea55ffc502e34feb8258eefe0a63c5ef3a1b2c4e74903061acd50712ed1 WatchSource:0}: Error finding container ff0c0ea55ffc502e34feb8258eefe0a63c5ef3a1b2c4e74903061acd50712ed1: Status 404 returned error can't find the container with id ff0c0ea55ffc502e34feb8258eefe0a63c5ef3a1b2c4e74903061acd50712ed1 Dec 01 08:39:55 crc kubenswrapper[5004]: I1201 08:39:55.908583 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-z2whz"] Dec 01 08:39:56 crc kubenswrapper[5004]: I1201 08:39:56.280365 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-z2whz" event={"ID":"92bb6b5b-96fd-416b-98a4-199390fd61b1","Type":"ContainerStarted","Data":"ff0c0ea55ffc502e34feb8258eefe0a63c5ef3a1b2c4e74903061acd50712ed1"} Dec 01 08:39:56 crc kubenswrapper[5004]: I1201 08:39:56.280395 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-25snn" Dec 01 08:39:56 crc kubenswrapper[5004]: I1201 08:39:56.314025 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-25snn"] Dec 01 08:39:56 crc kubenswrapper[5004]: I1201 08:39:56.326834 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-25snn"] Dec 01 08:39:56 crc kubenswrapper[5004]: I1201 08:39:56.796782 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cce6a698-0c81-45a6-8025-c04547cf2768" path="/var/lib/kubelet/pods/cce6a698-0c81-45a6-8025-c04547cf2768/volumes" Dec 01 08:39:57 crc kubenswrapper[5004]: I1201 08:39:57.640841 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-wlpnv" Dec 01 08:39:57 crc kubenswrapper[5004]: I1201 08:39:57.806771 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zszrd\" (UniqueName: \"kubernetes.io/projected/fc5ce303-7a1e-40b2-86f6-861898171b29-kube-api-access-zszrd\") pod \"fc5ce303-7a1e-40b2-86f6-861898171b29\" (UID: \"fc5ce303-7a1e-40b2-86f6-861898171b29\") " Dec 01 08:39:57 crc kubenswrapper[5004]: I1201 08:39:57.806818 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc5ce303-7a1e-40b2-86f6-861898171b29-config-data\") pod \"fc5ce303-7a1e-40b2-86f6-861898171b29\" (UID: \"fc5ce303-7a1e-40b2-86f6-861898171b29\") " Dec 01 08:39:57 crc kubenswrapper[5004]: I1201 08:39:57.806955 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc5ce303-7a1e-40b2-86f6-861898171b29-combined-ca-bundle\") pod \"fc5ce303-7a1e-40b2-86f6-861898171b29\" (UID: \"fc5ce303-7a1e-40b2-86f6-861898171b29\") " Dec 01 08:39:57 crc kubenswrapper[5004]: I1201 08:39:57.819833 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc5ce303-7a1e-40b2-86f6-861898171b29-kube-api-access-zszrd" (OuterVolumeSpecName: "kube-api-access-zszrd") pod "fc5ce303-7a1e-40b2-86f6-861898171b29" (UID: "fc5ce303-7a1e-40b2-86f6-861898171b29"). InnerVolumeSpecName "kube-api-access-zszrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:39:57 crc kubenswrapper[5004]: I1201 08:39:57.842692 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc5ce303-7a1e-40b2-86f6-861898171b29-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fc5ce303-7a1e-40b2-86f6-861898171b29" (UID: "fc5ce303-7a1e-40b2-86f6-861898171b29"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:39:57 crc kubenswrapper[5004]: I1201 08:39:57.862128 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc5ce303-7a1e-40b2-86f6-861898171b29-config-data" (OuterVolumeSpecName: "config-data") pod "fc5ce303-7a1e-40b2-86f6-861898171b29" (UID: "fc5ce303-7a1e-40b2-86f6-861898171b29"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:39:57 crc kubenswrapper[5004]: I1201 08:39:57.908712 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc5ce303-7a1e-40b2-86f6-861898171b29-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:57 crc kubenswrapper[5004]: I1201 08:39:57.908742 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc5ce303-7a1e-40b2-86f6-861898171b29-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:57 crc kubenswrapper[5004]: I1201 08:39:57.908754 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zszrd\" (UniqueName: \"kubernetes.io/projected/fc5ce303-7a1e-40b2-86f6-861898171b29-kube-api-access-zszrd\") on node \"crc\" DevicePath \"\"" Dec 01 08:39:58 crc kubenswrapper[5004]: I1201 08:39:58.301884 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-wlpnv" event={"ID":"fc5ce303-7a1e-40b2-86f6-861898171b29","Type":"ContainerDied","Data":"30ea54a94cc388536da4eaf7b87752d2cf91c96c1e9980921a1201648acb4b35"} Dec 01 08:39:58 crc kubenswrapper[5004]: I1201 08:39:58.301936 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30ea54a94cc388536da4eaf7b87752d2cf91c96c1e9980921a1201648acb4b35" Dec 01 08:39:58 crc kubenswrapper[5004]: I1201 08:39:58.302001 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-wlpnv" Dec 01 08:39:58 crc kubenswrapper[5004]: I1201 08:39:58.968231 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-z2whz"] Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.007343 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-zwfnx"] Dec 01 08:39:59 crc kubenswrapper[5004]: E1201 08:39:59.007796 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cce6a698-0c81-45a6-8025-c04547cf2768" containerName="init" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.007814 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="cce6a698-0c81-45a6-8025-c04547cf2768" containerName="init" Dec 01 08:39:59 crc kubenswrapper[5004]: E1201 08:39:59.007858 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cce6a698-0c81-45a6-8025-c04547cf2768" containerName="dnsmasq-dns" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.007864 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="cce6a698-0c81-45a6-8025-c04547cf2768" containerName="dnsmasq-dns" Dec 01 08:39:59 crc kubenswrapper[5004]: E1201 08:39:59.007876 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc5ce303-7a1e-40b2-86f6-861898171b29" containerName="keystone-db-sync" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.007882 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc5ce303-7a1e-40b2-86f6-861898171b29" containerName="keystone-db-sync" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.008353 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc5ce303-7a1e-40b2-86f6-861898171b29" containerName="keystone-db-sync" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.008369 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="cce6a698-0c81-45a6-8025-c04547cf2768" containerName="dnsmasq-dns" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.009593 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9c9f998c-zwfnx" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.036919 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-zwfnx"] Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.056469 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-bkslc"] Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.057845 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bkslc" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.060323 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.060693 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.061258 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-nlkxt" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.061739 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.061901 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.079086 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-bkslc"] Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.133804 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzd5c\" (UniqueName: \"kubernetes.io/projected/be28d2bd-7ba1-4322-ac54-24a8e63c807a-kube-api-access-jzd5c\") pod \"dnsmasq-dns-6c9c9f998c-zwfnx\" (UID: \"be28d2bd-7ba1-4322-ac54-24a8e63c807a\") " pod="openstack/dnsmasq-dns-6c9c9f998c-zwfnx" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.133932 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/be28d2bd-7ba1-4322-ac54-24a8e63c807a-dns-swift-storage-0\") pod \"dnsmasq-dns-6c9c9f998c-zwfnx\" (UID: \"be28d2bd-7ba1-4322-ac54-24a8e63c807a\") " pod="openstack/dnsmasq-dns-6c9c9f998c-zwfnx" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.133974 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/be28d2bd-7ba1-4322-ac54-24a8e63c807a-ovsdbserver-sb\") pod \"dnsmasq-dns-6c9c9f998c-zwfnx\" (UID: \"be28d2bd-7ba1-4322-ac54-24a8e63c807a\") " pod="openstack/dnsmasq-dns-6c9c9f998c-zwfnx" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.134007 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/be28d2bd-7ba1-4322-ac54-24a8e63c807a-ovsdbserver-nb\") pod \"dnsmasq-dns-6c9c9f998c-zwfnx\" (UID: \"be28d2bd-7ba1-4322-ac54-24a8e63c807a\") " pod="openstack/dnsmasq-dns-6c9c9f998c-zwfnx" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.134198 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be28d2bd-7ba1-4322-ac54-24a8e63c807a-config\") pod \"dnsmasq-dns-6c9c9f998c-zwfnx\" (UID: \"be28d2bd-7ba1-4322-ac54-24a8e63c807a\") " pod="openstack/dnsmasq-dns-6c9c9f998c-zwfnx" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.134263 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/be28d2bd-7ba1-4322-ac54-24a8e63c807a-dns-svc\") pod \"dnsmasq-dns-6c9c9f998c-zwfnx\" (UID: \"be28d2bd-7ba1-4322-ac54-24a8e63c807a\") " pod="openstack/dnsmasq-dns-6c9c9f998c-zwfnx" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.232756 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-cwgb2"] Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.235011 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-cwgb2" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.235681 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47efa747-2a3f-4e7f-b1c2-222dd039c1fe-combined-ca-bundle\") pod \"keystone-bootstrap-bkslc\" (UID: \"47efa747-2a3f-4e7f-b1c2-222dd039c1fe\") " pod="openstack/keystone-bootstrap-bkslc" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.235729 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzd5c\" (UniqueName: \"kubernetes.io/projected/be28d2bd-7ba1-4322-ac54-24a8e63c807a-kube-api-access-jzd5c\") pod \"dnsmasq-dns-6c9c9f998c-zwfnx\" (UID: \"be28d2bd-7ba1-4322-ac54-24a8e63c807a\") " pod="openstack/dnsmasq-dns-6c9c9f998c-zwfnx" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.235757 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47efa747-2a3f-4e7f-b1c2-222dd039c1fe-config-data\") pod \"keystone-bootstrap-bkslc\" (UID: \"47efa747-2a3f-4e7f-b1c2-222dd039c1fe\") " pod="openstack/keystone-bootstrap-bkslc" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.236209 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/be28d2bd-7ba1-4322-ac54-24a8e63c807a-dns-swift-storage-0\") pod \"dnsmasq-dns-6c9c9f998c-zwfnx\" (UID: \"be28d2bd-7ba1-4322-ac54-24a8e63c807a\") " pod="openstack/dnsmasq-dns-6c9c9f998c-zwfnx" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.236245 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/be28d2bd-7ba1-4322-ac54-24a8e63c807a-ovsdbserver-sb\") pod \"dnsmasq-dns-6c9c9f998c-zwfnx\" (UID: \"be28d2bd-7ba1-4322-ac54-24a8e63c807a\") " pod="openstack/dnsmasq-dns-6c9c9f998c-zwfnx" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.236267 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/be28d2bd-7ba1-4322-ac54-24a8e63c807a-ovsdbserver-nb\") pod \"dnsmasq-dns-6c9c9f998c-zwfnx\" (UID: \"be28d2bd-7ba1-4322-ac54-24a8e63c807a\") " pod="openstack/dnsmasq-dns-6c9c9f998c-zwfnx" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.236373 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be28d2bd-7ba1-4322-ac54-24a8e63c807a-config\") pod \"dnsmasq-dns-6c9c9f998c-zwfnx\" (UID: \"be28d2bd-7ba1-4322-ac54-24a8e63c807a\") " pod="openstack/dnsmasq-dns-6c9c9f998c-zwfnx" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.237372 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/be28d2bd-7ba1-4322-ac54-24a8e63c807a-ovsdbserver-nb\") pod \"dnsmasq-dns-6c9c9f998c-zwfnx\" (UID: \"be28d2bd-7ba1-4322-ac54-24a8e63c807a\") " pod="openstack/dnsmasq-dns-6c9c9f998c-zwfnx" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.237978 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be28d2bd-7ba1-4322-ac54-24a8e63c807a-config\") pod \"dnsmasq-dns-6c9c9f998c-zwfnx\" (UID: \"be28d2bd-7ba1-4322-ac54-24a8e63c807a\") " pod="openstack/dnsmasq-dns-6c9c9f998c-zwfnx" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.238521 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/be28d2bd-7ba1-4322-ac54-24a8e63c807a-dns-swift-storage-0\") pod \"dnsmasq-dns-6c9c9f998c-zwfnx\" (UID: \"be28d2bd-7ba1-4322-ac54-24a8e63c807a\") " pod="openstack/dnsmasq-dns-6c9c9f998c-zwfnx" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.239105 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/be28d2bd-7ba1-4322-ac54-24a8e63c807a-ovsdbserver-sb\") pod \"dnsmasq-dns-6c9c9f998c-zwfnx\" (UID: \"be28d2bd-7ba1-4322-ac54-24a8e63c807a\") " pod="openstack/dnsmasq-dns-6c9c9f998c-zwfnx" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.239177 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.239361 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/be28d2bd-7ba1-4322-ac54-24a8e63c807a-dns-svc\") pod \"dnsmasq-dns-6c9c9f998c-zwfnx\" (UID: \"be28d2bd-7ba1-4322-ac54-24a8e63c807a\") " pod="openstack/dnsmasq-dns-6c9c9f998c-zwfnx" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.239456 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/47efa747-2a3f-4e7f-b1c2-222dd039c1fe-fernet-keys\") pod \"keystone-bootstrap-bkslc\" (UID: \"47efa747-2a3f-4e7f-b1c2-222dd039c1fe\") " pod="openstack/keystone-bootstrap-bkslc" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.239540 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5th24\" (UniqueName: \"kubernetes.io/projected/47efa747-2a3f-4e7f-b1c2-222dd039c1fe-kube-api-access-5th24\") pod \"keystone-bootstrap-bkslc\" (UID: \"47efa747-2a3f-4e7f-b1c2-222dd039c1fe\") " pod="openstack/keystone-bootstrap-bkslc" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.239693 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-d6ghb" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.239701 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/47efa747-2a3f-4e7f-b1c2-222dd039c1fe-credential-keys\") pod \"keystone-bootstrap-bkslc\" (UID: \"47efa747-2a3f-4e7f-b1c2-222dd039c1fe\") " pod="openstack/keystone-bootstrap-bkslc" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.239959 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47efa747-2a3f-4e7f-b1c2-222dd039c1fe-scripts\") pod \"keystone-bootstrap-bkslc\" (UID: \"47efa747-2a3f-4e7f-b1c2-222dd039c1fe\") " pod="openstack/keystone-bootstrap-bkslc" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.240710 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/be28d2bd-7ba1-4322-ac54-24a8e63c807a-dns-svc\") pod \"dnsmasq-dns-6c9c9f998c-zwfnx\" (UID: \"be28d2bd-7ba1-4322-ac54-24a8e63c807a\") " pod="openstack/dnsmasq-dns-6c9c9f998c-zwfnx" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.259872 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-cwgb2"] Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.281528 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzd5c\" (UniqueName: \"kubernetes.io/projected/be28d2bd-7ba1-4322-ac54-24a8e63c807a-kube-api-access-jzd5c\") pod \"dnsmasq-dns-6c9c9f998c-zwfnx\" (UID: \"be28d2bd-7ba1-4322-ac54-24a8e63c807a\") " pod="openstack/dnsmasq-dns-6c9c9f998c-zwfnx" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.314807 5004 generic.go:334] "Generic (PLEG): container finished" podID="92bb6b5b-96fd-416b-98a4-199390fd61b1" containerID="af4e575a8ca8e74879136f8b21f258959582db3fe1df8c6bf51d969809ba44e4" exitCode=0 Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.314852 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-z2whz" event={"ID":"92bb6b5b-96fd-416b-98a4-199390fd61b1","Type":"ContainerDied","Data":"af4e575a8ca8e74879136f8b21f258959582db3fe1df8c6bf51d969809ba44e4"} Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.342935 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5th24\" (UniqueName: \"kubernetes.io/projected/47efa747-2a3f-4e7f-b1c2-222dd039c1fe-kube-api-access-5th24\") pod \"keystone-bootstrap-bkslc\" (UID: \"47efa747-2a3f-4e7f-b1c2-222dd039c1fe\") " pod="openstack/keystone-bootstrap-bkslc" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.343204 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/47efa747-2a3f-4e7f-b1c2-222dd039c1fe-credential-keys\") pod \"keystone-bootstrap-bkslc\" (UID: \"47efa747-2a3f-4e7f-b1c2-222dd039c1fe\") " pod="openstack/keystone-bootstrap-bkslc" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.343235 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47efa747-2a3f-4e7f-b1c2-222dd039c1fe-scripts\") pod \"keystone-bootstrap-bkslc\" (UID: \"47efa747-2a3f-4e7f-b1c2-222dd039c1fe\") " pod="openstack/keystone-bootstrap-bkslc" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.343261 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gs5vj\" (UniqueName: \"kubernetes.io/projected/fb372dfc-6007-42ba-bc16-96f7d99d8b98-kube-api-access-gs5vj\") pod \"heat-db-sync-cwgb2\" (UID: \"fb372dfc-6007-42ba-bc16-96f7d99d8b98\") " pod="openstack/heat-db-sync-cwgb2" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.343292 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47efa747-2a3f-4e7f-b1c2-222dd039c1fe-combined-ca-bundle\") pod \"keystone-bootstrap-bkslc\" (UID: \"47efa747-2a3f-4e7f-b1c2-222dd039c1fe\") " pod="openstack/keystone-bootstrap-bkslc" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.343314 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47efa747-2a3f-4e7f-b1c2-222dd039c1fe-config-data\") pod \"keystone-bootstrap-bkslc\" (UID: \"47efa747-2a3f-4e7f-b1c2-222dd039c1fe\") " pod="openstack/keystone-bootstrap-bkslc" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.343419 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb372dfc-6007-42ba-bc16-96f7d99d8b98-combined-ca-bundle\") pod \"heat-db-sync-cwgb2\" (UID: \"fb372dfc-6007-42ba-bc16-96f7d99d8b98\") " pod="openstack/heat-db-sync-cwgb2" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.343448 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb372dfc-6007-42ba-bc16-96f7d99d8b98-config-data\") pod \"heat-db-sync-cwgb2\" (UID: \"fb372dfc-6007-42ba-bc16-96f7d99d8b98\") " pod="openstack/heat-db-sync-cwgb2" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.343481 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/47efa747-2a3f-4e7f-b1c2-222dd039c1fe-fernet-keys\") pod \"keystone-bootstrap-bkslc\" (UID: \"47efa747-2a3f-4e7f-b1c2-222dd039c1fe\") " pod="openstack/keystone-bootstrap-bkslc" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.342996 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9c9f998c-zwfnx" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.352296 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/47efa747-2a3f-4e7f-b1c2-222dd039c1fe-fernet-keys\") pod \"keystone-bootstrap-bkslc\" (UID: \"47efa747-2a3f-4e7f-b1c2-222dd039c1fe\") " pod="openstack/keystone-bootstrap-bkslc" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.360094 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47efa747-2a3f-4e7f-b1c2-222dd039c1fe-scripts\") pod \"keystone-bootstrap-bkslc\" (UID: \"47efa747-2a3f-4e7f-b1c2-222dd039c1fe\") " pod="openstack/keystone-bootstrap-bkslc" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.377341 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47efa747-2a3f-4e7f-b1c2-222dd039c1fe-config-data\") pod \"keystone-bootstrap-bkslc\" (UID: \"47efa747-2a3f-4e7f-b1c2-222dd039c1fe\") " pod="openstack/keystone-bootstrap-bkslc" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.382528 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47efa747-2a3f-4e7f-b1c2-222dd039c1fe-combined-ca-bundle\") pod \"keystone-bootstrap-bkslc\" (UID: \"47efa747-2a3f-4e7f-b1c2-222dd039c1fe\") " pod="openstack/keystone-bootstrap-bkslc" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.393532 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5th24\" (UniqueName: \"kubernetes.io/projected/47efa747-2a3f-4e7f-b1c2-222dd039c1fe-kube-api-access-5th24\") pod \"keystone-bootstrap-bkslc\" (UID: \"47efa747-2a3f-4e7f-b1c2-222dd039c1fe\") " pod="openstack/keystone-bootstrap-bkslc" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.425761 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/47efa747-2a3f-4e7f-b1c2-222dd039c1fe-credential-keys\") pod \"keystone-bootstrap-bkslc\" (UID: \"47efa747-2a3f-4e7f-b1c2-222dd039c1fe\") " pod="openstack/keystone-bootstrap-bkslc" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.444800 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb372dfc-6007-42ba-bc16-96f7d99d8b98-combined-ca-bundle\") pod \"heat-db-sync-cwgb2\" (UID: \"fb372dfc-6007-42ba-bc16-96f7d99d8b98\") " pod="openstack/heat-db-sync-cwgb2" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.444856 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb372dfc-6007-42ba-bc16-96f7d99d8b98-config-data\") pod \"heat-db-sync-cwgb2\" (UID: \"fb372dfc-6007-42ba-bc16-96f7d99d8b98\") " pod="openstack/heat-db-sync-cwgb2" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.444944 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gs5vj\" (UniqueName: \"kubernetes.io/projected/fb372dfc-6007-42ba-bc16-96f7d99d8b98-kube-api-access-gs5vj\") pod \"heat-db-sync-cwgb2\" (UID: \"fb372dfc-6007-42ba-bc16-96f7d99d8b98\") " pod="openstack/heat-db-sync-cwgb2" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.458893 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-68h6f"] Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.460193 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb372dfc-6007-42ba-bc16-96f7d99d8b98-combined-ca-bundle\") pod \"heat-db-sync-cwgb2\" (UID: \"fb372dfc-6007-42ba-bc16-96f7d99d8b98\") " pod="openstack/heat-db-sync-cwgb2" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.492912 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-68h6f" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.496369 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb372dfc-6007-42ba-bc16-96f7d99d8b98-config-data\") pod \"heat-db-sync-cwgb2\" (UID: \"fb372dfc-6007-42ba-bc16-96f7d99d8b98\") " pod="openstack/heat-db-sync-cwgb2" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.507648 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-68h6f"] Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.507999 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-wfrsk" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.541122 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.542766 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-ln7f8"] Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.544021 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-ln7f8" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.557696 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.560136 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-srsdl" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.560389 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.560604 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.561191 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gs5vj\" (UniqueName: \"kubernetes.io/projected/fb372dfc-6007-42ba-bc16-96f7d99d8b98-kube-api-access-gs5vj\") pod \"heat-db-sync-cwgb2\" (UID: \"fb372dfc-6007-42ba-bc16-96f7d99d8b98\") " pod="openstack/heat-db-sync-cwgb2" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.573391 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-vgmk8"] Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.575379 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-ln7f8"] Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.575498 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-vgmk8" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.580760 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.581595 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-lsbpg" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.674979 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/165d617f-a220-49b1-af2b-65d4c509962c-scripts\") pod \"cinder-db-sync-ln7f8\" (UID: \"165d617f-a220-49b1-af2b-65d4c509962c\") " pod="openstack/cinder-db-sync-ln7f8" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.679974 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/87079b8d-839c-42d2-95d1-33dee4ca61e1-config\") pod \"neutron-db-sync-68h6f\" (UID: \"87079b8d-839c-42d2-95d1-33dee4ca61e1\") " pod="openstack/neutron-db-sync-68h6f" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.680019 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2hpl\" (UniqueName: \"kubernetes.io/projected/165d617f-a220-49b1-af2b-65d4c509962c-kube-api-access-r2hpl\") pod \"cinder-db-sync-ln7f8\" (UID: \"165d617f-a220-49b1-af2b-65d4c509962c\") " pod="openstack/cinder-db-sync-ln7f8" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.680062 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwgfm\" (UniqueName: \"kubernetes.io/projected/87079b8d-839c-42d2-95d1-33dee4ca61e1-kube-api-access-fwgfm\") pod \"neutron-db-sync-68h6f\" (UID: \"87079b8d-839c-42d2-95d1-33dee4ca61e1\") " pod="openstack/neutron-db-sync-68h6f" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.680140 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/165d617f-a220-49b1-af2b-65d4c509962c-db-sync-config-data\") pod \"cinder-db-sync-ln7f8\" (UID: \"165d617f-a220-49b1-af2b-65d4c509962c\") " pod="openstack/cinder-db-sync-ln7f8" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.680218 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87079b8d-839c-42d2-95d1-33dee4ca61e1-combined-ca-bundle\") pod \"neutron-db-sync-68h6f\" (UID: \"87079b8d-839c-42d2-95d1-33dee4ca61e1\") " pod="openstack/neutron-db-sync-68h6f" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.680261 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/165d617f-a220-49b1-af2b-65d4c509962c-etc-machine-id\") pod \"cinder-db-sync-ln7f8\" (UID: \"165d617f-a220-49b1-af2b-65d4c509962c\") " pod="openstack/cinder-db-sync-ln7f8" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.680357 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/165d617f-a220-49b1-af2b-65d4c509962c-config-data\") pod \"cinder-db-sync-ln7f8\" (UID: \"165d617f-a220-49b1-af2b-65d4c509962c\") " pod="openstack/cinder-db-sync-ln7f8" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.680538 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/165d617f-a220-49b1-af2b-65d4c509962c-combined-ca-bundle\") pod \"cinder-db-sync-ln7f8\" (UID: \"165d617f-a220-49b1-af2b-65d4c509962c\") " pod="openstack/cinder-db-sync-ln7f8" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.703733 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bkslc" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.715249 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-vgmk8"] Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.817169 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/165d617f-a220-49b1-af2b-65d4c509962c-scripts\") pod \"cinder-db-sync-ln7f8\" (UID: \"165d617f-a220-49b1-af2b-65d4c509962c\") " pod="openstack/cinder-db-sync-ln7f8" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.817337 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/87079b8d-839c-42d2-95d1-33dee4ca61e1-config\") pod \"neutron-db-sync-68h6f\" (UID: \"87079b8d-839c-42d2-95d1-33dee4ca61e1\") " pod="openstack/neutron-db-sync-68h6f" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.817354 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2hpl\" (UniqueName: \"kubernetes.io/projected/165d617f-a220-49b1-af2b-65d4c509962c-kube-api-access-r2hpl\") pod \"cinder-db-sync-ln7f8\" (UID: \"165d617f-a220-49b1-af2b-65d4c509962c\") " pod="openstack/cinder-db-sync-ln7f8" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.817383 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwgfm\" (UniqueName: \"kubernetes.io/projected/87079b8d-839c-42d2-95d1-33dee4ca61e1-kube-api-access-fwgfm\") pod \"neutron-db-sync-68h6f\" (UID: \"87079b8d-839c-42d2-95d1-33dee4ca61e1\") " pod="openstack/neutron-db-sync-68h6f" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.817442 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/165d617f-a220-49b1-af2b-65d4c509962c-db-sync-config-data\") pod \"cinder-db-sync-ln7f8\" (UID: \"165d617f-a220-49b1-af2b-65d4c509962c\") " pod="openstack/cinder-db-sync-ln7f8" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.817492 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87079b8d-839c-42d2-95d1-33dee4ca61e1-combined-ca-bundle\") pod \"neutron-db-sync-68h6f\" (UID: \"87079b8d-839c-42d2-95d1-33dee4ca61e1\") " pod="openstack/neutron-db-sync-68h6f" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.817521 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/165d617f-a220-49b1-af2b-65d4c509962c-etc-machine-id\") pod \"cinder-db-sync-ln7f8\" (UID: \"165d617f-a220-49b1-af2b-65d4c509962c\") " pod="openstack/cinder-db-sync-ln7f8" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.817643 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/165d617f-a220-49b1-af2b-65d4c509962c-config-data\") pod \"cinder-db-sync-ln7f8\" (UID: \"165d617f-a220-49b1-af2b-65d4c509962c\") " pod="openstack/cinder-db-sync-ln7f8" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.817689 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/165d617f-a220-49b1-af2b-65d4c509962c-combined-ca-bundle\") pod \"cinder-db-sync-ln7f8\" (UID: \"165d617f-a220-49b1-af2b-65d4c509962c\") " pod="openstack/cinder-db-sync-ln7f8" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.818509 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/165d617f-a220-49b1-af2b-65d4c509962c-etc-machine-id\") pod \"cinder-db-sync-ln7f8\" (UID: \"165d617f-a220-49b1-af2b-65d4c509962c\") " pod="openstack/cinder-db-sync-ln7f8" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.831224 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/165d617f-a220-49b1-af2b-65d4c509962c-combined-ca-bundle\") pod \"cinder-db-sync-ln7f8\" (UID: \"165d617f-a220-49b1-af2b-65d4c509962c\") " pod="openstack/cinder-db-sync-ln7f8" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.831518 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/165d617f-a220-49b1-af2b-65d4c509962c-config-data\") pod \"cinder-db-sync-ln7f8\" (UID: \"165d617f-a220-49b1-af2b-65d4c509962c\") " pod="openstack/cinder-db-sync-ln7f8" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.842991 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/87079b8d-839c-42d2-95d1-33dee4ca61e1-config\") pod \"neutron-db-sync-68h6f\" (UID: \"87079b8d-839c-42d2-95d1-33dee4ca61e1\") " pod="openstack/neutron-db-sync-68h6f" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.844081 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/165d617f-a220-49b1-af2b-65d4c509962c-db-sync-config-data\") pod \"cinder-db-sync-ln7f8\" (UID: \"165d617f-a220-49b1-af2b-65d4c509962c\") " pod="openstack/cinder-db-sync-ln7f8" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.852645 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87079b8d-839c-42d2-95d1-33dee4ca61e1-combined-ca-bundle\") pod \"neutron-db-sync-68h6f\" (UID: \"87079b8d-839c-42d2-95d1-33dee4ca61e1\") " pod="openstack/neutron-db-sync-68h6f" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.865408 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-cwgb2" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.865772 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2hpl\" (UniqueName: \"kubernetes.io/projected/165d617f-a220-49b1-af2b-65d4c509962c-kube-api-access-r2hpl\") pod \"cinder-db-sync-ln7f8\" (UID: \"165d617f-a220-49b1-af2b-65d4c509962c\") " pod="openstack/cinder-db-sync-ln7f8" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.875383 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/165d617f-a220-49b1-af2b-65d4c509962c-scripts\") pod \"cinder-db-sync-ln7f8\" (UID: \"165d617f-a220-49b1-af2b-65d4c509962c\") " pod="openstack/cinder-db-sync-ln7f8" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.875879 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwgfm\" (UniqueName: \"kubernetes.io/projected/87079b8d-839c-42d2-95d1-33dee4ca61e1-kube-api-access-fwgfm\") pod \"neutron-db-sync-68h6f\" (UID: \"87079b8d-839c-42d2-95d1-33dee4ca61e1\") " pod="openstack/neutron-db-sync-68h6f" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.895411 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-zwfnx"] Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.914552 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-7jdwx"] Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.916965 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-7jdwx" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.919432 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0eccea77-d6ee-4592-ad47-1f29ca2a943b-combined-ca-bundle\") pod \"barbican-db-sync-vgmk8\" (UID: \"0eccea77-d6ee-4592-ad47-1f29ca2a943b\") " pod="openstack/barbican-db-sync-vgmk8" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.919494 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lv29d\" (UniqueName: \"kubernetes.io/projected/0eccea77-d6ee-4592-ad47-1f29ca2a943b-kube-api-access-lv29d\") pod \"barbican-db-sync-vgmk8\" (UID: \"0eccea77-d6ee-4592-ad47-1f29ca2a943b\") " pod="openstack/barbican-db-sync-vgmk8" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.919581 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0eccea77-d6ee-4592-ad47-1f29ca2a943b-db-sync-config-data\") pod \"barbican-db-sync-vgmk8\" (UID: \"0eccea77-d6ee-4592-ad47-1f29ca2a943b\") " pod="openstack/barbican-db-sync-vgmk8" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.922117 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-wj4p4" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.922358 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.922509 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 01 08:39:59 crc kubenswrapper[5004]: E1201 08:39:59.935293 5004 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Dec 01 08:39:59 crc kubenswrapper[5004]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/92bb6b5b-96fd-416b-98a4-199390fd61b1/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Dec 01 08:39:59 crc kubenswrapper[5004]: > podSandboxID="ff0c0ea55ffc502e34feb8258eefe0a63c5ef3a1b2c4e74903061acd50712ed1" Dec 01 08:39:59 crc kubenswrapper[5004]: E1201 08:39:59.935438 5004 kuberuntime_manager.go:1274] "Unhandled Error" err=< Dec 01 08:39:59 crc kubenswrapper[5004]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68fhb4hc8h699h59fh7bh548hbh65fh564hfbh678h5f6h57dh58bhf9hb4h59ch64fh558h54fh659h5fbh58h599h558h5b5h5cchf6h598h59dh97q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-swift-storage-0,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-swift-storage-0,SubPath:dns-swift-storage-0,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dd5cd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-895cf5cf-z2whz_openstack(92bb6b5b-96fd-416b-98a4-199390fd61b1): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/92bb6b5b-96fd-416b-98a4-199390fd61b1/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Dec 01 08:39:59 crc kubenswrapper[5004]: > logger="UnhandledError" Dec 01 08:39:59 crc kubenswrapper[5004]: E1201 08:39:59.938753 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/92bb6b5b-96fd-416b-98a4-199390fd61b1/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-895cf5cf-z2whz" podUID="92bb6b5b-96fd-416b-98a4-199390fd61b1" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.948704 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-7jdwx"] Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.958754 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-68h6f" Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.988981 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-9x9h5"] Dec 01 08:39:59 crc kubenswrapper[5004]: I1201 08:39:59.991013 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-9x9h5" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.001055 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-9x9h5"] Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.015885 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6d5b6d6b67-25snn" podUID="cce6a698-0c81-45a6-8025-c04547cf2768" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.173:5353: i/o timeout" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.026838 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b2170ab-37fa-4381-9001-5487eb2a302c-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-9x9h5\" (UID: \"1b2170ab-37fa-4381-9001-5487eb2a302c\") " pod="openstack/dnsmasq-dns-57c957c4ff-9x9h5" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.026890 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5303a09-48ed-4287-83ba-0fb70fe199d0-scripts\") pod \"placement-db-sync-7jdwx\" (UID: \"e5303a09-48ed-4287-83ba-0fb70fe199d0\") " pod="openstack/placement-db-sync-7jdwx" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.026915 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1b2170ab-37fa-4381-9001-5487eb2a302c-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-9x9h5\" (UID: \"1b2170ab-37fa-4381-9001-5487eb2a302c\") " pod="openstack/dnsmasq-dns-57c957c4ff-9x9h5" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.026935 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1b2170ab-37fa-4381-9001-5487eb2a302c-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-9x9h5\" (UID: \"1b2170ab-37fa-4381-9001-5487eb2a302c\") " pod="openstack/dnsmasq-dns-57c957c4ff-9x9h5" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.026965 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1b2170ab-37fa-4381-9001-5487eb2a302c-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-9x9h5\" (UID: \"1b2170ab-37fa-4381-9001-5487eb2a302c\") " pod="openstack/dnsmasq-dns-57c957c4ff-9x9h5" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.027021 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5303a09-48ed-4287-83ba-0fb70fe199d0-config-data\") pod \"placement-db-sync-7jdwx\" (UID: \"e5303a09-48ed-4287-83ba-0fb70fe199d0\") " pod="openstack/placement-db-sync-7jdwx" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.027048 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lp58h\" (UniqueName: \"kubernetes.io/projected/1b2170ab-37fa-4381-9001-5487eb2a302c-kube-api-access-lp58h\") pod \"dnsmasq-dns-57c957c4ff-9x9h5\" (UID: \"1b2170ab-37fa-4381-9001-5487eb2a302c\") " pod="openstack/dnsmasq-dns-57c957c4ff-9x9h5" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.027072 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lpgq\" (UniqueName: \"kubernetes.io/projected/e5303a09-48ed-4287-83ba-0fb70fe199d0-kube-api-access-8lpgq\") pod \"placement-db-sync-7jdwx\" (UID: \"e5303a09-48ed-4287-83ba-0fb70fe199d0\") " pod="openstack/placement-db-sync-7jdwx" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.027276 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0eccea77-d6ee-4592-ad47-1f29ca2a943b-combined-ca-bundle\") pod \"barbican-db-sync-vgmk8\" (UID: \"0eccea77-d6ee-4592-ad47-1f29ca2a943b\") " pod="openstack/barbican-db-sync-vgmk8" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.027362 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5303a09-48ed-4287-83ba-0fb70fe199d0-combined-ca-bundle\") pod \"placement-db-sync-7jdwx\" (UID: \"e5303a09-48ed-4287-83ba-0fb70fe199d0\") " pod="openstack/placement-db-sync-7jdwx" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.027622 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lv29d\" (UniqueName: \"kubernetes.io/projected/0eccea77-d6ee-4592-ad47-1f29ca2a943b-kube-api-access-lv29d\") pod \"barbican-db-sync-vgmk8\" (UID: \"0eccea77-d6ee-4592-ad47-1f29ca2a943b\") " pod="openstack/barbican-db-sync-vgmk8" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.027748 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5303a09-48ed-4287-83ba-0fb70fe199d0-logs\") pod \"placement-db-sync-7jdwx\" (UID: \"e5303a09-48ed-4287-83ba-0fb70fe199d0\") " pod="openstack/placement-db-sync-7jdwx" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.027892 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0eccea77-d6ee-4592-ad47-1f29ca2a943b-db-sync-config-data\") pod \"barbican-db-sync-vgmk8\" (UID: \"0eccea77-d6ee-4592-ad47-1f29ca2a943b\") " pod="openstack/barbican-db-sync-vgmk8" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.027953 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b2170ab-37fa-4381-9001-5487eb2a302c-config\") pod \"dnsmasq-dns-57c957c4ff-9x9h5\" (UID: \"1b2170ab-37fa-4381-9001-5487eb2a302c\") " pod="openstack/dnsmasq-dns-57c957c4ff-9x9h5" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.037115 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0eccea77-d6ee-4592-ad47-1f29ca2a943b-db-sync-config-data\") pod \"barbican-db-sync-vgmk8\" (UID: \"0eccea77-d6ee-4592-ad47-1f29ca2a943b\") " pod="openstack/barbican-db-sync-vgmk8" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.038113 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0eccea77-d6ee-4592-ad47-1f29ca2a943b-combined-ca-bundle\") pod \"barbican-db-sync-vgmk8\" (UID: \"0eccea77-d6ee-4592-ad47-1f29ca2a943b\") " pod="openstack/barbican-db-sync-vgmk8" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.040810 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.055776 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lv29d\" (UniqueName: \"kubernetes.io/projected/0eccea77-d6ee-4592-ad47-1f29ca2a943b-kube-api-access-lv29d\") pod \"barbican-db-sync-vgmk8\" (UID: \"0eccea77-d6ee-4592-ad47-1f29ca2a943b\") " pod="openstack/barbican-db-sync-vgmk8" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.056188 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.056333 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.059989 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.060164 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.104659 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-ln7f8" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.129594 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1fd8d826-4dab-4d07-bc04-be5dfebdaf2b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1fd8d826-4dab-4d07-bc04-be5dfebdaf2b\") " pod="openstack/ceilometer-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.129656 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fd8d826-4dab-4d07-bc04-be5dfebdaf2b-scripts\") pod \"ceilometer-0\" (UID: \"1fd8d826-4dab-4d07-bc04-be5dfebdaf2b\") " pod="openstack/ceilometer-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.129678 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fd8d826-4dab-4d07-bc04-be5dfebdaf2b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1fd8d826-4dab-4d07-bc04-be5dfebdaf2b\") " pod="openstack/ceilometer-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.129710 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5303a09-48ed-4287-83ba-0fb70fe199d0-logs\") pod \"placement-db-sync-7jdwx\" (UID: \"e5303a09-48ed-4287-83ba-0fb70fe199d0\") " pod="openstack/placement-db-sync-7jdwx" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.129769 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1fd8d826-4dab-4d07-bc04-be5dfebdaf2b-run-httpd\") pod \"ceilometer-0\" (UID: \"1fd8d826-4dab-4d07-bc04-be5dfebdaf2b\") " pod="openstack/ceilometer-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.130710 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5303a09-48ed-4287-83ba-0fb70fe199d0-logs\") pod \"placement-db-sync-7jdwx\" (UID: \"e5303a09-48ed-4287-83ba-0fb70fe199d0\") " pod="openstack/placement-db-sync-7jdwx" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.130828 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mfkl\" (UniqueName: \"kubernetes.io/projected/1fd8d826-4dab-4d07-bc04-be5dfebdaf2b-kube-api-access-2mfkl\") pod \"ceilometer-0\" (UID: \"1fd8d826-4dab-4d07-bc04-be5dfebdaf2b\") " pod="openstack/ceilometer-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.130929 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b2170ab-37fa-4381-9001-5487eb2a302c-config\") pod \"dnsmasq-dns-57c957c4ff-9x9h5\" (UID: \"1b2170ab-37fa-4381-9001-5487eb2a302c\") " pod="openstack/dnsmasq-dns-57c957c4ff-9x9h5" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.130971 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1fd8d826-4dab-4d07-bc04-be5dfebdaf2b-log-httpd\") pod \"ceilometer-0\" (UID: \"1fd8d826-4dab-4d07-bc04-be5dfebdaf2b\") " pod="openstack/ceilometer-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.131033 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b2170ab-37fa-4381-9001-5487eb2a302c-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-9x9h5\" (UID: \"1b2170ab-37fa-4381-9001-5487eb2a302c\") " pod="openstack/dnsmasq-dns-57c957c4ff-9x9h5" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.131065 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fd8d826-4dab-4d07-bc04-be5dfebdaf2b-config-data\") pod \"ceilometer-0\" (UID: \"1fd8d826-4dab-4d07-bc04-be5dfebdaf2b\") " pod="openstack/ceilometer-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.131129 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5303a09-48ed-4287-83ba-0fb70fe199d0-scripts\") pod \"placement-db-sync-7jdwx\" (UID: \"e5303a09-48ed-4287-83ba-0fb70fe199d0\") " pod="openstack/placement-db-sync-7jdwx" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.131169 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1b2170ab-37fa-4381-9001-5487eb2a302c-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-9x9h5\" (UID: \"1b2170ab-37fa-4381-9001-5487eb2a302c\") " pod="openstack/dnsmasq-dns-57c957c4ff-9x9h5" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.131193 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1b2170ab-37fa-4381-9001-5487eb2a302c-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-9x9h5\" (UID: \"1b2170ab-37fa-4381-9001-5487eb2a302c\") " pod="openstack/dnsmasq-dns-57c957c4ff-9x9h5" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.131244 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1b2170ab-37fa-4381-9001-5487eb2a302c-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-9x9h5\" (UID: \"1b2170ab-37fa-4381-9001-5487eb2a302c\") " pod="openstack/dnsmasq-dns-57c957c4ff-9x9h5" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.131289 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5303a09-48ed-4287-83ba-0fb70fe199d0-config-data\") pod \"placement-db-sync-7jdwx\" (UID: \"e5303a09-48ed-4287-83ba-0fb70fe199d0\") " pod="openstack/placement-db-sync-7jdwx" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.131324 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lp58h\" (UniqueName: \"kubernetes.io/projected/1b2170ab-37fa-4381-9001-5487eb2a302c-kube-api-access-lp58h\") pod \"dnsmasq-dns-57c957c4ff-9x9h5\" (UID: \"1b2170ab-37fa-4381-9001-5487eb2a302c\") " pod="openstack/dnsmasq-dns-57c957c4ff-9x9h5" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.131378 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lpgq\" (UniqueName: \"kubernetes.io/projected/e5303a09-48ed-4287-83ba-0fb70fe199d0-kube-api-access-8lpgq\") pod \"placement-db-sync-7jdwx\" (UID: \"e5303a09-48ed-4287-83ba-0fb70fe199d0\") " pod="openstack/placement-db-sync-7jdwx" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.131493 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5303a09-48ed-4287-83ba-0fb70fe199d0-combined-ca-bundle\") pod \"placement-db-sync-7jdwx\" (UID: \"e5303a09-48ed-4287-83ba-0fb70fe199d0\") " pod="openstack/placement-db-sync-7jdwx" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.133277 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1b2170ab-37fa-4381-9001-5487eb2a302c-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-9x9h5\" (UID: \"1b2170ab-37fa-4381-9001-5487eb2a302c\") " pod="openstack/dnsmasq-dns-57c957c4ff-9x9h5" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.133831 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1b2170ab-37fa-4381-9001-5487eb2a302c-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-9x9h5\" (UID: \"1b2170ab-37fa-4381-9001-5487eb2a302c\") " pod="openstack/dnsmasq-dns-57c957c4ff-9x9h5" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.134387 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1b2170ab-37fa-4381-9001-5487eb2a302c-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-9x9h5\" (UID: \"1b2170ab-37fa-4381-9001-5487eb2a302c\") " pod="openstack/dnsmasq-dns-57c957c4ff-9x9h5" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.135125 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b2170ab-37fa-4381-9001-5487eb2a302c-config\") pod \"dnsmasq-dns-57c957c4ff-9x9h5\" (UID: \"1b2170ab-37fa-4381-9001-5487eb2a302c\") " pod="openstack/dnsmasq-dns-57c957c4ff-9x9h5" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.135678 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5303a09-48ed-4287-83ba-0fb70fe199d0-scripts\") pod \"placement-db-sync-7jdwx\" (UID: \"e5303a09-48ed-4287-83ba-0fb70fe199d0\") " pod="openstack/placement-db-sync-7jdwx" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.136431 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b2170ab-37fa-4381-9001-5487eb2a302c-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-9x9h5\" (UID: \"1b2170ab-37fa-4381-9001-5487eb2a302c\") " pod="openstack/dnsmasq-dns-57c957c4ff-9x9h5" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.139307 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5303a09-48ed-4287-83ba-0fb70fe199d0-config-data\") pod \"placement-db-sync-7jdwx\" (UID: \"e5303a09-48ed-4287-83ba-0fb70fe199d0\") " pod="openstack/placement-db-sync-7jdwx" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.144173 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-vgmk8" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.145401 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5303a09-48ed-4287-83ba-0fb70fe199d0-combined-ca-bundle\") pod \"placement-db-sync-7jdwx\" (UID: \"e5303a09-48ed-4287-83ba-0fb70fe199d0\") " pod="openstack/placement-db-sync-7jdwx" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.154687 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lp58h\" (UniqueName: \"kubernetes.io/projected/1b2170ab-37fa-4381-9001-5487eb2a302c-kube-api-access-lp58h\") pod \"dnsmasq-dns-57c957c4ff-9x9h5\" (UID: \"1b2170ab-37fa-4381-9001-5487eb2a302c\") " pod="openstack/dnsmasq-dns-57c957c4ff-9x9h5" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.163429 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lpgq\" (UniqueName: \"kubernetes.io/projected/e5303a09-48ed-4287-83ba-0fb70fe199d0-kube-api-access-8lpgq\") pod \"placement-db-sync-7jdwx\" (UID: \"e5303a09-48ed-4287-83ba-0fb70fe199d0\") " pod="openstack/placement-db-sync-7jdwx" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.197352 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-zwfnx"] Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.229117 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.234940 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1fd8d826-4dab-4d07-bc04-be5dfebdaf2b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1fd8d826-4dab-4d07-bc04-be5dfebdaf2b\") " pod="openstack/ceilometer-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.234996 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fd8d826-4dab-4d07-bc04-be5dfebdaf2b-scripts\") pod \"ceilometer-0\" (UID: \"1fd8d826-4dab-4d07-bc04-be5dfebdaf2b\") " pod="openstack/ceilometer-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.235012 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fd8d826-4dab-4d07-bc04-be5dfebdaf2b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1fd8d826-4dab-4d07-bc04-be5dfebdaf2b\") " pod="openstack/ceilometer-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.235146 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1fd8d826-4dab-4d07-bc04-be5dfebdaf2b-run-httpd\") pod \"ceilometer-0\" (UID: \"1fd8d826-4dab-4d07-bc04-be5dfebdaf2b\") " pod="openstack/ceilometer-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.235179 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mfkl\" (UniqueName: \"kubernetes.io/projected/1fd8d826-4dab-4d07-bc04-be5dfebdaf2b-kube-api-access-2mfkl\") pod \"ceilometer-0\" (UID: \"1fd8d826-4dab-4d07-bc04-be5dfebdaf2b\") " pod="openstack/ceilometer-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.235213 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1fd8d826-4dab-4d07-bc04-be5dfebdaf2b-log-httpd\") pod \"ceilometer-0\" (UID: \"1fd8d826-4dab-4d07-bc04-be5dfebdaf2b\") " pod="openstack/ceilometer-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.235239 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fd8d826-4dab-4d07-bc04-be5dfebdaf2b-config-data\") pod \"ceilometer-0\" (UID: \"1fd8d826-4dab-4d07-bc04-be5dfebdaf2b\") " pod="openstack/ceilometer-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.240147 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1fd8d826-4dab-4d07-bc04-be5dfebdaf2b-log-httpd\") pod \"ceilometer-0\" (UID: \"1fd8d826-4dab-4d07-bc04-be5dfebdaf2b\") " pod="openstack/ceilometer-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.240186 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.240236 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1fd8d826-4dab-4d07-bc04-be5dfebdaf2b-run-httpd\") pod \"ceilometer-0\" (UID: \"1fd8d826-4dab-4d07-bc04-be5dfebdaf2b\") " pod="openstack/ceilometer-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.245039 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1fd8d826-4dab-4d07-bc04-be5dfebdaf2b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1fd8d826-4dab-4d07-bc04-be5dfebdaf2b\") " pod="openstack/ceilometer-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.247754 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fd8d826-4dab-4d07-bc04-be5dfebdaf2b-config-data\") pod \"ceilometer-0\" (UID: \"1fd8d826-4dab-4d07-bc04-be5dfebdaf2b\") " pod="openstack/ceilometer-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.248723 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-7jdwx" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.249338 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fd8d826-4dab-4d07-bc04-be5dfebdaf2b-scripts\") pod \"ceilometer-0\" (UID: \"1fd8d826-4dab-4d07-bc04-be5dfebdaf2b\") " pod="openstack/ceilometer-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.261208 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.261343 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.261450 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-fldwv" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.261530 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.267803 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fd8d826-4dab-4d07-bc04-be5dfebdaf2b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1fd8d826-4dab-4d07-bc04-be5dfebdaf2b\") " pod="openstack/ceilometer-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.274487 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mfkl\" (UniqueName: \"kubernetes.io/projected/1fd8d826-4dab-4d07-bc04-be5dfebdaf2b-kube-api-access-2mfkl\") pod \"ceilometer-0\" (UID: \"1fd8d826-4dab-4d07-bc04-be5dfebdaf2b\") " pod="openstack/ceilometer-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.289279 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.339385 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.341761 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-9x9h5" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.349657 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c9c9f998c-zwfnx" event={"ID":"be28d2bd-7ba1-4322-ac54-24a8e63c807a","Type":"ContainerStarted","Data":"e57f6b9c355c1519f0c1fd9eefbe0d99bce81fcd89c259d14e1a5744879b0b78"} Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.350516 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.373985 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.375659 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.381693 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.381993 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.397003 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.400097 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.436958 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-bkslc"] Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.442393 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c36bd79-e85d-4f3a-acf8-28439a520611-scripts\") pod \"glance-default-external-api-0\" (UID: \"3c36bd79-e85d-4f3a-acf8-28439a520611\") " pod="openstack/glance-default-external-api-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.442460 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c36bd79-e85d-4f3a-acf8-28439a520611-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3c36bd79-e85d-4f3a-acf8-28439a520611\") " pod="openstack/glance-default-external-api-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.442542 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c36bd79-e85d-4f3a-acf8-28439a520611-config-data\") pod \"glance-default-external-api-0\" (UID: \"3c36bd79-e85d-4f3a-acf8-28439a520611\") " pod="openstack/glance-default-external-api-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.442607 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c36bd79-e85d-4f3a-acf8-28439a520611-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3c36bd79-e85d-4f3a-acf8-28439a520611\") " pod="openstack/glance-default-external-api-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.442628 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3c36bd79-e85d-4f3a-acf8-28439a520611-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3c36bd79-e85d-4f3a-acf8-28439a520611\") " pod="openstack/glance-default-external-api-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.442670 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c36bd79-e85d-4f3a-acf8-28439a520611-logs\") pod \"glance-default-external-api-0\" (UID: \"3c36bd79-e85d-4f3a-acf8-28439a520611\") " pod="openstack/glance-default-external-api-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.442729 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lphf6\" (UniqueName: \"kubernetes.io/projected/3c36bd79-e85d-4f3a-acf8-28439a520611-kube-api-access-lphf6\") pod \"glance-default-external-api-0\" (UID: \"3c36bd79-e85d-4f3a-acf8-28439a520611\") " pod="openstack/glance-default-external-api-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.442794 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"3c36bd79-e85d-4f3a-acf8-28439a520611\") " pod="openstack/glance-default-external-api-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.548934 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c36bd79-e85d-4f3a-acf8-28439a520611-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3c36bd79-e85d-4f3a-acf8-28439a520611\") " pod="openstack/glance-default-external-api-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.548992 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3c36bd79-e85d-4f3a-acf8-28439a520611-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3c36bd79-e85d-4f3a-acf8-28439a520611\") " pod="openstack/glance-default-external-api-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.549025 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c36bd79-e85d-4f3a-acf8-28439a520611-logs\") pod \"glance-default-external-api-0\" (UID: \"3c36bd79-e85d-4f3a-acf8-28439a520611\") " pod="openstack/glance-default-external-api-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.549069 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2n6d\" (UniqueName: \"kubernetes.io/projected/df95e914-4ee4-4ead-9306-4f2aa5b2c431-kube-api-access-g2n6d\") pod \"glance-default-internal-api-0\" (UID: \"df95e914-4ee4-4ead-9306-4f2aa5b2c431\") " pod="openstack/glance-default-internal-api-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.549103 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/df95e914-4ee4-4ead-9306-4f2aa5b2c431-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"df95e914-4ee4-4ead-9306-4f2aa5b2c431\") " pod="openstack/glance-default-internal-api-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.549123 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lphf6\" (UniqueName: \"kubernetes.io/projected/3c36bd79-e85d-4f3a-acf8-28439a520611-kube-api-access-lphf6\") pod \"glance-default-external-api-0\" (UID: \"3c36bd79-e85d-4f3a-acf8-28439a520611\") " pod="openstack/glance-default-external-api-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.549139 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"df95e914-4ee4-4ead-9306-4f2aa5b2c431\") " pod="openstack/glance-default-internal-api-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.549178 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/df95e914-4ee4-4ead-9306-4f2aa5b2c431-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"df95e914-4ee4-4ead-9306-4f2aa5b2c431\") " pod="openstack/glance-default-internal-api-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.549206 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"3c36bd79-e85d-4f3a-acf8-28439a520611\") " pod="openstack/glance-default-external-api-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.549232 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df95e914-4ee4-4ead-9306-4f2aa5b2c431-config-data\") pod \"glance-default-internal-api-0\" (UID: \"df95e914-4ee4-4ead-9306-4f2aa5b2c431\") " pod="openstack/glance-default-internal-api-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.549252 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df95e914-4ee4-4ead-9306-4f2aa5b2c431-logs\") pod \"glance-default-internal-api-0\" (UID: \"df95e914-4ee4-4ead-9306-4f2aa5b2c431\") " pod="openstack/glance-default-internal-api-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.549275 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c36bd79-e85d-4f3a-acf8-28439a520611-scripts\") pod \"glance-default-external-api-0\" (UID: \"3c36bd79-e85d-4f3a-acf8-28439a520611\") " pod="openstack/glance-default-external-api-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.549304 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c36bd79-e85d-4f3a-acf8-28439a520611-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3c36bd79-e85d-4f3a-acf8-28439a520611\") " pod="openstack/glance-default-external-api-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.549324 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df95e914-4ee4-4ead-9306-4f2aa5b2c431-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"df95e914-4ee4-4ead-9306-4f2aa5b2c431\") " pod="openstack/glance-default-internal-api-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.549360 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c36bd79-e85d-4f3a-acf8-28439a520611-config-data\") pod \"glance-default-external-api-0\" (UID: \"3c36bd79-e85d-4f3a-acf8-28439a520611\") " pod="openstack/glance-default-external-api-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.549376 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df95e914-4ee4-4ead-9306-4f2aa5b2c431-scripts\") pod \"glance-default-internal-api-0\" (UID: \"df95e914-4ee4-4ead-9306-4f2aa5b2c431\") " pod="openstack/glance-default-internal-api-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.550075 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3c36bd79-e85d-4f3a-acf8-28439a520611-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3c36bd79-e85d-4f3a-acf8-28439a520611\") " pod="openstack/glance-default-external-api-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.550308 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c36bd79-e85d-4f3a-acf8-28439a520611-logs\") pod \"glance-default-external-api-0\" (UID: \"3c36bd79-e85d-4f3a-acf8-28439a520611\") " pod="openstack/glance-default-external-api-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.550813 5004 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"3c36bd79-e85d-4f3a-acf8-28439a520611\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.563298 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c36bd79-e85d-4f3a-acf8-28439a520611-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3c36bd79-e85d-4f3a-acf8-28439a520611\") " pod="openstack/glance-default-external-api-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.568876 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c36bd79-e85d-4f3a-acf8-28439a520611-scripts\") pod \"glance-default-external-api-0\" (UID: \"3c36bd79-e85d-4f3a-acf8-28439a520611\") " pod="openstack/glance-default-external-api-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.581201 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c36bd79-e85d-4f3a-acf8-28439a520611-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3c36bd79-e85d-4f3a-acf8-28439a520611\") " pod="openstack/glance-default-external-api-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.581308 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c36bd79-e85d-4f3a-acf8-28439a520611-config-data\") pod \"glance-default-external-api-0\" (UID: \"3c36bd79-e85d-4f3a-acf8-28439a520611\") " pod="openstack/glance-default-external-api-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.582333 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lphf6\" (UniqueName: \"kubernetes.io/projected/3c36bd79-e85d-4f3a-acf8-28439a520611-kube-api-access-lphf6\") pod \"glance-default-external-api-0\" (UID: \"3c36bd79-e85d-4f3a-acf8-28439a520611\") " pod="openstack/glance-default-external-api-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.650867 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2n6d\" (UniqueName: \"kubernetes.io/projected/df95e914-4ee4-4ead-9306-4f2aa5b2c431-kube-api-access-g2n6d\") pod \"glance-default-internal-api-0\" (UID: \"df95e914-4ee4-4ead-9306-4f2aa5b2c431\") " pod="openstack/glance-default-internal-api-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.651121 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/df95e914-4ee4-4ead-9306-4f2aa5b2c431-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"df95e914-4ee4-4ead-9306-4f2aa5b2c431\") " pod="openstack/glance-default-internal-api-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.651146 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"df95e914-4ee4-4ead-9306-4f2aa5b2c431\") " pod="openstack/glance-default-internal-api-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.651187 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/df95e914-4ee4-4ead-9306-4f2aa5b2c431-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"df95e914-4ee4-4ead-9306-4f2aa5b2c431\") " pod="openstack/glance-default-internal-api-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.651235 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df95e914-4ee4-4ead-9306-4f2aa5b2c431-config-data\") pod \"glance-default-internal-api-0\" (UID: \"df95e914-4ee4-4ead-9306-4f2aa5b2c431\") " pod="openstack/glance-default-internal-api-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.651252 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df95e914-4ee4-4ead-9306-4f2aa5b2c431-logs\") pod \"glance-default-internal-api-0\" (UID: \"df95e914-4ee4-4ead-9306-4f2aa5b2c431\") " pod="openstack/glance-default-internal-api-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.651295 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df95e914-4ee4-4ead-9306-4f2aa5b2c431-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"df95e914-4ee4-4ead-9306-4f2aa5b2c431\") " pod="openstack/glance-default-internal-api-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.651334 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df95e914-4ee4-4ead-9306-4f2aa5b2c431-scripts\") pod \"glance-default-internal-api-0\" (UID: \"df95e914-4ee4-4ead-9306-4f2aa5b2c431\") " pod="openstack/glance-default-internal-api-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.654165 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df95e914-4ee4-4ead-9306-4f2aa5b2c431-logs\") pod \"glance-default-internal-api-0\" (UID: \"df95e914-4ee4-4ead-9306-4f2aa5b2c431\") " pod="openstack/glance-default-internal-api-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.654507 5004 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"df95e914-4ee4-4ead-9306-4f2aa5b2c431\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.654716 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/df95e914-4ee4-4ead-9306-4f2aa5b2c431-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"df95e914-4ee4-4ead-9306-4f2aa5b2c431\") " pod="openstack/glance-default-internal-api-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.665152 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/df95e914-4ee4-4ead-9306-4f2aa5b2c431-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"df95e914-4ee4-4ead-9306-4f2aa5b2c431\") " pod="openstack/glance-default-internal-api-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.667370 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df95e914-4ee4-4ead-9306-4f2aa5b2c431-scripts\") pod \"glance-default-internal-api-0\" (UID: \"df95e914-4ee4-4ead-9306-4f2aa5b2c431\") " pod="openstack/glance-default-internal-api-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.674377 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df95e914-4ee4-4ead-9306-4f2aa5b2c431-config-data\") pod \"glance-default-internal-api-0\" (UID: \"df95e914-4ee4-4ead-9306-4f2aa5b2c431\") " pod="openstack/glance-default-internal-api-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.681251 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2n6d\" (UniqueName: \"kubernetes.io/projected/df95e914-4ee4-4ead-9306-4f2aa5b2c431-kube-api-access-g2n6d\") pod \"glance-default-internal-api-0\" (UID: \"df95e914-4ee4-4ead-9306-4f2aa5b2c431\") " pod="openstack/glance-default-internal-api-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.692402 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df95e914-4ee4-4ead-9306-4f2aa5b2c431-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"df95e914-4ee4-4ead-9306-4f2aa5b2c431\") " pod="openstack/glance-default-internal-api-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.711665 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"df95e914-4ee4-4ead-9306-4f2aa5b2c431\") " pod="openstack/glance-default-internal-api-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.721822 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"3c36bd79-e85d-4f3a-acf8-28439a520611\") " pod="openstack/glance-default-external-api-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.837660 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 08:40:00 crc kubenswrapper[5004]: I1201 08:40:00.857023 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 08:40:01 crc kubenswrapper[5004]: I1201 08:40:01.011948 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-cwgb2"] Dec 01 08:40:01 crc kubenswrapper[5004]: I1201 08:40:01.186021 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-68h6f"] Dec 01 08:40:01 crc kubenswrapper[5004]: I1201 08:40:01.196856 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-ln7f8"] Dec 01 08:40:01 crc kubenswrapper[5004]: I1201 08:40:01.363739 5004 generic.go:334] "Generic (PLEG): container finished" podID="be28d2bd-7ba1-4322-ac54-24a8e63c807a" containerID="798665d4d9a16f9b8f2bd47e00af67f0d11ba92b49046191ef097e51bc1f7c70" exitCode=0 Dec 01 08:40:01 crc kubenswrapper[5004]: I1201 08:40:01.363819 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c9c9f998c-zwfnx" event={"ID":"be28d2bd-7ba1-4322-ac54-24a8e63c807a","Type":"ContainerDied","Data":"798665d4d9a16f9b8f2bd47e00af67f0d11ba92b49046191ef097e51bc1f7c70"} Dec 01 08:40:01 crc kubenswrapper[5004]: I1201 08:40:01.365959 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bkslc" event={"ID":"47efa747-2a3f-4e7f-b1c2-222dd039c1fe","Type":"ContainerStarted","Data":"aa8a0ddeefe479aac16227176ae767c4796462f8d57e843478acde9f393a951f"} Dec 01 08:40:01 crc kubenswrapper[5004]: I1201 08:40:01.365995 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bkslc" event={"ID":"47efa747-2a3f-4e7f-b1c2-222dd039c1fe","Type":"ContainerStarted","Data":"c0c08ee3ae527b0a970d28f9069cde22587ba80161f470e8def06e995988a6ef"} Dec 01 08:40:01 crc kubenswrapper[5004]: I1201 08:40:01.369046 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-68h6f" event={"ID":"87079b8d-839c-42d2-95d1-33dee4ca61e1","Type":"ContainerStarted","Data":"d8316539ebf2a931dc44bea7c691f840d8c5da98f31934274c0676be3f66dd95"} Dec 01 08:40:01 crc kubenswrapper[5004]: I1201 08:40:01.370379 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-ln7f8" event={"ID":"165d617f-a220-49b1-af2b-65d4c509962c","Type":"ContainerStarted","Data":"c8eeda69a5e9d51194fda5a9b50a5deb686408debce833088062e32b39c2f37c"} Dec 01 08:40:01 crc kubenswrapper[5004]: I1201 08:40:01.373062 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-z2whz" event={"ID":"92bb6b5b-96fd-416b-98a4-199390fd61b1","Type":"ContainerDied","Data":"ff0c0ea55ffc502e34feb8258eefe0a63c5ef3a1b2c4e74903061acd50712ed1"} Dec 01 08:40:01 crc kubenswrapper[5004]: I1201 08:40:01.373135 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff0c0ea55ffc502e34feb8258eefe0a63c5ef3a1b2c4e74903061acd50712ed1" Dec 01 08:40:01 crc kubenswrapper[5004]: I1201 08:40:01.374652 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-cwgb2" event={"ID":"fb372dfc-6007-42ba-bc16-96f7d99d8b98","Type":"ContainerStarted","Data":"fe1e31d629a397126b9d8027ebe24cb48f5c84b792ad9b84966980d51c4ca396"} Dec 01 08:40:01 crc kubenswrapper[5004]: I1201 08:40:01.379631 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Dec 01 08:40:01 crc kubenswrapper[5004]: I1201 08:40:01.441926 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-bkslc" podStartSLOduration=2.441909489 podStartE2EDuration="2.441909489s" podCreationTimestamp="2025-12-01 08:39:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:40:01.441242712 +0000 UTC m=+1379.006234694" watchObservedRunningTime="2025-12-01 08:40:01.441909489 +0000 UTC m=+1379.006901471" Dec 01 08:40:01 crc kubenswrapper[5004]: I1201 08:40:01.444986 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-895cf5cf-z2whz" Dec 01 08:40:01 crc kubenswrapper[5004]: I1201 08:40:01.474733 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dd5cd\" (UniqueName: \"kubernetes.io/projected/92bb6b5b-96fd-416b-98a4-199390fd61b1-kube-api-access-dd5cd\") pod \"92bb6b5b-96fd-416b-98a4-199390fd61b1\" (UID: \"92bb6b5b-96fd-416b-98a4-199390fd61b1\") " Dec 01 08:40:01 crc kubenswrapper[5004]: I1201 08:40:01.474805 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92bb6b5b-96fd-416b-98a4-199390fd61b1-config\") pod \"92bb6b5b-96fd-416b-98a4-199390fd61b1\" (UID: \"92bb6b5b-96fd-416b-98a4-199390fd61b1\") " Dec 01 08:40:01 crc kubenswrapper[5004]: I1201 08:40:01.474905 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/92bb6b5b-96fd-416b-98a4-199390fd61b1-ovsdbserver-sb\") pod \"92bb6b5b-96fd-416b-98a4-199390fd61b1\" (UID: \"92bb6b5b-96fd-416b-98a4-199390fd61b1\") " Dec 01 08:40:01 crc kubenswrapper[5004]: I1201 08:40:01.474932 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/92bb6b5b-96fd-416b-98a4-199390fd61b1-ovsdbserver-nb\") pod \"92bb6b5b-96fd-416b-98a4-199390fd61b1\" (UID: \"92bb6b5b-96fd-416b-98a4-199390fd61b1\") " Dec 01 08:40:01 crc kubenswrapper[5004]: I1201 08:40:01.474974 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/92bb6b5b-96fd-416b-98a4-199390fd61b1-dns-svc\") pod \"92bb6b5b-96fd-416b-98a4-199390fd61b1\" (UID: \"92bb6b5b-96fd-416b-98a4-199390fd61b1\") " Dec 01 08:40:01 crc kubenswrapper[5004]: I1201 08:40:01.474995 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/92bb6b5b-96fd-416b-98a4-199390fd61b1-dns-swift-storage-0\") pod \"92bb6b5b-96fd-416b-98a4-199390fd61b1\" (UID: \"92bb6b5b-96fd-416b-98a4-199390fd61b1\") " Dec 01 08:40:01 crc kubenswrapper[5004]: I1201 08:40:01.498537 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92bb6b5b-96fd-416b-98a4-199390fd61b1-kube-api-access-dd5cd" (OuterVolumeSpecName: "kube-api-access-dd5cd") pod "92bb6b5b-96fd-416b-98a4-199390fd61b1" (UID: "92bb6b5b-96fd-416b-98a4-199390fd61b1"). InnerVolumeSpecName "kube-api-access-dd5cd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:40:01 crc kubenswrapper[5004]: I1201 08:40:01.577842 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dd5cd\" (UniqueName: \"kubernetes.io/projected/92bb6b5b-96fd-416b-98a4-199390fd61b1-kube-api-access-dd5cd\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:01 crc kubenswrapper[5004]: I1201 08:40:01.597551 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92bb6b5b-96fd-416b-98a4-199390fd61b1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "92bb6b5b-96fd-416b-98a4-199390fd61b1" (UID: "92bb6b5b-96fd-416b-98a4-199390fd61b1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:40:01 crc kubenswrapper[5004]: I1201 08:40:01.702385 5004 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/92bb6b5b-96fd-416b-98a4-199390fd61b1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:01 crc kubenswrapper[5004]: I1201 08:40:01.802284 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 08:40:01 crc kubenswrapper[5004]: I1201 08:40:01.970386 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-vgmk8"] Dec 01 08:40:02 crc kubenswrapper[5004]: I1201 08:40:02.009346 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-7jdwx"] Dec 01 08:40:02 crc kubenswrapper[5004]: I1201 08:40:02.030990 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-9x9h5"] Dec 01 08:40:02 crc kubenswrapper[5004]: I1201 08:40:02.067618 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 08:40:02 crc kubenswrapper[5004]: I1201 08:40:02.120020 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 08:40:02 crc kubenswrapper[5004]: I1201 08:40:02.254685 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92bb6b5b-96fd-416b-98a4-199390fd61b1-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "92bb6b5b-96fd-416b-98a4-199390fd61b1" (UID: "92bb6b5b-96fd-416b-98a4-199390fd61b1"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:40:02 crc kubenswrapper[5004]: I1201 08:40:02.290276 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 08:40:02 crc kubenswrapper[5004]: I1201 08:40:02.324650 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 08:40:02 crc kubenswrapper[5004]: I1201 08:40:02.340090 5004 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/92bb6b5b-96fd-416b-98a4-199390fd61b1-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:02 crc kubenswrapper[5004]: I1201 08:40:02.345027 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92bb6b5b-96fd-416b-98a4-199390fd61b1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "92bb6b5b-96fd-416b-98a4-199390fd61b1" (UID: "92bb6b5b-96fd-416b-98a4-199390fd61b1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:40:02 crc kubenswrapper[5004]: W1201 08:40:02.358980 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5303a09_48ed_4287_83ba_0fb70fe199d0.slice/crio-8a99f3615c9b5ecb5c3b604389126063c56e3de2d5a2cec04b9ab1f7458955bc WatchSource:0}: Error finding container 8a99f3615c9b5ecb5c3b604389126063c56e3de2d5a2cec04b9ab1f7458955bc: Status 404 returned error can't find the container with id 8a99f3615c9b5ecb5c3b604389126063c56e3de2d5a2cec04b9ab1f7458955bc Dec 01 08:40:02 crc kubenswrapper[5004]: W1201 08:40:02.368335 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c36bd79_e85d_4f3a_acf8_28439a520611.slice/crio-2510933437e6aa7d9ced98bc64a70ad9ee3c6d678050351516a57041a7af83e4 WatchSource:0}: Error finding container 2510933437e6aa7d9ced98bc64a70ad9ee3c6d678050351516a57041a7af83e4: Status 404 returned error can't find the container with id 2510933437e6aa7d9ced98bc64a70ad9ee3c6d678050351516a57041a7af83e4 Dec 01 08:40:02 crc kubenswrapper[5004]: I1201 08:40:02.408414 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92bb6b5b-96fd-416b-98a4-199390fd61b1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "92bb6b5b-96fd-416b-98a4-199390fd61b1" (UID: "92bb6b5b-96fd-416b-98a4-199390fd61b1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:40:02 crc kubenswrapper[5004]: I1201 08:40:02.410433 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3c36bd79-e85d-4f3a-acf8-28439a520611","Type":"ContainerStarted","Data":"2510933437e6aa7d9ced98bc64a70ad9ee3c6d678050351516a57041a7af83e4"} Dec 01 08:40:02 crc kubenswrapper[5004]: I1201 08:40:02.429089 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-7jdwx" event={"ID":"e5303a09-48ed-4287-83ba-0fb70fe199d0","Type":"ContainerStarted","Data":"8a99f3615c9b5ecb5c3b604389126063c56e3de2d5a2cec04b9ab1f7458955bc"} Dec 01 08:40:02 crc kubenswrapper[5004]: I1201 08:40:02.433516 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-vgmk8" event={"ID":"0eccea77-d6ee-4592-ad47-1f29ca2a943b","Type":"ContainerStarted","Data":"54b4d25d0fb7243c4ea6c7d5b0e752b0b37710e32def63324a9c28d342d47d0a"} Dec 01 08:40:02 crc kubenswrapper[5004]: I1201 08:40:02.433579 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-895cf5cf-z2whz" Dec 01 08:40:02 crc kubenswrapper[5004]: I1201 08:40:02.439611 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92bb6b5b-96fd-416b-98a4-199390fd61b1-config" (OuterVolumeSpecName: "config") pod "92bb6b5b-96fd-416b-98a4-199390fd61b1" (UID: "92bb6b5b-96fd-416b-98a4-199390fd61b1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:40:02 crc kubenswrapper[5004]: I1201 08:40:02.441680 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92bb6b5b-96fd-416b-98a4-199390fd61b1-config\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:02 crc kubenswrapper[5004]: I1201 08:40:02.441700 5004 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/92bb6b5b-96fd-416b-98a4-199390fd61b1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:02 crc kubenswrapper[5004]: I1201 08:40:02.441716 5004 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/92bb6b5b-96fd-416b-98a4-199390fd61b1-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:02 crc kubenswrapper[5004]: I1201 08:40:02.452190 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 08:40:02 crc kubenswrapper[5004]: I1201 08:40:02.612202 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9c9f998c-zwfnx" Dec 01 08:40:02 crc kubenswrapper[5004]: I1201 08:40:02.649138 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/be28d2bd-7ba1-4322-ac54-24a8e63c807a-dns-svc\") pod \"be28d2bd-7ba1-4322-ac54-24a8e63c807a\" (UID: \"be28d2bd-7ba1-4322-ac54-24a8e63c807a\") " Dec 01 08:40:02 crc kubenswrapper[5004]: I1201 08:40:02.649230 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/be28d2bd-7ba1-4322-ac54-24a8e63c807a-ovsdbserver-nb\") pod \"be28d2bd-7ba1-4322-ac54-24a8e63c807a\" (UID: \"be28d2bd-7ba1-4322-ac54-24a8e63c807a\") " Dec 01 08:40:02 crc kubenswrapper[5004]: I1201 08:40:02.649293 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be28d2bd-7ba1-4322-ac54-24a8e63c807a-config\") pod \"be28d2bd-7ba1-4322-ac54-24a8e63c807a\" (UID: \"be28d2bd-7ba1-4322-ac54-24a8e63c807a\") " Dec 01 08:40:02 crc kubenswrapper[5004]: I1201 08:40:02.649533 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/be28d2bd-7ba1-4322-ac54-24a8e63c807a-ovsdbserver-sb\") pod \"be28d2bd-7ba1-4322-ac54-24a8e63c807a\" (UID: \"be28d2bd-7ba1-4322-ac54-24a8e63c807a\") " Dec 01 08:40:02 crc kubenswrapper[5004]: I1201 08:40:02.649638 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzd5c\" (UniqueName: \"kubernetes.io/projected/be28d2bd-7ba1-4322-ac54-24a8e63c807a-kube-api-access-jzd5c\") pod \"be28d2bd-7ba1-4322-ac54-24a8e63c807a\" (UID: \"be28d2bd-7ba1-4322-ac54-24a8e63c807a\") " Dec 01 08:40:02 crc kubenswrapper[5004]: I1201 08:40:02.649700 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/be28d2bd-7ba1-4322-ac54-24a8e63c807a-dns-swift-storage-0\") pod \"be28d2bd-7ba1-4322-ac54-24a8e63c807a\" (UID: \"be28d2bd-7ba1-4322-ac54-24a8e63c807a\") " Dec 01 08:40:02 crc kubenswrapper[5004]: I1201 08:40:02.654576 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be28d2bd-7ba1-4322-ac54-24a8e63c807a-kube-api-access-jzd5c" (OuterVolumeSpecName: "kube-api-access-jzd5c") pod "be28d2bd-7ba1-4322-ac54-24a8e63c807a" (UID: "be28d2bd-7ba1-4322-ac54-24a8e63c807a"). InnerVolumeSpecName "kube-api-access-jzd5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:40:02 crc kubenswrapper[5004]: I1201 08:40:02.731623 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be28d2bd-7ba1-4322-ac54-24a8e63c807a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "be28d2bd-7ba1-4322-ac54-24a8e63c807a" (UID: "be28d2bd-7ba1-4322-ac54-24a8e63c807a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:40:02 crc kubenswrapper[5004]: I1201 08:40:02.752535 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be28d2bd-7ba1-4322-ac54-24a8e63c807a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "be28d2bd-7ba1-4322-ac54-24a8e63c807a" (UID: "be28d2bd-7ba1-4322-ac54-24a8e63c807a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:40:02 crc kubenswrapper[5004]: I1201 08:40:02.758195 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be28d2bd-7ba1-4322-ac54-24a8e63c807a-config" (OuterVolumeSpecName: "config") pod "be28d2bd-7ba1-4322-ac54-24a8e63c807a" (UID: "be28d2bd-7ba1-4322-ac54-24a8e63c807a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:40:02 crc kubenswrapper[5004]: I1201 08:40:02.773280 5004 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/be28d2bd-7ba1-4322-ac54-24a8e63c807a-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:02 crc kubenswrapper[5004]: I1201 08:40:02.774324 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be28d2bd-7ba1-4322-ac54-24a8e63c807a-config\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:02 crc kubenswrapper[5004]: I1201 08:40:02.774342 5004 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/be28d2bd-7ba1-4322-ac54-24a8e63c807a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:02 crc kubenswrapper[5004]: I1201 08:40:02.774354 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jzd5c\" (UniqueName: \"kubernetes.io/projected/be28d2bd-7ba1-4322-ac54-24a8e63c807a-kube-api-access-jzd5c\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:02 crc kubenswrapper[5004]: I1201 08:40:02.818784 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be28d2bd-7ba1-4322-ac54-24a8e63c807a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "be28d2bd-7ba1-4322-ac54-24a8e63c807a" (UID: "be28d2bd-7ba1-4322-ac54-24a8e63c807a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:40:02 crc kubenswrapper[5004]: I1201 08:40:02.830138 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be28d2bd-7ba1-4322-ac54-24a8e63c807a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "be28d2bd-7ba1-4322-ac54-24a8e63c807a" (UID: "be28d2bd-7ba1-4322-ac54-24a8e63c807a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:40:02 crc kubenswrapper[5004]: I1201 08:40:02.877984 5004 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/be28d2bd-7ba1-4322-ac54-24a8e63c807a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:02 crc kubenswrapper[5004]: I1201 08:40:02.878017 5004 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/be28d2bd-7ba1-4322-ac54-24a8e63c807a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:03 crc kubenswrapper[5004]: I1201 08:40:03.073932 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-z2whz"] Dec 01 08:40:03 crc kubenswrapper[5004]: I1201 08:40:03.096628 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-z2whz"] Dec 01 08:40:03 crc kubenswrapper[5004]: I1201 08:40:03.464015 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1fd8d826-4dab-4d07-bc04-be5dfebdaf2b","Type":"ContainerStarted","Data":"6f08f218b7a030d6e0d1680710ecf70e182562331ffdb4941d7d886be6ca2db6"} Dec 01 08:40:03 crc kubenswrapper[5004]: I1201 08:40:03.494682 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-68h6f" event={"ID":"87079b8d-839c-42d2-95d1-33dee4ca61e1","Type":"ContainerStarted","Data":"2e5a177f0147bb0a08e716773ead0c5f7639009e7d855103ecf90be8667d222d"} Dec 01 08:40:03 crc kubenswrapper[5004]: I1201 08:40:03.506263 5004 generic.go:334] "Generic (PLEG): container finished" podID="1b2170ab-37fa-4381-9001-5487eb2a302c" containerID="d84618ce559ed8363e13c4dadbdea5e393986932f5f51e7e54a580e72d7e978a" exitCode=0 Dec 01 08:40:03 crc kubenswrapper[5004]: I1201 08:40:03.506382 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-9x9h5" event={"ID":"1b2170ab-37fa-4381-9001-5487eb2a302c","Type":"ContainerDied","Data":"d84618ce559ed8363e13c4dadbdea5e393986932f5f51e7e54a580e72d7e978a"} Dec 01 08:40:03 crc kubenswrapper[5004]: I1201 08:40:03.506409 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-9x9h5" event={"ID":"1b2170ab-37fa-4381-9001-5487eb2a302c","Type":"ContainerStarted","Data":"83b6adac67cf58fa3c8096ca7abcbc3d41053a5bd9f2f171c4a7e60af501e3d8"} Dec 01 08:40:03 crc kubenswrapper[5004]: I1201 08:40:03.516308 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c9c9f998c-zwfnx" event={"ID":"be28d2bd-7ba1-4322-ac54-24a8e63c807a","Type":"ContainerDied","Data":"e57f6b9c355c1519f0c1fd9eefbe0d99bce81fcd89c259d14e1a5744879b0b78"} Dec 01 08:40:03 crc kubenswrapper[5004]: I1201 08:40:03.516360 5004 scope.go:117] "RemoveContainer" containerID="798665d4d9a16f9b8f2bd47e00af67f0d11ba92b49046191ef097e51bc1f7c70" Dec 01 08:40:03 crc kubenswrapper[5004]: I1201 08:40:03.516521 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9c9f998c-zwfnx" Dec 01 08:40:03 crc kubenswrapper[5004]: I1201 08:40:03.539784 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-68h6f" podStartSLOduration=4.539764252 podStartE2EDuration="4.539764252s" podCreationTimestamp="2025-12-01 08:39:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:40:03.516298368 +0000 UTC m=+1381.081290350" watchObservedRunningTime="2025-12-01 08:40:03.539764252 +0000 UTC m=+1381.104756234" Dec 01 08:40:03 crc kubenswrapper[5004]: I1201 08:40:03.539893 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"df95e914-4ee4-4ead-9306-4f2aa5b2c431","Type":"ContainerStarted","Data":"a4b544cd0fcc4a8e0b8f51a7f849f13930b8648f58482c0413bd8b9b3ed03877"} Dec 01 08:40:03 crc kubenswrapper[5004]: I1201 08:40:03.646981 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-zwfnx"] Dec 01 08:40:03 crc kubenswrapper[5004]: I1201 08:40:03.705185 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-zwfnx"] Dec 01 08:40:04 crc kubenswrapper[5004]: I1201 08:40:04.565974 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3c36bd79-e85d-4f3a-acf8-28439a520611","Type":"ContainerStarted","Data":"260be28cadb69a876e5e2fb0ea3a8703952eccc9ec875bac39213f374c659229"} Dec 01 08:40:04 crc kubenswrapper[5004]: I1201 08:40:04.570198 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"df95e914-4ee4-4ead-9306-4f2aa5b2c431","Type":"ContainerStarted","Data":"c495e8ff21df11426dd22273b521a77287ef27f2a5c16d0fbb132263af3c271f"} Dec 01 08:40:04 crc kubenswrapper[5004]: I1201 08:40:04.777103 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92bb6b5b-96fd-416b-98a4-199390fd61b1" path="/var/lib/kubelet/pods/92bb6b5b-96fd-416b-98a4-199390fd61b1/volumes" Dec 01 08:40:04 crc kubenswrapper[5004]: I1201 08:40:04.777782 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be28d2bd-7ba1-4322-ac54-24a8e63c807a" path="/var/lib/kubelet/pods/be28d2bd-7ba1-4322-ac54-24a8e63c807a/volumes" Dec 01 08:40:05 crc kubenswrapper[5004]: I1201 08:40:05.581599 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-9x9h5" event={"ID":"1b2170ab-37fa-4381-9001-5487eb2a302c","Type":"ContainerStarted","Data":"ba403a449325a9a9d719d703d7e195585ec5fd7c266e71d64e8d24dde0df14a4"} Dec 01 08:40:05 crc kubenswrapper[5004]: I1201 08:40:05.582234 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57c957c4ff-9x9h5" Dec 01 08:40:05 crc kubenswrapper[5004]: I1201 08:40:05.584818 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"df95e914-4ee4-4ead-9306-4f2aa5b2c431","Type":"ContainerStarted","Data":"096db5eefe76b833206c0552285a37c222f7f9fdcd618eee28153246aca441e9"} Dec 01 08:40:05 crc kubenswrapper[5004]: I1201 08:40:05.584935 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="df95e914-4ee4-4ead-9306-4f2aa5b2c431" containerName="glance-log" containerID="cri-o://c495e8ff21df11426dd22273b521a77287ef27f2a5c16d0fbb132263af3c271f" gracePeriod=30 Dec 01 08:40:05 crc kubenswrapper[5004]: I1201 08:40:05.585176 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="df95e914-4ee4-4ead-9306-4f2aa5b2c431" containerName="glance-httpd" containerID="cri-o://096db5eefe76b833206c0552285a37c222f7f9fdcd618eee28153246aca441e9" gracePeriod=30 Dec 01 08:40:05 crc kubenswrapper[5004]: I1201 08:40:05.588289 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3c36bd79-e85d-4f3a-acf8-28439a520611","Type":"ContainerStarted","Data":"83f8142cbf9e8042b7a8a11aa8f3366fe547181b81e1fd35f49d8e7d05f908d2"} Dec 01 08:40:05 crc kubenswrapper[5004]: I1201 08:40:05.588430 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="3c36bd79-e85d-4f3a-acf8-28439a520611" containerName="glance-log" containerID="cri-o://260be28cadb69a876e5e2fb0ea3a8703952eccc9ec875bac39213f374c659229" gracePeriod=30 Dec 01 08:40:05 crc kubenswrapper[5004]: I1201 08:40:05.588536 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="3c36bd79-e85d-4f3a-acf8-28439a520611" containerName="glance-httpd" containerID="cri-o://83f8142cbf9e8042b7a8a11aa8f3366fe547181b81e1fd35f49d8e7d05f908d2" gracePeriod=30 Dec 01 08:40:05 crc kubenswrapper[5004]: I1201 08:40:05.605821 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57c957c4ff-9x9h5" podStartSLOduration=6.605804588 podStartE2EDuration="6.605804588s" podCreationTimestamp="2025-12-01 08:39:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:40:05.596137452 +0000 UTC m=+1383.161129434" watchObservedRunningTime="2025-12-01 08:40:05.605804588 +0000 UTC m=+1383.170796570" Dec 01 08:40:05 crc kubenswrapper[5004]: I1201 08:40:05.630522 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.630504751 podStartE2EDuration="6.630504751s" podCreationTimestamp="2025-12-01 08:39:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:40:05.627441396 +0000 UTC m=+1383.192433368" watchObservedRunningTime="2025-12-01 08:40:05.630504751 +0000 UTC m=+1383.195496723" Dec 01 08:40:05 crc kubenswrapper[5004]: I1201 08:40:05.648668 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.648652804 podStartE2EDuration="6.648652804s" podCreationTimestamp="2025-12-01 08:39:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:40:05.647411234 +0000 UTC m=+1383.212403206" watchObservedRunningTime="2025-12-01 08:40:05.648652804 +0000 UTC m=+1383.213644786" Dec 01 08:40:06 crc kubenswrapper[5004]: I1201 08:40:06.603076 5004 generic.go:334] "Generic (PLEG): container finished" podID="df95e914-4ee4-4ead-9306-4f2aa5b2c431" containerID="096db5eefe76b833206c0552285a37c222f7f9fdcd618eee28153246aca441e9" exitCode=0 Dec 01 08:40:06 crc kubenswrapper[5004]: I1201 08:40:06.603498 5004 generic.go:334] "Generic (PLEG): container finished" podID="df95e914-4ee4-4ead-9306-4f2aa5b2c431" containerID="c495e8ff21df11426dd22273b521a77287ef27f2a5c16d0fbb132263af3c271f" exitCode=143 Dec 01 08:40:06 crc kubenswrapper[5004]: I1201 08:40:06.603168 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"df95e914-4ee4-4ead-9306-4f2aa5b2c431","Type":"ContainerDied","Data":"096db5eefe76b833206c0552285a37c222f7f9fdcd618eee28153246aca441e9"} Dec 01 08:40:06 crc kubenswrapper[5004]: I1201 08:40:06.603615 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"df95e914-4ee4-4ead-9306-4f2aa5b2c431","Type":"ContainerDied","Data":"c495e8ff21df11426dd22273b521a77287ef27f2a5c16d0fbb132263af3c271f"} Dec 01 08:40:06 crc kubenswrapper[5004]: I1201 08:40:06.605146 5004 generic.go:334] "Generic (PLEG): container finished" podID="47efa747-2a3f-4e7f-b1c2-222dd039c1fe" containerID="aa8a0ddeefe479aac16227176ae767c4796462f8d57e843478acde9f393a951f" exitCode=0 Dec 01 08:40:06 crc kubenswrapper[5004]: I1201 08:40:06.605204 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bkslc" event={"ID":"47efa747-2a3f-4e7f-b1c2-222dd039c1fe","Type":"ContainerDied","Data":"aa8a0ddeefe479aac16227176ae767c4796462f8d57e843478acde9f393a951f"} Dec 01 08:40:06 crc kubenswrapper[5004]: I1201 08:40:06.607040 5004 generic.go:334] "Generic (PLEG): container finished" podID="3c36bd79-e85d-4f3a-acf8-28439a520611" containerID="83f8142cbf9e8042b7a8a11aa8f3366fe547181b81e1fd35f49d8e7d05f908d2" exitCode=0 Dec 01 08:40:06 crc kubenswrapper[5004]: I1201 08:40:06.607060 5004 generic.go:334] "Generic (PLEG): container finished" podID="3c36bd79-e85d-4f3a-acf8-28439a520611" containerID="260be28cadb69a876e5e2fb0ea3a8703952eccc9ec875bac39213f374c659229" exitCode=143 Dec 01 08:40:06 crc kubenswrapper[5004]: I1201 08:40:06.608095 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3c36bd79-e85d-4f3a-acf8-28439a520611","Type":"ContainerDied","Data":"83f8142cbf9e8042b7a8a11aa8f3366fe547181b81e1fd35f49d8e7d05f908d2"} Dec 01 08:40:06 crc kubenswrapper[5004]: I1201 08:40:06.608126 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3c36bd79-e85d-4f3a-acf8-28439a520611","Type":"ContainerDied","Data":"260be28cadb69a876e5e2fb0ea3a8703952eccc9ec875bac39213f374c659229"} Dec 01 08:40:08 crc kubenswrapper[5004]: I1201 08:40:08.535842 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 08:40:08 crc kubenswrapper[5004]: I1201 08:40:08.640676 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3c36bd79-e85d-4f3a-acf8-28439a520611","Type":"ContainerDied","Data":"2510933437e6aa7d9ced98bc64a70ad9ee3c6d678050351516a57041a7af83e4"} Dec 01 08:40:08 crc kubenswrapper[5004]: I1201 08:40:08.640737 5004 scope.go:117] "RemoveContainer" containerID="83f8142cbf9e8042b7a8a11aa8f3366fe547181b81e1fd35f49d8e7d05f908d2" Dec 01 08:40:08 crc kubenswrapper[5004]: I1201 08:40:08.640931 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 08:40:08 crc kubenswrapper[5004]: I1201 08:40:08.658581 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c36bd79-e85d-4f3a-acf8-28439a520611-scripts\") pod \"3c36bd79-e85d-4f3a-acf8-28439a520611\" (UID: \"3c36bd79-e85d-4f3a-acf8-28439a520611\") " Dec 01 08:40:08 crc kubenswrapper[5004]: I1201 08:40:08.658780 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c36bd79-e85d-4f3a-acf8-28439a520611-logs\") pod \"3c36bd79-e85d-4f3a-acf8-28439a520611\" (UID: \"3c36bd79-e85d-4f3a-acf8-28439a520611\") " Dec 01 08:40:08 crc kubenswrapper[5004]: I1201 08:40:08.658817 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c36bd79-e85d-4f3a-acf8-28439a520611-public-tls-certs\") pod \"3c36bd79-e85d-4f3a-acf8-28439a520611\" (UID: \"3c36bd79-e85d-4f3a-acf8-28439a520611\") " Dec 01 08:40:08 crc kubenswrapper[5004]: I1201 08:40:08.658854 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c36bd79-e85d-4f3a-acf8-28439a520611-combined-ca-bundle\") pod \"3c36bd79-e85d-4f3a-acf8-28439a520611\" (UID: \"3c36bd79-e85d-4f3a-acf8-28439a520611\") " Dec 01 08:40:08 crc kubenswrapper[5004]: I1201 08:40:08.658903 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"3c36bd79-e85d-4f3a-acf8-28439a520611\" (UID: \"3c36bd79-e85d-4f3a-acf8-28439a520611\") " Dec 01 08:40:08 crc kubenswrapper[5004]: I1201 08:40:08.658935 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lphf6\" (UniqueName: \"kubernetes.io/projected/3c36bd79-e85d-4f3a-acf8-28439a520611-kube-api-access-lphf6\") pod \"3c36bd79-e85d-4f3a-acf8-28439a520611\" (UID: \"3c36bd79-e85d-4f3a-acf8-28439a520611\") " Dec 01 08:40:08 crc kubenswrapper[5004]: I1201 08:40:08.659011 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c36bd79-e85d-4f3a-acf8-28439a520611-config-data\") pod \"3c36bd79-e85d-4f3a-acf8-28439a520611\" (UID: \"3c36bd79-e85d-4f3a-acf8-28439a520611\") " Dec 01 08:40:08 crc kubenswrapper[5004]: I1201 08:40:08.659127 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3c36bd79-e85d-4f3a-acf8-28439a520611-httpd-run\") pod \"3c36bd79-e85d-4f3a-acf8-28439a520611\" (UID: \"3c36bd79-e85d-4f3a-acf8-28439a520611\") " Dec 01 08:40:08 crc kubenswrapper[5004]: I1201 08:40:08.659374 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c36bd79-e85d-4f3a-acf8-28439a520611-logs" (OuterVolumeSpecName: "logs") pod "3c36bd79-e85d-4f3a-acf8-28439a520611" (UID: "3c36bd79-e85d-4f3a-acf8-28439a520611"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:40:08 crc kubenswrapper[5004]: I1201 08:40:08.659688 5004 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c36bd79-e85d-4f3a-acf8-28439a520611-logs\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:08 crc kubenswrapper[5004]: I1201 08:40:08.659874 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c36bd79-e85d-4f3a-acf8-28439a520611-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "3c36bd79-e85d-4f3a-acf8-28439a520611" (UID: "3c36bd79-e85d-4f3a-acf8-28439a520611"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:40:08 crc kubenswrapper[5004]: I1201 08:40:08.664751 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c36bd79-e85d-4f3a-acf8-28439a520611-scripts" (OuterVolumeSpecName: "scripts") pod "3c36bd79-e85d-4f3a-acf8-28439a520611" (UID: "3c36bd79-e85d-4f3a-acf8-28439a520611"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:40:08 crc kubenswrapper[5004]: I1201 08:40:08.664897 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "3c36bd79-e85d-4f3a-acf8-28439a520611" (UID: "3c36bd79-e85d-4f3a-acf8-28439a520611"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 08:40:08 crc kubenswrapper[5004]: I1201 08:40:08.689882 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c36bd79-e85d-4f3a-acf8-28439a520611-kube-api-access-lphf6" (OuterVolumeSpecName: "kube-api-access-lphf6") pod "3c36bd79-e85d-4f3a-acf8-28439a520611" (UID: "3c36bd79-e85d-4f3a-acf8-28439a520611"). InnerVolumeSpecName "kube-api-access-lphf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:40:08 crc kubenswrapper[5004]: I1201 08:40:08.699910 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c36bd79-e85d-4f3a-acf8-28439a520611-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3c36bd79-e85d-4f3a-acf8-28439a520611" (UID: "3c36bd79-e85d-4f3a-acf8-28439a520611"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:40:08 crc kubenswrapper[5004]: I1201 08:40:08.729536 5004 patch_prober.go:28] interesting pod/machine-config-daemon-fvdgt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 08:40:08 crc kubenswrapper[5004]: I1201 08:40:08.729620 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 08:40:08 crc kubenswrapper[5004]: I1201 08:40:08.729804 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c36bd79-e85d-4f3a-acf8-28439a520611-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3c36bd79-e85d-4f3a-acf8-28439a520611" (UID: "3c36bd79-e85d-4f3a-acf8-28439a520611"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:40:08 crc kubenswrapper[5004]: I1201 08:40:08.746813 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c36bd79-e85d-4f3a-acf8-28439a520611-config-data" (OuterVolumeSpecName: "config-data") pod "3c36bd79-e85d-4f3a-acf8-28439a520611" (UID: "3c36bd79-e85d-4f3a-acf8-28439a520611"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:40:08 crc kubenswrapper[5004]: I1201 08:40:08.762270 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lphf6\" (UniqueName: \"kubernetes.io/projected/3c36bd79-e85d-4f3a-acf8-28439a520611-kube-api-access-lphf6\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:08 crc kubenswrapper[5004]: I1201 08:40:08.762302 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c36bd79-e85d-4f3a-acf8-28439a520611-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:08 crc kubenswrapper[5004]: I1201 08:40:08.762313 5004 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3c36bd79-e85d-4f3a-acf8-28439a520611-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:08 crc kubenswrapper[5004]: I1201 08:40:08.762322 5004 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c36bd79-e85d-4f3a-acf8-28439a520611-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:08 crc kubenswrapper[5004]: I1201 08:40:08.762330 5004 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c36bd79-e85d-4f3a-acf8-28439a520611-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:08 crc kubenswrapper[5004]: I1201 08:40:08.762339 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c36bd79-e85d-4f3a-acf8-28439a520611-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:08 crc kubenswrapper[5004]: I1201 08:40:08.762371 5004 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Dec 01 08:40:08 crc kubenswrapper[5004]: I1201 08:40:08.785264 5004 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Dec 01 08:40:08 crc kubenswrapper[5004]: I1201 08:40:08.864186 5004 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:08 crc kubenswrapper[5004]: I1201 08:40:08.982311 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 08:40:09 crc kubenswrapper[5004]: I1201 08:40:09.004733 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 08:40:09 crc kubenswrapper[5004]: I1201 08:40:09.015271 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 08:40:09 crc kubenswrapper[5004]: E1201 08:40:09.015902 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c36bd79-e85d-4f3a-acf8-28439a520611" containerName="glance-httpd" Dec 01 08:40:09 crc kubenswrapper[5004]: I1201 08:40:09.020141 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c36bd79-e85d-4f3a-acf8-28439a520611" containerName="glance-httpd" Dec 01 08:40:09 crc kubenswrapper[5004]: E1201 08:40:09.020244 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92bb6b5b-96fd-416b-98a4-199390fd61b1" containerName="init" Dec 01 08:40:09 crc kubenswrapper[5004]: I1201 08:40:09.020254 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="92bb6b5b-96fd-416b-98a4-199390fd61b1" containerName="init" Dec 01 08:40:09 crc kubenswrapper[5004]: E1201 08:40:09.020295 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c36bd79-e85d-4f3a-acf8-28439a520611" containerName="glance-log" Dec 01 08:40:09 crc kubenswrapper[5004]: I1201 08:40:09.020304 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c36bd79-e85d-4f3a-acf8-28439a520611" containerName="glance-log" Dec 01 08:40:09 crc kubenswrapper[5004]: E1201 08:40:09.020337 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be28d2bd-7ba1-4322-ac54-24a8e63c807a" containerName="init" Dec 01 08:40:09 crc kubenswrapper[5004]: I1201 08:40:09.020346 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="be28d2bd-7ba1-4322-ac54-24a8e63c807a" containerName="init" Dec 01 08:40:09 crc kubenswrapper[5004]: I1201 08:40:09.020797 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="be28d2bd-7ba1-4322-ac54-24a8e63c807a" containerName="init" Dec 01 08:40:09 crc kubenswrapper[5004]: I1201 08:40:09.020836 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c36bd79-e85d-4f3a-acf8-28439a520611" containerName="glance-httpd" Dec 01 08:40:09 crc kubenswrapper[5004]: I1201 08:40:09.020849 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c36bd79-e85d-4f3a-acf8-28439a520611" containerName="glance-log" Dec 01 08:40:09 crc kubenswrapper[5004]: I1201 08:40:09.020860 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="92bb6b5b-96fd-416b-98a4-199390fd61b1" containerName="init" Dec 01 08:40:09 crc kubenswrapper[5004]: I1201 08:40:09.022412 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 08:40:09 crc kubenswrapper[5004]: I1201 08:40:09.024714 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 01 08:40:09 crc kubenswrapper[5004]: I1201 08:40:09.026229 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 01 08:40:09 crc kubenswrapper[5004]: I1201 08:40:09.036420 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 08:40:09 crc kubenswrapper[5004]: I1201 08:40:09.067901 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa6628b9-be2f-4594-8767-5442e1d2f5b9-logs\") pod \"glance-default-external-api-0\" (UID: \"fa6628b9-be2f-4594-8767-5442e1d2f5b9\") " pod="openstack/glance-default-external-api-0" Dec 01 08:40:09 crc kubenswrapper[5004]: I1201 08:40:09.067964 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa6628b9-be2f-4594-8767-5442e1d2f5b9-config-data\") pod \"glance-default-external-api-0\" (UID: \"fa6628b9-be2f-4594-8767-5442e1d2f5b9\") " pod="openstack/glance-default-external-api-0" Dec 01 08:40:09 crc kubenswrapper[5004]: I1201 08:40:09.068036 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nv98t\" (UniqueName: \"kubernetes.io/projected/fa6628b9-be2f-4594-8767-5442e1d2f5b9-kube-api-access-nv98t\") pod \"glance-default-external-api-0\" (UID: \"fa6628b9-be2f-4594-8767-5442e1d2f5b9\") " pod="openstack/glance-default-external-api-0" Dec 01 08:40:09 crc kubenswrapper[5004]: I1201 08:40:09.068092 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa6628b9-be2f-4594-8767-5442e1d2f5b9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fa6628b9-be2f-4594-8767-5442e1d2f5b9\") " pod="openstack/glance-default-external-api-0" Dec 01 08:40:09 crc kubenswrapper[5004]: I1201 08:40:09.068405 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"fa6628b9-be2f-4594-8767-5442e1d2f5b9\") " pod="openstack/glance-default-external-api-0" Dec 01 08:40:09 crc kubenswrapper[5004]: I1201 08:40:09.068631 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa6628b9-be2f-4594-8767-5442e1d2f5b9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fa6628b9-be2f-4594-8767-5442e1d2f5b9\") " pod="openstack/glance-default-external-api-0" Dec 01 08:40:09 crc kubenswrapper[5004]: I1201 08:40:09.068745 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa6628b9-be2f-4594-8767-5442e1d2f5b9-scripts\") pod \"glance-default-external-api-0\" (UID: \"fa6628b9-be2f-4594-8767-5442e1d2f5b9\") " pod="openstack/glance-default-external-api-0" Dec 01 08:40:09 crc kubenswrapper[5004]: I1201 08:40:09.068824 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fa6628b9-be2f-4594-8767-5442e1d2f5b9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fa6628b9-be2f-4594-8767-5442e1d2f5b9\") " pod="openstack/glance-default-external-api-0" Dec 01 08:40:09 crc kubenswrapper[5004]: I1201 08:40:09.170354 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa6628b9-be2f-4594-8767-5442e1d2f5b9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fa6628b9-be2f-4594-8767-5442e1d2f5b9\") " pod="openstack/glance-default-external-api-0" Dec 01 08:40:09 crc kubenswrapper[5004]: I1201 08:40:09.170455 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa6628b9-be2f-4594-8767-5442e1d2f5b9-scripts\") pod \"glance-default-external-api-0\" (UID: \"fa6628b9-be2f-4594-8767-5442e1d2f5b9\") " pod="openstack/glance-default-external-api-0" Dec 01 08:40:09 crc kubenswrapper[5004]: I1201 08:40:09.171624 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fa6628b9-be2f-4594-8767-5442e1d2f5b9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fa6628b9-be2f-4594-8767-5442e1d2f5b9\") " pod="openstack/glance-default-external-api-0" Dec 01 08:40:09 crc kubenswrapper[5004]: I1201 08:40:09.171690 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa6628b9-be2f-4594-8767-5442e1d2f5b9-logs\") pod \"glance-default-external-api-0\" (UID: \"fa6628b9-be2f-4594-8767-5442e1d2f5b9\") " pod="openstack/glance-default-external-api-0" Dec 01 08:40:09 crc kubenswrapper[5004]: I1201 08:40:09.171737 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa6628b9-be2f-4594-8767-5442e1d2f5b9-config-data\") pod \"glance-default-external-api-0\" (UID: \"fa6628b9-be2f-4594-8767-5442e1d2f5b9\") " pod="openstack/glance-default-external-api-0" Dec 01 08:40:09 crc kubenswrapper[5004]: I1201 08:40:09.171812 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nv98t\" (UniqueName: \"kubernetes.io/projected/fa6628b9-be2f-4594-8767-5442e1d2f5b9-kube-api-access-nv98t\") pod \"glance-default-external-api-0\" (UID: \"fa6628b9-be2f-4594-8767-5442e1d2f5b9\") " pod="openstack/glance-default-external-api-0" Dec 01 08:40:09 crc kubenswrapper[5004]: I1201 08:40:09.172466 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa6628b9-be2f-4594-8767-5442e1d2f5b9-logs\") pod \"glance-default-external-api-0\" (UID: \"fa6628b9-be2f-4594-8767-5442e1d2f5b9\") " pod="openstack/glance-default-external-api-0" Dec 01 08:40:09 crc kubenswrapper[5004]: I1201 08:40:09.172543 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa6628b9-be2f-4594-8767-5442e1d2f5b9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fa6628b9-be2f-4594-8767-5442e1d2f5b9\") " pod="openstack/glance-default-external-api-0" Dec 01 08:40:09 crc kubenswrapper[5004]: I1201 08:40:09.172657 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"fa6628b9-be2f-4594-8767-5442e1d2f5b9\") " pod="openstack/glance-default-external-api-0" Dec 01 08:40:09 crc kubenswrapper[5004]: I1201 08:40:09.173310 5004 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"fa6628b9-be2f-4594-8767-5442e1d2f5b9\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Dec 01 08:40:09 crc kubenswrapper[5004]: I1201 08:40:09.173428 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fa6628b9-be2f-4594-8767-5442e1d2f5b9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fa6628b9-be2f-4594-8767-5442e1d2f5b9\") " pod="openstack/glance-default-external-api-0" Dec 01 08:40:09 crc kubenswrapper[5004]: I1201 08:40:09.177613 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa6628b9-be2f-4594-8767-5442e1d2f5b9-scripts\") pod \"glance-default-external-api-0\" (UID: \"fa6628b9-be2f-4594-8767-5442e1d2f5b9\") " pod="openstack/glance-default-external-api-0" Dec 01 08:40:09 crc kubenswrapper[5004]: I1201 08:40:09.179547 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa6628b9-be2f-4594-8767-5442e1d2f5b9-config-data\") pod \"glance-default-external-api-0\" (UID: \"fa6628b9-be2f-4594-8767-5442e1d2f5b9\") " pod="openstack/glance-default-external-api-0" Dec 01 08:40:09 crc kubenswrapper[5004]: I1201 08:40:09.179551 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa6628b9-be2f-4594-8767-5442e1d2f5b9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fa6628b9-be2f-4594-8767-5442e1d2f5b9\") " pod="openstack/glance-default-external-api-0" Dec 01 08:40:09 crc kubenswrapper[5004]: I1201 08:40:09.188518 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa6628b9-be2f-4594-8767-5442e1d2f5b9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fa6628b9-be2f-4594-8767-5442e1d2f5b9\") " pod="openstack/glance-default-external-api-0" Dec 01 08:40:09 crc kubenswrapper[5004]: I1201 08:40:09.192348 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nv98t\" (UniqueName: \"kubernetes.io/projected/fa6628b9-be2f-4594-8767-5442e1d2f5b9-kube-api-access-nv98t\") pod \"glance-default-external-api-0\" (UID: \"fa6628b9-be2f-4594-8767-5442e1d2f5b9\") " pod="openstack/glance-default-external-api-0" Dec 01 08:40:09 crc kubenswrapper[5004]: I1201 08:40:09.213354 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"fa6628b9-be2f-4594-8767-5442e1d2f5b9\") " pod="openstack/glance-default-external-api-0" Dec 01 08:40:09 crc kubenswrapper[5004]: I1201 08:40:09.363764 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 08:40:10 crc kubenswrapper[5004]: I1201 08:40:10.344086 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57c957c4ff-9x9h5" Dec 01 08:40:10 crc kubenswrapper[5004]: I1201 08:40:10.407422 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-f2sxs"] Dec 01 08:40:10 crc kubenswrapper[5004]: I1201 08:40:10.407665 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8fbc5445-f2sxs" podUID="4e59b4e1-4729-4161-ad22-e11718c0c6fe" containerName="dnsmasq-dns" containerID="cri-o://e03e643932ac005453b74c50822abc37bafc358c6daaabb3f00e8cae6e89c37f" gracePeriod=10 Dec 01 08:40:10 crc kubenswrapper[5004]: I1201 08:40:10.663341 5004 generic.go:334] "Generic (PLEG): container finished" podID="4e59b4e1-4729-4161-ad22-e11718c0c6fe" containerID="e03e643932ac005453b74c50822abc37bafc358c6daaabb3f00e8cae6e89c37f" exitCode=0 Dec 01 08:40:10 crc kubenswrapper[5004]: I1201 08:40:10.663428 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-f2sxs" event={"ID":"4e59b4e1-4729-4161-ad22-e11718c0c6fe","Type":"ContainerDied","Data":"e03e643932ac005453b74c50822abc37bafc358c6daaabb3f00e8cae6e89c37f"} Dec 01 08:40:10 crc kubenswrapper[5004]: I1201 08:40:10.781905 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c36bd79-e85d-4f3a-acf8-28439a520611" path="/var/lib/kubelet/pods/3c36bd79-e85d-4f3a-acf8-28439a520611/volumes" Dec 01 08:40:12 crc kubenswrapper[5004]: I1201 08:40:12.564479 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-f2sxs" podUID="4e59b4e1-4729-4161-ad22-e11718c0c6fe" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.150:5353: connect: connection refused" Dec 01 08:40:16 crc kubenswrapper[5004]: I1201 08:40:16.044711 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 08:40:16 crc kubenswrapper[5004]: I1201 08:40:16.168782 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2n6d\" (UniqueName: \"kubernetes.io/projected/df95e914-4ee4-4ead-9306-4f2aa5b2c431-kube-api-access-g2n6d\") pod \"df95e914-4ee4-4ead-9306-4f2aa5b2c431\" (UID: \"df95e914-4ee4-4ead-9306-4f2aa5b2c431\") " Dec 01 08:40:16 crc kubenswrapper[5004]: I1201 08:40:16.168998 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"df95e914-4ee4-4ead-9306-4f2aa5b2c431\" (UID: \"df95e914-4ee4-4ead-9306-4f2aa5b2c431\") " Dec 01 08:40:16 crc kubenswrapper[5004]: I1201 08:40:16.169051 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/df95e914-4ee4-4ead-9306-4f2aa5b2c431-httpd-run\") pod \"df95e914-4ee4-4ead-9306-4f2aa5b2c431\" (UID: \"df95e914-4ee4-4ead-9306-4f2aa5b2c431\") " Dec 01 08:40:16 crc kubenswrapper[5004]: I1201 08:40:16.169130 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df95e914-4ee4-4ead-9306-4f2aa5b2c431-combined-ca-bundle\") pod \"df95e914-4ee4-4ead-9306-4f2aa5b2c431\" (UID: \"df95e914-4ee4-4ead-9306-4f2aa5b2c431\") " Dec 01 08:40:16 crc kubenswrapper[5004]: I1201 08:40:16.169164 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df95e914-4ee4-4ead-9306-4f2aa5b2c431-config-data\") pod \"df95e914-4ee4-4ead-9306-4f2aa5b2c431\" (UID: \"df95e914-4ee4-4ead-9306-4f2aa5b2c431\") " Dec 01 08:40:16 crc kubenswrapper[5004]: I1201 08:40:16.169221 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df95e914-4ee4-4ead-9306-4f2aa5b2c431-scripts\") pod \"df95e914-4ee4-4ead-9306-4f2aa5b2c431\" (UID: \"df95e914-4ee4-4ead-9306-4f2aa5b2c431\") " Dec 01 08:40:16 crc kubenswrapper[5004]: I1201 08:40:16.169239 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df95e914-4ee4-4ead-9306-4f2aa5b2c431-logs\") pod \"df95e914-4ee4-4ead-9306-4f2aa5b2c431\" (UID: \"df95e914-4ee4-4ead-9306-4f2aa5b2c431\") " Dec 01 08:40:16 crc kubenswrapper[5004]: I1201 08:40:16.169314 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/df95e914-4ee4-4ead-9306-4f2aa5b2c431-internal-tls-certs\") pod \"df95e914-4ee4-4ead-9306-4f2aa5b2c431\" (UID: \"df95e914-4ee4-4ead-9306-4f2aa5b2c431\") " Dec 01 08:40:16 crc kubenswrapper[5004]: I1201 08:40:16.169859 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df95e914-4ee4-4ead-9306-4f2aa5b2c431-logs" (OuterVolumeSpecName: "logs") pod "df95e914-4ee4-4ead-9306-4f2aa5b2c431" (UID: "df95e914-4ee4-4ead-9306-4f2aa5b2c431"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:40:16 crc kubenswrapper[5004]: I1201 08:40:16.170274 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df95e914-4ee4-4ead-9306-4f2aa5b2c431-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "df95e914-4ee4-4ead-9306-4f2aa5b2c431" (UID: "df95e914-4ee4-4ead-9306-4f2aa5b2c431"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:40:16 crc kubenswrapper[5004]: I1201 08:40:16.177842 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df95e914-4ee4-4ead-9306-4f2aa5b2c431-scripts" (OuterVolumeSpecName: "scripts") pod "df95e914-4ee4-4ead-9306-4f2aa5b2c431" (UID: "df95e914-4ee4-4ead-9306-4f2aa5b2c431"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:40:16 crc kubenswrapper[5004]: I1201 08:40:16.178722 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df95e914-4ee4-4ead-9306-4f2aa5b2c431-kube-api-access-g2n6d" (OuterVolumeSpecName: "kube-api-access-g2n6d") pod "df95e914-4ee4-4ead-9306-4f2aa5b2c431" (UID: "df95e914-4ee4-4ead-9306-4f2aa5b2c431"). InnerVolumeSpecName "kube-api-access-g2n6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:40:16 crc kubenswrapper[5004]: I1201 08:40:16.178956 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "df95e914-4ee4-4ead-9306-4f2aa5b2c431" (UID: "df95e914-4ee4-4ead-9306-4f2aa5b2c431"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 08:40:16 crc kubenswrapper[5004]: I1201 08:40:16.208572 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df95e914-4ee4-4ead-9306-4f2aa5b2c431-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "df95e914-4ee4-4ead-9306-4f2aa5b2c431" (UID: "df95e914-4ee4-4ead-9306-4f2aa5b2c431"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:40:16 crc kubenswrapper[5004]: I1201 08:40:16.235252 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df95e914-4ee4-4ead-9306-4f2aa5b2c431-config-data" (OuterVolumeSpecName: "config-data") pod "df95e914-4ee4-4ead-9306-4f2aa5b2c431" (UID: "df95e914-4ee4-4ead-9306-4f2aa5b2c431"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:40:16 crc kubenswrapper[5004]: I1201 08:40:16.257826 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df95e914-4ee4-4ead-9306-4f2aa5b2c431-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "df95e914-4ee4-4ead-9306-4f2aa5b2c431" (UID: "df95e914-4ee4-4ead-9306-4f2aa5b2c431"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:40:16 crc kubenswrapper[5004]: I1201 08:40:16.272120 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2n6d\" (UniqueName: \"kubernetes.io/projected/df95e914-4ee4-4ead-9306-4f2aa5b2c431-kube-api-access-g2n6d\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:16 crc kubenswrapper[5004]: I1201 08:40:16.272166 5004 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Dec 01 08:40:16 crc kubenswrapper[5004]: I1201 08:40:16.272178 5004 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/df95e914-4ee4-4ead-9306-4f2aa5b2c431-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:16 crc kubenswrapper[5004]: I1201 08:40:16.272187 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df95e914-4ee4-4ead-9306-4f2aa5b2c431-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:16 crc kubenswrapper[5004]: I1201 08:40:16.272195 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df95e914-4ee4-4ead-9306-4f2aa5b2c431-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:16 crc kubenswrapper[5004]: I1201 08:40:16.272202 5004 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df95e914-4ee4-4ead-9306-4f2aa5b2c431-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:16 crc kubenswrapper[5004]: I1201 08:40:16.272212 5004 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df95e914-4ee4-4ead-9306-4f2aa5b2c431-logs\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:16 crc kubenswrapper[5004]: I1201 08:40:16.272220 5004 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/df95e914-4ee4-4ead-9306-4f2aa5b2c431-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:16 crc kubenswrapper[5004]: I1201 08:40:16.300065 5004 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Dec 01 08:40:16 crc kubenswrapper[5004]: I1201 08:40:16.374416 5004 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:16 crc kubenswrapper[5004]: I1201 08:40:16.743506 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"df95e914-4ee4-4ead-9306-4f2aa5b2c431","Type":"ContainerDied","Data":"a4b544cd0fcc4a8e0b8f51a7f849f13930b8648f58482c0413bd8b9b3ed03877"} Dec 01 08:40:16 crc kubenswrapper[5004]: I1201 08:40:16.743596 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 08:40:16 crc kubenswrapper[5004]: I1201 08:40:16.795863 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 08:40:16 crc kubenswrapper[5004]: I1201 08:40:16.820741 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 08:40:16 crc kubenswrapper[5004]: I1201 08:40:16.872913 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 08:40:16 crc kubenswrapper[5004]: E1201 08:40:16.873455 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df95e914-4ee4-4ead-9306-4f2aa5b2c431" containerName="glance-httpd" Dec 01 08:40:16 crc kubenswrapper[5004]: I1201 08:40:16.873475 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="df95e914-4ee4-4ead-9306-4f2aa5b2c431" containerName="glance-httpd" Dec 01 08:40:16 crc kubenswrapper[5004]: E1201 08:40:16.873494 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df95e914-4ee4-4ead-9306-4f2aa5b2c431" containerName="glance-log" Dec 01 08:40:16 crc kubenswrapper[5004]: I1201 08:40:16.873503 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="df95e914-4ee4-4ead-9306-4f2aa5b2c431" containerName="glance-log" Dec 01 08:40:16 crc kubenswrapper[5004]: I1201 08:40:16.873800 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="df95e914-4ee4-4ead-9306-4f2aa5b2c431" containerName="glance-httpd" Dec 01 08:40:16 crc kubenswrapper[5004]: I1201 08:40:16.873840 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="df95e914-4ee4-4ead-9306-4f2aa5b2c431" containerName="glance-log" Dec 01 08:40:16 crc kubenswrapper[5004]: I1201 08:40:16.875238 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 08:40:16 crc kubenswrapper[5004]: I1201 08:40:16.878614 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 01 08:40:16 crc kubenswrapper[5004]: I1201 08:40:16.878649 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 01 08:40:16 crc kubenswrapper[5004]: I1201 08:40:16.891955 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 08:40:16 crc kubenswrapper[5004]: I1201 08:40:16.993704 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64235e39-5760-4a2e-a164-7cb27ca906a3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"64235e39-5760-4a2e-a164-7cb27ca906a3\") " pod="openstack/glance-default-internal-api-0" Dec 01 08:40:16 crc kubenswrapper[5004]: I1201 08:40:16.993760 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64235e39-5760-4a2e-a164-7cb27ca906a3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"64235e39-5760-4a2e-a164-7cb27ca906a3\") " pod="openstack/glance-default-internal-api-0" Dec 01 08:40:16 crc kubenswrapper[5004]: I1201 08:40:16.993910 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/64235e39-5760-4a2e-a164-7cb27ca906a3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"64235e39-5760-4a2e-a164-7cb27ca906a3\") " pod="openstack/glance-default-internal-api-0" Dec 01 08:40:16 crc kubenswrapper[5004]: I1201 08:40:16.993961 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"64235e39-5760-4a2e-a164-7cb27ca906a3\") " pod="openstack/glance-default-internal-api-0" Dec 01 08:40:16 crc kubenswrapper[5004]: I1201 08:40:16.993982 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/64235e39-5760-4a2e-a164-7cb27ca906a3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"64235e39-5760-4a2e-a164-7cb27ca906a3\") " pod="openstack/glance-default-internal-api-0" Dec 01 08:40:16 crc kubenswrapper[5004]: I1201 08:40:16.993997 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64235e39-5760-4a2e-a164-7cb27ca906a3-logs\") pod \"glance-default-internal-api-0\" (UID: \"64235e39-5760-4a2e-a164-7cb27ca906a3\") " pod="openstack/glance-default-internal-api-0" Dec 01 08:40:16 crc kubenswrapper[5004]: I1201 08:40:16.994018 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twpdz\" (UniqueName: \"kubernetes.io/projected/64235e39-5760-4a2e-a164-7cb27ca906a3-kube-api-access-twpdz\") pod \"glance-default-internal-api-0\" (UID: \"64235e39-5760-4a2e-a164-7cb27ca906a3\") " pod="openstack/glance-default-internal-api-0" Dec 01 08:40:16 crc kubenswrapper[5004]: I1201 08:40:16.994087 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64235e39-5760-4a2e-a164-7cb27ca906a3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"64235e39-5760-4a2e-a164-7cb27ca906a3\") " pod="openstack/glance-default-internal-api-0" Dec 01 08:40:17 crc kubenswrapper[5004]: I1201 08:40:17.095477 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64235e39-5760-4a2e-a164-7cb27ca906a3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"64235e39-5760-4a2e-a164-7cb27ca906a3\") " pod="openstack/glance-default-internal-api-0" Dec 01 08:40:17 crc kubenswrapper[5004]: I1201 08:40:17.095536 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64235e39-5760-4a2e-a164-7cb27ca906a3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"64235e39-5760-4a2e-a164-7cb27ca906a3\") " pod="openstack/glance-default-internal-api-0" Dec 01 08:40:17 crc kubenswrapper[5004]: I1201 08:40:17.095631 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/64235e39-5760-4a2e-a164-7cb27ca906a3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"64235e39-5760-4a2e-a164-7cb27ca906a3\") " pod="openstack/glance-default-internal-api-0" Dec 01 08:40:17 crc kubenswrapper[5004]: I1201 08:40:17.095678 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"64235e39-5760-4a2e-a164-7cb27ca906a3\") " pod="openstack/glance-default-internal-api-0" Dec 01 08:40:17 crc kubenswrapper[5004]: I1201 08:40:17.095706 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/64235e39-5760-4a2e-a164-7cb27ca906a3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"64235e39-5760-4a2e-a164-7cb27ca906a3\") " pod="openstack/glance-default-internal-api-0" Dec 01 08:40:17 crc kubenswrapper[5004]: I1201 08:40:17.095731 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64235e39-5760-4a2e-a164-7cb27ca906a3-logs\") pod \"glance-default-internal-api-0\" (UID: \"64235e39-5760-4a2e-a164-7cb27ca906a3\") " pod="openstack/glance-default-internal-api-0" Dec 01 08:40:17 crc kubenswrapper[5004]: I1201 08:40:17.095759 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twpdz\" (UniqueName: \"kubernetes.io/projected/64235e39-5760-4a2e-a164-7cb27ca906a3-kube-api-access-twpdz\") pod \"glance-default-internal-api-0\" (UID: \"64235e39-5760-4a2e-a164-7cb27ca906a3\") " pod="openstack/glance-default-internal-api-0" Dec 01 08:40:17 crc kubenswrapper[5004]: I1201 08:40:17.095836 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64235e39-5760-4a2e-a164-7cb27ca906a3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"64235e39-5760-4a2e-a164-7cb27ca906a3\") " pod="openstack/glance-default-internal-api-0" Dec 01 08:40:17 crc kubenswrapper[5004]: I1201 08:40:17.096663 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/64235e39-5760-4a2e-a164-7cb27ca906a3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"64235e39-5760-4a2e-a164-7cb27ca906a3\") " pod="openstack/glance-default-internal-api-0" Dec 01 08:40:17 crc kubenswrapper[5004]: I1201 08:40:17.096700 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64235e39-5760-4a2e-a164-7cb27ca906a3-logs\") pod \"glance-default-internal-api-0\" (UID: \"64235e39-5760-4a2e-a164-7cb27ca906a3\") " pod="openstack/glance-default-internal-api-0" Dec 01 08:40:17 crc kubenswrapper[5004]: I1201 08:40:17.096947 5004 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"64235e39-5760-4a2e-a164-7cb27ca906a3\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Dec 01 08:40:17 crc kubenswrapper[5004]: I1201 08:40:17.099706 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64235e39-5760-4a2e-a164-7cb27ca906a3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"64235e39-5760-4a2e-a164-7cb27ca906a3\") " pod="openstack/glance-default-internal-api-0" Dec 01 08:40:17 crc kubenswrapper[5004]: I1201 08:40:17.104234 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64235e39-5760-4a2e-a164-7cb27ca906a3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"64235e39-5760-4a2e-a164-7cb27ca906a3\") " pod="openstack/glance-default-internal-api-0" Dec 01 08:40:17 crc kubenswrapper[5004]: I1201 08:40:17.105320 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64235e39-5760-4a2e-a164-7cb27ca906a3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"64235e39-5760-4a2e-a164-7cb27ca906a3\") " pod="openstack/glance-default-internal-api-0" Dec 01 08:40:17 crc kubenswrapper[5004]: I1201 08:40:17.107124 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/64235e39-5760-4a2e-a164-7cb27ca906a3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"64235e39-5760-4a2e-a164-7cb27ca906a3\") " pod="openstack/glance-default-internal-api-0" Dec 01 08:40:17 crc kubenswrapper[5004]: I1201 08:40:17.124448 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twpdz\" (UniqueName: \"kubernetes.io/projected/64235e39-5760-4a2e-a164-7cb27ca906a3-kube-api-access-twpdz\") pod \"glance-default-internal-api-0\" (UID: \"64235e39-5760-4a2e-a164-7cb27ca906a3\") " pod="openstack/glance-default-internal-api-0" Dec 01 08:40:17 crc kubenswrapper[5004]: I1201 08:40:17.135860 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"64235e39-5760-4a2e-a164-7cb27ca906a3\") " pod="openstack/glance-default-internal-api-0" Dec 01 08:40:17 crc kubenswrapper[5004]: I1201 08:40:17.221506 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 08:40:17 crc kubenswrapper[5004]: I1201 08:40:17.563236 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-f2sxs" podUID="4e59b4e1-4729-4161-ad22-e11718c0c6fe" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.150:5353: connect: connection refused" Dec 01 08:40:17 crc kubenswrapper[5004]: I1201 08:40:17.993656 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bkslc" Dec 01 08:40:18 crc kubenswrapper[5004]: I1201 08:40:18.147287 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47efa747-2a3f-4e7f-b1c2-222dd039c1fe-combined-ca-bundle\") pod \"47efa747-2a3f-4e7f-b1c2-222dd039c1fe\" (UID: \"47efa747-2a3f-4e7f-b1c2-222dd039c1fe\") " Dec 01 08:40:18 crc kubenswrapper[5004]: I1201 08:40:18.147363 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47efa747-2a3f-4e7f-b1c2-222dd039c1fe-scripts\") pod \"47efa747-2a3f-4e7f-b1c2-222dd039c1fe\" (UID: \"47efa747-2a3f-4e7f-b1c2-222dd039c1fe\") " Dec 01 08:40:18 crc kubenswrapper[5004]: I1201 08:40:18.147429 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5th24\" (UniqueName: \"kubernetes.io/projected/47efa747-2a3f-4e7f-b1c2-222dd039c1fe-kube-api-access-5th24\") pod \"47efa747-2a3f-4e7f-b1c2-222dd039c1fe\" (UID: \"47efa747-2a3f-4e7f-b1c2-222dd039c1fe\") " Dec 01 08:40:18 crc kubenswrapper[5004]: I1201 08:40:18.147640 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/47efa747-2a3f-4e7f-b1c2-222dd039c1fe-fernet-keys\") pod \"47efa747-2a3f-4e7f-b1c2-222dd039c1fe\" (UID: \"47efa747-2a3f-4e7f-b1c2-222dd039c1fe\") " Dec 01 08:40:18 crc kubenswrapper[5004]: I1201 08:40:18.147785 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47efa747-2a3f-4e7f-b1c2-222dd039c1fe-config-data\") pod \"47efa747-2a3f-4e7f-b1c2-222dd039c1fe\" (UID: \"47efa747-2a3f-4e7f-b1c2-222dd039c1fe\") " Dec 01 08:40:18 crc kubenswrapper[5004]: I1201 08:40:18.147850 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/47efa747-2a3f-4e7f-b1c2-222dd039c1fe-credential-keys\") pod \"47efa747-2a3f-4e7f-b1c2-222dd039c1fe\" (UID: \"47efa747-2a3f-4e7f-b1c2-222dd039c1fe\") " Dec 01 08:40:18 crc kubenswrapper[5004]: I1201 08:40:18.155828 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47efa747-2a3f-4e7f-b1c2-222dd039c1fe-scripts" (OuterVolumeSpecName: "scripts") pod "47efa747-2a3f-4e7f-b1c2-222dd039c1fe" (UID: "47efa747-2a3f-4e7f-b1c2-222dd039c1fe"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:40:18 crc kubenswrapper[5004]: I1201 08:40:18.155910 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47efa747-2a3f-4e7f-b1c2-222dd039c1fe-kube-api-access-5th24" (OuterVolumeSpecName: "kube-api-access-5th24") pod "47efa747-2a3f-4e7f-b1c2-222dd039c1fe" (UID: "47efa747-2a3f-4e7f-b1c2-222dd039c1fe"). InnerVolumeSpecName "kube-api-access-5th24". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:40:18 crc kubenswrapper[5004]: I1201 08:40:18.156536 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47efa747-2a3f-4e7f-b1c2-222dd039c1fe-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "47efa747-2a3f-4e7f-b1c2-222dd039c1fe" (UID: "47efa747-2a3f-4e7f-b1c2-222dd039c1fe"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:40:18 crc kubenswrapper[5004]: I1201 08:40:18.166042 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47efa747-2a3f-4e7f-b1c2-222dd039c1fe-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "47efa747-2a3f-4e7f-b1c2-222dd039c1fe" (UID: "47efa747-2a3f-4e7f-b1c2-222dd039c1fe"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:40:18 crc kubenswrapper[5004]: I1201 08:40:18.195745 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47efa747-2a3f-4e7f-b1c2-222dd039c1fe-config-data" (OuterVolumeSpecName: "config-data") pod "47efa747-2a3f-4e7f-b1c2-222dd039c1fe" (UID: "47efa747-2a3f-4e7f-b1c2-222dd039c1fe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:40:18 crc kubenswrapper[5004]: I1201 08:40:18.222585 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47efa747-2a3f-4e7f-b1c2-222dd039c1fe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "47efa747-2a3f-4e7f-b1c2-222dd039c1fe" (UID: "47efa747-2a3f-4e7f-b1c2-222dd039c1fe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:40:18 crc kubenswrapper[5004]: I1201 08:40:18.250586 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47efa747-2a3f-4e7f-b1c2-222dd039c1fe-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:18 crc kubenswrapper[5004]: I1201 08:40:18.250866 5004 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/47efa747-2a3f-4e7f-b1c2-222dd039c1fe-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:18 crc kubenswrapper[5004]: I1201 08:40:18.250878 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47efa747-2a3f-4e7f-b1c2-222dd039c1fe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:18 crc kubenswrapper[5004]: I1201 08:40:18.250887 5004 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47efa747-2a3f-4e7f-b1c2-222dd039c1fe-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:18 crc kubenswrapper[5004]: I1201 08:40:18.250896 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5th24\" (UniqueName: \"kubernetes.io/projected/47efa747-2a3f-4e7f-b1c2-222dd039c1fe-kube-api-access-5th24\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:18 crc kubenswrapper[5004]: I1201 08:40:18.250904 5004 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/47efa747-2a3f-4e7f-b1c2-222dd039c1fe-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:18 crc kubenswrapper[5004]: I1201 08:40:18.766409 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bkslc" Dec 01 08:40:18 crc kubenswrapper[5004]: I1201 08:40:18.775170 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df95e914-4ee4-4ead-9306-4f2aa5b2c431" path="/var/lib/kubelet/pods/df95e914-4ee4-4ead-9306-4f2aa5b2c431/volumes" Dec 01 08:40:18 crc kubenswrapper[5004]: I1201 08:40:18.777045 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bkslc" event={"ID":"47efa747-2a3f-4e7f-b1c2-222dd039c1fe","Type":"ContainerDied","Data":"c0c08ee3ae527b0a970d28f9069cde22587ba80161f470e8def06e995988a6ef"} Dec 01 08:40:18 crc kubenswrapper[5004]: I1201 08:40:18.777098 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0c08ee3ae527b0a970d28f9069cde22587ba80161f470e8def06e995988a6ef" Dec 01 08:40:19 crc kubenswrapper[5004]: I1201 08:40:19.094529 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-bkslc"] Dec 01 08:40:19 crc kubenswrapper[5004]: I1201 08:40:19.106844 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-bkslc"] Dec 01 08:40:19 crc kubenswrapper[5004]: I1201 08:40:19.191035 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-hlrzt"] Dec 01 08:40:19 crc kubenswrapper[5004]: E1201 08:40:19.191638 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47efa747-2a3f-4e7f-b1c2-222dd039c1fe" containerName="keystone-bootstrap" Dec 01 08:40:19 crc kubenswrapper[5004]: I1201 08:40:19.191658 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="47efa747-2a3f-4e7f-b1c2-222dd039c1fe" containerName="keystone-bootstrap" Dec 01 08:40:19 crc kubenswrapper[5004]: I1201 08:40:19.192028 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="47efa747-2a3f-4e7f-b1c2-222dd039c1fe" containerName="keystone-bootstrap" Dec 01 08:40:19 crc kubenswrapper[5004]: I1201 08:40:19.193125 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-hlrzt" Dec 01 08:40:19 crc kubenswrapper[5004]: I1201 08:40:19.195279 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-nlkxt" Dec 01 08:40:19 crc kubenswrapper[5004]: I1201 08:40:19.195325 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 01 08:40:19 crc kubenswrapper[5004]: I1201 08:40:19.195367 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 01 08:40:19 crc kubenswrapper[5004]: I1201 08:40:19.195806 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 01 08:40:19 crc kubenswrapper[5004]: I1201 08:40:19.196556 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 01 08:40:19 crc kubenswrapper[5004]: I1201 08:40:19.207529 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-hlrzt"] Dec 01 08:40:19 crc kubenswrapper[5004]: I1201 08:40:19.270587 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/713ee9f1-6421-4ffa-aed2-5f762d8cba63-combined-ca-bundle\") pod \"keystone-bootstrap-hlrzt\" (UID: \"713ee9f1-6421-4ffa-aed2-5f762d8cba63\") " pod="openstack/keystone-bootstrap-hlrzt" Dec 01 08:40:19 crc kubenswrapper[5004]: I1201 08:40:19.270654 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/713ee9f1-6421-4ffa-aed2-5f762d8cba63-scripts\") pod \"keystone-bootstrap-hlrzt\" (UID: \"713ee9f1-6421-4ffa-aed2-5f762d8cba63\") " pod="openstack/keystone-bootstrap-hlrzt" Dec 01 08:40:19 crc kubenswrapper[5004]: I1201 08:40:19.270779 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/713ee9f1-6421-4ffa-aed2-5f762d8cba63-credential-keys\") pod \"keystone-bootstrap-hlrzt\" (UID: \"713ee9f1-6421-4ffa-aed2-5f762d8cba63\") " pod="openstack/keystone-bootstrap-hlrzt" Dec 01 08:40:19 crc kubenswrapper[5004]: I1201 08:40:19.270848 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/713ee9f1-6421-4ffa-aed2-5f762d8cba63-fernet-keys\") pod \"keystone-bootstrap-hlrzt\" (UID: \"713ee9f1-6421-4ffa-aed2-5f762d8cba63\") " pod="openstack/keystone-bootstrap-hlrzt" Dec 01 08:40:19 crc kubenswrapper[5004]: I1201 08:40:19.271183 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/713ee9f1-6421-4ffa-aed2-5f762d8cba63-config-data\") pod \"keystone-bootstrap-hlrzt\" (UID: \"713ee9f1-6421-4ffa-aed2-5f762d8cba63\") " pod="openstack/keystone-bootstrap-hlrzt" Dec 01 08:40:19 crc kubenswrapper[5004]: I1201 08:40:19.271359 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plz57\" (UniqueName: \"kubernetes.io/projected/713ee9f1-6421-4ffa-aed2-5f762d8cba63-kube-api-access-plz57\") pod \"keystone-bootstrap-hlrzt\" (UID: \"713ee9f1-6421-4ffa-aed2-5f762d8cba63\") " pod="openstack/keystone-bootstrap-hlrzt" Dec 01 08:40:19 crc kubenswrapper[5004]: I1201 08:40:19.373350 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/713ee9f1-6421-4ffa-aed2-5f762d8cba63-config-data\") pod \"keystone-bootstrap-hlrzt\" (UID: \"713ee9f1-6421-4ffa-aed2-5f762d8cba63\") " pod="openstack/keystone-bootstrap-hlrzt" Dec 01 08:40:19 crc kubenswrapper[5004]: I1201 08:40:19.373814 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plz57\" (UniqueName: \"kubernetes.io/projected/713ee9f1-6421-4ffa-aed2-5f762d8cba63-kube-api-access-plz57\") pod \"keystone-bootstrap-hlrzt\" (UID: \"713ee9f1-6421-4ffa-aed2-5f762d8cba63\") " pod="openstack/keystone-bootstrap-hlrzt" Dec 01 08:40:19 crc kubenswrapper[5004]: I1201 08:40:19.374035 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/713ee9f1-6421-4ffa-aed2-5f762d8cba63-combined-ca-bundle\") pod \"keystone-bootstrap-hlrzt\" (UID: \"713ee9f1-6421-4ffa-aed2-5f762d8cba63\") " pod="openstack/keystone-bootstrap-hlrzt" Dec 01 08:40:19 crc kubenswrapper[5004]: I1201 08:40:19.374203 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/713ee9f1-6421-4ffa-aed2-5f762d8cba63-scripts\") pod \"keystone-bootstrap-hlrzt\" (UID: \"713ee9f1-6421-4ffa-aed2-5f762d8cba63\") " pod="openstack/keystone-bootstrap-hlrzt" Dec 01 08:40:19 crc kubenswrapper[5004]: I1201 08:40:19.374403 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/713ee9f1-6421-4ffa-aed2-5f762d8cba63-credential-keys\") pod \"keystone-bootstrap-hlrzt\" (UID: \"713ee9f1-6421-4ffa-aed2-5f762d8cba63\") " pod="openstack/keystone-bootstrap-hlrzt" Dec 01 08:40:19 crc kubenswrapper[5004]: I1201 08:40:19.374587 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/713ee9f1-6421-4ffa-aed2-5f762d8cba63-fernet-keys\") pod \"keystone-bootstrap-hlrzt\" (UID: \"713ee9f1-6421-4ffa-aed2-5f762d8cba63\") " pod="openstack/keystone-bootstrap-hlrzt" Dec 01 08:40:19 crc kubenswrapper[5004]: I1201 08:40:19.378693 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/713ee9f1-6421-4ffa-aed2-5f762d8cba63-scripts\") pod \"keystone-bootstrap-hlrzt\" (UID: \"713ee9f1-6421-4ffa-aed2-5f762d8cba63\") " pod="openstack/keystone-bootstrap-hlrzt" Dec 01 08:40:19 crc kubenswrapper[5004]: I1201 08:40:19.378774 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/713ee9f1-6421-4ffa-aed2-5f762d8cba63-config-data\") pod \"keystone-bootstrap-hlrzt\" (UID: \"713ee9f1-6421-4ffa-aed2-5f762d8cba63\") " pod="openstack/keystone-bootstrap-hlrzt" Dec 01 08:40:19 crc kubenswrapper[5004]: I1201 08:40:19.383989 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/713ee9f1-6421-4ffa-aed2-5f762d8cba63-credential-keys\") pod \"keystone-bootstrap-hlrzt\" (UID: \"713ee9f1-6421-4ffa-aed2-5f762d8cba63\") " pod="openstack/keystone-bootstrap-hlrzt" Dec 01 08:40:19 crc kubenswrapper[5004]: I1201 08:40:19.384020 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/713ee9f1-6421-4ffa-aed2-5f762d8cba63-fernet-keys\") pod \"keystone-bootstrap-hlrzt\" (UID: \"713ee9f1-6421-4ffa-aed2-5f762d8cba63\") " pod="openstack/keystone-bootstrap-hlrzt" Dec 01 08:40:19 crc kubenswrapper[5004]: I1201 08:40:19.387039 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/713ee9f1-6421-4ffa-aed2-5f762d8cba63-combined-ca-bundle\") pod \"keystone-bootstrap-hlrzt\" (UID: \"713ee9f1-6421-4ffa-aed2-5f762d8cba63\") " pod="openstack/keystone-bootstrap-hlrzt" Dec 01 08:40:19 crc kubenswrapper[5004]: I1201 08:40:19.395239 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plz57\" (UniqueName: \"kubernetes.io/projected/713ee9f1-6421-4ffa-aed2-5f762d8cba63-kube-api-access-plz57\") pod \"keystone-bootstrap-hlrzt\" (UID: \"713ee9f1-6421-4ffa-aed2-5f762d8cba63\") " pod="openstack/keystone-bootstrap-hlrzt" Dec 01 08:40:19 crc kubenswrapper[5004]: I1201 08:40:19.514444 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-hlrzt" Dec 01 08:40:20 crc kubenswrapper[5004]: E1201 08:40:20.455265 5004 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Dec 01 08:40:20 crc kubenswrapper[5004]: E1201 08:40:20.455621 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lv29d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-vgmk8_openstack(0eccea77-d6ee-4592-ad47-1f29ca2a943b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 08:40:20 crc kubenswrapper[5004]: E1201 08:40:20.457289 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-vgmk8" podUID="0eccea77-d6ee-4592-ad47-1f29ca2a943b" Dec 01 08:40:20 crc kubenswrapper[5004]: I1201 08:40:20.775951 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47efa747-2a3f-4e7f-b1c2-222dd039c1fe" path="/var/lib/kubelet/pods/47efa747-2a3f-4e7f-b1c2-222dd039c1fe/volumes" Dec 01 08:40:20 crc kubenswrapper[5004]: E1201 08:40:20.792726 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-vgmk8" podUID="0eccea77-d6ee-4592-ad47-1f29ca2a943b" Dec 01 08:40:21 crc kubenswrapper[5004]: I1201 08:40:21.803742 5004 generic.go:334] "Generic (PLEG): container finished" podID="87079b8d-839c-42d2-95d1-33dee4ca61e1" containerID="2e5a177f0147bb0a08e716773ead0c5f7639009e7d855103ecf90be8667d222d" exitCode=0 Dec 01 08:40:21 crc kubenswrapper[5004]: I1201 08:40:21.803793 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-68h6f" event={"ID":"87079b8d-839c-42d2-95d1-33dee4ca61e1","Type":"ContainerDied","Data":"2e5a177f0147bb0a08e716773ead0c5f7639009e7d855103ecf90be8667d222d"} Dec 01 08:40:27 crc kubenswrapper[5004]: I1201 08:40:27.564727 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-f2sxs" podUID="4e59b4e1-4729-4161-ad22-e11718c0c6fe" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.150:5353: i/o timeout" Dec 01 08:40:27 crc kubenswrapper[5004]: I1201 08:40:27.565790 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-f2sxs" Dec 01 08:40:27 crc kubenswrapper[5004]: E1201 08:40:27.925486 5004 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified" Dec 01 08:40:27 crc kubenswrapper[5004]: E1201 08:40:27.925669 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:heat-db-sync,Image:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,Command:[/bin/bash],Args:[-c /usr/bin/heat-manage --config-dir /etc/heat/heat.conf.d db_sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/heat/heat.conf.d/00-default.conf,SubPath:00-default.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/heat/heat.conf.d/01-custom.conf,SubPath:01-custom.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gs5vj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42418,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42418,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-db-sync-cwgb2_openstack(fb372dfc-6007-42ba-bc16-96f7d99d8b98): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 08:40:27 crc kubenswrapper[5004]: E1201 08:40:27.926872 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/heat-db-sync-cwgb2" podUID="fb372dfc-6007-42ba-bc16-96f7d99d8b98" Dec 01 08:40:28 crc kubenswrapper[5004]: E1201 08:40:28.253226 5004 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Dec 01 08:40:28 crc kubenswrapper[5004]: E1201 08:40:28.253454 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n656h65bhf4h68dh657h567h649hdch5d6h686h59dh5b9h7bh64h66fh689h559h5bfh5f9h85hc4hbchcbhd5h55ch6hf6h547h67h5f9h664h77q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2mfkl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(1fd8d826-4dab-4d07-bc04-be5dfebdaf2b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 08:40:28 crc kubenswrapper[5004]: I1201 08:40:28.269373 5004 scope.go:117] "RemoveContainer" containerID="260be28cadb69a876e5e2fb0ea3a8703952eccc9ec875bac39213f374c659229" Dec 01 08:40:28 crc kubenswrapper[5004]: I1201 08:40:28.383011 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-68h6f" Dec 01 08:40:28 crc kubenswrapper[5004]: I1201 08:40:28.396313 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-f2sxs" Dec 01 08:40:28 crc kubenswrapper[5004]: I1201 08:40:28.403656 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/87079b8d-839c-42d2-95d1-33dee4ca61e1-config\") pod \"87079b8d-839c-42d2-95d1-33dee4ca61e1\" (UID: \"87079b8d-839c-42d2-95d1-33dee4ca61e1\") " Dec 01 08:40:28 crc kubenswrapper[5004]: I1201 08:40:28.403827 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87079b8d-839c-42d2-95d1-33dee4ca61e1-combined-ca-bundle\") pod \"87079b8d-839c-42d2-95d1-33dee4ca61e1\" (UID: \"87079b8d-839c-42d2-95d1-33dee4ca61e1\") " Dec 01 08:40:28 crc kubenswrapper[5004]: I1201 08:40:28.403959 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwgfm\" (UniqueName: \"kubernetes.io/projected/87079b8d-839c-42d2-95d1-33dee4ca61e1-kube-api-access-fwgfm\") pod \"87079b8d-839c-42d2-95d1-33dee4ca61e1\" (UID: \"87079b8d-839c-42d2-95d1-33dee4ca61e1\") " Dec 01 08:40:28 crc kubenswrapper[5004]: I1201 08:40:28.404020 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e59b4e1-4729-4161-ad22-e11718c0c6fe-ovsdbserver-sb\") pod \"4e59b4e1-4729-4161-ad22-e11718c0c6fe\" (UID: \"4e59b4e1-4729-4161-ad22-e11718c0c6fe\") " Dec 01 08:40:28 crc kubenswrapper[5004]: I1201 08:40:28.408609 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87079b8d-839c-42d2-95d1-33dee4ca61e1-kube-api-access-fwgfm" (OuterVolumeSpecName: "kube-api-access-fwgfm") pod "87079b8d-839c-42d2-95d1-33dee4ca61e1" (UID: "87079b8d-839c-42d2-95d1-33dee4ca61e1"). InnerVolumeSpecName "kube-api-access-fwgfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:40:28 crc kubenswrapper[5004]: I1201 08:40:28.474673 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87079b8d-839c-42d2-95d1-33dee4ca61e1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "87079b8d-839c-42d2-95d1-33dee4ca61e1" (UID: "87079b8d-839c-42d2-95d1-33dee4ca61e1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:40:28 crc kubenswrapper[5004]: I1201 08:40:28.487828 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e59b4e1-4729-4161-ad22-e11718c0c6fe-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4e59b4e1-4729-4161-ad22-e11718c0c6fe" (UID: "4e59b4e1-4729-4161-ad22-e11718c0c6fe"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:40:28 crc kubenswrapper[5004]: I1201 08:40:28.488726 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87079b8d-839c-42d2-95d1-33dee4ca61e1-config" (OuterVolumeSpecName: "config") pod "87079b8d-839c-42d2-95d1-33dee4ca61e1" (UID: "87079b8d-839c-42d2-95d1-33dee4ca61e1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:40:28 crc kubenswrapper[5004]: I1201 08:40:28.505943 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e59b4e1-4729-4161-ad22-e11718c0c6fe-config\") pod \"4e59b4e1-4729-4161-ad22-e11718c0c6fe\" (UID: \"4e59b4e1-4729-4161-ad22-e11718c0c6fe\") " Dec 01 08:40:28 crc kubenswrapper[5004]: I1201 08:40:28.506020 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e59b4e1-4729-4161-ad22-e11718c0c6fe-dns-svc\") pod \"4e59b4e1-4729-4161-ad22-e11718c0c6fe\" (UID: \"4e59b4e1-4729-4161-ad22-e11718c0c6fe\") " Dec 01 08:40:28 crc kubenswrapper[5004]: I1201 08:40:28.506053 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e59b4e1-4729-4161-ad22-e11718c0c6fe-ovsdbserver-nb\") pod \"4e59b4e1-4729-4161-ad22-e11718c0c6fe\" (UID: \"4e59b4e1-4729-4161-ad22-e11718c0c6fe\") " Dec 01 08:40:28 crc kubenswrapper[5004]: I1201 08:40:28.506114 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9w2h\" (UniqueName: \"kubernetes.io/projected/4e59b4e1-4729-4161-ad22-e11718c0c6fe-kube-api-access-m9w2h\") pod \"4e59b4e1-4729-4161-ad22-e11718c0c6fe\" (UID: \"4e59b4e1-4729-4161-ad22-e11718c0c6fe\") " Dec 01 08:40:28 crc kubenswrapper[5004]: I1201 08:40:28.507284 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/87079b8d-839c-42d2-95d1-33dee4ca61e1-config\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:28 crc kubenswrapper[5004]: I1201 08:40:28.507320 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87079b8d-839c-42d2-95d1-33dee4ca61e1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:28 crc kubenswrapper[5004]: I1201 08:40:28.507341 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwgfm\" (UniqueName: \"kubernetes.io/projected/87079b8d-839c-42d2-95d1-33dee4ca61e1-kube-api-access-fwgfm\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:28 crc kubenswrapper[5004]: I1201 08:40:28.507358 5004 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e59b4e1-4729-4161-ad22-e11718c0c6fe-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:28 crc kubenswrapper[5004]: I1201 08:40:28.510768 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e59b4e1-4729-4161-ad22-e11718c0c6fe-kube-api-access-m9w2h" (OuterVolumeSpecName: "kube-api-access-m9w2h") pod "4e59b4e1-4729-4161-ad22-e11718c0c6fe" (UID: "4e59b4e1-4729-4161-ad22-e11718c0c6fe"). InnerVolumeSpecName "kube-api-access-m9w2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:40:28 crc kubenswrapper[5004]: I1201 08:40:28.552918 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e59b4e1-4729-4161-ad22-e11718c0c6fe-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4e59b4e1-4729-4161-ad22-e11718c0c6fe" (UID: "4e59b4e1-4729-4161-ad22-e11718c0c6fe"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:40:28 crc kubenswrapper[5004]: I1201 08:40:28.556949 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e59b4e1-4729-4161-ad22-e11718c0c6fe-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4e59b4e1-4729-4161-ad22-e11718c0c6fe" (UID: "4e59b4e1-4729-4161-ad22-e11718c0c6fe"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:40:28 crc kubenswrapper[5004]: I1201 08:40:28.569372 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e59b4e1-4729-4161-ad22-e11718c0c6fe-config" (OuterVolumeSpecName: "config") pod "4e59b4e1-4729-4161-ad22-e11718c0c6fe" (UID: "4e59b4e1-4729-4161-ad22-e11718c0c6fe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:40:28 crc kubenswrapper[5004]: I1201 08:40:28.609215 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e59b4e1-4729-4161-ad22-e11718c0c6fe-config\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:28 crc kubenswrapper[5004]: I1201 08:40:28.609256 5004 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e59b4e1-4729-4161-ad22-e11718c0c6fe-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:28 crc kubenswrapper[5004]: I1201 08:40:28.609269 5004 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e59b4e1-4729-4161-ad22-e11718c0c6fe-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:28 crc kubenswrapper[5004]: I1201 08:40:28.609284 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9w2h\" (UniqueName: \"kubernetes.io/projected/4e59b4e1-4729-4161-ad22-e11718c0c6fe-kube-api-access-m9w2h\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:28 crc kubenswrapper[5004]: I1201 08:40:28.883932 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-f2sxs" event={"ID":"4e59b4e1-4729-4161-ad22-e11718c0c6fe","Type":"ContainerDied","Data":"c139241a209375b7c7ce0fa66ad09eb4f88af8e6252175a970edc88d41e70319"} Dec 01 08:40:28 crc kubenswrapper[5004]: I1201 08:40:28.884012 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-f2sxs" Dec 01 08:40:28 crc kubenswrapper[5004]: I1201 08:40:28.885523 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-68h6f" Dec 01 08:40:28 crc kubenswrapper[5004]: I1201 08:40:28.885527 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-68h6f" event={"ID":"87079b8d-839c-42d2-95d1-33dee4ca61e1","Type":"ContainerDied","Data":"d8316539ebf2a931dc44bea7c691f840d8c5da98f31934274c0676be3f66dd95"} Dec 01 08:40:28 crc kubenswrapper[5004]: I1201 08:40:28.885591 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8316539ebf2a931dc44bea7c691f840d8c5da98f31934274c0676be3f66dd95" Dec 01 08:40:28 crc kubenswrapper[5004]: E1201 08:40:28.888886 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified\\\"\"" pod="openstack/heat-db-sync-cwgb2" podUID="fb372dfc-6007-42ba-bc16-96f7d99d8b98" Dec 01 08:40:28 crc kubenswrapper[5004]: I1201 08:40:28.947831 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-f2sxs"] Dec 01 08:40:28 crc kubenswrapper[5004]: I1201 08:40:28.967201 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-f2sxs"] Dec 01 08:40:29 crc kubenswrapper[5004]: I1201 08:40:29.641670 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-xj8tw"] Dec 01 08:40:29 crc kubenswrapper[5004]: E1201 08:40:29.642499 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e59b4e1-4729-4161-ad22-e11718c0c6fe" containerName="dnsmasq-dns" Dec 01 08:40:29 crc kubenswrapper[5004]: I1201 08:40:29.642517 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e59b4e1-4729-4161-ad22-e11718c0c6fe" containerName="dnsmasq-dns" Dec 01 08:40:29 crc kubenswrapper[5004]: E1201 08:40:29.642543 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e59b4e1-4729-4161-ad22-e11718c0c6fe" containerName="init" Dec 01 08:40:29 crc kubenswrapper[5004]: I1201 08:40:29.642553 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e59b4e1-4729-4161-ad22-e11718c0c6fe" containerName="init" Dec 01 08:40:29 crc kubenswrapper[5004]: E1201 08:40:29.642594 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87079b8d-839c-42d2-95d1-33dee4ca61e1" containerName="neutron-db-sync" Dec 01 08:40:29 crc kubenswrapper[5004]: I1201 08:40:29.642605 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="87079b8d-839c-42d2-95d1-33dee4ca61e1" containerName="neutron-db-sync" Dec 01 08:40:29 crc kubenswrapper[5004]: I1201 08:40:29.643044 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="87079b8d-839c-42d2-95d1-33dee4ca61e1" containerName="neutron-db-sync" Dec 01 08:40:29 crc kubenswrapper[5004]: I1201 08:40:29.643085 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e59b4e1-4729-4161-ad22-e11718c0c6fe" containerName="dnsmasq-dns" Dec 01 08:40:29 crc kubenswrapper[5004]: I1201 08:40:29.644839 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-xj8tw" Dec 01 08:40:29 crc kubenswrapper[5004]: I1201 08:40:29.657793 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-xj8tw"] Dec 01 08:40:29 crc kubenswrapper[5004]: I1201 08:40:29.733303 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fbc3a584-72e3-4331-8bd5-c5accc1f0395-config\") pod \"dnsmasq-dns-5ccc5c4795-xj8tw\" (UID: \"fbc3a584-72e3-4331-8bd5-c5accc1f0395\") " pod="openstack/dnsmasq-dns-5ccc5c4795-xj8tw" Dec 01 08:40:29 crc kubenswrapper[5004]: I1201 08:40:29.733393 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fbc3a584-72e3-4331-8bd5-c5accc1f0395-dns-swift-storage-0\") pod \"dnsmasq-dns-5ccc5c4795-xj8tw\" (UID: \"fbc3a584-72e3-4331-8bd5-c5accc1f0395\") " pod="openstack/dnsmasq-dns-5ccc5c4795-xj8tw" Dec 01 08:40:29 crc kubenswrapper[5004]: I1201 08:40:29.733429 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rs9h4\" (UniqueName: \"kubernetes.io/projected/fbc3a584-72e3-4331-8bd5-c5accc1f0395-kube-api-access-rs9h4\") pod \"dnsmasq-dns-5ccc5c4795-xj8tw\" (UID: \"fbc3a584-72e3-4331-8bd5-c5accc1f0395\") " pod="openstack/dnsmasq-dns-5ccc5c4795-xj8tw" Dec 01 08:40:29 crc kubenswrapper[5004]: I1201 08:40:29.733483 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fbc3a584-72e3-4331-8bd5-c5accc1f0395-dns-svc\") pod \"dnsmasq-dns-5ccc5c4795-xj8tw\" (UID: \"fbc3a584-72e3-4331-8bd5-c5accc1f0395\") " pod="openstack/dnsmasq-dns-5ccc5c4795-xj8tw" Dec 01 08:40:29 crc kubenswrapper[5004]: I1201 08:40:29.733587 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fbc3a584-72e3-4331-8bd5-c5accc1f0395-ovsdbserver-nb\") pod \"dnsmasq-dns-5ccc5c4795-xj8tw\" (UID: \"fbc3a584-72e3-4331-8bd5-c5accc1f0395\") " pod="openstack/dnsmasq-dns-5ccc5c4795-xj8tw" Dec 01 08:40:29 crc kubenswrapper[5004]: I1201 08:40:29.733613 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fbc3a584-72e3-4331-8bd5-c5accc1f0395-ovsdbserver-sb\") pod \"dnsmasq-dns-5ccc5c4795-xj8tw\" (UID: \"fbc3a584-72e3-4331-8bd5-c5accc1f0395\") " pod="openstack/dnsmasq-dns-5ccc5c4795-xj8tw" Dec 01 08:40:29 crc kubenswrapper[5004]: I1201 08:40:29.751198 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-745fdd7bc8-x8cmf"] Dec 01 08:40:29 crc kubenswrapper[5004]: I1201 08:40:29.754070 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-745fdd7bc8-x8cmf" Dec 01 08:40:29 crc kubenswrapper[5004]: I1201 08:40:29.757338 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 01 08:40:29 crc kubenswrapper[5004]: I1201 08:40:29.757579 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-wfrsk" Dec 01 08:40:29 crc kubenswrapper[5004]: I1201 08:40:29.757653 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Dec 01 08:40:29 crc kubenswrapper[5004]: I1201 08:40:29.757704 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 01 08:40:29 crc kubenswrapper[5004]: I1201 08:40:29.766388 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-745fdd7bc8-x8cmf"] Dec 01 08:40:29 crc kubenswrapper[5004]: I1201 08:40:29.836461 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fbc3a584-72e3-4331-8bd5-c5accc1f0395-ovsdbserver-nb\") pod \"dnsmasq-dns-5ccc5c4795-xj8tw\" (UID: \"fbc3a584-72e3-4331-8bd5-c5accc1f0395\") " pod="openstack/dnsmasq-dns-5ccc5c4795-xj8tw" Dec 01 08:40:29 crc kubenswrapper[5004]: I1201 08:40:29.836511 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fbc3a584-72e3-4331-8bd5-c5accc1f0395-ovsdbserver-sb\") pod \"dnsmasq-dns-5ccc5c4795-xj8tw\" (UID: \"fbc3a584-72e3-4331-8bd5-c5accc1f0395\") " pod="openstack/dnsmasq-dns-5ccc5c4795-xj8tw" Dec 01 08:40:29 crc kubenswrapper[5004]: I1201 08:40:29.836605 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fbc3a584-72e3-4331-8bd5-c5accc1f0395-config\") pod \"dnsmasq-dns-5ccc5c4795-xj8tw\" (UID: \"fbc3a584-72e3-4331-8bd5-c5accc1f0395\") " pod="openstack/dnsmasq-dns-5ccc5c4795-xj8tw" Dec 01 08:40:29 crc kubenswrapper[5004]: I1201 08:40:29.836674 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fbc3a584-72e3-4331-8bd5-c5accc1f0395-dns-swift-storage-0\") pod \"dnsmasq-dns-5ccc5c4795-xj8tw\" (UID: \"fbc3a584-72e3-4331-8bd5-c5accc1f0395\") " pod="openstack/dnsmasq-dns-5ccc5c4795-xj8tw" Dec 01 08:40:29 crc kubenswrapper[5004]: I1201 08:40:29.836702 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bc52bfd5-1560-4155-ba74-5fc2d92dfe73-httpd-config\") pod \"neutron-745fdd7bc8-x8cmf\" (UID: \"bc52bfd5-1560-4155-ba74-5fc2d92dfe73\") " pod="openstack/neutron-745fdd7bc8-x8cmf" Dec 01 08:40:29 crc kubenswrapper[5004]: I1201 08:40:29.836741 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rs9h4\" (UniqueName: \"kubernetes.io/projected/fbc3a584-72e3-4331-8bd5-c5accc1f0395-kube-api-access-rs9h4\") pod \"dnsmasq-dns-5ccc5c4795-xj8tw\" (UID: \"fbc3a584-72e3-4331-8bd5-c5accc1f0395\") " pod="openstack/dnsmasq-dns-5ccc5c4795-xj8tw" Dec 01 08:40:29 crc kubenswrapper[5004]: I1201 08:40:29.836764 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc52bfd5-1560-4155-ba74-5fc2d92dfe73-combined-ca-bundle\") pod \"neutron-745fdd7bc8-x8cmf\" (UID: \"bc52bfd5-1560-4155-ba74-5fc2d92dfe73\") " pod="openstack/neutron-745fdd7bc8-x8cmf" Dec 01 08:40:29 crc kubenswrapper[5004]: I1201 08:40:29.836833 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fbc3a584-72e3-4331-8bd5-c5accc1f0395-dns-svc\") pod \"dnsmasq-dns-5ccc5c4795-xj8tw\" (UID: \"fbc3a584-72e3-4331-8bd5-c5accc1f0395\") " pod="openstack/dnsmasq-dns-5ccc5c4795-xj8tw" Dec 01 08:40:29 crc kubenswrapper[5004]: I1201 08:40:29.836874 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bc52bfd5-1560-4155-ba74-5fc2d92dfe73-config\") pod \"neutron-745fdd7bc8-x8cmf\" (UID: \"bc52bfd5-1560-4155-ba74-5fc2d92dfe73\") " pod="openstack/neutron-745fdd7bc8-x8cmf" Dec 01 08:40:29 crc kubenswrapper[5004]: I1201 08:40:29.836909 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc52bfd5-1560-4155-ba74-5fc2d92dfe73-ovndb-tls-certs\") pod \"neutron-745fdd7bc8-x8cmf\" (UID: \"bc52bfd5-1560-4155-ba74-5fc2d92dfe73\") " pod="openstack/neutron-745fdd7bc8-x8cmf" Dec 01 08:40:29 crc kubenswrapper[5004]: I1201 08:40:29.836956 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rv277\" (UniqueName: \"kubernetes.io/projected/bc52bfd5-1560-4155-ba74-5fc2d92dfe73-kube-api-access-rv277\") pod \"neutron-745fdd7bc8-x8cmf\" (UID: \"bc52bfd5-1560-4155-ba74-5fc2d92dfe73\") " pod="openstack/neutron-745fdd7bc8-x8cmf" Dec 01 08:40:29 crc kubenswrapper[5004]: I1201 08:40:29.838035 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fbc3a584-72e3-4331-8bd5-c5accc1f0395-ovsdbserver-nb\") pod \"dnsmasq-dns-5ccc5c4795-xj8tw\" (UID: \"fbc3a584-72e3-4331-8bd5-c5accc1f0395\") " pod="openstack/dnsmasq-dns-5ccc5c4795-xj8tw" Dec 01 08:40:29 crc kubenswrapper[5004]: I1201 08:40:29.838427 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fbc3a584-72e3-4331-8bd5-c5accc1f0395-dns-svc\") pod \"dnsmasq-dns-5ccc5c4795-xj8tw\" (UID: \"fbc3a584-72e3-4331-8bd5-c5accc1f0395\") " pod="openstack/dnsmasq-dns-5ccc5c4795-xj8tw" Dec 01 08:40:29 crc kubenswrapper[5004]: I1201 08:40:29.840904 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fbc3a584-72e3-4331-8bd5-c5accc1f0395-config\") pod \"dnsmasq-dns-5ccc5c4795-xj8tw\" (UID: \"fbc3a584-72e3-4331-8bd5-c5accc1f0395\") " pod="openstack/dnsmasq-dns-5ccc5c4795-xj8tw" Dec 01 08:40:29 crc kubenswrapper[5004]: I1201 08:40:29.842209 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fbc3a584-72e3-4331-8bd5-c5accc1f0395-dns-swift-storage-0\") pod \"dnsmasq-dns-5ccc5c4795-xj8tw\" (UID: \"fbc3a584-72e3-4331-8bd5-c5accc1f0395\") " pod="openstack/dnsmasq-dns-5ccc5c4795-xj8tw" Dec 01 08:40:29 crc kubenswrapper[5004]: I1201 08:40:29.843983 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fbc3a584-72e3-4331-8bd5-c5accc1f0395-ovsdbserver-sb\") pod \"dnsmasq-dns-5ccc5c4795-xj8tw\" (UID: \"fbc3a584-72e3-4331-8bd5-c5accc1f0395\") " pod="openstack/dnsmasq-dns-5ccc5c4795-xj8tw" Dec 01 08:40:29 crc kubenswrapper[5004]: I1201 08:40:29.862226 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rs9h4\" (UniqueName: \"kubernetes.io/projected/fbc3a584-72e3-4331-8bd5-c5accc1f0395-kube-api-access-rs9h4\") pod \"dnsmasq-dns-5ccc5c4795-xj8tw\" (UID: \"fbc3a584-72e3-4331-8bd5-c5accc1f0395\") " pod="openstack/dnsmasq-dns-5ccc5c4795-xj8tw" Dec 01 08:40:29 crc kubenswrapper[5004]: I1201 08:40:29.939233 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bc52bfd5-1560-4155-ba74-5fc2d92dfe73-config\") pod \"neutron-745fdd7bc8-x8cmf\" (UID: \"bc52bfd5-1560-4155-ba74-5fc2d92dfe73\") " pod="openstack/neutron-745fdd7bc8-x8cmf" Dec 01 08:40:29 crc kubenswrapper[5004]: I1201 08:40:29.939749 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc52bfd5-1560-4155-ba74-5fc2d92dfe73-ovndb-tls-certs\") pod \"neutron-745fdd7bc8-x8cmf\" (UID: \"bc52bfd5-1560-4155-ba74-5fc2d92dfe73\") " pod="openstack/neutron-745fdd7bc8-x8cmf" Dec 01 08:40:29 crc kubenswrapper[5004]: I1201 08:40:29.939794 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rv277\" (UniqueName: \"kubernetes.io/projected/bc52bfd5-1560-4155-ba74-5fc2d92dfe73-kube-api-access-rv277\") pod \"neutron-745fdd7bc8-x8cmf\" (UID: \"bc52bfd5-1560-4155-ba74-5fc2d92dfe73\") " pod="openstack/neutron-745fdd7bc8-x8cmf" Dec 01 08:40:29 crc kubenswrapper[5004]: I1201 08:40:29.940187 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bc52bfd5-1560-4155-ba74-5fc2d92dfe73-httpd-config\") pod \"neutron-745fdd7bc8-x8cmf\" (UID: \"bc52bfd5-1560-4155-ba74-5fc2d92dfe73\") " pod="openstack/neutron-745fdd7bc8-x8cmf" Dec 01 08:40:29 crc kubenswrapper[5004]: I1201 08:40:29.940237 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc52bfd5-1560-4155-ba74-5fc2d92dfe73-combined-ca-bundle\") pod \"neutron-745fdd7bc8-x8cmf\" (UID: \"bc52bfd5-1560-4155-ba74-5fc2d92dfe73\") " pod="openstack/neutron-745fdd7bc8-x8cmf" Dec 01 08:40:29 crc kubenswrapper[5004]: I1201 08:40:29.944946 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/bc52bfd5-1560-4155-ba74-5fc2d92dfe73-config\") pod \"neutron-745fdd7bc8-x8cmf\" (UID: \"bc52bfd5-1560-4155-ba74-5fc2d92dfe73\") " pod="openstack/neutron-745fdd7bc8-x8cmf" Dec 01 08:40:29 crc kubenswrapper[5004]: I1201 08:40:29.949399 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc52bfd5-1560-4155-ba74-5fc2d92dfe73-combined-ca-bundle\") pod \"neutron-745fdd7bc8-x8cmf\" (UID: \"bc52bfd5-1560-4155-ba74-5fc2d92dfe73\") " pod="openstack/neutron-745fdd7bc8-x8cmf" Dec 01 08:40:29 crc kubenswrapper[5004]: I1201 08:40:29.954451 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc52bfd5-1560-4155-ba74-5fc2d92dfe73-ovndb-tls-certs\") pod \"neutron-745fdd7bc8-x8cmf\" (UID: \"bc52bfd5-1560-4155-ba74-5fc2d92dfe73\") " pod="openstack/neutron-745fdd7bc8-x8cmf" Dec 01 08:40:29 crc kubenswrapper[5004]: I1201 08:40:29.960605 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bc52bfd5-1560-4155-ba74-5fc2d92dfe73-httpd-config\") pod \"neutron-745fdd7bc8-x8cmf\" (UID: \"bc52bfd5-1560-4155-ba74-5fc2d92dfe73\") " pod="openstack/neutron-745fdd7bc8-x8cmf" Dec 01 08:40:29 crc kubenswrapper[5004]: I1201 08:40:29.968028 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rv277\" (UniqueName: \"kubernetes.io/projected/bc52bfd5-1560-4155-ba74-5fc2d92dfe73-kube-api-access-rv277\") pod \"neutron-745fdd7bc8-x8cmf\" (UID: \"bc52bfd5-1560-4155-ba74-5fc2d92dfe73\") " pod="openstack/neutron-745fdd7bc8-x8cmf" Dec 01 08:40:29 crc kubenswrapper[5004]: I1201 08:40:29.980652 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-xj8tw" Dec 01 08:40:30 crc kubenswrapper[5004]: E1201 08:40:30.021196 5004 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Dec 01 08:40:30 crc kubenswrapper[5004]: E1201 08:40:30.021350 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r2hpl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-ln7f8_openstack(165d617f-a220-49b1-af2b-65d4c509962c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 08:40:30 crc kubenswrapper[5004]: E1201 08:40:30.022664 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-ln7f8" podUID="165d617f-a220-49b1-af2b-65d4c509962c" Dec 01 08:40:30 crc kubenswrapper[5004]: I1201 08:40:30.065676 5004 scope.go:117] "RemoveContainer" containerID="096db5eefe76b833206c0552285a37c222f7f9fdcd618eee28153246aca441e9" Dec 01 08:40:30 crc kubenswrapper[5004]: I1201 08:40:30.093421 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-745fdd7bc8-x8cmf" Dec 01 08:40:30 crc kubenswrapper[5004]: I1201 08:40:30.235739 5004 scope.go:117] "RemoveContainer" containerID="c495e8ff21df11426dd22273b521a77287ef27f2a5c16d0fbb132263af3c271f" Dec 01 08:40:30 crc kubenswrapper[5004]: I1201 08:40:30.284624 5004 scope.go:117] "RemoveContainer" containerID="e03e643932ac005453b74c50822abc37bafc358c6daaabb3f00e8cae6e89c37f" Dec 01 08:40:30 crc kubenswrapper[5004]: I1201 08:40:30.583708 5004 scope.go:117] "RemoveContainer" containerID="21701a04ad6f4449f87cea6a10e39fc1f1d0e7b777aef1e2de0843de3e97d9b1" Dec 01 08:40:30 crc kubenswrapper[5004]: I1201 08:40:30.773991 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e59b4e1-4729-4161-ad22-e11718c0c6fe" path="/var/lib/kubelet/pods/4e59b4e1-4729-4161-ad22-e11718c0c6fe/volumes" Dec 01 08:40:30 crc kubenswrapper[5004]: I1201 08:40:30.923149 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-7jdwx" event={"ID":"e5303a09-48ed-4287-83ba-0fb70fe199d0","Type":"ContainerStarted","Data":"5d0a501941db692fc8607f877460be6c252bfb759ba05cd87257d71106f20640"} Dec 01 08:40:30 crc kubenswrapper[5004]: E1201 08:40:30.933725 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-ln7f8" podUID="165d617f-a220-49b1-af2b-65d4c509962c" Dec 01 08:40:30 crc kubenswrapper[5004]: I1201 08:40:30.934782 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 08:40:30 crc kubenswrapper[5004]: I1201 08:40:30.944441 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-7jdwx" podStartSLOduration=6.049316245 podStartE2EDuration="31.944420449s" podCreationTimestamp="2025-12-01 08:39:59 +0000 UTC" firstStartedPulling="2025-12-01 08:40:02.361834199 +0000 UTC m=+1379.926826181" lastFinishedPulling="2025-12-01 08:40:28.256938403 +0000 UTC m=+1405.821930385" observedRunningTime="2025-12-01 08:40:30.938577786 +0000 UTC m=+1408.503569768" watchObservedRunningTime="2025-12-01 08:40:30.944420449 +0000 UTC m=+1408.509412421" Dec 01 08:40:31 crc kubenswrapper[5004]: I1201 08:40:31.034159 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-hlrzt"] Dec 01 08:40:31 crc kubenswrapper[5004]: I1201 08:40:31.093440 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 08:40:31 crc kubenswrapper[5004]: I1201 08:40:31.224777 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-xj8tw"] Dec 01 08:40:31 crc kubenswrapper[5004]: W1201 08:40:31.259664 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64235e39_5760_4a2e_a164_7cb27ca906a3.slice/crio-a8a4f0712474d3b24c57adbe9519f964cb29fc5e0b4e8cc281ffbe1f19bbe5da WatchSource:0}: Error finding container a8a4f0712474d3b24c57adbe9519f964cb29fc5e0b4e8cc281ffbe1f19bbe5da: Status 404 returned error can't find the container with id a8a4f0712474d3b24c57adbe9519f964cb29fc5e0b4e8cc281ffbe1f19bbe5da Dec 01 08:40:31 crc kubenswrapper[5004]: I1201 08:40:31.325870 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-745fdd7bc8-x8cmf"] Dec 01 08:40:31 crc kubenswrapper[5004]: W1201 08:40:31.340683 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc52bfd5_1560_4155_ba74_5fc2d92dfe73.slice/crio-2747be2f570392f48aba853116f00805df9a9a07324214e31ecf682b2f121654 WatchSource:0}: Error finding container 2747be2f570392f48aba853116f00805df9a9a07324214e31ecf682b2f121654: Status 404 returned error can't find the container with id 2747be2f570392f48aba853116f00805df9a9a07324214e31ecf682b2f121654 Dec 01 08:40:31 crc kubenswrapper[5004]: I1201 08:40:31.951827 5004 generic.go:334] "Generic (PLEG): container finished" podID="fbc3a584-72e3-4331-8bd5-c5accc1f0395" containerID="a67d7af54527f2c3a579a265d83d5032fa2aefe3147972385e8f4684d9e7644b" exitCode=0 Dec 01 08:40:31 crc kubenswrapper[5004]: I1201 08:40:31.952089 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-xj8tw" event={"ID":"fbc3a584-72e3-4331-8bd5-c5accc1f0395","Type":"ContainerDied","Data":"a67d7af54527f2c3a579a265d83d5032fa2aefe3147972385e8f4684d9e7644b"} Dec 01 08:40:31 crc kubenswrapper[5004]: I1201 08:40:31.952394 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-xj8tw" event={"ID":"fbc3a584-72e3-4331-8bd5-c5accc1f0395","Type":"ContainerStarted","Data":"2e841945cd2f50c324175c66f2c981ed0db52f38268528475f4db176acc14031"} Dec 01 08:40:31 crc kubenswrapper[5004]: I1201 08:40:31.960105 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"64235e39-5760-4a2e-a164-7cb27ca906a3","Type":"ContainerStarted","Data":"a8a4f0712474d3b24c57adbe9519f964cb29fc5e0b4e8cc281ffbe1f19bbe5da"} Dec 01 08:40:31 crc kubenswrapper[5004]: I1201 08:40:31.968225 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-hlrzt" event={"ID":"713ee9f1-6421-4ffa-aed2-5f762d8cba63","Type":"ContainerStarted","Data":"a5e210e1ece1b7cb88d92cd73ce1261bd023a742628e4cd59a7610d4e53cd7c5"} Dec 01 08:40:31 crc kubenswrapper[5004]: I1201 08:40:31.968264 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-hlrzt" event={"ID":"713ee9f1-6421-4ffa-aed2-5f762d8cba63","Type":"ContainerStarted","Data":"64dd8f90d116777fbcbeec0db5b8cc52c3ce3c6508ef78a32e5de3e468f90c91"} Dec 01 08:40:31 crc kubenswrapper[5004]: I1201 08:40:31.972176 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-745fdd7bc8-x8cmf" event={"ID":"bc52bfd5-1560-4155-ba74-5fc2d92dfe73","Type":"ContainerStarted","Data":"8240e03a472d498dbdfc57c97c051baed4a1338229d1ae7da3d4173a1e98430f"} Dec 01 08:40:31 crc kubenswrapper[5004]: I1201 08:40:31.972218 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-745fdd7bc8-x8cmf" event={"ID":"bc52bfd5-1560-4155-ba74-5fc2d92dfe73","Type":"ContainerStarted","Data":"2747be2f570392f48aba853116f00805df9a9a07324214e31ecf682b2f121654"} Dec 01 08:40:31 crc kubenswrapper[5004]: I1201 08:40:31.984153 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1fd8d826-4dab-4d07-bc04-be5dfebdaf2b","Type":"ContainerStarted","Data":"e955dd78612c15b9cc804f7c4d3a8fa1a731e8687153b47cd4fdd2a0719e29eb"} Dec 01 08:40:31 crc kubenswrapper[5004]: I1201 08:40:31.992291 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fa6628b9-be2f-4594-8767-5442e1d2f5b9","Type":"ContainerStarted","Data":"d5514074610abc7259416e8dda14f559675a9845afe697c489e3eb4e44819adc"} Dec 01 08:40:32 crc kubenswrapper[5004]: I1201 08:40:32.011736 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-hlrzt" podStartSLOduration=13.011716139 podStartE2EDuration="13.011716139s" podCreationTimestamp="2025-12-01 08:40:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:40:31.986862812 +0000 UTC m=+1409.551854794" watchObservedRunningTime="2025-12-01 08:40:32.011716139 +0000 UTC m=+1409.576708111" Dec 01 08:40:32 crc kubenswrapper[5004]: I1201 08:40:32.428977 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-cb845f59f-m4f5q"] Dec 01 08:40:32 crc kubenswrapper[5004]: I1201 08:40:32.432702 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-cb845f59f-m4f5q" Dec 01 08:40:32 crc kubenswrapper[5004]: I1201 08:40:32.438095 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Dec 01 08:40:32 crc kubenswrapper[5004]: I1201 08:40:32.438290 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Dec 01 08:40:32 crc kubenswrapper[5004]: I1201 08:40:32.491670 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-cb845f59f-m4f5q"] Dec 01 08:40:32 crc kubenswrapper[5004]: I1201 08:40:32.520420 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d456c05-b8d3-43ca-ae93-9b0a5b111296-public-tls-certs\") pod \"neutron-cb845f59f-m4f5q\" (UID: \"8d456c05-b8d3-43ca-ae93-9b0a5b111296\") " pod="openstack/neutron-cb845f59f-m4f5q" Dec 01 08:40:32 crc kubenswrapper[5004]: I1201 08:40:32.520738 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbqnn\" (UniqueName: \"kubernetes.io/projected/8d456c05-b8d3-43ca-ae93-9b0a5b111296-kube-api-access-wbqnn\") pod \"neutron-cb845f59f-m4f5q\" (UID: \"8d456c05-b8d3-43ca-ae93-9b0a5b111296\") " pod="openstack/neutron-cb845f59f-m4f5q" Dec 01 08:40:32 crc kubenswrapper[5004]: I1201 08:40:32.520782 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8d456c05-b8d3-43ca-ae93-9b0a5b111296-config\") pod \"neutron-cb845f59f-m4f5q\" (UID: \"8d456c05-b8d3-43ca-ae93-9b0a5b111296\") " pod="openstack/neutron-cb845f59f-m4f5q" Dec 01 08:40:32 crc kubenswrapper[5004]: I1201 08:40:32.520844 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8d456c05-b8d3-43ca-ae93-9b0a5b111296-httpd-config\") pod \"neutron-cb845f59f-m4f5q\" (UID: \"8d456c05-b8d3-43ca-ae93-9b0a5b111296\") " pod="openstack/neutron-cb845f59f-m4f5q" Dec 01 08:40:32 crc kubenswrapper[5004]: I1201 08:40:32.520864 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d456c05-b8d3-43ca-ae93-9b0a5b111296-ovndb-tls-certs\") pod \"neutron-cb845f59f-m4f5q\" (UID: \"8d456c05-b8d3-43ca-ae93-9b0a5b111296\") " pod="openstack/neutron-cb845f59f-m4f5q" Dec 01 08:40:32 crc kubenswrapper[5004]: I1201 08:40:32.520900 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d456c05-b8d3-43ca-ae93-9b0a5b111296-combined-ca-bundle\") pod \"neutron-cb845f59f-m4f5q\" (UID: \"8d456c05-b8d3-43ca-ae93-9b0a5b111296\") " pod="openstack/neutron-cb845f59f-m4f5q" Dec 01 08:40:32 crc kubenswrapper[5004]: I1201 08:40:32.520917 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d456c05-b8d3-43ca-ae93-9b0a5b111296-internal-tls-certs\") pod \"neutron-cb845f59f-m4f5q\" (UID: \"8d456c05-b8d3-43ca-ae93-9b0a5b111296\") " pod="openstack/neutron-cb845f59f-m4f5q" Dec 01 08:40:32 crc kubenswrapper[5004]: I1201 08:40:32.570391 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-f2sxs" podUID="4e59b4e1-4729-4161-ad22-e11718c0c6fe" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.150:5353: i/o timeout" Dec 01 08:40:32 crc kubenswrapper[5004]: I1201 08:40:32.622631 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d456c05-b8d3-43ca-ae93-9b0a5b111296-public-tls-certs\") pod \"neutron-cb845f59f-m4f5q\" (UID: \"8d456c05-b8d3-43ca-ae93-9b0a5b111296\") " pod="openstack/neutron-cb845f59f-m4f5q" Dec 01 08:40:32 crc kubenswrapper[5004]: I1201 08:40:32.622710 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbqnn\" (UniqueName: \"kubernetes.io/projected/8d456c05-b8d3-43ca-ae93-9b0a5b111296-kube-api-access-wbqnn\") pod \"neutron-cb845f59f-m4f5q\" (UID: \"8d456c05-b8d3-43ca-ae93-9b0a5b111296\") " pod="openstack/neutron-cb845f59f-m4f5q" Dec 01 08:40:32 crc kubenswrapper[5004]: I1201 08:40:32.622739 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8d456c05-b8d3-43ca-ae93-9b0a5b111296-config\") pod \"neutron-cb845f59f-m4f5q\" (UID: \"8d456c05-b8d3-43ca-ae93-9b0a5b111296\") " pod="openstack/neutron-cb845f59f-m4f5q" Dec 01 08:40:32 crc kubenswrapper[5004]: I1201 08:40:32.622788 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8d456c05-b8d3-43ca-ae93-9b0a5b111296-httpd-config\") pod \"neutron-cb845f59f-m4f5q\" (UID: \"8d456c05-b8d3-43ca-ae93-9b0a5b111296\") " pod="openstack/neutron-cb845f59f-m4f5q" Dec 01 08:40:32 crc kubenswrapper[5004]: I1201 08:40:32.622818 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d456c05-b8d3-43ca-ae93-9b0a5b111296-ovndb-tls-certs\") pod \"neutron-cb845f59f-m4f5q\" (UID: \"8d456c05-b8d3-43ca-ae93-9b0a5b111296\") " pod="openstack/neutron-cb845f59f-m4f5q" Dec 01 08:40:32 crc kubenswrapper[5004]: I1201 08:40:32.622848 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d456c05-b8d3-43ca-ae93-9b0a5b111296-combined-ca-bundle\") pod \"neutron-cb845f59f-m4f5q\" (UID: \"8d456c05-b8d3-43ca-ae93-9b0a5b111296\") " pod="openstack/neutron-cb845f59f-m4f5q" Dec 01 08:40:32 crc kubenswrapper[5004]: I1201 08:40:32.622865 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d456c05-b8d3-43ca-ae93-9b0a5b111296-internal-tls-certs\") pod \"neutron-cb845f59f-m4f5q\" (UID: \"8d456c05-b8d3-43ca-ae93-9b0a5b111296\") " pod="openstack/neutron-cb845f59f-m4f5q" Dec 01 08:40:32 crc kubenswrapper[5004]: I1201 08:40:32.627141 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d456c05-b8d3-43ca-ae93-9b0a5b111296-public-tls-certs\") pod \"neutron-cb845f59f-m4f5q\" (UID: \"8d456c05-b8d3-43ca-ae93-9b0a5b111296\") " pod="openstack/neutron-cb845f59f-m4f5q" Dec 01 08:40:32 crc kubenswrapper[5004]: I1201 08:40:32.627726 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d456c05-b8d3-43ca-ae93-9b0a5b111296-internal-tls-certs\") pod \"neutron-cb845f59f-m4f5q\" (UID: \"8d456c05-b8d3-43ca-ae93-9b0a5b111296\") " pod="openstack/neutron-cb845f59f-m4f5q" Dec 01 08:40:32 crc kubenswrapper[5004]: I1201 08:40:32.640547 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d456c05-b8d3-43ca-ae93-9b0a5b111296-ovndb-tls-certs\") pod \"neutron-cb845f59f-m4f5q\" (UID: \"8d456c05-b8d3-43ca-ae93-9b0a5b111296\") " pod="openstack/neutron-cb845f59f-m4f5q" Dec 01 08:40:32 crc kubenswrapper[5004]: I1201 08:40:32.643759 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8d456c05-b8d3-43ca-ae93-9b0a5b111296-httpd-config\") pod \"neutron-cb845f59f-m4f5q\" (UID: \"8d456c05-b8d3-43ca-ae93-9b0a5b111296\") " pod="openstack/neutron-cb845f59f-m4f5q" Dec 01 08:40:32 crc kubenswrapper[5004]: I1201 08:40:32.650276 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbqnn\" (UniqueName: \"kubernetes.io/projected/8d456c05-b8d3-43ca-ae93-9b0a5b111296-kube-api-access-wbqnn\") pod \"neutron-cb845f59f-m4f5q\" (UID: \"8d456c05-b8d3-43ca-ae93-9b0a5b111296\") " pod="openstack/neutron-cb845f59f-m4f5q" Dec 01 08:40:32 crc kubenswrapper[5004]: I1201 08:40:32.650650 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d456c05-b8d3-43ca-ae93-9b0a5b111296-combined-ca-bundle\") pod \"neutron-cb845f59f-m4f5q\" (UID: \"8d456c05-b8d3-43ca-ae93-9b0a5b111296\") " pod="openstack/neutron-cb845f59f-m4f5q" Dec 01 08:40:32 crc kubenswrapper[5004]: I1201 08:40:32.651280 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/8d456c05-b8d3-43ca-ae93-9b0a5b111296-config\") pod \"neutron-cb845f59f-m4f5q\" (UID: \"8d456c05-b8d3-43ca-ae93-9b0a5b111296\") " pod="openstack/neutron-cb845f59f-m4f5q" Dec 01 08:40:32 crc kubenswrapper[5004]: I1201 08:40:32.922202 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-cb845f59f-m4f5q" Dec 01 08:40:33 crc kubenswrapper[5004]: I1201 08:40:33.007334 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-xj8tw" event={"ID":"fbc3a584-72e3-4331-8bd5-c5accc1f0395","Type":"ContainerStarted","Data":"c39746a684943747e313bf508e64ee70cb531fc44f2e9c99324915e5ec69249d"} Dec 01 08:40:33 crc kubenswrapper[5004]: I1201 08:40:33.007448 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5ccc5c4795-xj8tw" Dec 01 08:40:33 crc kubenswrapper[5004]: I1201 08:40:33.011099 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"64235e39-5760-4a2e-a164-7cb27ca906a3","Type":"ContainerStarted","Data":"a4de37a2a551c10a01fbe2aa77cf24579a8bbc8756ad332d58f177ae88a8f0c4"} Dec 01 08:40:33 crc kubenswrapper[5004]: I1201 08:40:33.017351 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-745fdd7bc8-x8cmf" event={"ID":"bc52bfd5-1560-4155-ba74-5fc2d92dfe73","Type":"ContainerStarted","Data":"69e3c2040e4f88fafa2d0e41f24069bd46aca6f4a6f8940dc1c49c5bd1273691"} Dec 01 08:40:33 crc kubenswrapper[5004]: I1201 08:40:33.018219 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-745fdd7bc8-x8cmf" Dec 01 08:40:33 crc kubenswrapper[5004]: I1201 08:40:33.021675 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fa6628b9-be2f-4594-8767-5442e1d2f5b9","Type":"ContainerStarted","Data":"8e0d24717f6b77a9ff93f781693f1a7b385497e8d77549962f407e2ec0a74009"} Dec 01 08:40:33 crc kubenswrapper[5004]: I1201 08:40:33.042144 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5ccc5c4795-xj8tw" podStartSLOduration=4.042129018 podStartE2EDuration="4.042129018s" podCreationTimestamp="2025-12-01 08:40:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:40:33.038692555 +0000 UTC m=+1410.603684537" watchObservedRunningTime="2025-12-01 08:40:33.042129018 +0000 UTC m=+1410.607120990" Dec 01 08:40:33 crc kubenswrapper[5004]: I1201 08:40:33.082286 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-745fdd7bc8-x8cmf" podStartSLOduration=4.082268699 podStartE2EDuration="4.082268699s" podCreationTimestamp="2025-12-01 08:40:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:40:33.065052948 +0000 UTC m=+1410.630044930" watchObservedRunningTime="2025-12-01 08:40:33.082268699 +0000 UTC m=+1410.647260681" Dec 01 08:40:33 crc kubenswrapper[5004]: I1201 08:40:33.573886 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-cb845f59f-m4f5q"] Dec 01 08:40:34 crc kubenswrapper[5004]: I1201 08:40:34.049555 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fa6628b9-be2f-4594-8767-5442e1d2f5b9","Type":"ContainerStarted","Data":"873b1fd641aba5c655ee027bc8a60117804b809c92280ec25c80829f55b1ab65"} Dec 01 08:40:34 crc kubenswrapper[5004]: I1201 08:40:34.052080 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-cb845f59f-m4f5q" event={"ID":"8d456c05-b8d3-43ca-ae93-9b0a5b111296","Type":"ContainerStarted","Data":"91484950dc20e3c8ae67d73e05e8587f2519e477efe14548b9fa6dba027d0576"} Dec 01 08:40:34 crc kubenswrapper[5004]: I1201 08:40:34.055892 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"64235e39-5760-4a2e-a164-7cb27ca906a3","Type":"ContainerStarted","Data":"78ca9413ec0076671f4c8abfe49abffd1640b7fdb572e7e322984516b6e3783c"} Dec 01 08:40:34 crc kubenswrapper[5004]: I1201 08:40:34.059136 5004 generic.go:334] "Generic (PLEG): container finished" podID="e5303a09-48ed-4287-83ba-0fb70fe199d0" containerID="5d0a501941db692fc8607f877460be6c252bfb759ba05cd87257d71106f20640" exitCode=0 Dec 01 08:40:34 crc kubenswrapper[5004]: I1201 08:40:34.059155 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-7jdwx" event={"ID":"e5303a09-48ed-4287-83ba-0fb70fe199d0","Type":"ContainerDied","Data":"5d0a501941db692fc8607f877460be6c252bfb759ba05cd87257d71106f20640"} Dec 01 08:40:34 crc kubenswrapper[5004]: I1201 08:40:34.086846 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=26.086824166 podStartE2EDuration="26.086824166s" podCreationTimestamp="2025-12-01 08:40:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:40:34.071022651 +0000 UTC m=+1411.636014633" watchObservedRunningTime="2025-12-01 08:40:34.086824166 +0000 UTC m=+1411.651816148" Dec 01 08:40:34 crc kubenswrapper[5004]: I1201 08:40:34.103645 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=18.103605066 podStartE2EDuration="18.103605066s" podCreationTimestamp="2025-12-01 08:40:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:40:34.094928825 +0000 UTC m=+1411.659920837" watchObservedRunningTime="2025-12-01 08:40:34.103605066 +0000 UTC m=+1411.668597048" Dec 01 08:40:35 crc kubenswrapper[5004]: I1201 08:40:35.583688 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-7jdwx" Dec 01 08:40:35 crc kubenswrapper[5004]: I1201 08:40:35.742839 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5303a09-48ed-4287-83ba-0fb70fe199d0-scripts\") pod \"e5303a09-48ed-4287-83ba-0fb70fe199d0\" (UID: \"e5303a09-48ed-4287-83ba-0fb70fe199d0\") " Dec 01 08:40:35 crc kubenswrapper[5004]: I1201 08:40:35.742944 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lpgq\" (UniqueName: \"kubernetes.io/projected/e5303a09-48ed-4287-83ba-0fb70fe199d0-kube-api-access-8lpgq\") pod \"e5303a09-48ed-4287-83ba-0fb70fe199d0\" (UID: \"e5303a09-48ed-4287-83ba-0fb70fe199d0\") " Dec 01 08:40:35 crc kubenswrapper[5004]: I1201 08:40:35.743029 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5303a09-48ed-4287-83ba-0fb70fe199d0-config-data\") pod \"e5303a09-48ed-4287-83ba-0fb70fe199d0\" (UID: \"e5303a09-48ed-4287-83ba-0fb70fe199d0\") " Dec 01 08:40:35 crc kubenswrapper[5004]: I1201 08:40:35.743143 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5303a09-48ed-4287-83ba-0fb70fe199d0-combined-ca-bundle\") pod \"e5303a09-48ed-4287-83ba-0fb70fe199d0\" (UID: \"e5303a09-48ed-4287-83ba-0fb70fe199d0\") " Dec 01 08:40:35 crc kubenswrapper[5004]: I1201 08:40:35.743170 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5303a09-48ed-4287-83ba-0fb70fe199d0-logs\") pod \"e5303a09-48ed-4287-83ba-0fb70fe199d0\" (UID: \"e5303a09-48ed-4287-83ba-0fb70fe199d0\") " Dec 01 08:40:35 crc kubenswrapper[5004]: I1201 08:40:35.743996 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5303a09-48ed-4287-83ba-0fb70fe199d0-logs" (OuterVolumeSpecName: "logs") pod "e5303a09-48ed-4287-83ba-0fb70fe199d0" (UID: "e5303a09-48ed-4287-83ba-0fb70fe199d0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:40:35 crc kubenswrapper[5004]: I1201 08:40:35.749376 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5303a09-48ed-4287-83ba-0fb70fe199d0-kube-api-access-8lpgq" (OuterVolumeSpecName: "kube-api-access-8lpgq") pod "e5303a09-48ed-4287-83ba-0fb70fe199d0" (UID: "e5303a09-48ed-4287-83ba-0fb70fe199d0"). InnerVolumeSpecName "kube-api-access-8lpgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:40:35 crc kubenswrapper[5004]: I1201 08:40:35.762604 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5303a09-48ed-4287-83ba-0fb70fe199d0-scripts" (OuterVolumeSpecName: "scripts") pod "e5303a09-48ed-4287-83ba-0fb70fe199d0" (UID: "e5303a09-48ed-4287-83ba-0fb70fe199d0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:40:35 crc kubenswrapper[5004]: I1201 08:40:35.786832 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5303a09-48ed-4287-83ba-0fb70fe199d0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e5303a09-48ed-4287-83ba-0fb70fe199d0" (UID: "e5303a09-48ed-4287-83ba-0fb70fe199d0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:40:35 crc kubenswrapper[5004]: I1201 08:40:35.793130 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5303a09-48ed-4287-83ba-0fb70fe199d0-config-data" (OuterVolumeSpecName: "config-data") pod "e5303a09-48ed-4287-83ba-0fb70fe199d0" (UID: "e5303a09-48ed-4287-83ba-0fb70fe199d0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:40:35 crc kubenswrapper[5004]: I1201 08:40:35.845945 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5303a09-48ed-4287-83ba-0fb70fe199d0-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:35 crc kubenswrapper[5004]: I1201 08:40:35.846361 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5303a09-48ed-4287-83ba-0fb70fe199d0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:35 crc kubenswrapper[5004]: I1201 08:40:35.846375 5004 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5303a09-48ed-4287-83ba-0fb70fe199d0-logs\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:35 crc kubenswrapper[5004]: I1201 08:40:35.846422 5004 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5303a09-48ed-4287-83ba-0fb70fe199d0-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:35 crc kubenswrapper[5004]: I1201 08:40:35.846437 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8lpgq\" (UniqueName: \"kubernetes.io/projected/e5303a09-48ed-4287-83ba-0fb70fe199d0-kube-api-access-8lpgq\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:36 crc kubenswrapper[5004]: I1201 08:40:36.091157 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-7jdwx" event={"ID":"e5303a09-48ed-4287-83ba-0fb70fe199d0","Type":"ContainerDied","Data":"8a99f3615c9b5ecb5c3b604389126063c56e3de2d5a2cec04b9ab1f7458955bc"} Dec 01 08:40:36 crc kubenswrapper[5004]: I1201 08:40:36.091197 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a99f3615c9b5ecb5c3b604389126063c56e3de2d5a2cec04b9ab1f7458955bc" Dec 01 08:40:36 crc kubenswrapper[5004]: I1201 08:40:36.091196 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-7jdwx" Dec 01 08:40:36 crc kubenswrapper[5004]: I1201 08:40:36.092778 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-cb845f59f-m4f5q" event={"ID":"8d456c05-b8d3-43ca-ae93-9b0a5b111296","Type":"ContainerStarted","Data":"9f0ba108bd0ca565138bcf66b21e945fa90cbe89f70ef43dff800db7b101a06c"} Dec 01 08:40:36 crc kubenswrapper[5004]: I1201 08:40:36.258369 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-695797f65d-5b4rw"] Dec 01 08:40:36 crc kubenswrapper[5004]: E1201 08:40:36.258952 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5303a09-48ed-4287-83ba-0fb70fe199d0" containerName="placement-db-sync" Dec 01 08:40:36 crc kubenswrapper[5004]: I1201 08:40:36.258970 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5303a09-48ed-4287-83ba-0fb70fe199d0" containerName="placement-db-sync" Dec 01 08:40:36 crc kubenswrapper[5004]: I1201 08:40:36.259165 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5303a09-48ed-4287-83ba-0fb70fe199d0" containerName="placement-db-sync" Dec 01 08:40:36 crc kubenswrapper[5004]: I1201 08:40:36.263681 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-695797f65d-5b4rw" Dec 01 08:40:36 crc kubenswrapper[5004]: I1201 08:40:36.275199 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 01 08:40:36 crc kubenswrapper[5004]: I1201 08:40:36.275269 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Dec 01 08:40:36 crc kubenswrapper[5004]: I1201 08:40:36.275898 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Dec 01 08:40:36 crc kubenswrapper[5004]: I1201 08:40:36.276036 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 01 08:40:36 crc kubenswrapper[5004]: I1201 08:40:36.276112 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-wj4p4" Dec 01 08:40:36 crc kubenswrapper[5004]: I1201 08:40:36.280965 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-695797f65d-5b4rw"] Dec 01 08:40:36 crc kubenswrapper[5004]: I1201 08:40:36.360818 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3715c3c-b54f-4ff0-808a-51dd69614417-internal-tls-certs\") pod \"placement-695797f65d-5b4rw\" (UID: \"e3715c3c-b54f-4ff0-808a-51dd69614417\") " pod="openstack/placement-695797f65d-5b4rw" Dec 01 08:40:36 crc kubenswrapper[5004]: I1201 08:40:36.361353 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3715c3c-b54f-4ff0-808a-51dd69614417-scripts\") pod \"placement-695797f65d-5b4rw\" (UID: \"e3715c3c-b54f-4ff0-808a-51dd69614417\") " pod="openstack/placement-695797f65d-5b4rw" Dec 01 08:40:36 crc kubenswrapper[5004]: I1201 08:40:36.361678 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3715c3c-b54f-4ff0-808a-51dd69614417-public-tls-certs\") pod \"placement-695797f65d-5b4rw\" (UID: \"e3715c3c-b54f-4ff0-808a-51dd69614417\") " pod="openstack/placement-695797f65d-5b4rw" Dec 01 08:40:36 crc kubenswrapper[5004]: I1201 08:40:36.361934 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3715c3c-b54f-4ff0-808a-51dd69614417-logs\") pod \"placement-695797f65d-5b4rw\" (UID: \"e3715c3c-b54f-4ff0-808a-51dd69614417\") " pod="openstack/placement-695797f65d-5b4rw" Dec 01 08:40:36 crc kubenswrapper[5004]: I1201 08:40:36.362434 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhm5m\" (UniqueName: \"kubernetes.io/projected/e3715c3c-b54f-4ff0-808a-51dd69614417-kube-api-access-rhm5m\") pod \"placement-695797f65d-5b4rw\" (UID: \"e3715c3c-b54f-4ff0-808a-51dd69614417\") " pod="openstack/placement-695797f65d-5b4rw" Dec 01 08:40:36 crc kubenswrapper[5004]: I1201 08:40:36.362600 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3715c3c-b54f-4ff0-808a-51dd69614417-config-data\") pod \"placement-695797f65d-5b4rw\" (UID: \"e3715c3c-b54f-4ff0-808a-51dd69614417\") " pod="openstack/placement-695797f65d-5b4rw" Dec 01 08:40:36 crc kubenswrapper[5004]: I1201 08:40:36.363871 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3715c3c-b54f-4ff0-808a-51dd69614417-combined-ca-bundle\") pod \"placement-695797f65d-5b4rw\" (UID: \"e3715c3c-b54f-4ff0-808a-51dd69614417\") " pod="openstack/placement-695797f65d-5b4rw" Dec 01 08:40:36 crc kubenswrapper[5004]: I1201 08:40:36.466353 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3715c3c-b54f-4ff0-808a-51dd69614417-internal-tls-certs\") pod \"placement-695797f65d-5b4rw\" (UID: \"e3715c3c-b54f-4ff0-808a-51dd69614417\") " pod="openstack/placement-695797f65d-5b4rw" Dec 01 08:40:36 crc kubenswrapper[5004]: I1201 08:40:36.466418 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3715c3c-b54f-4ff0-808a-51dd69614417-scripts\") pod \"placement-695797f65d-5b4rw\" (UID: \"e3715c3c-b54f-4ff0-808a-51dd69614417\") " pod="openstack/placement-695797f65d-5b4rw" Dec 01 08:40:36 crc kubenswrapper[5004]: I1201 08:40:36.466475 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3715c3c-b54f-4ff0-808a-51dd69614417-public-tls-certs\") pod \"placement-695797f65d-5b4rw\" (UID: \"e3715c3c-b54f-4ff0-808a-51dd69614417\") " pod="openstack/placement-695797f65d-5b4rw" Dec 01 08:40:36 crc kubenswrapper[5004]: I1201 08:40:36.466512 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3715c3c-b54f-4ff0-808a-51dd69614417-logs\") pod \"placement-695797f65d-5b4rw\" (UID: \"e3715c3c-b54f-4ff0-808a-51dd69614417\") " pod="openstack/placement-695797f65d-5b4rw" Dec 01 08:40:36 crc kubenswrapper[5004]: I1201 08:40:36.466579 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhm5m\" (UniqueName: \"kubernetes.io/projected/e3715c3c-b54f-4ff0-808a-51dd69614417-kube-api-access-rhm5m\") pod \"placement-695797f65d-5b4rw\" (UID: \"e3715c3c-b54f-4ff0-808a-51dd69614417\") " pod="openstack/placement-695797f65d-5b4rw" Dec 01 08:40:36 crc kubenswrapper[5004]: I1201 08:40:36.466603 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3715c3c-b54f-4ff0-808a-51dd69614417-config-data\") pod \"placement-695797f65d-5b4rw\" (UID: \"e3715c3c-b54f-4ff0-808a-51dd69614417\") " pod="openstack/placement-695797f65d-5b4rw" Dec 01 08:40:36 crc kubenswrapper[5004]: I1201 08:40:36.466632 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3715c3c-b54f-4ff0-808a-51dd69614417-combined-ca-bundle\") pod \"placement-695797f65d-5b4rw\" (UID: \"e3715c3c-b54f-4ff0-808a-51dd69614417\") " pod="openstack/placement-695797f65d-5b4rw" Dec 01 08:40:36 crc kubenswrapper[5004]: I1201 08:40:36.467169 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3715c3c-b54f-4ff0-808a-51dd69614417-logs\") pod \"placement-695797f65d-5b4rw\" (UID: \"e3715c3c-b54f-4ff0-808a-51dd69614417\") " pod="openstack/placement-695797f65d-5b4rw" Dec 01 08:40:36 crc kubenswrapper[5004]: I1201 08:40:36.472574 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3715c3c-b54f-4ff0-808a-51dd69614417-public-tls-certs\") pod \"placement-695797f65d-5b4rw\" (UID: \"e3715c3c-b54f-4ff0-808a-51dd69614417\") " pod="openstack/placement-695797f65d-5b4rw" Dec 01 08:40:36 crc kubenswrapper[5004]: I1201 08:40:36.472929 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3715c3c-b54f-4ff0-808a-51dd69614417-internal-tls-certs\") pod \"placement-695797f65d-5b4rw\" (UID: \"e3715c3c-b54f-4ff0-808a-51dd69614417\") " pod="openstack/placement-695797f65d-5b4rw" Dec 01 08:40:36 crc kubenswrapper[5004]: I1201 08:40:36.473269 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3715c3c-b54f-4ff0-808a-51dd69614417-combined-ca-bundle\") pod \"placement-695797f65d-5b4rw\" (UID: \"e3715c3c-b54f-4ff0-808a-51dd69614417\") " pod="openstack/placement-695797f65d-5b4rw" Dec 01 08:40:36 crc kubenswrapper[5004]: I1201 08:40:36.474738 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3715c3c-b54f-4ff0-808a-51dd69614417-config-data\") pod \"placement-695797f65d-5b4rw\" (UID: \"e3715c3c-b54f-4ff0-808a-51dd69614417\") " pod="openstack/placement-695797f65d-5b4rw" Dec 01 08:40:36 crc kubenswrapper[5004]: I1201 08:40:36.477114 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3715c3c-b54f-4ff0-808a-51dd69614417-scripts\") pod \"placement-695797f65d-5b4rw\" (UID: \"e3715c3c-b54f-4ff0-808a-51dd69614417\") " pod="openstack/placement-695797f65d-5b4rw" Dec 01 08:40:36 crc kubenswrapper[5004]: I1201 08:40:36.483045 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhm5m\" (UniqueName: \"kubernetes.io/projected/e3715c3c-b54f-4ff0-808a-51dd69614417-kube-api-access-rhm5m\") pod \"placement-695797f65d-5b4rw\" (UID: \"e3715c3c-b54f-4ff0-808a-51dd69614417\") " pod="openstack/placement-695797f65d-5b4rw" Dec 01 08:40:36 crc kubenswrapper[5004]: I1201 08:40:36.596030 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-695797f65d-5b4rw" Dec 01 08:40:37 crc kubenswrapper[5004]: I1201 08:40:37.109252 5004 generic.go:334] "Generic (PLEG): container finished" podID="713ee9f1-6421-4ffa-aed2-5f762d8cba63" containerID="a5e210e1ece1b7cb88d92cd73ce1261bd023a742628e4cd59a7610d4e53cd7c5" exitCode=0 Dec 01 08:40:37 crc kubenswrapper[5004]: I1201 08:40:37.109297 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-hlrzt" event={"ID":"713ee9f1-6421-4ffa-aed2-5f762d8cba63","Type":"ContainerDied","Data":"a5e210e1ece1b7cb88d92cd73ce1261bd023a742628e4cd59a7610d4e53cd7c5"} Dec 01 08:40:37 crc kubenswrapper[5004]: I1201 08:40:37.223959 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 01 08:40:37 crc kubenswrapper[5004]: I1201 08:40:37.224000 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 01 08:40:37 crc kubenswrapper[5004]: I1201 08:40:37.254398 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 01 08:40:37 crc kubenswrapper[5004]: I1201 08:40:37.280330 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 01 08:40:38 crc kubenswrapper[5004]: I1201 08:40:38.053509 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zgjgq"] Dec 01 08:40:38 crc kubenswrapper[5004]: I1201 08:40:38.056493 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zgjgq" Dec 01 08:40:38 crc kubenswrapper[5004]: I1201 08:40:38.078474 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zgjgq"] Dec 01 08:40:38 crc kubenswrapper[5004]: I1201 08:40:38.125624 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 01 08:40:38 crc kubenswrapper[5004]: I1201 08:40:38.125657 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 01 08:40:38 crc kubenswrapper[5004]: I1201 08:40:38.214103 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4c8bbd5-63c7-441d-9262-b8d44344c7fa-utilities\") pod \"redhat-operators-zgjgq\" (UID: \"f4c8bbd5-63c7-441d-9262-b8d44344c7fa\") " pod="openshift-marketplace/redhat-operators-zgjgq" Dec 01 08:40:38 crc kubenswrapper[5004]: I1201 08:40:38.214364 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgqsj\" (UniqueName: \"kubernetes.io/projected/f4c8bbd5-63c7-441d-9262-b8d44344c7fa-kube-api-access-qgqsj\") pod \"redhat-operators-zgjgq\" (UID: \"f4c8bbd5-63c7-441d-9262-b8d44344c7fa\") " pod="openshift-marketplace/redhat-operators-zgjgq" Dec 01 08:40:38 crc kubenswrapper[5004]: I1201 08:40:38.214460 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4c8bbd5-63c7-441d-9262-b8d44344c7fa-catalog-content\") pod \"redhat-operators-zgjgq\" (UID: \"f4c8bbd5-63c7-441d-9262-b8d44344c7fa\") " pod="openshift-marketplace/redhat-operators-zgjgq" Dec 01 08:40:38 crc kubenswrapper[5004]: I1201 08:40:38.316443 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4c8bbd5-63c7-441d-9262-b8d44344c7fa-utilities\") pod \"redhat-operators-zgjgq\" (UID: \"f4c8bbd5-63c7-441d-9262-b8d44344c7fa\") " pod="openshift-marketplace/redhat-operators-zgjgq" Dec 01 08:40:38 crc kubenswrapper[5004]: I1201 08:40:38.316513 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgqsj\" (UniqueName: \"kubernetes.io/projected/f4c8bbd5-63c7-441d-9262-b8d44344c7fa-kube-api-access-qgqsj\") pod \"redhat-operators-zgjgq\" (UID: \"f4c8bbd5-63c7-441d-9262-b8d44344c7fa\") " pod="openshift-marketplace/redhat-operators-zgjgq" Dec 01 08:40:38 crc kubenswrapper[5004]: I1201 08:40:38.316588 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4c8bbd5-63c7-441d-9262-b8d44344c7fa-catalog-content\") pod \"redhat-operators-zgjgq\" (UID: \"f4c8bbd5-63c7-441d-9262-b8d44344c7fa\") " pod="openshift-marketplace/redhat-operators-zgjgq" Dec 01 08:40:38 crc kubenswrapper[5004]: I1201 08:40:38.317032 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4c8bbd5-63c7-441d-9262-b8d44344c7fa-catalog-content\") pod \"redhat-operators-zgjgq\" (UID: \"f4c8bbd5-63c7-441d-9262-b8d44344c7fa\") " pod="openshift-marketplace/redhat-operators-zgjgq" Dec 01 08:40:38 crc kubenswrapper[5004]: I1201 08:40:38.317290 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4c8bbd5-63c7-441d-9262-b8d44344c7fa-utilities\") pod \"redhat-operators-zgjgq\" (UID: \"f4c8bbd5-63c7-441d-9262-b8d44344c7fa\") " pod="openshift-marketplace/redhat-operators-zgjgq" Dec 01 08:40:38 crc kubenswrapper[5004]: I1201 08:40:38.341114 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgqsj\" (UniqueName: \"kubernetes.io/projected/f4c8bbd5-63c7-441d-9262-b8d44344c7fa-kube-api-access-qgqsj\") pod \"redhat-operators-zgjgq\" (UID: \"f4c8bbd5-63c7-441d-9262-b8d44344c7fa\") " pod="openshift-marketplace/redhat-operators-zgjgq" Dec 01 08:40:38 crc kubenswrapper[5004]: I1201 08:40:38.387600 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zgjgq" Dec 01 08:40:38 crc kubenswrapper[5004]: I1201 08:40:38.729767 5004 patch_prober.go:28] interesting pod/machine-config-daemon-fvdgt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 08:40:38 crc kubenswrapper[5004]: I1201 08:40:38.729833 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 08:40:39 crc kubenswrapper[5004]: I1201 08:40:39.365328 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 01 08:40:39 crc kubenswrapper[5004]: I1201 08:40:39.365638 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 01 08:40:39 crc kubenswrapper[5004]: I1201 08:40:39.365649 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 01 08:40:39 crc kubenswrapper[5004]: I1201 08:40:39.365660 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 01 08:40:39 crc kubenswrapper[5004]: I1201 08:40:39.407983 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 01 08:40:39 crc kubenswrapper[5004]: I1201 08:40:39.427796 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 01 08:40:39 crc kubenswrapper[5004]: I1201 08:40:39.983817 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5ccc5c4795-xj8tw" Dec 01 08:40:40 crc kubenswrapper[5004]: I1201 08:40:40.051042 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-9x9h5"] Dec 01 08:40:40 crc kubenswrapper[5004]: I1201 08:40:40.051271 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57c957c4ff-9x9h5" podUID="1b2170ab-37fa-4381-9001-5487eb2a302c" containerName="dnsmasq-dns" containerID="cri-o://ba403a449325a9a9d719d703d7e195585ec5fd7c266e71d64e8d24dde0df14a4" gracePeriod=10 Dec 01 08:40:40 crc kubenswrapper[5004]: I1201 08:40:40.141348 5004 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 08:40:40 crc kubenswrapper[5004]: I1201 08:40:40.141815 5004 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 08:40:40 crc kubenswrapper[5004]: I1201 08:40:40.343719 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57c957c4ff-9x9h5" podUID="1b2170ab-37fa-4381-9001-5487eb2a302c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.182:5353: connect: connection refused" Dec 01 08:40:43 crc kubenswrapper[5004]: I1201 08:40:43.176836 5004 generic.go:334] "Generic (PLEG): container finished" podID="1b2170ab-37fa-4381-9001-5487eb2a302c" containerID="ba403a449325a9a9d719d703d7e195585ec5fd7c266e71d64e8d24dde0df14a4" exitCode=0 Dec 01 08:40:43 crc kubenswrapper[5004]: I1201 08:40:43.176984 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-9x9h5" event={"ID":"1b2170ab-37fa-4381-9001-5487eb2a302c","Type":"ContainerDied","Data":"ba403a449325a9a9d719d703d7e195585ec5fd7c266e71d64e8d24dde0df14a4"} Dec 01 08:40:43 crc kubenswrapper[5004]: I1201 08:40:43.466477 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-hlrzt" Dec 01 08:40:43 crc kubenswrapper[5004]: I1201 08:40:43.643270 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/713ee9f1-6421-4ffa-aed2-5f762d8cba63-credential-keys\") pod \"713ee9f1-6421-4ffa-aed2-5f762d8cba63\" (UID: \"713ee9f1-6421-4ffa-aed2-5f762d8cba63\") " Dec 01 08:40:43 crc kubenswrapper[5004]: I1201 08:40:43.643321 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/713ee9f1-6421-4ffa-aed2-5f762d8cba63-fernet-keys\") pod \"713ee9f1-6421-4ffa-aed2-5f762d8cba63\" (UID: \"713ee9f1-6421-4ffa-aed2-5f762d8cba63\") " Dec 01 08:40:43 crc kubenswrapper[5004]: I1201 08:40:43.643369 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plz57\" (UniqueName: \"kubernetes.io/projected/713ee9f1-6421-4ffa-aed2-5f762d8cba63-kube-api-access-plz57\") pod \"713ee9f1-6421-4ffa-aed2-5f762d8cba63\" (UID: \"713ee9f1-6421-4ffa-aed2-5f762d8cba63\") " Dec 01 08:40:43 crc kubenswrapper[5004]: I1201 08:40:43.643387 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/713ee9f1-6421-4ffa-aed2-5f762d8cba63-config-data\") pod \"713ee9f1-6421-4ffa-aed2-5f762d8cba63\" (UID: \"713ee9f1-6421-4ffa-aed2-5f762d8cba63\") " Dec 01 08:40:43 crc kubenswrapper[5004]: I1201 08:40:43.643437 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/713ee9f1-6421-4ffa-aed2-5f762d8cba63-combined-ca-bundle\") pod \"713ee9f1-6421-4ffa-aed2-5f762d8cba63\" (UID: \"713ee9f1-6421-4ffa-aed2-5f762d8cba63\") " Dec 01 08:40:43 crc kubenswrapper[5004]: I1201 08:40:43.643478 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/713ee9f1-6421-4ffa-aed2-5f762d8cba63-scripts\") pod \"713ee9f1-6421-4ffa-aed2-5f762d8cba63\" (UID: \"713ee9f1-6421-4ffa-aed2-5f762d8cba63\") " Dec 01 08:40:43 crc kubenswrapper[5004]: I1201 08:40:43.650810 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/713ee9f1-6421-4ffa-aed2-5f762d8cba63-scripts" (OuterVolumeSpecName: "scripts") pod "713ee9f1-6421-4ffa-aed2-5f762d8cba63" (UID: "713ee9f1-6421-4ffa-aed2-5f762d8cba63"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:40:43 crc kubenswrapper[5004]: I1201 08:40:43.665652 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/713ee9f1-6421-4ffa-aed2-5f762d8cba63-kube-api-access-plz57" (OuterVolumeSpecName: "kube-api-access-plz57") pod "713ee9f1-6421-4ffa-aed2-5f762d8cba63" (UID: "713ee9f1-6421-4ffa-aed2-5f762d8cba63"). InnerVolumeSpecName "kube-api-access-plz57". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:40:43 crc kubenswrapper[5004]: I1201 08:40:43.678877 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/713ee9f1-6421-4ffa-aed2-5f762d8cba63-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "713ee9f1-6421-4ffa-aed2-5f762d8cba63" (UID: "713ee9f1-6421-4ffa-aed2-5f762d8cba63"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:40:43 crc kubenswrapper[5004]: I1201 08:40:43.678968 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/713ee9f1-6421-4ffa-aed2-5f762d8cba63-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "713ee9f1-6421-4ffa-aed2-5f762d8cba63" (UID: "713ee9f1-6421-4ffa-aed2-5f762d8cba63"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:40:43 crc kubenswrapper[5004]: I1201 08:40:43.685950 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/713ee9f1-6421-4ffa-aed2-5f762d8cba63-config-data" (OuterVolumeSpecName: "config-data") pod "713ee9f1-6421-4ffa-aed2-5f762d8cba63" (UID: "713ee9f1-6421-4ffa-aed2-5f762d8cba63"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:40:43 crc kubenswrapper[5004]: I1201 08:40:43.704461 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/713ee9f1-6421-4ffa-aed2-5f762d8cba63-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "713ee9f1-6421-4ffa-aed2-5f762d8cba63" (UID: "713ee9f1-6421-4ffa-aed2-5f762d8cba63"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:40:43 crc kubenswrapper[5004]: I1201 08:40:43.746455 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plz57\" (UniqueName: \"kubernetes.io/projected/713ee9f1-6421-4ffa-aed2-5f762d8cba63-kube-api-access-plz57\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:43 crc kubenswrapper[5004]: I1201 08:40:43.746516 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/713ee9f1-6421-4ffa-aed2-5f762d8cba63-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:43 crc kubenswrapper[5004]: I1201 08:40:43.746531 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/713ee9f1-6421-4ffa-aed2-5f762d8cba63-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:43 crc kubenswrapper[5004]: I1201 08:40:43.746550 5004 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/713ee9f1-6421-4ffa-aed2-5f762d8cba63-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:43 crc kubenswrapper[5004]: I1201 08:40:43.746573 5004 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/713ee9f1-6421-4ffa-aed2-5f762d8cba63-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:43 crc kubenswrapper[5004]: I1201 08:40:43.746583 5004 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/713ee9f1-6421-4ffa-aed2-5f762d8cba63-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:43 crc kubenswrapper[5004]: I1201 08:40:43.897040 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-9x9h5" Dec 01 08:40:43 crc kubenswrapper[5004]: I1201 08:40:43.956549 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lp58h\" (UniqueName: \"kubernetes.io/projected/1b2170ab-37fa-4381-9001-5487eb2a302c-kube-api-access-lp58h\") pod \"1b2170ab-37fa-4381-9001-5487eb2a302c\" (UID: \"1b2170ab-37fa-4381-9001-5487eb2a302c\") " Dec 01 08:40:43 crc kubenswrapper[5004]: I1201 08:40:43.956876 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1b2170ab-37fa-4381-9001-5487eb2a302c-ovsdbserver-sb\") pod \"1b2170ab-37fa-4381-9001-5487eb2a302c\" (UID: \"1b2170ab-37fa-4381-9001-5487eb2a302c\") " Dec 01 08:40:43 crc kubenswrapper[5004]: I1201 08:40:43.957473 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1b2170ab-37fa-4381-9001-5487eb2a302c-dns-swift-storage-0\") pod \"1b2170ab-37fa-4381-9001-5487eb2a302c\" (UID: \"1b2170ab-37fa-4381-9001-5487eb2a302c\") " Dec 01 08:40:43 crc kubenswrapper[5004]: I1201 08:40:43.957654 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b2170ab-37fa-4381-9001-5487eb2a302c-config\") pod \"1b2170ab-37fa-4381-9001-5487eb2a302c\" (UID: \"1b2170ab-37fa-4381-9001-5487eb2a302c\") " Dec 01 08:40:43 crc kubenswrapper[5004]: I1201 08:40:43.957697 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b2170ab-37fa-4381-9001-5487eb2a302c-dns-svc\") pod \"1b2170ab-37fa-4381-9001-5487eb2a302c\" (UID: \"1b2170ab-37fa-4381-9001-5487eb2a302c\") " Dec 01 08:40:43 crc kubenswrapper[5004]: I1201 08:40:43.957804 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1b2170ab-37fa-4381-9001-5487eb2a302c-ovsdbserver-nb\") pod \"1b2170ab-37fa-4381-9001-5487eb2a302c\" (UID: \"1b2170ab-37fa-4381-9001-5487eb2a302c\") " Dec 01 08:40:43 crc kubenswrapper[5004]: I1201 08:40:43.982126 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b2170ab-37fa-4381-9001-5487eb2a302c-kube-api-access-lp58h" (OuterVolumeSpecName: "kube-api-access-lp58h") pod "1b2170ab-37fa-4381-9001-5487eb2a302c" (UID: "1b2170ab-37fa-4381-9001-5487eb2a302c"). InnerVolumeSpecName "kube-api-access-lp58h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:40:43 crc kubenswrapper[5004]: I1201 08:40:43.989782 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-695797f65d-5b4rw"] Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.060964 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lp58h\" (UniqueName: \"kubernetes.io/projected/1b2170ab-37fa-4381-9001-5487eb2a302c-kube-api-access-lp58h\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.061096 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b2170ab-37fa-4381-9001-5487eb2a302c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1b2170ab-37fa-4381-9001-5487eb2a302c" (UID: "1b2170ab-37fa-4381-9001-5487eb2a302c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.074054 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b2170ab-37fa-4381-9001-5487eb2a302c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1b2170ab-37fa-4381-9001-5487eb2a302c" (UID: "1b2170ab-37fa-4381-9001-5487eb2a302c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.103151 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b2170ab-37fa-4381-9001-5487eb2a302c-config" (OuterVolumeSpecName: "config") pod "1b2170ab-37fa-4381-9001-5487eb2a302c" (UID: "1b2170ab-37fa-4381-9001-5487eb2a302c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.130464 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b2170ab-37fa-4381-9001-5487eb2a302c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1b2170ab-37fa-4381-9001-5487eb2a302c" (UID: "1b2170ab-37fa-4381-9001-5487eb2a302c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.159961 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b2170ab-37fa-4381-9001-5487eb2a302c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1b2170ab-37fa-4381-9001-5487eb2a302c" (UID: "1b2170ab-37fa-4381-9001-5487eb2a302c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.162943 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b2170ab-37fa-4381-9001-5487eb2a302c-config\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.162966 5004 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b2170ab-37fa-4381-9001-5487eb2a302c-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.162977 5004 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1b2170ab-37fa-4381-9001-5487eb2a302c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.162988 5004 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1b2170ab-37fa-4381-9001-5487eb2a302c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.162996 5004 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1b2170ab-37fa-4381-9001-5487eb2a302c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.187434 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1fd8d826-4dab-4d07-bc04-be5dfebdaf2b","Type":"ContainerStarted","Data":"b30bf82a68f48468230213f790b700cf0f102fb193557f1d0f19a033f9588f8f"} Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.189081 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-cb845f59f-m4f5q" event={"ID":"8d456c05-b8d3-43ca-ae93-9b0a5b111296","Type":"ContainerStarted","Data":"5626580415d8d426f1fe659b8330aac470959b5bbe281a0c390de292a3854442"} Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.190125 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-cb845f59f-m4f5q" Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.190937 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-695797f65d-5b4rw" event={"ID":"e3715c3c-b54f-4ff0-808a-51dd69614417","Type":"ContainerStarted","Data":"1398a29e2b86ea91073a58be404879207a3ad42cf4f580c002e3e5d4672280ab"} Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.192604 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-9x9h5" Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.192647 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-9x9h5" event={"ID":"1b2170ab-37fa-4381-9001-5487eb2a302c","Type":"ContainerDied","Data":"83b6adac67cf58fa3c8096ca7abcbc3d41053a5bd9f2f171c4a7e60af501e3d8"} Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.192692 5004 scope.go:117] "RemoveContainer" containerID="ba403a449325a9a9d719d703d7e195585ec5fd7c266e71d64e8d24dde0df14a4" Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.194863 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-hlrzt" event={"ID":"713ee9f1-6421-4ffa-aed2-5f762d8cba63","Type":"ContainerDied","Data":"64dd8f90d116777fbcbeec0db5b8cc52c3ce3c6508ef78a32e5de3e468f90c91"} Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.194885 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-hlrzt" Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.194901 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64dd8f90d116777fbcbeec0db5b8cc52c3ce3c6508ef78a32e5de3e468f90c91" Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.196177 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.196259 5004 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.196793 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-vgmk8" event={"ID":"0eccea77-d6ee-4592-ad47-1f29ca2a943b","Type":"ContainerStarted","Data":"5dda4d99565c4be4d42864bdcdc7309d6c705f0d76d76b289b4046a8e6ef092f"} Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.199014 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.215464 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-cb845f59f-m4f5q" podStartSLOduration=12.215441002 podStartE2EDuration="12.215441002s" podCreationTimestamp="2025-12-01 08:40:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:40:44.211326562 +0000 UTC m=+1421.776318544" watchObservedRunningTime="2025-12-01 08:40:44.215441002 +0000 UTC m=+1421.780432984" Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.236718 5004 scope.go:117] "RemoveContainer" containerID="d84618ce559ed8363e13c4dadbdea5e393986932f5f51e7e54a580e72d7e978a" Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.237936 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-vgmk8" podStartSLOduration=3.433028629 podStartE2EDuration="45.237918721s" podCreationTimestamp="2025-12-01 08:39:59 +0000 UTC" firstStartedPulling="2025-12-01 08:40:01.99461107 +0000 UTC m=+1379.559603052" lastFinishedPulling="2025-12-01 08:40:43.799501162 +0000 UTC m=+1421.364493144" observedRunningTime="2025-12-01 08:40:44.237130572 +0000 UTC m=+1421.802122554" watchObservedRunningTime="2025-12-01 08:40:44.237918721 +0000 UTC m=+1421.802910713" Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.260426 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.260533 5004 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.275196 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.284893 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zgjgq"] Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.339348 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-9x9h5"] Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.349720 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-9x9h5"] Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.658399 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-6d7ffc7f79-7hlcz"] Dec 01 08:40:44 crc kubenswrapper[5004]: E1201 08:40:44.659147 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b2170ab-37fa-4381-9001-5487eb2a302c" containerName="dnsmasq-dns" Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.659159 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b2170ab-37fa-4381-9001-5487eb2a302c" containerName="dnsmasq-dns" Dec 01 08:40:44 crc kubenswrapper[5004]: E1201 08:40:44.659172 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b2170ab-37fa-4381-9001-5487eb2a302c" containerName="init" Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.659178 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b2170ab-37fa-4381-9001-5487eb2a302c" containerName="init" Dec 01 08:40:44 crc kubenswrapper[5004]: E1201 08:40:44.659204 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="713ee9f1-6421-4ffa-aed2-5f762d8cba63" containerName="keystone-bootstrap" Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.659211 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="713ee9f1-6421-4ffa-aed2-5f762d8cba63" containerName="keystone-bootstrap" Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.659392 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="713ee9f1-6421-4ffa-aed2-5f762d8cba63" containerName="keystone-bootstrap" Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.659415 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b2170ab-37fa-4381-9001-5487eb2a302c" containerName="dnsmasq-dns" Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.660196 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6d7ffc7f79-7hlcz" Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.671022 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.671060 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.671299 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.671629 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.672193 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.672368 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-nlkxt" Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.720058 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6d7ffc7f79-7hlcz"] Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.780599 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/78648b2d-8d56-42a1-8152-80e6e1a1b201-internal-tls-certs\") pod \"keystone-6d7ffc7f79-7hlcz\" (UID: \"78648b2d-8d56-42a1-8152-80e6e1a1b201\") " pod="openstack/keystone-6d7ffc7f79-7hlcz" Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.780758 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78648b2d-8d56-42a1-8152-80e6e1a1b201-combined-ca-bundle\") pod \"keystone-6d7ffc7f79-7hlcz\" (UID: \"78648b2d-8d56-42a1-8152-80e6e1a1b201\") " pod="openstack/keystone-6d7ffc7f79-7hlcz" Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.780858 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/78648b2d-8d56-42a1-8152-80e6e1a1b201-public-tls-certs\") pod \"keystone-6d7ffc7f79-7hlcz\" (UID: \"78648b2d-8d56-42a1-8152-80e6e1a1b201\") " pod="openstack/keystone-6d7ffc7f79-7hlcz" Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.780959 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/78648b2d-8d56-42a1-8152-80e6e1a1b201-credential-keys\") pod \"keystone-6d7ffc7f79-7hlcz\" (UID: \"78648b2d-8d56-42a1-8152-80e6e1a1b201\") " pod="openstack/keystone-6d7ffc7f79-7hlcz" Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.781055 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78648b2d-8d56-42a1-8152-80e6e1a1b201-scripts\") pod \"keystone-6d7ffc7f79-7hlcz\" (UID: \"78648b2d-8d56-42a1-8152-80e6e1a1b201\") " pod="openstack/keystone-6d7ffc7f79-7hlcz" Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.781224 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78648b2d-8d56-42a1-8152-80e6e1a1b201-config-data\") pod \"keystone-6d7ffc7f79-7hlcz\" (UID: \"78648b2d-8d56-42a1-8152-80e6e1a1b201\") " pod="openstack/keystone-6d7ffc7f79-7hlcz" Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.781329 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjpzr\" (UniqueName: \"kubernetes.io/projected/78648b2d-8d56-42a1-8152-80e6e1a1b201-kube-api-access-gjpzr\") pod \"keystone-6d7ffc7f79-7hlcz\" (UID: \"78648b2d-8d56-42a1-8152-80e6e1a1b201\") " pod="openstack/keystone-6d7ffc7f79-7hlcz" Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.781503 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/78648b2d-8d56-42a1-8152-80e6e1a1b201-fernet-keys\") pod \"keystone-6d7ffc7f79-7hlcz\" (UID: \"78648b2d-8d56-42a1-8152-80e6e1a1b201\") " pod="openstack/keystone-6d7ffc7f79-7hlcz" Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.781670 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b2170ab-37fa-4381-9001-5487eb2a302c" path="/var/lib/kubelet/pods/1b2170ab-37fa-4381-9001-5487eb2a302c/volumes" Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.884210 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78648b2d-8d56-42a1-8152-80e6e1a1b201-scripts\") pod \"keystone-6d7ffc7f79-7hlcz\" (UID: \"78648b2d-8d56-42a1-8152-80e6e1a1b201\") " pod="openstack/keystone-6d7ffc7f79-7hlcz" Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.884334 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78648b2d-8d56-42a1-8152-80e6e1a1b201-config-data\") pod \"keystone-6d7ffc7f79-7hlcz\" (UID: \"78648b2d-8d56-42a1-8152-80e6e1a1b201\") " pod="openstack/keystone-6d7ffc7f79-7hlcz" Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.884369 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjpzr\" (UniqueName: \"kubernetes.io/projected/78648b2d-8d56-42a1-8152-80e6e1a1b201-kube-api-access-gjpzr\") pod \"keystone-6d7ffc7f79-7hlcz\" (UID: \"78648b2d-8d56-42a1-8152-80e6e1a1b201\") " pod="openstack/keystone-6d7ffc7f79-7hlcz" Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.884463 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/78648b2d-8d56-42a1-8152-80e6e1a1b201-fernet-keys\") pod \"keystone-6d7ffc7f79-7hlcz\" (UID: \"78648b2d-8d56-42a1-8152-80e6e1a1b201\") " pod="openstack/keystone-6d7ffc7f79-7hlcz" Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.884515 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/78648b2d-8d56-42a1-8152-80e6e1a1b201-internal-tls-certs\") pod \"keystone-6d7ffc7f79-7hlcz\" (UID: \"78648b2d-8d56-42a1-8152-80e6e1a1b201\") " pod="openstack/keystone-6d7ffc7f79-7hlcz" Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.884534 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78648b2d-8d56-42a1-8152-80e6e1a1b201-combined-ca-bundle\") pod \"keystone-6d7ffc7f79-7hlcz\" (UID: \"78648b2d-8d56-42a1-8152-80e6e1a1b201\") " pod="openstack/keystone-6d7ffc7f79-7hlcz" Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.884556 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/78648b2d-8d56-42a1-8152-80e6e1a1b201-public-tls-certs\") pod \"keystone-6d7ffc7f79-7hlcz\" (UID: \"78648b2d-8d56-42a1-8152-80e6e1a1b201\") " pod="openstack/keystone-6d7ffc7f79-7hlcz" Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.884595 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/78648b2d-8d56-42a1-8152-80e6e1a1b201-credential-keys\") pod \"keystone-6d7ffc7f79-7hlcz\" (UID: \"78648b2d-8d56-42a1-8152-80e6e1a1b201\") " pod="openstack/keystone-6d7ffc7f79-7hlcz" Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.890347 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/78648b2d-8d56-42a1-8152-80e6e1a1b201-fernet-keys\") pod \"keystone-6d7ffc7f79-7hlcz\" (UID: \"78648b2d-8d56-42a1-8152-80e6e1a1b201\") " pod="openstack/keystone-6d7ffc7f79-7hlcz" Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.890619 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78648b2d-8d56-42a1-8152-80e6e1a1b201-config-data\") pod \"keystone-6d7ffc7f79-7hlcz\" (UID: \"78648b2d-8d56-42a1-8152-80e6e1a1b201\") " pod="openstack/keystone-6d7ffc7f79-7hlcz" Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.891789 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/78648b2d-8d56-42a1-8152-80e6e1a1b201-credential-keys\") pod \"keystone-6d7ffc7f79-7hlcz\" (UID: \"78648b2d-8d56-42a1-8152-80e6e1a1b201\") " pod="openstack/keystone-6d7ffc7f79-7hlcz" Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.893282 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78648b2d-8d56-42a1-8152-80e6e1a1b201-scripts\") pod \"keystone-6d7ffc7f79-7hlcz\" (UID: \"78648b2d-8d56-42a1-8152-80e6e1a1b201\") " pod="openstack/keystone-6d7ffc7f79-7hlcz" Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.894456 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/78648b2d-8d56-42a1-8152-80e6e1a1b201-public-tls-certs\") pod \"keystone-6d7ffc7f79-7hlcz\" (UID: \"78648b2d-8d56-42a1-8152-80e6e1a1b201\") " pod="openstack/keystone-6d7ffc7f79-7hlcz" Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.894937 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78648b2d-8d56-42a1-8152-80e6e1a1b201-combined-ca-bundle\") pod \"keystone-6d7ffc7f79-7hlcz\" (UID: \"78648b2d-8d56-42a1-8152-80e6e1a1b201\") " pod="openstack/keystone-6d7ffc7f79-7hlcz" Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.895139 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/78648b2d-8d56-42a1-8152-80e6e1a1b201-internal-tls-certs\") pod \"keystone-6d7ffc7f79-7hlcz\" (UID: \"78648b2d-8d56-42a1-8152-80e6e1a1b201\") " pod="openstack/keystone-6d7ffc7f79-7hlcz" Dec 01 08:40:44 crc kubenswrapper[5004]: I1201 08:40:44.899060 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjpzr\" (UniqueName: \"kubernetes.io/projected/78648b2d-8d56-42a1-8152-80e6e1a1b201-kube-api-access-gjpzr\") pod \"keystone-6d7ffc7f79-7hlcz\" (UID: \"78648b2d-8d56-42a1-8152-80e6e1a1b201\") " pod="openstack/keystone-6d7ffc7f79-7hlcz" Dec 01 08:40:45 crc kubenswrapper[5004]: I1201 08:40:45.029719 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6d7ffc7f79-7hlcz" Dec 01 08:40:45 crc kubenswrapper[5004]: I1201 08:40:45.214492 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-cwgb2" event={"ID":"fb372dfc-6007-42ba-bc16-96f7d99d8b98","Type":"ContainerStarted","Data":"84bc7259ce202f981795b79e6494280f52d4b5a4b6161309b9ecbae984c2aca9"} Dec 01 08:40:45 crc kubenswrapper[5004]: I1201 08:40:45.227816 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-695797f65d-5b4rw" event={"ID":"e3715c3c-b54f-4ff0-808a-51dd69614417","Type":"ContainerStarted","Data":"19abccc07de5cc525dcd230d5562c960b1505cbab4f7ed862ee1379a6a53c1db"} Dec 01 08:40:45 crc kubenswrapper[5004]: I1201 08:40:45.227871 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-695797f65d-5b4rw" event={"ID":"e3715c3c-b54f-4ff0-808a-51dd69614417","Type":"ContainerStarted","Data":"5d8ae62d6f9eb08092662f0a3f807bc2e4418156426b3ee6373d8b58fef4f576"} Dec 01 08:40:45 crc kubenswrapper[5004]: I1201 08:40:45.228182 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-695797f65d-5b4rw" Dec 01 08:40:45 crc kubenswrapper[5004]: I1201 08:40:45.229208 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-695797f65d-5b4rw" Dec 01 08:40:45 crc kubenswrapper[5004]: I1201 08:40:45.231478 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-cwgb2" podStartSLOduration=2.978099446 podStartE2EDuration="46.23146192s" podCreationTimestamp="2025-12-01 08:39:59 +0000 UTC" firstStartedPulling="2025-12-01 08:40:01.111372175 +0000 UTC m=+1378.676364157" lastFinishedPulling="2025-12-01 08:40:44.364734649 +0000 UTC m=+1421.929726631" observedRunningTime="2025-12-01 08:40:45.227684678 +0000 UTC m=+1422.792676670" watchObservedRunningTime="2025-12-01 08:40:45.23146192 +0000 UTC m=+1422.796453902" Dec 01 08:40:45 crc kubenswrapper[5004]: I1201 08:40:45.232212 5004 generic.go:334] "Generic (PLEG): container finished" podID="f4c8bbd5-63c7-441d-9262-b8d44344c7fa" containerID="45265fc64361a745488e972896546d0437e068e42072dd4208f3398b553b8c96" exitCode=0 Dec 01 08:40:45 crc kubenswrapper[5004]: I1201 08:40:45.232271 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zgjgq" event={"ID":"f4c8bbd5-63c7-441d-9262-b8d44344c7fa","Type":"ContainerDied","Data":"45265fc64361a745488e972896546d0437e068e42072dd4208f3398b553b8c96"} Dec 01 08:40:45 crc kubenswrapper[5004]: I1201 08:40:45.232296 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zgjgq" event={"ID":"f4c8bbd5-63c7-441d-9262-b8d44344c7fa","Type":"ContainerStarted","Data":"d5d8f75474011431ddcc351c020de455cc92b0948d1d49ab2a28001dc19aaba4"} Dec 01 08:40:45 crc kubenswrapper[5004]: I1201 08:40:45.299803 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-695797f65d-5b4rw" podStartSLOduration=9.299782409 podStartE2EDuration="9.299782409s" podCreationTimestamp="2025-12-01 08:40:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:40:45.249770217 +0000 UTC m=+1422.814762219" watchObservedRunningTime="2025-12-01 08:40:45.299782409 +0000 UTC m=+1422.864774401" Dec 01 08:40:45 crc kubenswrapper[5004]: I1201 08:40:45.611599 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6d7ffc7f79-7hlcz"] Dec 01 08:40:46 crc kubenswrapper[5004]: I1201 08:40:46.256733 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-ln7f8" event={"ID":"165d617f-a220-49b1-af2b-65d4c509962c","Type":"ContainerStarted","Data":"e56b5fd8e27ba254e70b7ec601bc6d3d8f18ae99c0d1825e9b01b4807757acbf"} Dec 01 08:40:46 crc kubenswrapper[5004]: I1201 08:40:46.260594 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6d7ffc7f79-7hlcz" event={"ID":"78648b2d-8d56-42a1-8152-80e6e1a1b201","Type":"ContainerStarted","Data":"08048385e133c135efde013a8878d39bfb6ed6e93ce76258b44bdfc406a9a1d3"} Dec 01 08:40:46 crc kubenswrapper[5004]: I1201 08:40:46.260653 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6d7ffc7f79-7hlcz" event={"ID":"78648b2d-8d56-42a1-8152-80e6e1a1b201","Type":"ContainerStarted","Data":"bdba1184df0004ab3e4a12a52dae39f2159239a55feabf30d0930474613b729a"} Dec 01 08:40:46 crc kubenswrapper[5004]: I1201 08:40:46.278115 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-ln7f8" podStartSLOduration=3.149625194 podStartE2EDuration="47.278094885s" podCreationTimestamp="2025-12-01 08:39:59 +0000 UTC" firstStartedPulling="2025-12-01 08:40:01.183837894 +0000 UTC m=+1378.748829876" lastFinishedPulling="2025-12-01 08:40:45.312307585 +0000 UTC m=+1422.877299567" observedRunningTime="2025-12-01 08:40:46.275378789 +0000 UTC m=+1423.840370771" watchObservedRunningTime="2025-12-01 08:40:46.278094885 +0000 UTC m=+1423.843086867" Dec 01 08:40:46 crc kubenswrapper[5004]: I1201 08:40:46.296660 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-6d7ffc7f79-7hlcz" podStartSLOduration=2.296635649 podStartE2EDuration="2.296635649s" podCreationTimestamp="2025-12-01 08:40:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:40:46.294632309 +0000 UTC m=+1423.859624301" watchObservedRunningTime="2025-12-01 08:40:46.296635649 +0000 UTC m=+1423.861627631" Dec 01 08:40:47 crc kubenswrapper[5004]: I1201 08:40:47.272099 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zgjgq" event={"ID":"f4c8bbd5-63c7-441d-9262-b8d44344c7fa","Type":"ContainerStarted","Data":"bdeeb795990c8564109564e8080740677ec73d83420e3c4b12b77bda869a9f7f"} Dec 01 08:40:47 crc kubenswrapper[5004]: I1201 08:40:47.273072 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-6d7ffc7f79-7hlcz" Dec 01 08:40:48 crc kubenswrapper[5004]: I1201 08:40:48.286995 5004 generic.go:334] "Generic (PLEG): container finished" podID="f4c8bbd5-63c7-441d-9262-b8d44344c7fa" containerID="bdeeb795990c8564109564e8080740677ec73d83420e3c4b12b77bda869a9f7f" exitCode=0 Dec 01 08:40:48 crc kubenswrapper[5004]: I1201 08:40:48.287213 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zgjgq" event={"ID":"f4c8bbd5-63c7-441d-9262-b8d44344c7fa","Type":"ContainerDied","Data":"bdeeb795990c8564109564e8080740677ec73d83420e3c4b12b77bda869a9f7f"} Dec 01 08:40:49 crc kubenswrapper[5004]: I1201 08:40:49.304057 5004 generic.go:334] "Generic (PLEG): container finished" podID="0eccea77-d6ee-4592-ad47-1f29ca2a943b" containerID="5dda4d99565c4be4d42864bdcdc7309d6c705f0d76d76b289b4046a8e6ef092f" exitCode=0 Dec 01 08:40:49 crc kubenswrapper[5004]: I1201 08:40:49.304124 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-vgmk8" event={"ID":"0eccea77-d6ee-4592-ad47-1f29ca2a943b","Type":"ContainerDied","Data":"5dda4d99565c4be4d42864bdcdc7309d6c705f0d76d76b289b4046a8e6ef092f"} Dec 01 08:40:50 crc kubenswrapper[5004]: I1201 08:40:50.318101 5004 generic.go:334] "Generic (PLEG): container finished" podID="fb372dfc-6007-42ba-bc16-96f7d99d8b98" containerID="84bc7259ce202f981795b79e6494280f52d4b5a4b6161309b9ecbae984c2aca9" exitCode=0 Dec 01 08:40:50 crc kubenswrapper[5004]: I1201 08:40:50.318143 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-cwgb2" event={"ID":"fb372dfc-6007-42ba-bc16-96f7d99d8b98","Type":"ContainerDied","Data":"84bc7259ce202f981795b79e6494280f52d4b5a4b6161309b9ecbae984c2aca9"} Dec 01 08:40:51 crc kubenswrapper[5004]: I1201 08:40:51.336597 5004 generic.go:334] "Generic (PLEG): container finished" podID="165d617f-a220-49b1-af2b-65d4c509962c" containerID="e56b5fd8e27ba254e70b7ec601bc6d3d8f18ae99c0d1825e9b01b4807757acbf" exitCode=0 Dec 01 08:40:51 crc kubenswrapper[5004]: I1201 08:40:51.336705 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-ln7f8" event={"ID":"165d617f-a220-49b1-af2b-65d4c509962c","Type":"ContainerDied","Data":"e56b5fd8e27ba254e70b7ec601bc6d3d8f18ae99c0d1825e9b01b4807757acbf"} Dec 01 08:40:52 crc kubenswrapper[5004]: I1201 08:40:52.628830 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-vgmk8" Dec 01 08:40:52 crc kubenswrapper[5004]: I1201 08:40:52.657015 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-cwgb2" Dec 01 08:40:52 crc kubenswrapper[5004]: I1201 08:40:52.738634 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-ln7f8" Dec 01 08:40:52 crc kubenswrapper[5004]: I1201 08:40:52.781065 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb372dfc-6007-42ba-bc16-96f7d99d8b98-combined-ca-bundle\") pod \"fb372dfc-6007-42ba-bc16-96f7d99d8b98\" (UID: \"fb372dfc-6007-42ba-bc16-96f7d99d8b98\") " Dec 01 08:40:52 crc kubenswrapper[5004]: I1201 08:40:52.781155 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0eccea77-d6ee-4592-ad47-1f29ca2a943b-db-sync-config-data\") pod \"0eccea77-d6ee-4592-ad47-1f29ca2a943b\" (UID: \"0eccea77-d6ee-4592-ad47-1f29ca2a943b\") " Dec 01 08:40:52 crc kubenswrapper[5004]: I1201 08:40:52.781201 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lv29d\" (UniqueName: \"kubernetes.io/projected/0eccea77-d6ee-4592-ad47-1f29ca2a943b-kube-api-access-lv29d\") pod \"0eccea77-d6ee-4592-ad47-1f29ca2a943b\" (UID: \"0eccea77-d6ee-4592-ad47-1f29ca2a943b\") " Dec 01 08:40:52 crc kubenswrapper[5004]: I1201 08:40:52.781262 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb372dfc-6007-42ba-bc16-96f7d99d8b98-config-data\") pod \"fb372dfc-6007-42ba-bc16-96f7d99d8b98\" (UID: \"fb372dfc-6007-42ba-bc16-96f7d99d8b98\") " Dec 01 08:40:52 crc kubenswrapper[5004]: I1201 08:40:52.781290 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0eccea77-d6ee-4592-ad47-1f29ca2a943b-combined-ca-bundle\") pod \"0eccea77-d6ee-4592-ad47-1f29ca2a943b\" (UID: \"0eccea77-d6ee-4592-ad47-1f29ca2a943b\") " Dec 01 08:40:52 crc kubenswrapper[5004]: I1201 08:40:52.781425 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gs5vj\" (UniqueName: \"kubernetes.io/projected/fb372dfc-6007-42ba-bc16-96f7d99d8b98-kube-api-access-gs5vj\") pod \"fb372dfc-6007-42ba-bc16-96f7d99d8b98\" (UID: \"fb372dfc-6007-42ba-bc16-96f7d99d8b98\") " Dec 01 08:40:52 crc kubenswrapper[5004]: E1201 08:40:52.782168 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="1fd8d826-4dab-4d07-bc04-be5dfebdaf2b" Dec 01 08:40:52 crc kubenswrapper[5004]: I1201 08:40:52.787061 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0eccea77-d6ee-4592-ad47-1f29ca2a943b-kube-api-access-lv29d" (OuterVolumeSpecName: "kube-api-access-lv29d") pod "0eccea77-d6ee-4592-ad47-1f29ca2a943b" (UID: "0eccea77-d6ee-4592-ad47-1f29ca2a943b"). InnerVolumeSpecName "kube-api-access-lv29d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:40:52 crc kubenswrapper[5004]: I1201 08:40:52.787175 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0eccea77-d6ee-4592-ad47-1f29ca2a943b-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "0eccea77-d6ee-4592-ad47-1f29ca2a943b" (UID: "0eccea77-d6ee-4592-ad47-1f29ca2a943b"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:40:52 crc kubenswrapper[5004]: I1201 08:40:52.788637 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb372dfc-6007-42ba-bc16-96f7d99d8b98-kube-api-access-gs5vj" (OuterVolumeSpecName: "kube-api-access-gs5vj") pod "fb372dfc-6007-42ba-bc16-96f7d99d8b98" (UID: "fb372dfc-6007-42ba-bc16-96f7d99d8b98"). InnerVolumeSpecName "kube-api-access-gs5vj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:40:52 crc kubenswrapper[5004]: I1201 08:40:52.819027 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb372dfc-6007-42ba-bc16-96f7d99d8b98-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fb372dfc-6007-42ba-bc16-96f7d99d8b98" (UID: "fb372dfc-6007-42ba-bc16-96f7d99d8b98"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:40:52 crc kubenswrapper[5004]: I1201 08:40:52.822529 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0eccea77-d6ee-4592-ad47-1f29ca2a943b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0eccea77-d6ee-4592-ad47-1f29ca2a943b" (UID: "0eccea77-d6ee-4592-ad47-1f29ca2a943b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:40:52 crc kubenswrapper[5004]: I1201 08:40:52.862710 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb372dfc-6007-42ba-bc16-96f7d99d8b98-config-data" (OuterVolumeSpecName: "config-data") pod "fb372dfc-6007-42ba-bc16-96f7d99d8b98" (UID: "fb372dfc-6007-42ba-bc16-96f7d99d8b98"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:40:52 crc kubenswrapper[5004]: I1201 08:40:52.883634 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2hpl\" (UniqueName: \"kubernetes.io/projected/165d617f-a220-49b1-af2b-65d4c509962c-kube-api-access-r2hpl\") pod \"165d617f-a220-49b1-af2b-65d4c509962c\" (UID: \"165d617f-a220-49b1-af2b-65d4c509962c\") " Dec 01 08:40:52 crc kubenswrapper[5004]: I1201 08:40:52.883704 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/165d617f-a220-49b1-af2b-65d4c509962c-config-data\") pod \"165d617f-a220-49b1-af2b-65d4c509962c\" (UID: \"165d617f-a220-49b1-af2b-65d4c509962c\") " Dec 01 08:40:52 crc kubenswrapper[5004]: I1201 08:40:52.883746 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/165d617f-a220-49b1-af2b-65d4c509962c-etc-machine-id\") pod \"165d617f-a220-49b1-af2b-65d4c509962c\" (UID: \"165d617f-a220-49b1-af2b-65d4c509962c\") " Dec 01 08:40:52 crc kubenswrapper[5004]: I1201 08:40:52.883772 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/165d617f-a220-49b1-af2b-65d4c509962c-db-sync-config-data\") pod \"165d617f-a220-49b1-af2b-65d4c509962c\" (UID: \"165d617f-a220-49b1-af2b-65d4c509962c\") " Dec 01 08:40:52 crc kubenswrapper[5004]: I1201 08:40:52.883801 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/165d617f-a220-49b1-af2b-65d4c509962c-scripts\") pod \"165d617f-a220-49b1-af2b-65d4c509962c\" (UID: \"165d617f-a220-49b1-af2b-65d4c509962c\") " Dec 01 08:40:52 crc kubenswrapper[5004]: I1201 08:40:52.883861 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/165d617f-a220-49b1-af2b-65d4c509962c-combined-ca-bundle\") pod \"165d617f-a220-49b1-af2b-65d4c509962c\" (UID: \"165d617f-a220-49b1-af2b-65d4c509962c\") " Dec 01 08:40:52 crc kubenswrapper[5004]: I1201 08:40:52.884404 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gs5vj\" (UniqueName: \"kubernetes.io/projected/fb372dfc-6007-42ba-bc16-96f7d99d8b98-kube-api-access-gs5vj\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:52 crc kubenswrapper[5004]: I1201 08:40:52.884424 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb372dfc-6007-42ba-bc16-96f7d99d8b98-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:52 crc kubenswrapper[5004]: I1201 08:40:52.884434 5004 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0eccea77-d6ee-4592-ad47-1f29ca2a943b-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:52 crc kubenswrapper[5004]: I1201 08:40:52.884444 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lv29d\" (UniqueName: \"kubernetes.io/projected/0eccea77-d6ee-4592-ad47-1f29ca2a943b-kube-api-access-lv29d\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:52 crc kubenswrapper[5004]: I1201 08:40:52.884458 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb372dfc-6007-42ba-bc16-96f7d99d8b98-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:52 crc kubenswrapper[5004]: I1201 08:40:52.884467 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0eccea77-d6ee-4592-ad47-1f29ca2a943b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:52 crc kubenswrapper[5004]: I1201 08:40:52.883847 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/165d617f-a220-49b1-af2b-65d4c509962c-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "165d617f-a220-49b1-af2b-65d4c509962c" (UID: "165d617f-a220-49b1-af2b-65d4c509962c"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 08:40:52 crc kubenswrapper[5004]: I1201 08:40:52.887089 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/165d617f-a220-49b1-af2b-65d4c509962c-scripts" (OuterVolumeSpecName: "scripts") pod "165d617f-a220-49b1-af2b-65d4c509962c" (UID: "165d617f-a220-49b1-af2b-65d4c509962c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:40:52 crc kubenswrapper[5004]: I1201 08:40:52.888789 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/165d617f-a220-49b1-af2b-65d4c509962c-kube-api-access-r2hpl" (OuterVolumeSpecName: "kube-api-access-r2hpl") pod "165d617f-a220-49b1-af2b-65d4c509962c" (UID: "165d617f-a220-49b1-af2b-65d4c509962c"). InnerVolumeSpecName "kube-api-access-r2hpl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:40:52 crc kubenswrapper[5004]: I1201 08:40:52.888898 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/165d617f-a220-49b1-af2b-65d4c509962c-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "165d617f-a220-49b1-af2b-65d4c509962c" (UID: "165d617f-a220-49b1-af2b-65d4c509962c"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:40:52 crc kubenswrapper[5004]: I1201 08:40:52.912254 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/165d617f-a220-49b1-af2b-65d4c509962c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "165d617f-a220-49b1-af2b-65d4c509962c" (UID: "165d617f-a220-49b1-af2b-65d4c509962c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:40:52 crc kubenswrapper[5004]: I1201 08:40:52.935875 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/165d617f-a220-49b1-af2b-65d4c509962c-config-data" (OuterVolumeSpecName: "config-data") pod "165d617f-a220-49b1-af2b-65d4c509962c" (UID: "165d617f-a220-49b1-af2b-65d4c509962c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:40:52 crc kubenswrapper[5004]: I1201 08:40:52.986645 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2hpl\" (UniqueName: \"kubernetes.io/projected/165d617f-a220-49b1-af2b-65d4c509962c-kube-api-access-r2hpl\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:52 crc kubenswrapper[5004]: I1201 08:40:52.986696 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/165d617f-a220-49b1-af2b-65d4c509962c-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:52 crc kubenswrapper[5004]: I1201 08:40:52.986715 5004 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/165d617f-a220-49b1-af2b-65d4c509962c-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:52 crc kubenswrapper[5004]: I1201 08:40:52.986729 5004 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/165d617f-a220-49b1-af2b-65d4c509962c-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:52 crc kubenswrapper[5004]: I1201 08:40:52.986743 5004 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/165d617f-a220-49b1-af2b-65d4c509962c-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:52 crc kubenswrapper[5004]: I1201 08:40:52.986758 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/165d617f-a220-49b1-af2b-65d4c509962c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.364826 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-vgmk8" event={"ID":"0eccea77-d6ee-4592-ad47-1f29ca2a943b","Type":"ContainerDied","Data":"54b4d25d0fb7243c4ea6c7d5b0e752b0b37710e32def63324a9c28d342d47d0a"} Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.364864 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-vgmk8" Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.364886 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54b4d25d0fb7243c4ea6c7d5b0e752b0b37710e32def63324a9c28d342d47d0a" Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.368497 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1fd8d826-4dab-4d07-bc04-be5dfebdaf2b","Type":"ContainerStarted","Data":"65429ecf681ad84a20e124bf7210bd69eac351831158bdd2b9f27340b34a19da"} Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.368927 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1fd8d826-4dab-4d07-bc04-be5dfebdaf2b" containerName="proxy-httpd" containerID="cri-o://65429ecf681ad84a20e124bf7210bd69eac351831158bdd2b9f27340b34a19da" gracePeriod=30 Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.368919 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1fd8d826-4dab-4d07-bc04-be5dfebdaf2b" containerName="ceilometer-notification-agent" containerID="cri-o://e955dd78612c15b9cc804f7c4d3a8fa1a731e8687153b47cd4fdd2a0719e29eb" gracePeriod=30 Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.368954 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1fd8d826-4dab-4d07-bc04-be5dfebdaf2b" containerName="sg-core" containerID="cri-o://b30bf82a68f48468230213f790b700cf0f102fb193557f1d0f19a033f9588f8f" gracePeriod=30 Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.369221 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.374124 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-ln7f8" Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.376235 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-ln7f8" event={"ID":"165d617f-a220-49b1-af2b-65d4c509962c","Type":"ContainerDied","Data":"c8eeda69a5e9d51194fda5a9b50a5deb686408debce833088062e32b39c2f37c"} Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.376274 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8eeda69a5e9d51194fda5a9b50a5deb686408debce833088062e32b39c2f37c" Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.398529 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zgjgq" event={"ID":"f4c8bbd5-63c7-441d-9262-b8d44344c7fa","Type":"ContainerStarted","Data":"1ce112820d5acbe5229e48d53907309311ce33034eb42872b1f0cde08253dc00"} Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.408520 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-cwgb2" event={"ID":"fb372dfc-6007-42ba-bc16-96f7d99d8b98","Type":"ContainerDied","Data":"fe1e31d629a397126b9d8027ebe24cb48f5c84b792ad9b84966980d51c4ca396"} Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.408587 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe1e31d629a397126b9d8027ebe24cb48f5c84b792ad9b84966980d51c4ca396" Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.408676 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-cwgb2" Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.450167 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zgjgq" podStartSLOduration=8.232531193 podStartE2EDuration="15.450143053s" podCreationTimestamp="2025-12-01 08:40:38 +0000 UTC" firstStartedPulling="2025-12-01 08:40:45.234442873 +0000 UTC m=+1422.799434855" lastFinishedPulling="2025-12-01 08:40:52.452054733 +0000 UTC m=+1430.017046715" observedRunningTime="2025-12-01 08:40:53.440955528 +0000 UTC m=+1431.005947510" watchObservedRunningTime="2025-12-01 08:40:53.450143053 +0000 UTC m=+1431.015135035" Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.584309 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 08:40:53 crc kubenswrapper[5004]: E1201 08:40:53.584755 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0eccea77-d6ee-4592-ad47-1f29ca2a943b" containerName="barbican-db-sync" Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.584771 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="0eccea77-d6ee-4592-ad47-1f29ca2a943b" containerName="barbican-db-sync" Dec 01 08:40:53 crc kubenswrapper[5004]: E1201 08:40:53.584806 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="165d617f-a220-49b1-af2b-65d4c509962c" containerName="cinder-db-sync" Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.584813 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="165d617f-a220-49b1-af2b-65d4c509962c" containerName="cinder-db-sync" Dec 01 08:40:53 crc kubenswrapper[5004]: E1201 08:40:53.584836 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb372dfc-6007-42ba-bc16-96f7d99d8b98" containerName="heat-db-sync" Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.584842 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb372dfc-6007-42ba-bc16-96f7d99d8b98" containerName="heat-db-sync" Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.585040 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="165d617f-a220-49b1-af2b-65d4c509962c" containerName="cinder-db-sync" Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.585058 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb372dfc-6007-42ba-bc16-96f7d99d8b98" containerName="heat-db-sync" Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.585073 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="0eccea77-d6ee-4592-ad47-1f29ca2a943b" containerName="barbican-db-sync" Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.586133 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.589705 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.589934 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.590251 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.592392 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-srsdl" Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.604787 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.704280 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1c910a71-8445-4c64-a555-433bb2c60bbd-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1c910a71-8445-4c64-a555-433bb2c60bbd\") " pod="openstack/cinder-scheduler-0" Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.704330 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1c910a71-8445-4c64-a555-433bb2c60bbd-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1c910a71-8445-4c64-a555-433bb2c60bbd\") " pod="openstack/cinder-scheduler-0" Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.704362 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c910a71-8445-4c64-a555-433bb2c60bbd-scripts\") pod \"cinder-scheduler-0\" (UID: \"1c910a71-8445-4c64-a555-433bb2c60bbd\") " pod="openstack/cinder-scheduler-0" Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.704396 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c910a71-8445-4c64-a555-433bb2c60bbd-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1c910a71-8445-4c64-a555-433bb2c60bbd\") " pod="openstack/cinder-scheduler-0" Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.704440 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcvdg\" (UniqueName: \"kubernetes.io/projected/1c910a71-8445-4c64-a555-433bb2c60bbd-kube-api-access-fcvdg\") pod \"cinder-scheduler-0\" (UID: \"1c910a71-8445-4c64-a555-433bb2c60bbd\") " pod="openstack/cinder-scheduler-0" Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.704475 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c910a71-8445-4c64-a555-433bb2c60bbd-config-data\") pod \"cinder-scheduler-0\" (UID: \"1c910a71-8445-4c64-a555-433bb2c60bbd\") " pod="openstack/cinder-scheduler-0" Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.707869 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8775748c9-7mdhw"] Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.709802 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8775748c9-7mdhw" Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.717655 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8775748c9-7mdhw"] Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.807212 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1c910a71-8445-4c64-a555-433bb2c60bbd-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1c910a71-8445-4c64-a555-433bb2c60bbd\") " pod="openstack/cinder-scheduler-0" Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.807264 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1c910a71-8445-4c64-a555-433bb2c60bbd-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1c910a71-8445-4c64-a555-433bb2c60bbd\") " pod="openstack/cinder-scheduler-0" Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.807304 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c910a71-8445-4c64-a555-433bb2c60bbd-scripts\") pod \"cinder-scheduler-0\" (UID: \"1c910a71-8445-4c64-a555-433bb2c60bbd\") " pod="openstack/cinder-scheduler-0" Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.807326 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dcdda8d0-d614-4c3e-b9a0-37f3e676c11c-dns-swift-storage-0\") pod \"dnsmasq-dns-8775748c9-7mdhw\" (UID: \"dcdda8d0-d614-4c3e-b9a0-37f3e676c11c\") " pod="openstack/dnsmasq-dns-8775748c9-7mdhw" Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.807348 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcdda8d0-d614-4c3e-b9a0-37f3e676c11c-config\") pod \"dnsmasq-dns-8775748c9-7mdhw\" (UID: \"dcdda8d0-d614-4c3e-b9a0-37f3e676c11c\") " pod="openstack/dnsmasq-dns-8775748c9-7mdhw" Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.807371 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dcdda8d0-d614-4c3e-b9a0-37f3e676c11c-ovsdbserver-sb\") pod \"dnsmasq-dns-8775748c9-7mdhw\" (UID: \"dcdda8d0-d614-4c3e-b9a0-37f3e676c11c\") " pod="openstack/dnsmasq-dns-8775748c9-7mdhw" Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.807404 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c910a71-8445-4c64-a555-433bb2c60bbd-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1c910a71-8445-4c64-a555-433bb2c60bbd\") " pod="openstack/cinder-scheduler-0" Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.807419 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1c910a71-8445-4c64-a555-433bb2c60bbd-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1c910a71-8445-4c64-a555-433bb2c60bbd\") " pod="openstack/cinder-scheduler-0" Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.807916 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcvdg\" (UniqueName: \"kubernetes.io/projected/1c910a71-8445-4c64-a555-433bb2c60bbd-kube-api-access-fcvdg\") pod \"cinder-scheduler-0\" (UID: \"1c910a71-8445-4c64-a555-433bb2c60bbd\") " pod="openstack/cinder-scheduler-0" Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.808013 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c910a71-8445-4c64-a555-433bb2c60bbd-config-data\") pod \"cinder-scheduler-0\" (UID: \"1c910a71-8445-4c64-a555-433bb2c60bbd\") " pod="openstack/cinder-scheduler-0" Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.808037 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dcdda8d0-d614-4c3e-b9a0-37f3e676c11c-dns-svc\") pod \"dnsmasq-dns-8775748c9-7mdhw\" (UID: \"dcdda8d0-d614-4c3e-b9a0-37f3e676c11c\") " pod="openstack/dnsmasq-dns-8775748c9-7mdhw" Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.808164 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qt6kq\" (UniqueName: \"kubernetes.io/projected/dcdda8d0-d614-4c3e-b9a0-37f3e676c11c-kube-api-access-qt6kq\") pod \"dnsmasq-dns-8775748c9-7mdhw\" (UID: \"dcdda8d0-d614-4c3e-b9a0-37f3e676c11c\") " pod="openstack/dnsmasq-dns-8775748c9-7mdhw" Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.808277 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dcdda8d0-d614-4c3e-b9a0-37f3e676c11c-ovsdbserver-nb\") pod \"dnsmasq-dns-8775748c9-7mdhw\" (UID: \"dcdda8d0-d614-4c3e-b9a0-37f3e676c11c\") " pod="openstack/dnsmasq-dns-8775748c9-7mdhw" Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.813149 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1c910a71-8445-4c64-a555-433bb2c60bbd-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1c910a71-8445-4c64-a555-433bb2c60bbd\") " pod="openstack/cinder-scheduler-0" Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.816418 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c910a71-8445-4c64-a555-433bb2c60bbd-config-data\") pod \"cinder-scheduler-0\" (UID: \"1c910a71-8445-4c64-a555-433bb2c60bbd\") " pod="openstack/cinder-scheduler-0" Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.830088 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c910a71-8445-4c64-a555-433bb2c60bbd-scripts\") pod \"cinder-scheduler-0\" (UID: \"1c910a71-8445-4c64-a555-433bb2c60bbd\") " pod="openstack/cinder-scheduler-0" Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.837047 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c910a71-8445-4c64-a555-433bb2c60bbd-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1c910a71-8445-4c64-a555-433bb2c60bbd\") " pod="openstack/cinder-scheduler-0" Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.850131 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcvdg\" (UniqueName: \"kubernetes.io/projected/1c910a71-8445-4c64-a555-433bb2c60bbd-kube-api-access-fcvdg\") pod \"cinder-scheduler-0\" (UID: \"1c910a71-8445-4c64-a555-433bb2c60bbd\") " pod="openstack/cinder-scheduler-0" Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.886041 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-58d7b4cb4c-z6xck"] Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.887985 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-58d7b4cb4c-z6xck" Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.893039 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.893697 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.893796 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-lsbpg" Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.906347 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.912135 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dcdda8d0-d614-4c3e-b9a0-37f3e676c11c-ovsdbserver-nb\") pod \"dnsmasq-dns-8775748c9-7mdhw\" (UID: \"dcdda8d0-d614-4c3e-b9a0-37f3e676c11c\") " pod="openstack/dnsmasq-dns-8775748c9-7mdhw" Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.912283 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dcdda8d0-d614-4c3e-b9a0-37f3e676c11c-dns-swift-storage-0\") pod \"dnsmasq-dns-8775748c9-7mdhw\" (UID: \"dcdda8d0-d614-4c3e-b9a0-37f3e676c11c\") " pod="openstack/dnsmasq-dns-8775748c9-7mdhw" Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.912319 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcdda8d0-d614-4c3e-b9a0-37f3e676c11c-config\") pod \"dnsmasq-dns-8775748c9-7mdhw\" (UID: \"dcdda8d0-d614-4c3e-b9a0-37f3e676c11c\") " pod="openstack/dnsmasq-dns-8775748c9-7mdhw" Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.912343 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dcdda8d0-d614-4c3e-b9a0-37f3e676c11c-ovsdbserver-sb\") pod \"dnsmasq-dns-8775748c9-7mdhw\" (UID: \"dcdda8d0-d614-4c3e-b9a0-37f3e676c11c\") " pod="openstack/dnsmasq-dns-8775748c9-7mdhw" Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.912434 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dcdda8d0-d614-4c3e-b9a0-37f3e676c11c-dns-svc\") pod \"dnsmasq-dns-8775748c9-7mdhw\" (UID: \"dcdda8d0-d614-4c3e-b9a0-37f3e676c11c\") " pod="openstack/dnsmasq-dns-8775748c9-7mdhw" Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.912494 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qt6kq\" (UniqueName: \"kubernetes.io/projected/dcdda8d0-d614-4c3e-b9a0-37f3e676c11c-kube-api-access-qt6kq\") pod \"dnsmasq-dns-8775748c9-7mdhw\" (UID: \"dcdda8d0-d614-4c3e-b9a0-37f3e676c11c\") " pod="openstack/dnsmasq-dns-8775748c9-7mdhw" Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.913953 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dcdda8d0-d614-4c3e-b9a0-37f3e676c11c-ovsdbserver-nb\") pod \"dnsmasq-dns-8775748c9-7mdhw\" (UID: \"dcdda8d0-d614-4c3e-b9a0-37f3e676c11c\") " pod="openstack/dnsmasq-dns-8775748c9-7mdhw" Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.915227 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dcdda8d0-d614-4c3e-b9a0-37f3e676c11c-dns-swift-storage-0\") pod \"dnsmasq-dns-8775748c9-7mdhw\" (UID: \"dcdda8d0-d614-4c3e-b9a0-37f3e676c11c\") " pod="openstack/dnsmasq-dns-8775748c9-7mdhw" Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.915796 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcdda8d0-d614-4c3e-b9a0-37f3e676c11c-config\") pod \"dnsmasq-dns-8775748c9-7mdhw\" (UID: \"dcdda8d0-d614-4c3e-b9a0-37f3e676c11c\") " pod="openstack/dnsmasq-dns-8775748c9-7mdhw" Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.916309 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dcdda8d0-d614-4c3e-b9a0-37f3e676c11c-ovsdbserver-sb\") pod \"dnsmasq-dns-8775748c9-7mdhw\" (UID: \"dcdda8d0-d614-4c3e-b9a0-37f3e676c11c\") " pod="openstack/dnsmasq-dns-8775748c9-7mdhw" Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.917086 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dcdda8d0-d614-4c3e-b9a0-37f3e676c11c-dns-svc\") pod \"dnsmasq-dns-8775748c9-7mdhw\" (UID: \"dcdda8d0-d614-4c3e-b9a0-37f3e676c11c\") " pod="openstack/dnsmasq-dns-8775748c9-7mdhw" Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.937467 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-58bccc5494-trhd5"] Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.972934 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qt6kq\" (UniqueName: \"kubernetes.io/projected/dcdda8d0-d614-4c3e-b9a0-37f3e676c11c-kube-api-access-qt6kq\") pod \"dnsmasq-dns-8775748c9-7mdhw\" (UID: \"dcdda8d0-d614-4c3e-b9a0-37f3e676c11c\") " pod="openstack/dnsmasq-dns-8775748c9-7mdhw" Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.973253 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-58d7b4cb4c-z6xck"] Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.973616 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-58bccc5494-trhd5" Dec 01 08:40:53 crc kubenswrapper[5004]: I1201 08:40:53.978299 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.021738 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b463310a-8c0e-462e-a746-94a664a21ebe-logs\") pod \"barbican-worker-58d7b4cb4c-z6xck\" (UID: \"b463310a-8c0e-462e-a746-94a664a21ebe\") " pod="openstack/barbican-worker-58d7b4cb4c-z6xck" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.021872 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b463310a-8c0e-462e-a746-94a664a21ebe-config-data-custom\") pod \"barbican-worker-58d7b4cb4c-z6xck\" (UID: \"b463310a-8c0e-462e-a746-94a664a21ebe\") " pod="openstack/barbican-worker-58d7b4cb4c-z6xck" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.022118 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b463310a-8c0e-462e-a746-94a664a21ebe-combined-ca-bundle\") pod \"barbican-worker-58d7b4cb4c-z6xck\" (UID: \"b463310a-8c0e-462e-a746-94a664a21ebe\") " pod="openstack/barbican-worker-58d7b4cb4c-z6xck" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.022221 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b463310a-8c0e-462e-a746-94a664a21ebe-config-data\") pod \"barbican-worker-58d7b4cb4c-z6xck\" (UID: \"b463310a-8c0e-462e-a746-94a664a21ebe\") " pod="openstack/barbican-worker-58d7b4cb4c-z6xck" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.022260 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9pfk\" (UniqueName: \"kubernetes.io/projected/b463310a-8c0e-462e-a746-94a664a21ebe-kube-api-access-x9pfk\") pod \"barbican-worker-58d7b4cb4c-z6xck\" (UID: \"b463310a-8c0e-462e-a746-94a664a21ebe\") " pod="openstack/barbican-worker-58d7b4cb4c-z6xck" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.037581 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8775748c9-7mdhw" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.102540 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-58bccc5494-trhd5"] Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.137761 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c77ef8f3-6312-48ca-9e64-49e0db910168-config-data-custom\") pod \"barbican-keystone-listener-58bccc5494-trhd5\" (UID: \"c77ef8f3-6312-48ca-9e64-49e0db910168\") " pod="openstack/barbican-keystone-listener-58bccc5494-trhd5" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.137807 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c77ef8f3-6312-48ca-9e64-49e0db910168-config-data\") pod \"barbican-keystone-listener-58bccc5494-trhd5\" (UID: \"c77ef8f3-6312-48ca-9e64-49e0db910168\") " pod="openstack/barbican-keystone-listener-58bccc5494-trhd5" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.137852 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b463310a-8c0e-462e-a746-94a664a21ebe-logs\") pod \"barbican-worker-58d7b4cb4c-z6xck\" (UID: \"b463310a-8c0e-462e-a746-94a664a21ebe\") " pod="openstack/barbican-worker-58d7b4cb4c-z6xck" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.137916 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c77ef8f3-6312-48ca-9e64-49e0db910168-logs\") pod \"barbican-keystone-listener-58bccc5494-trhd5\" (UID: \"c77ef8f3-6312-48ca-9e64-49e0db910168\") " pod="openstack/barbican-keystone-listener-58bccc5494-trhd5" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.137932 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b463310a-8c0e-462e-a746-94a664a21ebe-config-data-custom\") pod \"barbican-worker-58d7b4cb4c-z6xck\" (UID: \"b463310a-8c0e-462e-a746-94a664a21ebe\") " pod="openstack/barbican-worker-58d7b4cb4c-z6xck" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.137987 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b463310a-8c0e-462e-a746-94a664a21ebe-combined-ca-bundle\") pod \"barbican-worker-58d7b4cb4c-z6xck\" (UID: \"b463310a-8c0e-462e-a746-94a664a21ebe\") " pod="openstack/barbican-worker-58d7b4cb4c-z6xck" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.138046 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b463310a-8c0e-462e-a746-94a664a21ebe-config-data\") pod \"barbican-worker-58d7b4cb4c-z6xck\" (UID: \"b463310a-8c0e-462e-a746-94a664a21ebe\") " pod="openstack/barbican-worker-58d7b4cb4c-z6xck" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.138060 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9pfk\" (UniqueName: \"kubernetes.io/projected/b463310a-8c0e-462e-a746-94a664a21ebe-kube-api-access-x9pfk\") pod \"barbican-worker-58d7b4cb4c-z6xck\" (UID: \"b463310a-8c0e-462e-a746-94a664a21ebe\") " pod="openstack/barbican-worker-58d7b4cb4c-z6xck" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.138085 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djvc9\" (UniqueName: \"kubernetes.io/projected/c77ef8f3-6312-48ca-9e64-49e0db910168-kube-api-access-djvc9\") pod \"barbican-keystone-listener-58bccc5494-trhd5\" (UID: \"c77ef8f3-6312-48ca-9e64-49e0db910168\") " pod="openstack/barbican-keystone-listener-58bccc5494-trhd5" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.138117 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c77ef8f3-6312-48ca-9e64-49e0db910168-combined-ca-bundle\") pod \"barbican-keystone-listener-58bccc5494-trhd5\" (UID: \"c77ef8f3-6312-48ca-9e64-49e0db910168\") " pod="openstack/barbican-keystone-listener-58bccc5494-trhd5" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.138486 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b463310a-8c0e-462e-a746-94a664a21ebe-logs\") pod \"barbican-worker-58d7b4cb4c-z6xck\" (UID: \"b463310a-8c0e-462e-a746-94a664a21ebe\") " pod="openstack/barbican-worker-58d7b4cb4c-z6xck" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.150693 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b463310a-8c0e-462e-a746-94a664a21ebe-config-data\") pod \"barbican-worker-58d7b4cb4c-z6xck\" (UID: \"b463310a-8c0e-462e-a746-94a664a21ebe\") " pod="openstack/barbican-worker-58d7b4cb4c-z6xck" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.153469 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b463310a-8c0e-462e-a746-94a664a21ebe-combined-ca-bundle\") pod \"barbican-worker-58d7b4cb4c-z6xck\" (UID: \"b463310a-8c0e-462e-a746-94a664a21ebe\") " pod="openstack/barbican-worker-58d7b4cb4c-z6xck" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.160011 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b463310a-8c0e-462e-a746-94a664a21ebe-config-data-custom\") pod \"barbican-worker-58d7b4cb4c-z6xck\" (UID: \"b463310a-8c0e-462e-a746-94a664a21ebe\") " pod="openstack/barbican-worker-58d7b4cb4c-z6xck" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.175148 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9pfk\" (UniqueName: \"kubernetes.io/projected/b463310a-8c0e-462e-a746-94a664a21ebe-kube-api-access-x9pfk\") pod \"barbican-worker-58d7b4cb4c-z6xck\" (UID: \"b463310a-8c0e-462e-a746-94a664a21ebe\") " pod="openstack/barbican-worker-58d7b4cb4c-z6xck" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.188520 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.190484 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.193309 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.241221 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.242752 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16fb71da-95be-43fa-a52b-fb3315938e09-scripts\") pod \"cinder-api-0\" (UID: \"16fb71da-95be-43fa-a52b-fb3315938e09\") " pod="openstack/cinder-api-0" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.242793 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vvn6\" (UniqueName: \"kubernetes.io/projected/16fb71da-95be-43fa-a52b-fb3315938e09-kube-api-access-6vvn6\") pod \"cinder-api-0\" (UID: \"16fb71da-95be-43fa-a52b-fb3315938e09\") " pod="openstack/cinder-api-0" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.242824 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c77ef8f3-6312-48ca-9e64-49e0db910168-logs\") pod \"barbican-keystone-listener-58bccc5494-trhd5\" (UID: \"c77ef8f3-6312-48ca-9e64-49e0db910168\") " pod="openstack/barbican-keystone-listener-58bccc5494-trhd5" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.242851 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/16fb71da-95be-43fa-a52b-fb3315938e09-etc-machine-id\") pod \"cinder-api-0\" (UID: \"16fb71da-95be-43fa-a52b-fb3315938e09\") " pod="openstack/cinder-api-0" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.242920 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16fb71da-95be-43fa-a52b-fb3315938e09-config-data\") pod \"cinder-api-0\" (UID: \"16fb71da-95be-43fa-a52b-fb3315938e09\") " pod="openstack/cinder-api-0" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.242960 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djvc9\" (UniqueName: \"kubernetes.io/projected/c77ef8f3-6312-48ca-9e64-49e0db910168-kube-api-access-djvc9\") pod \"barbican-keystone-listener-58bccc5494-trhd5\" (UID: \"c77ef8f3-6312-48ca-9e64-49e0db910168\") " pod="openstack/barbican-keystone-listener-58bccc5494-trhd5" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.242992 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c77ef8f3-6312-48ca-9e64-49e0db910168-combined-ca-bundle\") pod \"barbican-keystone-listener-58bccc5494-trhd5\" (UID: \"c77ef8f3-6312-48ca-9e64-49e0db910168\") " pod="openstack/barbican-keystone-listener-58bccc5494-trhd5" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.243010 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16fb71da-95be-43fa-a52b-fb3315938e09-logs\") pod \"cinder-api-0\" (UID: \"16fb71da-95be-43fa-a52b-fb3315938e09\") " pod="openstack/cinder-api-0" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.243028 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16fb71da-95be-43fa-a52b-fb3315938e09-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"16fb71da-95be-43fa-a52b-fb3315938e09\") " pod="openstack/cinder-api-0" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.243051 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c77ef8f3-6312-48ca-9e64-49e0db910168-config-data-custom\") pod \"barbican-keystone-listener-58bccc5494-trhd5\" (UID: \"c77ef8f3-6312-48ca-9e64-49e0db910168\") " pod="openstack/barbican-keystone-listener-58bccc5494-trhd5" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.243085 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c77ef8f3-6312-48ca-9e64-49e0db910168-config-data\") pod \"barbican-keystone-listener-58bccc5494-trhd5\" (UID: \"c77ef8f3-6312-48ca-9e64-49e0db910168\") " pod="openstack/barbican-keystone-listener-58bccc5494-trhd5" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.243164 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/16fb71da-95be-43fa-a52b-fb3315938e09-config-data-custom\") pod \"cinder-api-0\" (UID: \"16fb71da-95be-43fa-a52b-fb3315938e09\") " pod="openstack/cinder-api-0" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.243739 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c77ef8f3-6312-48ca-9e64-49e0db910168-logs\") pod \"barbican-keystone-listener-58bccc5494-trhd5\" (UID: \"c77ef8f3-6312-48ca-9e64-49e0db910168\") " pod="openstack/barbican-keystone-listener-58bccc5494-trhd5" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.252252 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8775748c9-7mdhw"] Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.253149 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c77ef8f3-6312-48ca-9e64-49e0db910168-combined-ca-bundle\") pod \"barbican-keystone-listener-58bccc5494-trhd5\" (UID: \"c77ef8f3-6312-48ca-9e64-49e0db910168\") " pod="openstack/barbican-keystone-listener-58bccc5494-trhd5" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.254077 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c77ef8f3-6312-48ca-9e64-49e0db910168-config-data-custom\") pod \"barbican-keystone-listener-58bccc5494-trhd5\" (UID: \"c77ef8f3-6312-48ca-9e64-49e0db910168\") " pod="openstack/barbican-keystone-listener-58bccc5494-trhd5" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.254601 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c77ef8f3-6312-48ca-9e64-49e0db910168-config-data\") pod \"barbican-keystone-listener-58bccc5494-trhd5\" (UID: \"c77ef8f3-6312-48ca-9e64-49e0db910168\") " pod="openstack/barbican-keystone-listener-58bccc5494-trhd5" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.262532 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djvc9\" (UniqueName: \"kubernetes.io/projected/c77ef8f3-6312-48ca-9e64-49e0db910168-kube-api-access-djvc9\") pod \"barbican-keystone-listener-58bccc5494-trhd5\" (UID: \"c77ef8f3-6312-48ca-9e64-49e0db910168\") " pod="openstack/barbican-keystone-listener-58bccc5494-trhd5" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.277032 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-zqhkd"] Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.279093 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-zqhkd" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.291943 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-zqhkd"] Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.304135 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-58d7b4cb4c-z6xck" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.330747 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-65dd66d59d-csdqc"] Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.332802 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-65dd66d59d-csdqc" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.335872 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.347188 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vvn6\" (UniqueName: \"kubernetes.io/projected/16fb71da-95be-43fa-a52b-fb3315938e09-kube-api-access-6vvn6\") pod \"cinder-api-0\" (UID: \"16fb71da-95be-43fa-a52b-fb3315938e09\") " pod="openstack/cinder-api-0" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.347255 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/16fb71da-95be-43fa-a52b-fb3315938e09-etc-machine-id\") pod \"cinder-api-0\" (UID: \"16fb71da-95be-43fa-a52b-fb3315938e09\") " pod="openstack/cinder-api-0" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.347288 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e56ced1f-e623-4c1a-8da6-944c91827cac-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-zqhkd\" (UID: \"e56ced1f-e623-4c1a-8da6-944c91827cac\") " pod="openstack/dnsmasq-dns-6bb4fc677f-zqhkd" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.347439 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e56ced1f-e623-4c1a-8da6-944c91827cac-config\") pod \"dnsmasq-dns-6bb4fc677f-zqhkd\" (UID: \"e56ced1f-e623-4c1a-8da6-944c91827cac\") " pod="openstack/dnsmasq-dns-6bb4fc677f-zqhkd" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.347469 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e56ced1f-e623-4c1a-8da6-944c91827cac-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-zqhkd\" (UID: \"e56ced1f-e623-4c1a-8da6-944c91827cac\") " pod="openstack/dnsmasq-dns-6bb4fc677f-zqhkd" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.347537 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16fb71da-95be-43fa-a52b-fb3315938e09-config-data\") pod \"cinder-api-0\" (UID: \"16fb71da-95be-43fa-a52b-fb3315938e09\") " pod="openstack/cinder-api-0" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.347662 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e56ced1f-e623-4c1a-8da6-944c91827cac-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-zqhkd\" (UID: \"e56ced1f-e623-4c1a-8da6-944c91827cac\") " pod="openstack/dnsmasq-dns-6bb4fc677f-zqhkd" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.347694 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16fb71da-95be-43fa-a52b-fb3315938e09-logs\") pod \"cinder-api-0\" (UID: \"16fb71da-95be-43fa-a52b-fb3315938e09\") " pod="openstack/cinder-api-0" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.347722 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16fb71da-95be-43fa-a52b-fb3315938e09-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"16fb71da-95be-43fa-a52b-fb3315938e09\") " pod="openstack/cinder-api-0" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.347743 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e56ced1f-e623-4c1a-8da6-944c91827cac-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-zqhkd\" (UID: \"e56ced1f-e623-4c1a-8da6-944c91827cac\") " pod="openstack/dnsmasq-dns-6bb4fc677f-zqhkd" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.347780 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fl5sp\" (UniqueName: \"kubernetes.io/projected/e56ced1f-e623-4c1a-8da6-944c91827cac-kube-api-access-fl5sp\") pod \"dnsmasq-dns-6bb4fc677f-zqhkd\" (UID: \"e56ced1f-e623-4c1a-8da6-944c91827cac\") " pod="openstack/dnsmasq-dns-6bb4fc677f-zqhkd" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.347806 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/16fb71da-95be-43fa-a52b-fb3315938e09-config-data-custom\") pod \"cinder-api-0\" (UID: \"16fb71da-95be-43fa-a52b-fb3315938e09\") " pod="openstack/cinder-api-0" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.347858 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16fb71da-95be-43fa-a52b-fb3315938e09-scripts\") pod \"cinder-api-0\" (UID: \"16fb71da-95be-43fa-a52b-fb3315938e09\") " pod="openstack/cinder-api-0" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.350451 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-65dd66d59d-csdqc"] Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.350801 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16fb71da-95be-43fa-a52b-fb3315938e09-logs\") pod \"cinder-api-0\" (UID: \"16fb71da-95be-43fa-a52b-fb3315938e09\") " pod="openstack/cinder-api-0" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.351059 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/16fb71da-95be-43fa-a52b-fb3315938e09-etc-machine-id\") pod \"cinder-api-0\" (UID: \"16fb71da-95be-43fa-a52b-fb3315938e09\") " pod="openstack/cinder-api-0" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.355269 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16fb71da-95be-43fa-a52b-fb3315938e09-scripts\") pod \"cinder-api-0\" (UID: \"16fb71da-95be-43fa-a52b-fb3315938e09\") " pod="openstack/cinder-api-0" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.360380 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16fb71da-95be-43fa-a52b-fb3315938e09-config-data\") pod \"cinder-api-0\" (UID: \"16fb71da-95be-43fa-a52b-fb3315938e09\") " pod="openstack/cinder-api-0" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.363802 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/16fb71da-95be-43fa-a52b-fb3315938e09-config-data-custom\") pod \"cinder-api-0\" (UID: \"16fb71da-95be-43fa-a52b-fb3315938e09\") " pod="openstack/cinder-api-0" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.373208 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16fb71da-95be-43fa-a52b-fb3315938e09-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"16fb71da-95be-43fa-a52b-fb3315938e09\") " pod="openstack/cinder-api-0" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.373738 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-58bccc5494-trhd5" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.385258 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vvn6\" (UniqueName: \"kubernetes.io/projected/16fb71da-95be-43fa-a52b-fb3315938e09-kube-api-access-6vvn6\") pod \"cinder-api-0\" (UID: \"16fb71da-95be-43fa-a52b-fb3315938e09\") " pod="openstack/cinder-api-0" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.438234 5004 generic.go:334] "Generic (PLEG): container finished" podID="1fd8d826-4dab-4d07-bc04-be5dfebdaf2b" containerID="65429ecf681ad84a20e124bf7210bd69eac351831158bdd2b9f27340b34a19da" exitCode=0 Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.438268 5004 generic.go:334] "Generic (PLEG): container finished" podID="1fd8d826-4dab-4d07-bc04-be5dfebdaf2b" containerID="b30bf82a68f48468230213f790b700cf0f102fb193557f1d0f19a033f9588f8f" exitCode=2 Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.438440 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1fd8d826-4dab-4d07-bc04-be5dfebdaf2b","Type":"ContainerDied","Data":"65429ecf681ad84a20e124bf7210bd69eac351831158bdd2b9f27340b34a19da"} Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.438488 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1fd8d826-4dab-4d07-bc04-be5dfebdaf2b","Type":"ContainerDied","Data":"b30bf82a68f48468230213f790b700cf0f102fb193557f1d0f19a033f9588f8f"} Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.453019 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e56ced1f-e623-4c1a-8da6-944c91827cac-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-zqhkd\" (UID: \"e56ced1f-e623-4c1a-8da6-944c91827cac\") " pod="openstack/dnsmasq-dns-6bb4fc677f-zqhkd" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.453128 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e56ced1f-e623-4c1a-8da6-944c91827cac-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-zqhkd\" (UID: \"e56ced1f-e623-4c1a-8da6-944c91827cac\") " pod="openstack/dnsmasq-dns-6bb4fc677f-zqhkd" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.453197 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fl5sp\" (UniqueName: \"kubernetes.io/projected/e56ced1f-e623-4c1a-8da6-944c91827cac-kube-api-access-fl5sp\") pod \"dnsmasq-dns-6bb4fc677f-zqhkd\" (UID: \"e56ced1f-e623-4c1a-8da6-944c91827cac\") " pod="openstack/dnsmasq-dns-6bb4fc677f-zqhkd" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.453318 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d7ab903-f57f-450e-8af4-95ee4c219310-logs\") pod \"barbican-api-65dd66d59d-csdqc\" (UID: \"2d7ab903-f57f-450e-8af4-95ee4c219310\") " pod="openstack/barbican-api-65dd66d59d-csdqc" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.453432 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xvw4\" (UniqueName: \"kubernetes.io/projected/2d7ab903-f57f-450e-8af4-95ee4c219310-kube-api-access-7xvw4\") pod \"barbican-api-65dd66d59d-csdqc\" (UID: \"2d7ab903-f57f-450e-8af4-95ee4c219310\") " pod="openstack/barbican-api-65dd66d59d-csdqc" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.453482 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e56ced1f-e623-4c1a-8da6-944c91827cac-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-zqhkd\" (UID: \"e56ced1f-e623-4c1a-8da6-944c91827cac\") " pod="openstack/dnsmasq-dns-6bb4fc677f-zqhkd" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.453545 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e56ced1f-e623-4c1a-8da6-944c91827cac-config\") pod \"dnsmasq-dns-6bb4fc677f-zqhkd\" (UID: \"e56ced1f-e623-4c1a-8da6-944c91827cac\") " pod="openstack/dnsmasq-dns-6bb4fc677f-zqhkd" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.453617 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e56ced1f-e623-4c1a-8da6-944c91827cac-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-zqhkd\" (UID: \"e56ced1f-e623-4c1a-8da6-944c91827cac\") " pod="openstack/dnsmasq-dns-6bb4fc677f-zqhkd" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.453733 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2d7ab903-f57f-450e-8af4-95ee4c219310-config-data-custom\") pod \"barbican-api-65dd66d59d-csdqc\" (UID: \"2d7ab903-f57f-450e-8af4-95ee4c219310\") " pod="openstack/barbican-api-65dd66d59d-csdqc" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.454597 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d7ab903-f57f-450e-8af4-95ee4c219310-combined-ca-bundle\") pod \"barbican-api-65dd66d59d-csdqc\" (UID: \"2d7ab903-f57f-450e-8af4-95ee4c219310\") " pod="openstack/barbican-api-65dd66d59d-csdqc" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.454626 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d7ab903-f57f-450e-8af4-95ee4c219310-config-data\") pod \"barbican-api-65dd66d59d-csdqc\" (UID: \"2d7ab903-f57f-450e-8af4-95ee4c219310\") " pod="openstack/barbican-api-65dd66d59d-csdqc" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.454710 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e56ced1f-e623-4c1a-8da6-944c91827cac-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-zqhkd\" (UID: \"e56ced1f-e623-4c1a-8da6-944c91827cac\") " pod="openstack/dnsmasq-dns-6bb4fc677f-zqhkd" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.455698 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e56ced1f-e623-4c1a-8da6-944c91827cac-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-zqhkd\" (UID: \"e56ced1f-e623-4c1a-8da6-944c91827cac\") " pod="openstack/dnsmasq-dns-6bb4fc677f-zqhkd" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.456097 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e56ced1f-e623-4c1a-8da6-944c91827cac-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-zqhkd\" (UID: \"e56ced1f-e623-4c1a-8da6-944c91827cac\") " pod="openstack/dnsmasq-dns-6bb4fc677f-zqhkd" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.456462 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e56ced1f-e623-4c1a-8da6-944c91827cac-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-zqhkd\" (UID: \"e56ced1f-e623-4c1a-8da6-944c91827cac\") " pod="openstack/dnsmasq-dns-6bb4fc677f-zqhkd" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.456913 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e56ced1f-e623-4c1a-8da6-944c91827cac-config\") pod \"dnsmasq-dns-6bb4fc677f-zqhkd\" (UID: \"e56ced1f-e623-4c1a-8da6-944c91827cac\") " pod="openstack/dnsmasq-dns-6bb4fc677f-zqhkd" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.483990 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fl5sp\" (UniqueName: \"kubernetes.io/projected/e56ced1f-e623-4c1a-8da6-944c91827cac-kube-api-access-fl5sp\") pod \"dnsmasq-dns-6bb4fc677f-zqhkd\" (UID: \"e56ced1f-e623-4c1a-8da6-944c91827cac\") " pod="openstack/dnsmasq-dns-6bb4fc677f-zqhkd" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.519171 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.557051 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2d7ab903-f57f-450e-8af4-95ee4c219310-config-data-custom\") pod \"barbican-api-65dd66d59d-csdqc\" (UID: \"2d7ab903-f57f-450e-8af4-95ee4c219310\") " pod="openstack/barbican-api-65dd66d59d-csdqc" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.557348 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d7ab903-f57f-450e-8af4-95ee4c219310-combined-ca-bundle\") pod \"barbican-api-65dd66d59d-csdqc\" (UID: \"2d7ab903-f57f-450e-8af4-95ee4c219310\") " pod="openstack/barbican-api-65dd66d59d-csdqc" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.557379 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d7ab903-f57f-450e-8af4-95ee4c219310-config-data\") pod \"barbican-api-65dd66d59d-csdqc\" (UID: \"2d7ab903-f57f-450e-8af4-95ee4c219310\") " pod="openstack/barbican-api-65dd66d59d-csdqc" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.557475 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d7ab903-f57f-450e-8af4-95ee4c219310-logs\") pod \"barbican-api-65dd66d59d-csdqc\" (UID: \"2d7ab903-f57f-450e-8af4-95ee4c219310\") " pod="openstack/barbican-api-65dd66d59d-csdqc" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.557524 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xvw4\" (UniqueName: \"kubernetes.io/projected/2d7ab903-f57f-450e-8af4-95ee4c219310-kube-api-access-7xvw4\") pod \"barbican-api-65dd66d59d-csdqc\" (UID: \"2d7ab903-f57f-450e-8af4-95ee4c219310\") " pod="openstack/barbican-api-65dd66d59d-csdqc" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.560206 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d7ab903-f57f-450e-8af4-95ee4c219310-logs\") pod \"barbican-api-65dd66d59d-csdqc\" (UID: \"2d7ab903-f57f-450e-8af4-95ee4c219310\") " pod="openstack/barbican-api-65dd66d59d-csdqc" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.563619 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2d7ab903-f57f-450e-8af4-95ee4c219310-config-data-custom\") pod \"barbican-api-65dd66d59d-csdqc\" (UID: \"2d7ab903-f57f-450e-8af4-95ee4c219310\") " pod="openstack/barbican-api-65dd66d59d-csdqc" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.572281 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d7ab903-f57f-450e-8af4-95ee4c219310-config-data\") pod \"barbican-api-65dd66d59d-csdqc\" (UID: \"2d7ab903-f57f-450e-8af4-95ee4c219310\") " pod="openstack/barbican-api-65dd66d59d-csdqc" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.577485 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d7ab903-f57f-450e-8af4-95ee4c219310-combined-ca-bundle\") pod \"barbican-api-65dd66d59d-csdqc\" (UID: \"2d7ab903-f57f-450e-8af4-95ee4c219310\") " pod="openstack/barbican-api-65dd66d59d-csdqc" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.585602 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xvw4\" (UniqueName: \"kubernetes.io/projected/2d7ab903-f57f-450e-8af4-95ee4c219310-kube-api-access-7xvw4\") pod \"barbican-api-65dd66d59d-csdqc\" (UID: \"2d7ab903-f57f-450e-8af4-95ee4c219310\") " pod="openstack/barbican-api-65dd66d59d-csdqc" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.622668 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-zqhkd" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.650117 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.651803 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-65dd66d59d-csdqc" Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.849359 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8775748c9-7mdhw"] Dec 01 08:40:54 crc kubenswrapper[5004]: W1201 08:40:54.849885 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddcdda8d0_d614_4c3e_b9a0_37f3e676c11c.slice/crio-7c7526d294abd41367a3ca14bd5469a718c451b7bee17d7d7adf8c51ba75ab81 WatchSource:0}: Error finding container 7c7526d294abd41367a3ca14bd5469a718c451b7bee17d7d7adf8c51ba75ab81: Status 404 returned error can't find the container with id 7c7526d294abd41367a3ca14bd5469a718c451b7bee17d7d7adf8c51ba75ab81 Dec 01 08:40:54 crc kubenswrapper[5004]: I1201 08:40:54.952026 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-58bccc5494-trhd5"] Dec 01 08:40:54 crc kubenswrapper[5004]: W1201 08:40:54.954167 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc77ef8f3_6312_48ca_9e64_49e0db910168.slice/crio-fc97d789f4ccda641b039fee937e2b2fe9cd539de6bffcf92779cf9d3e46b96c WatchSource:0}: Error finding container fc97d789f4ccda641b039fee937e2b2fe9cd539de6bffcf92779cf9d3e46b96c: Status 404 returned error can't find the container with id fc97d789f4ccda641b039fee937e2b2fe9cd539de6bffcf92779cf9d3e46b96c Dec 01 08:40:55 crc kubenswrapper[5004]: I1201 08:40:55.113825 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-58d7b4cb4c-z6xck"] Dec 01 08:40:55 crc kubenswrapper[5004]: I1201 08:40:55.199374 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 01 08:40:55 crc kubenswrapper[5004]: W1201 08:40:55.199823 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod16fb71da_95be_43fa_a52b_fb3315938e09.slice/crio-be81985f4081e6bfcc71dc041b52b7f59783ab29b1cebe607d92b0d69e1e679b WatchSource:0}: Error finding container be81985f4081e6bfcc71dc041b52b7f59783ab29b1cebe607d92b0d69e1e679b: Status 404 returned error can't find the container with id be81985f4081e6bfcc71dc041b52b7f59783ab29b1cebe607d92b0d69e1e679b Dec 01 08:40:55 crc kubenswrapper[5004]: I1201 08:40:55.344400 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-zqhkd"] Dec 01 08:40:55 crc kubenswrapper[5004]: W1201 08:40:55.351228 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode56ced1f_e623_4c1a_8da6_944c91827cac.slice/crio-837d474be604f73d574bf8d5be6a68205ec3c8f7bea373704b39d6d2cfa37edc WatchSource:0}: Error finding container 837d474be604f73d574bf8d5be6a68205ec3c8f7bea373704b39d6d2cfa37edc: Status 404 returned error can't find the container with id 837d474be604f73d574bf8d5be6a68205ec3c8f7bea373704b39d6d2cfa37edc Dec 01 08:40:55 crc kubenswrapper[5004]: I1201 08:40:55.427265 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-65dd66d59d-csdqc"] Dec 01 08:40:55 crc kubenswrapper[5004]: W1201 08:40:55.453294 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d7ab903_f57f_450e_8af4_95ee4c219310.slice/crio-48e25e58135e65f251545800e04b5e2dbe90dba3a77b31e1afeb08e7c7399815 WatchSource:0}: Error finding container 48e25e58135e65f251545800e04b5e2dbe90dba3a77b31e1afeb08e7c7399815: Status 404 returned error can't find the container with id 48e25e58135e65f251545800e04b5e2dbe90dba3a77b31e1afeb08e7c7399815 Dec 01 08:40:55 crc kubenswrapper[5004]: I1201 08:40:55.454763 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-58d7b4cb4c-z6xck" event={"ID":"b463310a-8c0e-462e-a746-94a664a21ebe","Type":"ContainerStarted","Data":"b6346da6776190aa3029427420fcb748c2d3cf09a37bea06935b90147cbc4278"} Dec 01 08:40:55 crc kubenswrapper[5004]: I1201 08:40:55.458389 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1c910a71-8445-4c64-a555-433bb2c60bbd","Type":"ContainerStarted","Data":"5149b02f46f4e6cd86e56e13857f01668080f47cbf728b5072c7c7db0674d33e"} Dec 01 08:40:55 crc kubenswrapper[5004]: I1201 08:40:55.461342 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8775748c9-7mdhw" event={"ID":"dcdda8d0-d614-4c3e-b9a0-37f3e676c11c","Type":"ContainerStarted","Data":"7c7526d294abd41367a3ca14bd5469a718c451b7bee17d7d7adf8c51ba75ab81"} Dec 01 08:40:55 crc kubenswrapper[5004]: I1201 08:40:55.469346 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"16fb71da-95be-43fa-a52b-fb3315938e09","Type":"ContainerStarted","Data":"be81985f4081e6bfcc71dc041b52b7f59783ab29b1cebe607d92b0d69e1e679b"} Dec 01 08:40:55 crc kubenswrapper[5004]: I1201 08:40:55.479911 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-zqhkd" event={"ID":"e56ced1f-e623-4c1a-8da6-944c91827cac","Type":"ContainerStarted","Data":"837d474be604f73d574bf8d5be6a68205ec3c8f7bea373704b39d6d2cfa37edc"} Dec 01 08:40:55 crc kubenswrapper[5004]: I1201 08:40:55.482394 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-58bccc5494-trhd5" event={"ID":"c77ef8f3-6312-48ca-9e64-49e0db910168","Type":"ContainerStarted","Data":"fc97d789f4ccda641b039fee937e2b2fe9cd539de6bffcf92779cf9d3e46b96c"} Dec 01 08:40:55 crc kubenswrapper[5004]: I1201 08:40:55.912893 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 01 08:40:56 crc kubenswrapper[5004]: I1201 08:40:56.502238 5004 generic.go:334] "Generic (PLEG): container finished" podID="dcdda8d0-d614-4c3e-b9a0-37f3e676c11c" containerID="ddb666458d01527ee38973363c57ebab868f2acc5bac54a002894cf811624356" exitCode=0 Dec 01 08:40:56 crc kubenswrapper[5004]: I1201 08:40:56.502489 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8775748c9-7mdhw" event={"ID":"dcdda8d0-d614-4c3e-b9a0-37f3e676c11c","Type":"ContainerDied","Data":"ddb666458d01527ee38973363c57ebab868f2acc5bac54a002894cf811624356"} Dec 01 08:40:56 crc kubenswrapper[5004]: I1201 08:40:56.506983 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"16fb71da-95be-43fa-a52b-fb3315938e09","Type":"ContainerStarted","Data":"a88b56ebeae335b6b5f78e4d59b23d3201cc5bcb8179f86ff401aa2015390c91"} Dec 01 08:40:56 crc kubenswrapper[5004]: I1201 08:40:56.517465 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-65dd66d59d-csdqc" event={"ID":"2d7ab903-f57f-450e-8af4-95ee4c219310","Type":"ContainerStarted","Data":"ec3001c85cfb4b70331b0e5e19e5d9f9213368a70f563402d2b9a2a523a17e79"} Dec 01 08:40:56 crc kubenswrapper[5004]: I1201 08:40:56.517512 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-65dd66d59d-csdqc" event={"ID":"2d7ab903-f57f-450e-8af4-95ee4c219310","Type":"ContainerStarted","Data":"cfb456369a0911948f9c47135bb4dcc6ee04f0414ec9b430b49e3523a57c321a"} Dec 01 08:40:56 crc kubenswrapper[5004]: I1201 08:40:56.517542 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-65dd66d59d-csdqc" event={"ID":"2d7ab903-f57f-450e-8af4-95ee4c219310","Type":"ContainerStarted","Data":"48e25e58135e65f251545800e04b5e2dbe90dba3a77b31e1afeb08e7c7399815"} Dec 01 08:40:56 crc kubenswrapper[5004]: I1201 08:40:56.517604 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-65dd66d59d-csdqc" Dec 01 08:40:56 crc kubenswrapper[5004]: I1201 08:40:56.517721 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-65dd66d59d-csdqc" Dec 01 08:40:56 crc kubenswrapper[5004]: I1201 08:40:56.532999 5004 generic.go:334] "Generic (PLEG): container finished" podID="e56ced1f-e623-4c1a-8da6-944c91827cac" containerID="002293e044aefc19d7b90b6c52c2446535cf25dc6a789457717e49c13553c0ad" exitCode=0 Dec 01 08:40:56 crc kubenswrapper[5004]: I1201 08:40:56.533048 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-zqhkd" event={"ID":"e56ced1f-e623-4c1a-8da6-944c91827cac","Type":"ContainerDied","Data":"002293e044aefc19d7b90b6c52c2446535cf25dc6a789457717e49c13553c0ad"} Dec 01 08:40:56 crc kubenswrapper[5004]: I1201 08:40:56.617896 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-65dd66d59d-csdqc" podStartSLOduration=2.617878389 podStartE2EDuration="2.617878389s" podCreationTimestamp="2025-12-01 08:40:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:40:56.564109126 +0000 UTC m=+1434.129101108" watchObservedRunningTime="2025-12-01 08:40:56.617878389 +0000 UTC m=+1434.182870371" Dec 01 08:40:57 crc kubenswrapper[5004]: I1201 08:40:57.167988 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8775748c9-7mdhw" Dec 01 08:40:57 crc kubenswrapper[5004]: I1201 08:40:57.310031 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dcdda8d0-d614-4c3e-b9a0-37f3e676c11c-ovsdbserver-sb\") pod \"dcdda8d0-d614-4c3e-b9a0-37f3e676c11c\" (UID: \"dcdda8d0-d614-4c3e-b9a0-37f3e676c11c\") " Dec 01 08:40:57 crc kubenswrapper[5004]: I1201 08:40:57.310463 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dcdda8d0-d614-4c3e-b9a0-37f3e676c11c-dns-swift-storage-0\") pod \"dcdda8d0-d614-4c3e-b9a0-37f3e676c11c\" (UID: \"dcdda8d0-d614-4c3e-b9a0-37f3e676c11c\") " Dec 01 08:40:57 crc kubenswrapper[5004]: I1201 08:40:57.310620 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dcdda8d0-d614-4c3e-b9a0-37f3e676c11c-dns-svc\") pod \"dcdda8d0-d614-4c3e-b9a0-37f3e676c11c\" (UID: \"dcdda8d0-d614-4c3e-b9a0-37f3e676c11c\") " Dec 01 08:40:57 crc kubenswrapper[5004]: I1201 08:40:57.310724 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dcdda8d0-d614-4c3e-b9a0-37f3e676c11c-ovsdbserver-nb\") pod \"dcdda8d0-d614-4c3e-b9a0-37f3e676c11c\" (UID: \"dcdda8d0-d614-4c3e-b9a0-37f3e676c11c\") " Dec 01 08:40:57 crc kubenswrapper[5004]: I1201 08:40:57.310838 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcdda8d0-d614-4c3e-b9a0-37f3e676c11c-config\") pod \"dcdda8d0-d614-4c3e-b9a0-37f3e676c11c\" (UID: \"dcdda8d0-d614-4c3e-b9a0-37f3e676c11c\") " Dec 01 08:40:57 crc kubenswrapper[5004]: I1201 08:40:57.310977 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qt6kq\" (UniqueName: \"kubernetes.io/projected/dcdda8d0-d614-4c3e-b9a0-37f3e676c11c-kube-api-access-qt6kq\") pod \"dcdda8d0-d614-4c3e-b9a0-37f3e676c11c\" (UID: \"dcdda8d0-d614-4c3e-b9a0-37f3e676c11c\") " Dec 01 08:40:57 crc kubenswrapper[5004]: I1201 08:40:57.320332 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcdda8d0-d614-4c3e-b9a0-37f3e676c11c-kube-api-access-qt6kq" (OuterVolumeSpecName: "kube-api-access-qt6kq") pod "dcdda8d0-d614-4c3e-b9a0-37f3e676c11c" (UID: "dcdda8d0-d614-4c3e-b9a0-37f3e676c11c"). InnerVolumeSpecName "kube-api-access-qt6kq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:40:57 crc kubenswrapper[5004]: I1201 08:40:57.344339 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcdda8d0-d614-4c3e-b9a0-37f3e676c11c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "dcdda8d0-d614-4c3e-b9a0-37f3e676c11c" (UID: "dcdda8d0-d614-4c3e-b9a0-37f3e676c11c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:40:57 crc kubenswrapper[5004]: I1201 08:40:57.352142 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcdda8d0-d614-4c3e-b9a0-37f3e676c11c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "dcdda8d0-d614-4c3e-b9a0-37f3e676c11c" (UID: "dcdda8d0-d614-4c3e-b9a0-37f3e676c11c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:40:57 crc kubenswrapper[5004]: I1201 08:40:57.367158 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcdda8d0-d614-4c3e-b9a0-37f3e676c11c-config" (OuterVolumeSpecName: "config") pod "dcdda8d0-d614-4c3e-b9a0-37f3e676c11c" (UID: "dcdda8d0-d614-4c3e-b9a0-37f3e676c11c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:40:57 crc kubenswrapper[5004]: I1201 08:40:57.387912 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcdda8d0-d614-4c3e-b9a0-37f3e676c11c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "dcdda8d0-d614-4c3e-b9a0-37f3e676c11c" (UID: "dcdda8d0-d614-4c3e-b9a0-37f3e676c11c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:40:57 crc kubenswrapper[5004]: I1201 08:40:57.405256 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcdda8d0-d614-4c3e-b9a0-37f3e676c11c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "dcdda8d0-d614-4c3e-b9a0-37f3e676c11c" (UID: "dcdda8d0-d614-4c3e-b9a0-37f3e676c11c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:40:57 crc kubenswrapper[5004]: I1201 08:40:57.413247 5004 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dcdda8d0-d614-4c3e-b9a0-37f3e676c11c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:57 crc kubenswrapper[5004]: I1201 08:40:57.413396 5004 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dcdda8d0-d614-4c3e-b9a0-37f3e676c11c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:57 crc kubenswrapper[5004]: I1201 08:40:57.413460 5004 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dcdda8d0-d614-4c3e-b9a0-37f3e676c11c-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:57 crc kubenswrapper[5004]: I1201 08:40:57.413514 5004 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dcdda8d0-d614-4c3e-b9a0-37f3e676c11c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:57 crc kubenswrapper[5004]: I1201 08:40:57.413578 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcdda8d0-d614-4c3e-b9a0-37f3e676c11c-config\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:57 crc kubenswrapper[5004]: I1201 08:40:57.413642 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qt6kq\" (UniqueName: \"kubernetes.io/projected/dcdda8d0-d614-4c3e-b9a0-37f3e676c11c-kube-api-access-qt6kq\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:57 crc kubenswrapper[5004]: I1201 08:40:57.561381 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1c910a71-8445-4c64-a555-433bb2c60bbd","Type":"ContainerStarted","Data":"d5b78ca5142cc60878a4ad77fa71ddfca3d3c89f7ed8d105d0bc39e9b44fba76"} Dec 01 08:40:57 crc kubenswrapper[5004]: I1201 08:40:57.564130 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8775748c9-7mdhw" Dec 01 08:40:57 crc kubenswrapper[5004]: I1201 08:40:57.564125 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8775748c9-7mdhw" event={"ID":"dcdda8d0-d614-4c3e-b9a0-37f3e676c11c","Type":"ContainerDied","Data":"7c7526d294abd41367a3ca14bd5469a718c451b7bee17d7d7adf8c51ba75ab81"} Dec 01 08:40:57 crc kubenswrapper[5004]: I1201 08:40:57.564450 5004 scope.go:117] "RemoveContainer" containerID="ddb666458d01527ee38973363c57ebab868f2acc5bac54a002894cf811624356" Dec 01 08:40:57 crc kubenswrapper[5004]: I1201 08:40:57.572077 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"16fb71da-95be-43fa-a52b-fb3315938e09","Type":"ContainerStarted","Data":"2bf02db439a0f6bf3e43b98b9fc9fd95ebabfb103c9374b5210216103eed3ec7"} Dec 01 08:40:57 crc kubenswrapper[5004]: I1201 08:40:57.572232 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="16fb71da-95be-43fa-a52b-fb3315938e09" containerName="cinder-api-log" containerID="cri-o://a88b56ebeae335b6b5f78e4d59b23d3201cc5bcb8179f86ff401aa2015390c91" gracePeriod=30 Dec 01 08:40:57 crc kubenswrapper[5004]: I1201 08:40:57.572370 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 01 08:40:57 crc kubenswrapper[5004]: I1201 08:40:57.572380 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="16fb71da-95be-43fa-a52b-fb3315938e09" containerName="cinder-api" containerID="cri-o://2bf02db439a0f6bf3e43b98b9fc9fd95ebabfb103c9374b5210216103eed3ec7" gracePeriod=30 Dec 01 08:40:57 crc kubenswrapper[5004]: I1201 08:40:57.579436 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-zqhkd" event={"ID":"e56ced1f-e623-4c1a-8da6-944c91827cac","Type":"ContainerStarted","Data":"821287477bfaf6931ed0af5b0546c9fd2cfc10b9a29881cd227e2471910721bb"} Dec 01 08:40:57 crc kubenswrapper[5004]: I1201 08:40:57.579663 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6bb4fc677f-zqhkd" Dec 01 08:40:57 crc kubenswrapper[5004]: I1201 08:40:57.599572 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.599542668 podStartE2EDuration="4.599542668s" podCreationTimestamp="2025-12-01 08:40:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:40:57.588437326 +0000 UTC m=+1435.153429308" watchObservedRunningTime="2025-12-01 08:40:57.599542668 +0000 UTC m=+1435.164534650" Dec 01 08:40:57 crc kubenswrapper[5004]: I1201 08:40:57.631831 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8775748c9-7mdhw"] Dec 01 08:40:57 crc kubenswrapper[5004]: I1201 08:40:57.643351 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6bb4fc677f-zqhkd" podStartSLOduration=3.6433346479999997 podStartE2EDuration="3.643334648s" podCreationTimestamp="2025-12-01 08:40:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:40:57.641956714 +0000 UTC m=+1435.206948716" watchObservedRunningTime="2025-12-01 08:40:57.643334648 +0000 UTC m=+1435.208326630" Dec 01 08:40:57 crc kubenswrapper[5004]: I1201 08:40:57.643412 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8775748c9-7mdhw"] Dec 01 08:40:58 crc kubenswrapper[5004]: I1201 08:40:58.389669 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zgjgq" Dec 01 08:40:58 crc kubenswrapper[5004]: I1201 08:40:58.390178 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zgjgq" Dec 01 08:40:58 crc kubenswrapper[5004]: I1201 08:40:58.621847 5004 generic.go:334] "Generic (PLEG): container finished" podID="16fb71da-95be-43fa-a52b-fb3315938e09" containerID="2bf02db439a0f6bf3e43b98b9fc9fd95ebabfb103c9374b5210216103eed3ec7" exitCode=0 Dec 01 08:40:58 crc kubenswrapper[5004]: I1201 08:40:58.622114 5004 generic.go:334] "Generic (PLEG): container finished" podID="16fb71da-95be-43fa-a52b-fb3315938e09" containerID="a88b56ebeae335b6b5f78e4d59b23d3201cc5bcb8179f86ff401aa2015390c91" exitCode=143 Dec 01 08:40:58 crc kubenswrapper[5004]: I1201 08:40:58.622483 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"16fb71da-95be-43fa-a52b-fb3315938e09","Type":"ContainerDied","Data":"2bf02db439a0f6bf3e43b98b9fc9fd95ebabfb103c9374b5210216103eed3ec7"} Dec 01 08:40:58 crc kubenswrapper[5004]: I1201 08:40:58.622535 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"16fb71da-95be-43fa-a52b-fb3315938e09","Type":"ContainerDied","Data":"a88b56ebeae335b6b5f78e4d59b23d3201cc5bcb8179f86ff401aa2015390c91"} Dec 01 08:40:58 crc kubenswrapper[5004]: I1201 08:40:58.622546 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"16fb71da-95be-43fa-a52b-fb3315938e09","Type":"ContainerDied","Data":"be81985f4081e6bfcc71dc041b52b7f59783ab29b1cebe607d92b0d69e1e679b"} Dec 01 08:40:58 crc kubenswrapper[5004]: I1201 08:40:58.622557 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be81985f4081e6bfcc71dc041b52b7f59783ab29b1cebe607d92b0d69e1e679b" Dec 01 08:40:58 crc kubenswrapper[5004]: I1201 08:40:58.684955 5004 generic.go:334] "Generic (PLEG): container finished" podID="1fd8d826-4dab-4d07-bc04-be5dfebdaf2b" containerID="e955dd78612c15b9cc804f7c4d3a8fa1a731e8687153b47cd4fdd2a0719e29eb" exitCode=0 Dec 01 08:40:58 crc kubenswrapper[5004]: I1201 08:40:58.685054 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1fd8d826-4dab-4d07-bc04-be5dfebdaf2b","Type":"ContainerDied","Data":"e955dd78612c15b9cc804f7c4d3a8fa1a731e8687153b47cd4fdd2a0719e29eb"} Dec 01 08:40:58 crc kubenswrapper[5004]: I1201 08:40:58.686157 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 01 08:40:58 crc kubenswrapper[5004]: I1201 08:40:58.693075 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-58bccc5494-trhd5" event={"ID":"c77ef8f3-6312-48ca-9e64-49e0db910168","Type":"ContainerStarted","Data":"83cd7943cab2e60c246d3b393967e17c98d041a912fa0dd23ea39f4c2f80262d"} Dec 01 08:40:58 crc kubenswrapper[5004]: I1201 08:40:58.703329 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-58d7b4cb4c-z6xck" event={"ID":"b463310a-8c0e-462e-a746-94a664a21ebe","Type":"ContainerStarted","Data":"167d83db337111b83e6b2f842610caf79f63b25cca93f26584611dd48e7808fd"} Dec 01 08:40:58 crc kubenswrapper[5004]: I1201 08:40:58.742236 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/16fb71da-95be-43fa-a52b-fb3315938e09-config-data-custom\") pod \"16fb71da-95be-43fa-a52b-fb3315938e09\" (UID: \"16fb71da-95be-43fa-a52b-fb3315938e09\") " Dec 01 08:40:58 crc kubenswrapper[5004]: I1201 08:40:58.742380 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16fb71da-95be-43fa-a52b-fb3315938e09-logs\") pod \"16fb71da-95be-43fa-a52b-fb3315938e09\" (UID: \"16fb71da-95be-43fa-a52b-fb3315938e09\") " Dec 01 08:40:58 crc kubenswrapper[5004]: I1201 08:40:58.742479 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16fb71da-95be-43fa-a52b-fb3315938e09-config-data\") pod \"16fb71da-95be-43fa-a52b-fb3315938e09\" (UID: \"16fb71da-95be-43fa-a52b-fb3315938e09\") " Dec 01 08:40:58 crc kubenswrapper[5004]: I1201 08:40:58.742508 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16fb71da-95be-43fa-a52b-fb3315938e09-combined-ca-bundle\") pod \"16fb71da-95be-43fa-a52b-fb3315938e09\" (UID: \"16fb71da-95be-43fa-a52b-fb3315938e09\") " Dec 01 08:40:58 crc kubenswrapper[5004]: I1201 08:40:58.742534 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/16fb71da-95be-43fa-a52b-fb3315938e09-etc-machine-id\") pod \"16fb71da-95be-43fa-a52b-fb3315938e09\" (UID: \"16fb71da-95be-43fa-a52b-fb3315938e09\") " Dec 01 08:40:58 crc kubenswrapper[5004]: I1201 08:40:58.742577 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vvn6\" (UniqueName: \"kubernetes.io/projected/16fb71da-95be-43fa-a52b-fb3315938e09-kube-api-access-6vvn6\") pod \"16fb71da-95be-43fa-a52b-fb3315938e09\" (UID: \"16fb71da-95be-43fa-a52b-fb3315938e09\") " Dec 01 08:40:58 crc kubenswrapper[5004]: I1201 08:40:58.742973 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16fb71da-95be-43fa-a52b-fb3315938e09-scripts\") pod \"16fb71da-95be-43fa-a52b-fb3315938e09\" (UID: \"16fb71da-95be-43fa-a52b-fb3315938e09\") " Dec 01 08:40:58 crc kubenswrapper[5004]: I1201 08:40:58.744062 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/16fb71da-95be-43fa-a52b-fb3315938e09-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "16fb71da-95be-43fa-a52b-fb3315938e09" (UID: "16fb71da-95be-43fa-a52b-fb3315938e09"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 08:40:58 crc kubenswrapper[5004]: I1201 08:40:58.744994 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16fb71da-95be-43fa-a52b-fb3315938e09-logs" (OuterVolumeSpecName: "logs") pod "16fb71da-95be-43fa-a52b-fb3315938e09" (UID: "16fb71da-95be-43fa-a52b-fb3315938e09"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:40:58 crc kubenswrapper[5004]: I1201 08:40:58.747143 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16fb71da-95be-43fa-a52b-fb3315938e09-scripts" (OuterVolumeSpecName: "scripts") pod "16fb71da-95be-43fa-a52b-fb3315938e09" (UID: "16fb71da-95be-43fa-a52b-fb3315938e09"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:40:58 crc kubenswrapper[5004]: I1201 08:40:58.760124 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16fb71da-95be-43fa-a52b-fb3315938e09-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "16fb71da-95be-43fa-a52b-fb3315938e09" (UID: "16fb71da-95be-43fa-a52b-fb3315938e09"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:40:58 crc kubenswrapper[5004]: I1201 08:40:58.762109 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16fb71da-95be-43fa-a52b-fb3315938e09-kube-api-access-6vvn6" (OuterVolumeSpecName: "kube-api-access-6vvn6") pod "16fb71da-95be-43fa-a52b-fb3315938e09" (UID: "16fb71da-95be-43fa-a52b-fb3315938e09"). InnerVolumeSpecName "kube-api-access-6vvn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:40:58 crc kubenswrapper[5004]: I1201 08:40:58.804841 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcdda8d0-d614-4c3e-b9a0-37f3e676c11c" path="/var/lib/kubelet/pods/dcdda8d0-d614-4c3e-b9a0-37f3e676c11c/volumes" Dec 01 08:40:58 crc kubenswrapper[5004]: I1201 08:40:58.831135 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16fb71da-95be-43fa-a52b-fb3315938e09-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "16fb71da-95be-43fa-a52b-fb3315938e09" (UID: "16fb71da-95be-43fa-a52b-fb3315938e09"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:40:58 crc kubenswrapper[5004]: I1201 08:40:58.850053 5004 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16fb71da-95be-43fa-a52b-fb3315938e09-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:58 crc kubenswrapper[5004]: I1201 08:40:58.850464 5004 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/16fb71da-95be-43fa-a52b-fb3315938e09-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:58 crc kubenswrapper[5004]: I1201 08:40:58.850524 5004 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16fb71da-95be-43fa-a52b-fb3315938e09-logs\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:58 crc kubenswrapper[5004]: I1201 08:40:58.850619 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16fb71da-95be-43fa-a52b-fb3315938e09-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:58 crc kubenswrapper[5004]: I1201 08:40:58.850685 5004 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/16fb71da-95be-43fa-a52b-fb3315938e09-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:58 crc kubenswrapper[5004]: I1201 08:40:58.850738 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vvn6\" (UniqueName: \"kubernetes.io/projected/16fb71da-95be-43fa-a52b-fb3315938e09-kube-api-access-6vvn6\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:58 crc kubenswrapper[5004]: I1201 08:40:58.872332 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 08:40:58 crc kubenswrapper[5004]: I1201 08:40:58.878497 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16fb71da-95be-43fa-a52b-fb3315938e09-config-data" (OuterVolumeSpecName: "config-data") pod "16fb71da-95be-43fa-a52b-fb3315938e09" (UID: "16fb71da-95be-43fa-a52b-fb3315938e09"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:40:58 crc kubenswrapper[5004]: I1201 08:40:58.952317 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fd8d826-4dab-4d07-bc04-be5dfebdaf2b-scripts\") pod \"1fd8d826-4dab-4d07-bc04-be5dfebdaf2b\" (UID: \"1fd8d826-4dab-4d07-bc04-be5dfebdaf2b\") " Dec 01 08:40:58 crc kubenswrapper[5004]: I1201 08:40:58.952384 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1fd8d826-4dab-4d07-bc04-be5dfebdaf2b-log-httpd\") pod \"1fd8d826-4dab-4d07-bc04-be5dfebdaf2b\" (UID: \"1fd8d826-4dab-4d07-bc04-be5dfebdaf2b\") " Dec 01 08:40:58 crc kubenswrapper[5004]: I1201 08:40:58.952412 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fd8d826-4dab-4d07-bc04-be5dfebdaf2b-combined-ca-bundle\") pod \"1fd8d826-4dab-4d07-bc04-be5dfebdaf2b\" (UID: \"1fd8d826-4dab-4d07-bc04-be5dfebdaf2b\") " Dec 01 08:40:58 crc kubenswrapper[5004]: I1201 08:40:58.952464 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fd8d826-4dab-4d07-bc04-be5dfebdaf2b-config-data\") pod \"1fd8d826-4dab-4d07-bc04-be5dfebdaf2b\" (UID: \"1fd8d826-4dab-4d07-bc04-be5dfebdaf2b\") " Dec 01 08:40:58 crc kubenswrapper[5004]: I1201 08:40:58.952548 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1fd8d826-4dab-4d07-bc04-be5dfebdaf2b-sg-core-conf-yaml\") pod \"1fd8d826-4dab-4d07-bc04-be5dfebdaf2b\" (UID: \"1fd8d826-4dab-4d07-bc04-be5dfebdaf2b\") " Dec 01 08:40:58 crc kubenswrapper[5004]: I1201 08:40:58.952617 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1fd8d826-4dab-4d07-bc04-be5dfebdaf2b-run-httpd\") pod \"1fd8d826-4dab-4d07-bc04-be5dfebdaf2b\" (UID: \"1fd8d826-4dab-4d07-bc04-be5dfebdaf2b\") " Dec 01 08:40:58 crc kubenswrapper[5004]: I1201 08:40:58.952747 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mfkl\" (UniqueName: \"kubernetes.io/projected/1fd8d826-4dab-4d07-bc04-be5dfebdaf2b-kube-api-access-2mfkl\") pod \"1fd8d826-4dab-4d07-bc04-be5dfebdaf2b\" (UID: \"1fd8d826-4dab-4d07-bc04-be5dfebdaf2b\") " Dec 01 08:40:58 crc kubenswrapper[5004]: I1201 08:40:58.953185 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16fb71da-95be-43fa-a52b-fb3315938e09-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:58 crc kubenswrapper[5004]: I1201 08:40:58.953746 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fd8d826-4dab-4d07-bc04-be5dfebdaf2b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1fd8d826-4dab-4d07-bc04-be5dfebdaf2b" (UID: "1fd8d826-4dab-4d07-bc04-be5dfebdaf2b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:40:58 crc kubenswrapper[5004]: I1201 08:40:58.953866 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fd8d826-4dab-4d07-bc04-be5dfebdaf2b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1fd8d826-4dab-4d07-bc04-be5dfebdaf2b" (UID: "1fd8d826-4dab-4d07-bc04-be5dfebdaf2b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:40:58 crc kubenswrapper[5004]: I1201 08:40:58.959509 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fd8d826-4dab-4d07-bc04-be5dfebdaf2b-scripts" (OuterVolumeSpecName: "scripts") pod "1fd8d826-4dab-4d07-bc04-be5dfebdaf2b" (UID: "1fd8d826-4dab-4d07-bc04-be5dfebdaf2b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:40:58 crc kubenswrapper[5004]: I1201 08:40:58.959526 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fd8d826-4dab-4d07-bc04-be5dfebdaf2b-kube-api-access-2mfkl" (OuterVolumeSpecName: "kube-api-access-2mfkl") pod "1fd8d826-4dab-4d07-bc04-be5dfebdaf2b" (UID: "1fd8d826-4dab-4d07-bc04-be5dfebdaf2b"). InnerVolumeSpecName "kube-api-access-2mfkl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:40:58 crc kubenswrapper[5004]: I1201 08:40:58.988657 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fd8d826-4dab-4d07-bc04-be5dfebdaf2b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1fd8d826-4dab-4d07-bc04-be5dfebdaf2b" (UID: "1fd8d826-4dab-4d07-bc04-be5dfebdaf2b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:40:59 crc kubenswrapper[5004]: I1201 08:40:59.021149 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fd8d826-4dab-4d07-bc04-be5dfebdaf2b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1fd8d826-4dab-4d07-bc04-be5dfebdaf2b" (UID: "1fd8d826-4dab-4d07-bc04-be5dfebdaf2b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:40:59 crc kubenswrapper[5004]: I1201 08:40:59.037704 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fd8d826-4dab-4d07-bc04-be5dfebdaf2b-config-data" (OuterVolumeSpecName: "config-data") pod "1fd8d826-4dab-4d07-bc04-be5dfebdaf2b" (UID: "1fd8d826-4dab-4d07-bc04-be5dfebdaf2b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:40:59 crc kubenswrapper[5004]: I1201 08:40:59.054528 5004 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1fd8d826-4dab-4d07-bc04-be5dfebdaf2b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:59 crc kubenswrapper[5004]: I1201 08:40:59.054584 5004 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1fd8d826-4dab-4d07-bc04-be5dfebdaf2b-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:59 crc kubenswrapper[5004]: I1201 08:40:59.054597 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mfkl\" (UniqueName: \"kubernetes.io/projected/1fd8d826-4dab-4d07-bc04-be5dfebdaf2b-kube-api-access-2mfkl\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:59 crc kubenswrapper[5004]: I1201 08:40:59.054607 5004 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fd8d826-4dab-4d07-bc04-be5dfebdaf2b-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:59 crc kubenswrapper[5004]: I1201 08:40:59.054616 5004 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1fd8d826-4dab-4d07-bc04-be5dfebdaf2b-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:59 crc kubenswrapper[5004]: I1201 08:40:59.054623 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fd8d826-4dab-4d07-bc04-be5dfebdaf2b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:59 crc kubenswrapper[5004]: I1201 08:40:59.054633 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fd8d826-4dab-4d07-bc04-be5dfebdaf2b-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 08:40:59 crc kubenswrapper[5004]: I1201 08:40:59.493334 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zgjgq" podUID="f4c8bbd5-63c7-441d-9262-b8d44344c7fa" containerName="registry-server" probeResult="failure" output=< Dec 01 08:40:59 crc kubenswrapper[5004]: timeout: failed to connect service ":50051" within 1s Dec 01 08:40:59 crc kubenswrapper[5004]: > Dec 01 08:40:59 crc kubenswrapper[5004]: I1201 08:40:59.728928 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1fd8d826-4dab-4d07-bc04-be5dfebdaf2b","Type":"ContainerDied","Data":"6f08f218b7a030d6e0d1680710ecf70e182562331ffdb4941d7d886be6ca2db6"} Dec 01 08:40:59 crc kubenswrapper[5004]: I1201 08:40:59.729422 5004 scope.go:117] "RemoveContainer" containerID="65429ecf681ad84a20e124bf7210bd69eac351831158bdd2b9f27340b34a19da" Dec 01 08:40:59 crc kubenswrapper[5004]: I1201 08:40:59.729751 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 08:40:59 crc kubenswrapper[5004]: I1201 08:40:59.734306 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-58bccc5494-trhd5" event={"ID":"c77ef8f3-6312-48ca-9e64-49e0db910168","Type":"ContainerStarted","Data":"ba2939220ff363c3b58953e2aaeec699a08b4af57c7b60a413dafc8508446f4d"} Dec 01 08:40:59 crc kubenswrapper[5004]: I1201 08:40:59.742199 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-58d7b4cb4c-z6xck" event={"ID":"b463310a-8c0e-462e-a746-94a664a21ebe","Type":"ContainerStarted","Data":"0b8cee27f59574071e65fa7b0e44d0a2d39ad74332c4db1ee8e6056ecda43b5c"} Dec 01 08:40:59 crc kubenswrapper[5004]: I1201 08:40:59.752268 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 01 08:40:59 crc kubenswrapper[5004]: I1201 08:40:59.752296 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1c910a71-8445-4c64-a555-433bb2c60bbd","Type":"ContainerStarted","Data":"4c59f8c69a9f6c8e17c3d7dd2514a80d7b6af36483e212edd7176fddc028f54e"} Dec 01 08:40:59 crc kubenswrapper[5004]: I1201 08:40:59.768939 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-58bccc5494-trhd5" podStartSLOduration=3.626964482 podStartE2EDuration="6.768918838s" podCreationTimestamp="2025-12-01 08:40:53 +0000 UTC" firstStartedPulling="2025-12-01 08:40:54.959674396 +0000 UTC m=+1432.524666378" lastFinishedPulling="2025-12-01 08:40:58.101628752 +0000 UTC m=+1435.666620734" observedRunningTime="2025-12-01 08:40:59.755187133 +0000 UTC m=+1437.320179125" watchObservedRunningTime="2025-12-01 08:40:59.768918838 +0000 UTC m=+1437.333910830" Dec 01 08:40:59 crc kubenswrapper[5004]: I1201 08:40:59.802591 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 08:40:59 crc kubenswrapper[5004]: I1201 08:40:59.816144 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 01 08:40:59 crc kubenswrapper[5004]: I1201 08:40:59.831265 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 08:40:59 crc kubenswrapper[5004]: E1201 08:40:59.831927 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16fb71da-95be-43fa-a52b-fb3315938e09" containerName="cinder-api-log" Dec 01 08:40:59 crc kubenswrapper[5004]: I1201 08:40:59.831965 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="16fb71da-95be-43fa-a52b-fb3315938e09" containerName="cinder-api-log" Dec 01 08:40:59 crc kubenswrapper[5004]: E1201 08:40:59.832004 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fd8d826-4dab-4d07-bc04-be5dfebdaf2b" containerName="proxy-httpd" Dec 01 08:40:59 crc kubenswrapper[5004]: I1201 08:40:59.832013 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fd8d826-4dab-4d07-bc04-be5dfebdaf2b" containerName="proxy-httpd" Dec 01 08:40:59 crc kubenswrapper[5004]: E1201 08:40:59.832034 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcdda8d0-d614-4c3e-b9a0-37f3e676c11c" containerName="init" Dec 01 08:40:59 crc kubenswrapper[5004]: I1201 08:40:59.832044 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcdda8d0-d614-4c3e-b9a0-37f3e676c11c" containerName="init" Dec 01 08:40:59 crc kubenswrapper[5004]: E1201 08:40:59.832054 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fd8d826-4dab-4d07-bc04-be5dfebdaf2b" containerName="ceilometer-notification-agent" Dec 01 08:40:59 crc kubenswrapper[5004]: I1201 08:40:59.832062 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fd8d826-4dab-4d07-bc04-be5dfebdaf2b" containerName="ceilometer-notification-agent" Dec 01 08:40:59 crc kubenswrapper[5004]: E1201 08:40:59.832089 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fd8d826-4dab-4d07-bc04-be5dfebdaf2b" containerName="sg-core" Dec 01 08:40:59 crc kubenswrapper[5004]: I1201 08:40:59.832097 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fd8d826-4dab-4d07-bc04-be5dfebdaf2b" containerName="sg-core" Dec 01 08:40:59 crc kubenswrapper[5004]: E1201 08:40:59.832114 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16fb71da-95be-43fa-a52b-fb3315938e09" containerName="cinder-api" Dec 01 08:40:59 crc kubenswrapper[5004]: I1201 08:40:59.832133 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="16fb71da-95be-43fa-a52b-fb3315938e09" containerName="cinder-api" Dec 01 08:40:59 crc kubenswrapper[5004]: I1201 08:40:59.832437 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fd8d826-4dab-4d07-bc04-be5dfebdaf2b" containerName="proxy-httpd" Dec 01 08:40:59 crc kubenswrapper[5004]: I1201 08:40:59.832459 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="16fb71da-95be-43fa-a52b-fb3315938e09" containerName="cinder-api" Dec 01 08:40:59 crc kubenswrapper[5004]: I1201 08:40:59.832477 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="16fb71da-95be-43fa-a52b-fb3315938e09" containerName="cinder-api-log" Dec 01 08:40:59 crc kubenswrapper[5004]: I1201 08:40:59.832491 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fd8d826-4dab-4d07-bc04-be5dfebdaf2b" containerName="ceilometer-notification-agent" Dec 01 08:40:59 crc kubenswrapper[5004]: I1201 08:40:59.832506 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fd8d826-4dab-4d07-bc04-be5dfebdaf2b" containerName="sg-core" Dec 01 08:40:59 crc kubenswrapper[5004]: I1201 08:40:59.832527 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcdda8d0-d614-4c3e-b9a0-37f3e676c11c" containerName="init" Dec 01 08:40:59 crc kubenswrapper[5004]: I1201 08:40:59.836658 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.421158557 podStartE2EDuration="6.836638462s" podCreationTimestamp="2025-12-01 08:40:53 +0000 UTC" firstStartedPulling="2025-12-01 08:40:54.713066412 +0000 UTC m=+1432.278058394" lastFinishedPulling="2025-12-01 08:40:56.128546317 +0000 UTC m=+1433.693538299" observedRunningTime="2025-12-01 08:40:59.814865311 +0000 UTC m=+1437.379857293" watchObservedRunningTime="2025-12-01 08:40:59.836638462 +0000 UTC m=+1437.401630444" Dec 01 08:40:59 crc kubenswrapper[5004]: I1201 08:40:59.846998 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 08:40:59 crc kubenswrapper[5004]: I1201 08:40:59.849710 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 08:40:59 crc kubenswrapper[5004]: I1201 08:40:59.849941 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 08:40:59 crc kubenswrapper[5004]: I1201 08:40:59.855346 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 08:40:59 crc kubenswrapper[5004]: I1201 08:40:59.857078 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-58d7b4cb4c-z6xck" podStartSLOduration=3.822643521 podStartE2EDuration="6.857067431s" podCreationTimestamp="2025-12-01 08:40:53 +0000 UTC" firstStartedPulling="2025-12-01 08:40:55.132826025 +0000 UTC m=+1432.697818007" lastFinishedPulling="2025-12-01 08:40:58.167249915 +0000 UTC m=+1435.732241917" observedRunningTime="2025-12-01 08:40:59.845083038 +0000 UTC m=+1437.410075020" watchObservedRunningTime="2025-12-01 08:40:59.857067431 +0000 UTC m=+1437.422059413" Dec 01 08:40:59 crc kubenswrapper[5004]: I1201 08:40:59.907461 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 01 08:40:59 crc kubenswrapper[5004]: I1201 08:40:59.930599 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 01 08:40:59 crc kubenswrapper[5004]: I1201 08:40:59.948183 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 01 08:40:59 crc kubenswrapper[5004]: I1201 08:40:59.949845 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 01 08:40:59 crc kubenswrapper[5004]: I1201 08:40:59.958983 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 01 08:40:59 crc kubenswrapper[5004]: I1201 08:40:59.959969 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 01 08:40:59 crc kubenswrapper[5004]: I1201 08:40:59.960109 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 01 08:40:59 crc kubenswrapper[5004]: I1201 08:40:59.960180 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 01 08:40:59 crc kubenswrapper[5004]: I1201 08:40:59.975033 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8d0df30-51e3-436d-a6a6-f65dcce6db43-config-data\") pod \"ceilometer-0\" (UID: \"e8d0df30-51e3-436d-a6a6-f65dcce6db43\") " pod="openstack/ceilometer-0" Dec 01 08:40:59 crc kubenswrapper[5004]: I1201 08:40:59.975077 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8d0df30-51e3-436d-a6a6-f65dcce6db43-scripts\") pod \"ceilometer-0\" (UID: \"e8d0df30-51e3-436d-a6a6-f65dcce6db43\") " pod="openstack/ceilometer-0" Dec 01 08:40:59 crc kubenswrapper[5004]: I1201 08:40:59.975100 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8d0df30-51e3-436d-a6a6-f65dcce6db43-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e8d0df30-51e3-436d-a6a6-f65dcce6db43\") " pod="openstack/ceilometer-0" Dec 01 08:40:59 crc kubenswrapper[5004]: I1201 08:40:59.975147 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8d0df30-51e3-436d-a6a6-f65dcce6db43-log-httpd\") pod \"ceilometer-0\" (UID: \"e8d0df30-51e3-436d-a6a6-f65dcce6db43\") " pod="openstack/ceilometer-0" Dec 01 08:40:59 crc kubenswrapper[5004]: I1201 08:40:59.975173 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8d0df30-51e3-436d-a6a6-f65dcce6db43-run-httpd\") pod \"ceilometer-0\" (UID: \"e8d0df30-51e3-436d-a6a6-f65dcce6db43\") " pod="openstack/ceilometer-0" Dec 01 08:40:59 crc kubenswrapper[5004]: I1201 08:40:59.975232 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsjpx\" (UniqueName: \"kubernetes.io/projected/e8d0df30-51e3-436d-a6a6-f65dcce6db43-kube-api-access-vsjpx\") pod \"ceilometer-0\" (UID: \"e8d0df30-51e3-436d-a6a6-f65dcce6db43\") " pod="openstack/ceilometer-0" Dec 01 08:40:59 crc kubenswrapper[5004]: I1201 08:40:59.975270 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e8d0df30-51e3-436d-a6a6-f65dcce6db43-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e8d0df30-51e3-436d-a6a6-f65dcce6db43\") " pod="openstack/ceilometer-0" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.077476 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8d0df30-51e3-436d-a6a6-f65dcce6db43-log-httpd\") pod \"ceilometer-0\" (UID: \"e8d0df30-51e3-436d-a6a6-f65dcce6db43\") " pod="openstack/ceilometer-0" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.077537 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8d0df30-51e3-436d-a6a6-f65dcce6db43-run-httpd\") pod \"ceilometer-0\" (UID: \"e8d0df30-51e3-436d-a6a6-f65dcce6db43\") " pod="openstack/ceilometer-0" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.077581 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c809558d-b402-4195-b03a-dbc3a1b96707-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"c809558d-b402-4195-b03a-dbc3a1b96707\") " pod="openstack/cinder-api-0" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.077631 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c809558d-b402-4195-b03a-dbc3a1b96707-config-data-custom\") pod \"cinder-api-0\" (UID: \"c809558d-b402-4195-b03a-dbc3a1b96707\") " pod="openstack/cinder-api-0" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.077654 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c809558d-b402-4195-b03a-dbc3a1b96707-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c809558d-b402-4195-b03a-dbc3a1b96707\") " pod="openstack/cinder-api-0" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.077699 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsjpx\" (UniqueName: \"kubernetes.io/projected/e8d0df30-51e3-436d-a6a6-f65dcce6db43-kube-api-access-vsjpx\") pod \"ceilometer-0\" (UID: \"e8d0df30-51e3-436d-a6a6-f65dcce6db43\") " pod="openstack/ceilometer-0" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.077722 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c809558d-b402-4195-b03a-dbc3a1b96707-scripts\") pod \"cinder-api-0\" (UID: \"c809558d-b402-4195-b03a-dbc3a1b96707\") " pod="openstack/cinder-api-0" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.077757 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e8d0df30-51e3-436d-a6a6-f65dcce6db43-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e8d0df30-51e3-436d-a6a6-f65dcce6db43\") " pod="openstack/ceilometer-0" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.077772 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c809558d-b402-4195-b03a-dbc3a1b96707-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c809558d-b402-4195-b03a-dbc3a1b96707\") " pod="openstack/cinder-api-0" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.077801 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c809558d-b402-4195-b03a-dbc3a1b96707-config-data\") pod \"cinder-api-0\" (UID: \"c809558d-b402-4195-b03a-dbc3a1b96707\") " pod="openstack/cinder-api-0" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.077814 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c809558d-b402-4195-b03a-dbc3a1b96707-logs\") pod \"cinder-api-0\" (UID: \"c809558d-b402-4195-b03a-dbc3a1b96707\") " pod="openstack/cinder-api-0" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.077858 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8d0df30-51e3-436d-a6a6-f65dcce6db43-config-data\") pod \"ceilometer-0\" (UID: \"e8d0df30-51e3-436d-a6a6-f65dcce6db43\") " pod="openstack/ceilometer-0" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.077883 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8d0df30-51e3-436d-a6a6-f65dcce6db43-scripts\") pod \"ceilometer-0\" (UID: \"e8d0df30-51e3-436d-a6a6-f65dcce6db43\") " pod="openstack/ceilometer-0" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.077900 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c809558d-b402-4195-b03a-dbc3a1b96707-public-tls-certs\") pod \"cinder-api-0\" (UID: \"c809558d-b402-4195-b03a-dbc3a1b96707\") " pod="openstack/cinder-api-0" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.077914 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjtnb\" (UniqueName: \"kubernetes.io/projected/c809558d-b402-4195-b03a-dbc3a1b96707-kube-api-access-kjtnb\") pod \"cinder-api-0\" (UID: \"c809558d-b402-4195-b03a-dbc3a1b96707\") " pod="openstack/cinder-api-0" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.077929 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8d0df30-51e3-436d-a6a6-f65dcce6db43-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e8d0df30-51e3-436d-a6a6-f65dcce6db43\") " pod="openstack/ceilometer-0" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.078870 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8d0df30-51e3-436d-a6a6-f65dcce6db43-log-httpd\") pod \"ceilometer-0\" (UID: \"e8d0df30-51e3-436d-a6a6-f65dcce6db43\") " pod="openstack/ceilometer-0" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.078956 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8d0df30-51e3-436d-a6a6-f65dcce6db43-run-httpd\") pod \"ceilometer-0\" (UID: \"e8d0df30-51e3-436d-a6a6-f65dcce6db43\") " pod="openstack/ceilometer-0" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.129220 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-745fdd7bc8-x8cmf" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.153958 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e8d0df30-51e3-436d-a6a6-f65dcce6db43-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e8d0df30-51e3-436d-a6a6-f65dcce6db43\") " pod="openstack/ceilometer-0" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.154013 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8d0df30-51e3-436d-a6a6-f65dcce6db43-scripts\") pod \"ceilometer-0\" (UID: \"e8d0df30-51e3-436d-a6a6-f65dcce6db43\") " pod="openstack/ceilometer-0" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.154117 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsjpx\" (UniqueName: \"kubernetes.io/projected/e8d0df30-51e3-436d-a6a6-f65dcce6db43-kube-api-access-vsjpx\") pod \"ceilometer-0\" (UID: \"e8d0df30-51e3-436d-a6a6-f65dcce6db43\") " pod="openstack/ceilometer-0" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.157371 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8d0df30-51e3-436d-a6a6-f65dcce6db43-config-data\") pod \"ceilometer-0\" (UID: \"e8d0df30-51e3-436d-a6a6-f65dcce6db43\") " pod="openstack/ceilometer-0" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.161051 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8d0df30-51e3-436d-a6a6-f65dcce6db43-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e8d0df30-51e3-436d-a6a6-f65dcce6db43\") " pod="openstack/ceilometer-0" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.172771 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.179173 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c809558d-b402-4195-b03a-dbc3a1b96707-public-tls-certs\") pod \"cinder-api-0\" (UID: \"c809558d-b402-4195-b03a-dbc3a1b96707\") " pod="openstack/cinder-api-0" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.179217 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjtnb\" (UniqueName: \"kubernetes.io/projected/c809558d-b402-4195-b03a-dbc3a1b96707-kube-api-access-kjtnb\") pod \"cinder-api-0\" (UID: \"c809558d-b402-4195-b03a-dbc3a1b96707\") " pod="openstack/cinder-api-0" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.179287 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c809558d-b402-4195-b03a-dbc3a1b96707-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"c809558d-b402-4195-b03a-dbc3a1b96707\") " pod="openstack/cinder-api-0" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.179334 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c809558d-b402-4195-b03a-dbc3a1b96707-config-data-custom\") pod \"cinder-api-0\" (UID: \"c809558d-b402-4195-b03a-dbc3a1b96707\") " pod="openstack/cinder-api-0" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.179359 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c809558d-b402-4195-b03a-dbc3a1b96707-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c809558d-b402-4195-b03a-dbc3a1b96707\") " pod="openstack/cinder-api-0" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.179394 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c809558d-b402-4195-b03a-dbc3a1b96707-scripts\") pod \"cinder-api-0\" (UID: \"c809558d-b402-4195-b03a-dbc3a1b96707\") " pod="openstack/cinder-api-0" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.179480 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c809558d-b402-4195-b03a-dbc3a1b96707-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c809558d-b402-4195-b03a-dbc3a1b96707\") " pod="openstack/cinder-api-0" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.179532 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c809558d-b402-4195-b03a-dbc3a1b96707-config-data\") pod \"cinder-api-0\" (UID: \"c809558d-b402-4195-b03a-dbc3a1b96707\") " pod="openstack/cinder-api-0" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.179550 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c809558d-b402-4195-b03a-dbc3a1b96707-logs\") pod \"cinder-api-0\" (UID: \"c809558d-b402-4195-b03a-dbc3a1b96707\") " pod="openstack/cinder-api-0" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.179850 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c809558d-b402-4195-b03a-dbc3a1b96707-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c809558d-b402-4195-b03a-dbc3a1b96707\") " pod="openstack/cinder-api-0" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.180284 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c809558d-b402-4195-b03a-dbc3a1b96707-logs\") pod \"cinder-api-0\" (UID: \"c809558d-b402-4195-b03a-dbc3a1b96707\") " pod="openstack/cinder-api-0" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.186778 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c809558d-b402-4195-b03a-dbc3a1b96707-scripts\") pod \"cinder-api-0\" (UID: \"c809558d-b402-4195-b03a-dbc3a1b96707\") " pod="openstack/cinder-api-0" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.187994 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c809558d-b402-4195-b03a-dbc3a1b96707-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"c809558d-b402-4195-b03a-dbc3a1b96707\") " pod="openstack/cinder-api-0" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.192156 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c809558d-b402-4195-b03a-dbc3a1b96707-config-data\") pod \"cinder-api-0\" (UID: \"c809558d-b402-4195-b03a-dbc3a1b96707\") " pod="openstack/cinder-api-0" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.196236 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c809558d-b402-4195-b03a-dbc3a1b96707-config-data-custom\") pod \"cinder-api-0\" (UID: \"c809558d-b402-4195-b03a-dbc3a1b96707\") " pod="openstack/cinder-api-0" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.199752 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c809558d-b402-4195-b03a-dbc3a1b96707-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c809558d-b402-4195-b03a-dbc3a1b96707\") " pod="openstack/cinder-api-0" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.206432 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c809558d-b402-4195-b03a-dbc3a1b96707-public-tls-certs\") pod \"cinder-api-0\" (UID: \"c809558d-b402-4195-b03a-dbc3a1b96707\") " pod="openstack/cinder-api-0" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.211439 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjtnb\" (UniqueName: \"kubernetes.io/projected/c809558d-b402-4195-b03a-dbc3a1b96707-kube-api-access-kjtnb\") pod \"cinder-api-0\" (UID: \"c809558d-b402-4195-b03a-dbc3a1b96707\") " pod="openstack/cinder-api-0" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.274985 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.596426 5004 scope.go:117] "RemoveContainer" containerID="b30bf82a68f48468230213f790b700cf0f102fb193557f1d0f19a033f9588f8f" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.627033 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-585d58d86-6cl4z"] Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.628860 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-585d58d86-6cl4z" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.634452 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.634640 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.640893 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-585d58d86-6cl4z"] Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.697640 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce432d6a-76cf-408f-88e5-de133c8a555b-combined-ca-bundle\") pod \"barbican-api-585d58d86-6cl4z\" (UID: \"ce432d6a-76cf-408f-88e5-de133c8a555b\") " pod="openstack/barbican-api-585d58d86-6cl4z" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.698024 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce432d6a-76cf-408f-88e5-de133c8a555b-public-tls-certs\") pod \"barbican-api-585d58d86-6cl4z\" (UID: \"ce432d6a-76cf-408f-88e5-de133c8a555b\") " pod="openstack/barbican-api-585d58d86-6cl4z" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.698110 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ce432d6a-76cf-408f-88e5-de133c8a555b-config-data-custom\") pod \"barbican-api-585d58d86-6cl4z\" (UID: \"ce432d6a-76cf-408f-88e5-de133c8a555b\") " pod="openstack/barbican-api-585d58d86-6cl4z" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.698182 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcd2j\" (UniqueName: \"kubernetes.io/projected/ce432d6a-76cf-408f-88e5-de133c8a555b-kube-api-access-wcd2j\") pod \"barbican-api-585d58d86-6cl4z\" (UID: \"ce432d6a-76cf-408f-88e5-de133c8a555b\") " pod="openstack/barbican-api-585d58d86-6cl4z" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.698241 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce432d6a-76cf-408f-88e5-de133c8a555b-internal-tls-certs\") pod \"barbican-api-585d58d86-6cl4z\" (UID: \"ce432d6a-76cf-408f-88e5-de133c8a555b\") " pod="openstack/barbican-api-585d58d86-6cl4z" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.698354 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce432d6a-76cf-408f-88e5-de133c8a555b-config-data\") pod \"barbican-api-585d58d86-6cl4z\" (UID: \"ce432d6a-76cf-408f-88e5-de133c8a555b\") " pod="openstack/barbican-api-585d58d86-6cl4z" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.698440 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce432d6a-76cf-408f-88e5-de133c8a555b-logs\") pod \"barbican-api-585d58d86-6cl4z\" (UID: \"ce432d6a-76cf-408f-88e5-de133c8a555b\") " pod="openstack/barbican-api-585d58d86-6cl4z" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.774963 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16fb71da-95be-43fa-a52b-fb3315938e09" path="/var/lib/kubelet/pods/16fb71da-95be-43fa-a52b-fb3315938e09/volumes" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.776556 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fd8d826-4dab-4d07-bc04-be5dfebdaf2b" path="/var/lib/kubelet/pods/1fd8d826-4dab-4d07-bc04-be5dfebdaf2b/volumes" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.801064 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcd2j\" (UniqueName: \"kubernetes.io/projected/ce432d6a-76cf-408f-88e5-de133c8a555b-kube-api-access-wcd2j\") pod \"barbican-api-585d58d86-6cl4z\" (UID: \"ce432d6a-76cf-408f-88e5-de133c8a555b\") " pod="openstack/barbican-api-585d58d86-6cl4z" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.801119 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce432d6a-76cf-408f-88e5-de133c8a555b-internal-tls-certs\") pod \"barbican-api-585d58d86-6cl4z\" (UID: \"ce432d6a-76cf-408f-88e5-de133c8a555b\") " pod="openstack/barbican-api-585d58d86-6cl4z" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.801218 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce432d6a-76cf-408f-88e5-de133c8a555b-config-data\") pod \"barbican-api-585d58d86-6cl4z\" (UID: \"ce432d6a-76cf-408f-88e5-de133c8a555b\") " pod="openstack/barbican-api-585d58d86-6cl4z" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.801276 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce432d6a-76cf-408f-88e5-de133c8a555b-logs\") pod \"barbican-api-585d58d86-6cl4z\" (UID: \"ce432d6a-76cf-408f-88e5-de133c8a555b\") " pod="openstack/barbican-api-585d58d86-6cl4z" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.801396 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce432d6a-76cf-408f-88e5-de133c8a555b-combined-ca-bundle\") pod \"barbican-api-585d58d86-6cl4z\" (UID: \"ce432d6a-76cf-408f-88e5-de133c8a555b\") " pod="openstack/barbican-api-585d58d86-6cl4z" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.801452 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce432d6a-76cf-408f-88e5-de133c8a555b-public-tls-certs\") pod \"barbican-api-585d58d86-6cl4z\" (UID: \"ce432d6a-76cf-408f-88e5-de133c8a555b\") " pod="openstack/barbican-api-585d58d86-6cl4z" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.801540 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ce432d6a-76cf-408f-88e5-de133c8a555b-config-data-custom\") pod \"barbican-api-585d58d86-6cl4z\" (UID: \"ce432d6a-76cf-408f-88e5-de133c8a555b\") " pod="openstack/barbican-api-585d58d86-6cl4z" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.802282 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce432d6a-76cf-408f-88e5-de133c8a555b-logs\") pod \"barbican-api-585d58d86-6cl4z\" (UID: \"ce432d6a-76cf-408f-88e5-de133c8a555b\") " pod="openstack/barbican-api-585d58d86-6cl4z" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.808161 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ce432d6a-76cf-408f-88e5-de133c8a555b-config-data-custom\") pod \"barbican-api-585d58d86-6cl4z\" (UID: \"ce432d6a-76cf-408f-88e5-de133c8a555b\") " pod="openstack/barbican-api-585d58d86-6cl4z" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.808905 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce432d6a-76cf-408f-88e5-de133c8a555b-internal-tls-certs\") pod \"barbican-api-585d58d86-6cl4z\" (UID: \"ce432d6a-76cf-408f-88e5-de133c8a555b\") " pod="openstack/barbican-api-585d58d86-6cl4z" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.808908 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce432d6a-76cf-408f-88e5-de133c8a555b-config-data\") pod \"barbican-api-585d58d86-6cl4z\" (UID: \"ce432d6a-76cf-408f-88e5-de133c8a555b\") " pod="openstack/barbican-api-585d58d86-6cl4z" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.810478 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce432d6a-76cf-408f-88e5-de133c8a555b-public-tls-certs\") pod \"barbican-api-585d58d86-6cl4z\" (UID: \"ce432d6a-76cf-408f-88e5-de133c8a555b\") " pod="openstack/barbican-api-585d58d86-6cl4z" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.814296 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce432d6a-76cf-408f-88e5-de133c8a555b-combined-ca-bundle\") pod \"barbican-api-585d58d86-6cl4z\" (UID: \"ce432d6a-76cf-408f-88e5-de133c8a555b\") " pod="openstack/barbican-api-585d58d86-6cl4z" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.820668 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcd2j\" (UniqueName: \"kubernetes.io/projected/ce432d6a-76cf-408f-88e5-de133c8a555b-kube-api-access-wcd2j\") pod \"barbican-api-585d58d86-6cl4z\" (UID: \"ce432d6a-76cf-408f-88e5-de133c8a555b\") " pod="openstack/barbican-api-585d58d86-6cl4z" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.873124 5004 scope.go:117] "RemoveContainer" containerID="e955dd78612c15b9cc804f7c4d3a8fa1a731e8687153b47cd4fdd2a0719e29eb" Dec 01 08:41:00 crc kubenswrapper[5004]: I1201 08:41:00.992774 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-585d58d86-6cl4z" Dec 01 08:41:01 crc kubenswrapper[5004]: I1201 08:41:01.359580 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 08:41:01 crc kubenswrapper[5004]: W1201 08:41:01.462544 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc809558d_b402_4195_b03a_dbc3a1b96707.slice/crio-d0dd15b23b9c1b3abef31bfab523dfbf3c43bdb522daaf240799f8cb2c4801be WatchSource:0}: Error finding container d0dd15b23b9c1b3abef31bfab523dfbf3c43bdb522daaf240799f8cb2c4801be: Status 404 returned error can't find the container with id d0dd15b23b9c1b3abef31bfab523dfbf3c43bdb522daaf240799f8cb2c4801be Dec 01 08:41:01 crc kubenswrapper[5004]: I1201 08:41:01.463242 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 01 08:41:01 crc kubenswrapper[5004]: I1201 08:41:01.560408 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-585d58d86-6cl4z"] Dec 01 08:41:01 crc kubenswrapper[5004]: W1201 08:41:01.567771 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce432d6a_76cf_408f_88e5_de133c8a555b.slice/crio-1526a0f3b17e36d83f6b8d2059de891e1463eca952bd2a4b2e3b666ebb316a9a WatchSource:0}: Error finding container 1526a0f3b17e36d83f6b8d2059de891e1463eca952bd2a4b2e3b666ebb316a9a: Status 404 returned error can't find the container with id 1526a0f3b17e36d83f6b8d2059de891e1463eca952bd2a4b2e3b666ebb316a9a Dec 01 08:41:01 crc kubenswrapper[5004]: I1201 08:41:01.796106 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c809558d-b402-4195-b03a-dbc3a1b96707","Type":"ContainerStarted","Data":"d0dd15b23b9c1b3abef31bfab523dfbf3c43bdb522daaf240799f8cb2c4801be"} Dec 01 08:41:01 crc kubenswrapper[5004]: I1201 08:41:01.800438 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e8d0df30-51e3-436d-a6a6-f65dcce6db43","Type":"ContainerStarted","Data":"8695336a6b0b5bfaa2d1058a1cfcaece8b30cb43b5ab4b855214dfc396ec240e"} Dec 01 08:41:01 crc kubenswrapper[5004]: I1201 08:41:01.802070 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-585d58d86-6cl4z" event={"ID":"ce432d6a-76cf-408f-88e5-de133c8a555b","Type":"ContainerStarted","Data":"1526a0f3b17e36d83f6b8d2059de891e1463eca952bd2a4b2e3b666ebb316a9a"} Dec 01 08:41:02 crc kubenswrapper[5004]: I1201 08:41:02.817785 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c809558d-b402-4195-b03a-dbc3a1b96707","Type":"ContainerStarted","Data":"acd8a48b402f0c76db5178f6ed30317acd465e78373150112a11259cd6955045"} Dec 01 08:41:02 crc kubenswrapper[5004]: I1201 08:41:02.818914 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e8d0df30-51e3-436d-a6a6-f65dcce6db43","Type":"ContainerStarted","Data":"50d0650801191286e1e076cbe935cf885747cedc8d93d1ada5eefd63f626c140"} Dec 01 08:41:02 crc kubenswrapper[5004]: I1201 08:41:02.820423 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-585d58d86-6cl4z" event={"ID":"ce432d6a-76cf-408f-88e5-de133c8a555b","Type":"ContainerStarted","Data":"d1845cd7530f244d4086ef7314bb0a02e2fd3bc15b58bb7479102392610cc3f3"} Dec 01 08:41:02 crc kubenswrapper[5004]: I1201 08:41:02.820457 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-585d58d86-6cl4z" event={"ID":"ce432d6a-76cf-408f-88e5-de133c8a555b","Type":"ContainerStarted","Data":"50c56d67fdd32527740314f33dcc162467324c1ffccddab41889b9ad46a09753"} Dec 01 08:41:02 crc kubenswrapper[5004]: I1201 08:41:02.820880 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-585d58d86-6cl4z" Dec 01 08:41:02 crc kubenswrapper[5004]: I1201 08:41:02.820903 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-585d58d86-6cl4z" Dec 01 08:41:02 crc kubenswrapper[5004]: I1201 08:41:02.878947 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-585d58d86-6cl4z" podStartSLOduration=2.878927954 podStartE2EDuration="2.878927954s" podCreationTimestamp="2025-12-01 08:41:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:41:02.848518261 +0000 UTC m=+1440.413510263" watchObservedRunningTime="2025-12-01 08:41:02.878927954 +0000 UTC m=+1440.443919936" Dec 01 08:41:02 crc kubenswrapper[5004]: I1201 08:41:02.937896 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-cb845f59f-m4f5q" Dec 01 08:41:03 crc kubenswrapper[5004]: I1201 08:41:03.025212 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-745fdd7bc8-x8cmf"] Dec 01 08:41:03 crc kubenswrapper[5004]: I1201 08:41:03.025494 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-745fdd7bc8-x8cmf" podUID="bc52bfd5-1560-4155-ba74-5fc2d92dfe73" containerName="neutron-api" containerID="cri-o://8240e03a472d498dbdfc57c97c051baed4a1338229d1ae7da3d4173a1e98430f" gracePeriod=30 Dec 01 08:41:03 crc kubenswrapper[5004]: I1201 08:41:03.025688 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-745fdd7bc8-x8cmf" podUID="bc52bfd5-1560-4155-ba74-5fc2d92dfe73" containerName="neutron-httpd" containerID="cri-o://69e3c2040e4f88fafa2d0e41f24069bd46aca6f4a6f8940dc1c49c5bd1273691" gracePeriod=30 Dec 01 08:41:03 crc kubenswrapper[5004]: I1201 08:41:03.833507 5004 generic.go:334] "Generic (PLEG): container finished" podID="bc52bfd5-1560-4155-ba74-5fc2d92dfe73" containerID="69e3c2040e4f88fafa2d0e41f24069bd46aca6f4a6f8940dc1c49c5bd1273691" exitCode=0 Dec 01 08:41:03 crc kubenswrapper[5004]: I1201 08:41:03.833605 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-745fdd7bc8-x8cmf" event={"ID":"bc52bfd5-1560-4155-ba74-5fc2d92dfe73","Type":"ContainerDied","Data":"69e3c2040e4f88fafa2d0e41f24069bd46aca6f4a6f8940dc1c49c5bd1273691"} Dec 01 08:41:03 crc kubenswrapper[5004]: I1201 08:41:03.837029 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e8d0df30-51e3-436d-a6a6-f65dcce6db43","Type":"ContainerStarted","Data":"8233abab7381b4b33e414d6dd7c421f095c3af601e8a4f09e9d238fd41d311b6"} Dec 01 08:41:03 crc kubenswrapper[5004]: I1201 08:41:03.838725 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c809558d-b402-4195-b03a-dbc3a1b96707","Type":"ContainerStarted","Data":"cebd848a6baa6834b6fedd54f9fe8152e86b83af78e97645bc7222af1db739e5"} Dec 01 08:41:03 crc kubenswrapper[5004]: I1201 08:41:03.877491 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.877465735 podStartE2EDuration="4.877465735s" podCreationTimestamp="2025-12-01 08:40:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:41:03.863014702 +0000 UTC m=+1441.428006714" watchObservedRunningTime="2025-12-01 08:41:03.877465735 +0000 UTC m=+1441.442457727" Dec 01 08:41:03 crc kubenswrapper[5004]: I1201 08:41:03.908748 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 01 08:41:04 crc kubenswrapper[5004]: I1201 08:41:04.289654 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 01 08:41:04 crc kubenswrapper[5004]: I1201 08:41:04.624768 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6bb4fc677f-zqhkd" Dec 01 08:41:04 crc kubenswrapper[5004]: I1201 08:41:04.711237 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-xj8tw"] Dec 01 08:41:04 crc kubenswrapper[5004]: I1201 08:41:04.711491 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5ccc5c4795-xj8tw" podUID="fbc3a584-72e3-4331-8bd5-c5accc1f0395" containerName="dnsmasq-dns" containerID="cri-o://c39746a684943747e313bf508e64ee70cb531fc44f2e9c99324915e5ec69249d" gracePeriod=10 Dec 01 08:41:04 crc kubenswrapper[5004]: I1201 08:41:04.862712 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e8d0df30-51e3-436d-a6a6-f65dcce6db43","Type":"ContainerStarted","Data":"db3c9c4681c539e98b6da1a0b5264aa96c8db05e0e318d8711e90f6dd314a2e2"} Dec 01 08:41:04 crc kubenswrapper[5004]: I1201 08:41:04.869056 5004 generic.go:334] "Generic (PLEG): container finished" podID="fbc3a584-72e3-4331-8bd5-c5accc1f0395" containerID="c39746a684943747e313bf508e64ee70cb531fc44f2e9c99324915e5ec69249d" exitCode=0 Dec 01 08:41:04 crc kubenswrapper[5004]: I1201 08:41:04.869139 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-xj8tw" event={"ID":"fbc3a584-72e3-4331-8bd5-c5accc1f0395","Type":"ContainerDied","Data":"c39746a684943747e313bf508e64ee70cb531fc44f2e9c99324915e5ec69249d"} Dec 01 08:41:04 crc kubenswrapper[5004]: I1201 08:41:04.870720 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 01 08:41:04 crc kubenswrapper[5004]: I1201 08:41:04.919073 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 08:41:04 crc kubenswrapper[5004]: I1201 08:41:04.981472 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5ccc5c4795-xj8tw" podUID="fbc3a584-72e3-4331-8bd5-c5accc1f0395" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.189:5353: connect: connection refused" Dec 01 08:41:05 crc kubenswrapper[5004]: I1201 08:41:05.318615 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-xj8tw" Dec 01 08:41:05 crc kubenswrapper[5004]: I1201 08:41:05.442504 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rs9h4\" (UniqueName: \"kubernetes.io/projected/fbc3a584-72e3-4331-8bd5-c5accc1f0395-kube-api-access-rs9h4\") pod \"fbc3a584-72e3-4331-8bd5-c5accc1f0395\" (UID: \"fbc3a584-72e3-4331-8bd5-c5accc1f0395\") " Dec 01 08:41:05 crc kubenswrapper[5004]: I1201 08:41:05.442577 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fbc3a584-72e3-4331-8bd5-c5accc1f0395-config\") pod \"fbc3a584-72e3-4331-8bd5-c5accc1f0395\" (UID: \"fbc3a584-72e3-4331-8bd5-c5accc1f0395\") " Dec 01 08:41:05 crc kubenswrapper[5004]: I1201 08:41:05.442646 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fbc3a584-72e3-4331-8bd5-c5accc1f0395-ovsdbserver-nb\") pod \"fbc3a584-72e3-4331-8bd5-c5accc1f0395\" (UID: \"fbc3a584-72e3-4331-8bd5-c5accc1f0395\") " Dec 01 08:41:05 crc kubenswrapper[5004]: I1201 08:41:05.442676 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fbc3a584-72e3-4331-8bd5-c5accc1f0395-dns-swift-storage-0\") pod \"fbc3a584-72e3-4331-8bd5-c5accc1f0395\" (UID: \"fbc3a584-72e3-4331-8bd5-c5accc1f0395\") " Dec 01 08:41:05 crc kubenswrapper[5004]: I1201 08:41:05.442779 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fbc3a584-72e3-4331-8bd5-c5accc1f0395-ovsdbserver-sb\") pod \"fbc3a584-72e3-4331-8bd5-c5accc1f0395\" (UID: \"fbc3a584-72e3-4331-8bd5-c5accc1f0395\") " Dec 01 08:41:05 crc kubenswrapper[5004]: I1201 08:41:05.442842 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fbc3a584-72e3-4331-8bd5-c5accc1f0395-dns-svc\") pod \"fbc3a584-72e3-4331-8bd5-c5accc1f0395\" (UID: \"fbc3a584-72e3-4331-8bd5-c5accc1f0395\") " Dec 01 08:41:05 crc kubenswrapper[5004]: I1201 08:41:05.476637 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbc3a584-72e3-4331-8bd5-c5accc1f0395-kube-api-access-rs9h4" (OuterVolumeSpecName: "kube-api-access-rs9h4") pod "fbc3a584-72e3-4331-8bd5-c5accc1f0395" (UID: "fbc3a584-72e3-4331-8bd5-c5accc1f0395"). InnerVolumeSpecName "kube-api-access-rs9h4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:41:05 crc kubenswrapper[5004]: I1201 08:41:05.537111 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbc3a584-72e3-4331-8bd5-c5accc1f0395-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fbc3a584-72e3-4331-8bd5-c5accc1f0395" (UID: "fbc3a584-72e3-4331-8bd5-c5accc1f0395"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:41:05 crc kubenswrapper[5004]: I1201 08:41:05.537980 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbc3a584-72e3-4331-8bd5-c5accc1f0395-config" (OuterVolumeSpecName: "config") pod "fbc3a584-72e3-4331-8bd5-c5accc1f0395" (UID: "fbc3a584-72e3-4331-8bd5-c5accc1f0395"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:41:05 crc kubenswrapper[5004]: I1201 08:41:05.546183 5004 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fbc3a584-72e3-4331-8bd5-c5accc1f0395-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:05 crc kubenswrapper[5004]: I1201 08:41:05.546227 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rs9h4\" (UniqueName: \"kubernetes.io/projected/fbc3a584-72e3-4331-8bd5-c5accc1f0395-kube-api-access-rs9h4\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:05 crc kubenswrapper[5004]: I1201 08:41:05.546243 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fbc3a584-72e3-4331-8bd5-c5accc1f0395-config\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:05 crc kubenswrapper[5004]: I1201 08:41:05.556954 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbc3a584-72e3-4331-8bd5-c5accc1f0395-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fbc3a584-72e3-4331-8bd5-c5accc1f0395" (UID: "fbc3a584-72e3-4331-8bd5-c5accc1f0395"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:41:05 crc kubenswrapper[5004]: I1201 08:41:05.557997 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbc3a584-72e3-4331-8bd5-c5accc1f0395-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "fbc3a584-72e3-4331-8bd5-c5accc1f0395" (UID: "fbc3a584-72e3-4331-8bd5-c5accc1f0395"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:41:05 crc kubenswrapper[5004]: I1201 08:41:05.571369 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbc3a584-72e3-4331-8bd5-c5accc1f0395-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fbc3a584-72e3-4331-8bd5-c5accc1f0395" (UID: "fbc3a584-72e3-4331-8bd5-c5accc1f0395"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:41:05 crc kubenswrapper[5004]: I1201 08:41:05.648540 5004 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fbc3a584-72e3-4331-8bd5-c5accc1f0395-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:05 crc kubenswrapper[5004]: I1201 08:41:05.648583 5004 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fbc3a584-72e3-4331-8bd5-c5accc1f0395-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:05 crc kubenswrapper[5004]: I1201 08:41:05.648593 5004 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fbc3a584-72e3-4331-8bd5-c5accc1f0395-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:05 crc kubenswrapper[5004]: I1201 08:41:05.887092 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e8d0df30-51e3-436d-a6a6-f65dcce6db43","Type":"ContainerStarted","Data":"96cd9079bacc653ba543eec29b93edca4bb82d19e66b908eec9ef5edfcdbf843"} Dec 01 08:41:05 crc kubenswrapper[5004]: I1201 08:41:05.887535 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 08:41:05 crc kubenswrapper[5004]: I1201 08:41:05.891538 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="1c910a71-8445-4c64-a555-433bb2c60bbd" containerName="cinder-scheduler" containerID="cri-o://d5b78ca5142cc60878a4ad77fa71ddfca3d3c89f7ed8d105d0bc39e9b44fba76" gracePeriod=30 Dec 01 08:41:05 crc kubenswrapper[5004]: I1201 08:41:05.891855 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-xj8tw" Dec 01 08:41:05 crc kubenswrapper[5004]: I1201 08:41:05.891911 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="1c910a71-8445-4c64-a555-433bb2c60bbd" containerName="probe" containerID="cri-o://4c59f8c69a9f6c8e17c3d7dd2514a80d7b6af36483e212edd7176fddc028f54e" gracePeriod=30 Dec 01 08:41:05 crc kubenswrapper[5004]: I1201 08:41:05.892002 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-xj8tw" event={"ID":"fbc3a584-72e3-4331-8bd5-c5accc1f0395","Type":"ContainerDied","Data":"2e841945cd2f50c324175c66f2c981ed0db52f38268528475f4db176acc14031"} Dec 01 08:41:05 crc kubenswrapper[5004]: I1201 08:41:05.892153 5004 scope.go:117] "RemoveContainer" containerID="c39746a684943747e313bf508e64ee70cb531fc44f2e9c99324915e5ec69249d" Dec 01 08:41:05 crc kubenswrapper[5004]: I1201 08:41:05.913528 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.700983681 podStartE2EDuration="6.913504298s" podCreationTimestamp="2025-12-01 08:40:59 +0000 UTC" firstStartedPulling="2025-12-01 08:41:01.365497117 +0000 UTC m=+1438.930489089" lastFinishedPulling="2025-12-01 08:41:05.578017734 +0000 UTC m=+1443.143009706" observedRunningTime="2025-12-01 08:41:05.91194924 +0000 UTC m=+1443.476941242" watchObservedRunningTime="2025-12-01 08:41:05.913504298 +0000 UTC m=+1443.478496280" Dec 01 08:41:05 crc kubenswrapper[5004]: I1201 08:41:05.939778 5004 scope.go:117] "RemoveContainer" containerID="a67d7af54527f2c3a579a265d83d5032fa2aefe3147972385e8f4684d9e7644b" Dec 01 08:41:05 crc kubenswrapper[5004]: I1201 08:41:05.968328 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-xj8tw"] Dec 01 08:41:05 crc kubenswrapper[5004]: I1201 08:41:05.975365 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-xj8tw"] Dec 01 08:41:06 crc kubenswrapper[5004]: I1201 08:41:06.486667 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-65dd66d59d-csdqc" Dec 01 08:41:06 crc kubenswrapper[5004]: I1201 08:41:06.561203 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-65dd66d59d-csdqc" Dec 01 08:41:06 crc kubenswrapper[5004]: I1201 08:41:06.771877 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbc3a584-72e3-4331-8bd5-c5accc1f0395" path="/var/lib/kubelet/pods/fbc3a584-72e3-4331-8bd5-c5accc1f0395/volumes" Dec 01 08:41:07 crc kubenswrapper[5004]: I1201 08:41:07.563973 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-745fdd7bc8-x8cmf" Dec 01 08:41:07 crc kubenswrapper[5004]: I1201 08:41:07.695516 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bc52bfd5-1560-4155-ba74-5fc2d92dfe73-httpd-config\") pod \"bc52bfd5-1560-4155-ba74-5fc2d92dfe73\" (UID: \"bc52bfd5-1560-4155-ba74-5fc2d92dfe73\") " Dec 01 08:41:07 crc kubenswrapper[5004]: I1201 08:41:07.695641 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc52bfd5-1560-4155-ba74-5fc2d92dfe73-ovndb-tls-certs\") pod \"bc52bfd5-1560-4155-ba74-5fc2d92dfe73\" (UID: \"bc52bfd5-1560-4155-ba74-5fc2d92dfe73\") " Dec 01 08:41:07 crc kubenswrapper[5004]: I1201 08:41:07.695831 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bc52bfd5-1560-4155-ba74-5fc2d92dfe73-config\") pod \"bc52bfd5-1560-4155-ba74-5fc2d92dfe73\" (UID: \"bc52bfd5-1560-4155-ba74-5fc2d92dfe73\") " Dec 01 08:41:07 crc kubenswrapper[5004]: I1201 08:41:07.695875 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc52bfd5-1560-4155-ba74-5fc2d92dfe73-combined-ca-bundle\") pod \"bc52bfd5-1560-4155-ba74-5fc2d92dfe73\" (UID: \"bc52bfd5-1560-4155-ba74-5fc2d92dfe73\") " Dec 01 08:41:07 crc kubenswrapper[5004]: I1201 08:41:07.695924 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rv277\" (UniqueName: \"kubernetes.io/projected/bc52bfd5-1560-4155-ba74-5fc2d92dfe73-kube-api-access-rv277\") pod \"bc52bfd5-1560-4155-ba74-5fc2d92dfe73\" (UID: \"bc52bfd5-1560-4155-ba74-5fc2d92dfe73\") " Dec 01 08:41:07 crc kubenswrapper[5004]: I1201 08:41:07.708840 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc52bfd5-1560-4155-ba74-5fc2d92dfe73-kube-api-access-rv277" (OuterVolumeSpecName: "kube-api-access-rv277") pod "bc52bfd5-1560-4155-ba74-5fc2d92dfe73" (UID: "bc52bfd5-1560-4155-ba74-5fc2d92dfe73"). InnerVolumeSpecName "kube-api-access-rv277". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:41:07 crc kubenswrapper[5004]: I1201 08:41:07.717763 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc52bfd5-1560-4155-ba74-5fc2d92dfe73-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "bc52bfd5-1560-4155-ba74-5fc2d92dfe73" (UID: "bc52bfd5-1560-4155-ba74-5fc2d92dfe73"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:41:07 crc kubenswrapper[5004]: I1201 08:41:07.755376 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc52bfd5-1560-4155-ba74-5fc2d92dfe73-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bc52bfd5-1560-4155-ba74-5fc2d92dfe73" (UID: "bc52bfd5-1560-4155-ba74-5fc2d92dfe73"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:41:07 crc kubenswrapper[5004]: I1201 08:41:07.801215 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc52bfd5-1560-4155-ba74-5fc2d92dfe73-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:07 crc kubenswrapper[5004]: I1201 08:41:07.801247 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rv277\" (UniqueName: \"kubernetes.io/projected/bc52bfd5-1560-4155-ba74-5fc2d92dfe73-kube-api-access-rv277\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:07 crc kubenswrapper[5004]: I1201 08:41:07.801260 5004 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bc52bfd5-1560-4155-ba74-5fc2d92dfe73-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:07 crc kubenswrapper[5004]: I1201 08:41:07.811720 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc52bfd5-1560-4155-ba74-5fc2d92dfe73-config" (OuterVolumeSpecName: "config") pod "bc52bfd5-1560-4155-ba74-5fc2d92dfe73" (UID: "bc52bfd5-1560-4155-ba74-5fc2d92dfe73"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:41:07 crc kubenswrapper[5004]: I1201 08:41:07.841139 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc52bfd5-1560-4155-ba74-5fc2d92dfe73-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "bc52bfd5-1560-4155-ba74-5fc2d92dfe73" (UID: "bc52bfd5-1560-4155-ba74-5fc2d92dfe73"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:41:07 crc kubenswrapper[5004]: I1201 08:41:07.903047 5004 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc52bfd5-1560-4155-ba74-5fc2d92dfe73-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:07 crc kubenswrapper[5004]: I1201 08:41:07.903086 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/bc52bfd5-1560-4155-ba74-5fc2d92dfe73-config\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:07 crc kubenswrapper[5004]: I1201 08:41:07.922075 5004 generic.go:334] "Generic (PLEG): container finished" podID="1c910a71-8445-4c64-a555-433bb2c60bbd" containerID="4c59f8c69a9f6c8e17c3d7dd2514a80d7b6af36483e212edd7176fddc028f54e" exitCode=0 Dec 01 08:41:07 crc kubenswrapper[5004]: I1201 08:41:07.922136 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1c910a71-8445-4c64-a555-433bb2c60bbd","Type":"ContainerDied","Data":"4c59f8c69a9f6c8e17c3d7dd2514a80d7b6af36483e212edd7176fddc028f54e"} Dec 01 08:41:07 crc kubenswrapper[5004]: I1201 08:41:07.924350 5004 generic.go:334] "Generic (PLEG): container finished" podID="bc52bfd5-1560-4155-ba74-5fc2d92dfe73" containerID="8240e03a472d498dbdfc57c97c051baed4a1338229d1ae7da3d4173a1e98430f" exitCode=0 Dec 01 08:41:07 crc kubenswrapper[5004]: I1201 08:41:07.924385 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-745fdd7bc8-x8cmf" event={"ID":"bc52bfd5-1560-4155-ba74-5fc2d92dfe73","Type":"ContainerDied","Data":"8240e03a472d498dbdfc57c97c051baed4a1338229d1ae7da3d4173a1e98430f"} Dec 01 08:41:07 crc kubenswrapper[5004]: I1201 08:41:07.924417 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-745fdd7bc8-x8cmf" event={"ID":"bc52bfd5-1560-4155-ba74-5fc2d92dfe73","Type":"ContainerDied","Data":"2747be2f570392f48aba853116f00805df9a9a07324214e31ecf682b2f121654"} Dec 01 08:41:07 crc kubenswrapper[5004]: I1201 08:41:07.924437 5004 scope.go:117] "RemoveContainer" containerID="69e3c2040e4f88fafa2d0e41f24069bd46aca6f4a6f8940dc1c49c5bd1273691" Dec 01 08:41:07 crc kubenswrapper[5004]: I1201 08:41:07.924432 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-745fdd7bc8-x8cmf" Dec 01 08:41:07 crc kubenswrapper[5004]: I1201 08:41:07.973396 5004 scope.go:117] "RemoveContainer" containerID="8240e03a472d498dbdfc57c97c051baed4a1338229d1ae7da3d4173a1e98430f" Dec 01 08:41:07 crc kubenswrapper[5004]: I1201 08:41:07.977222 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-745fdd7bc8-x8cmf"] Dec 01 08:41:07 crc kubenswrapper[5004]: I1201 08:41:07.989622 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-745fdd7bc8-x8cmf"] Dec 01 08:41:07 crc kubenswrapper[5004]: I1201 08:41:07.995679 5004 scope.go:117] "RemoveContainer" containerID="69e3c2040e4f88fafa2d0e41f24069bd46aca6f4a6f8940dc1c49c5bd1273691" Dec 01 08:41:07 crc kubenswrapper[5004]: E1201 08:41:07.996167 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69e3c2040e4f88fafa2d0e41f24069bd46aca6f4a6f8940dc1c49c5bd1273691\": container with ID starting with 69e3c2040e4f88fafa2d0e41f24069bd46aca6f4a6f8940dc1c49c5bd1273691 not found: ID does not exist" containerID="69e3c2040e4f88fafa2d0e41f24069bd46aca6f4a6f8940dc1c49c5bd1273691" Dec 01 08:41:07 crc kubenswrapper[5004]: I1201 08:41:07.996220 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69e3c2040e4f88fafa2d0e41f24069bd46aca6f4a6f8940dc1c49c5bd1273691"} err="failed to get container status \"69e3c2040e4f88fafa2d0e41f24069bd46aca6f4a6f8940dc1c49c5bd1273691\": rpc error: code = NotFound desc = could not find container \"69e3c2040e4f88fafa2d0e41f24069bd46aca6f4a6f8940dc1c49c5bd1273691\": container with ID starting with 69e3c2040e4f88fafa2d0e41f24069bd46aca6f4a6f8940dc1c49c5bd1273691 not found: ID does not exist" Dec 01 08:41:07 crc kubenswrapper[5004]: I1201 08:41:07.996291 5004 scope.go:117] "RemoveContainer" containerID="8240e03a472d498dbdfc57c97c051baed4a1338229d1ae7da3d4173a1e98430f" Dec 01 08:41:07 crc kubenswrapper[5004]: E1201 08:41:07.999715 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8240e03a472d498dbdfc57c97c051baed4a1338229d1ae7da3d4173a1e98430f\": container with ID starting with 8240e03a472d498dbdfc57c97c051baed4a1338229d1ae7da3d4173a1e98430f not found: ID does not exist" containerID="8240e03a472d498dbdfc57c97c051baed4a1338229d1ae7da3d4173a1e98430f" Dec 01 08:41:07 crc kubenswrapper[5004]: I1201 08:41:07.999771 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8240e03a472d498dbdfc57c97c051baed4a1338229d1ae7da3d4173a1e98430f"} err="failed to get container status \"8240e03a472d498dbdfc57c97c051baed4a1338229d1ae7da3d4173a1e98430f\": rpc error: code = NotFound desc = could not find container \"8240e03a472d498dbdfc57c97c051baed4a1338229d1ae7da3d4173a1e98430f\": container with ID starting with 8240e03a472d498dbdfc57c97c051baed4a1338229d1ae7da3d4173a1e98430f not found: ID does not exist" Dec 01 08:41:08 crc kubenswrapper[5004]: I1201 08:41:08.183257 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-695797f65d-5b4rw" Dec 01 08:41:08 crc kubenswrapper[5004]: I1201 08:41:08.204284 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-695797f65d-5b4rw" Dec 01 08:41:08 crc kubenswrapper[5004]: I1201 08:41:08.442619 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zgjgq" Dec 01 08:41:08 crc kubenswrapper[5004]: I1201 08:41:08.507128 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zgjgq" Dec 01 08:41:08 crc kubenswrapper[5004]: I1201 08:41:08.730018 5004 patch_prober.go:28] interesting pod/machine-config-daemon-fvdgt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 08:41:08 crc kubenswrapper[5004]: I1201 08:41:08.730088 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 08:41:08 crc kubenswrapper[5004]: I1201 08:41:08.730141 5004 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" Dec 01 08:41:08 crc kubenswrapper[5004]: I1201 08:41:08.731294 5004 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"af0bd8ad09d4d665e418c6d76caa0150a18c17d3528d47d38f4681f4edce895d"} pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 08:41:08 crc kubenswrapper[5004]: I1201 08:41:08.731355 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" containerID="cri-o://af0bd8ad09d4d665e418c6d76caa0150a18c17d3528d47d38f4681f4edce895d" gracePeriod=600 Dec 01 08:41:08 crc kubenswrapper[5004]: I1201 08:41:08.773926 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc52bfd5-1560-4155-ba74-5fc2d92dfe73" path="/var/lib/kubelet/pods/bc52bfd5-1560-4155-ba74-5fc2d92dfe73/volumes" Dec 01 08:41:08 crc kubenswrapper[5004]: I1201 08:41:08.939335 5004 generic.go:334] "Generic (PLEG): container finished" podID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerID="af0bd8ad09d4d665e418c6d76caa0150a18c17d3528d47d38f4681f4edce895d" exitCode=0 Dec 01 08:41:08 crc kubenswrapper[5004]: I1201 08:41:08.939662 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" event={"ID":"f9977ebb-82de-4e96-8763-0b5a84f8d4ce","Type":"ContainerDied","Data":"af0bd8ad09d4d665e418c6d76caa0150a18c17d3528d47d38f4681f4edce895d"} Dec 01 08:41:08 crc kubenswrapper[5004]: I1201 08:41:08.939695 5004 scope.go:117] "RemoveContainer" containerID="da4b1d9e1788dd947ac4216eff1a285666eccd0fc7594a8fc8667307c82c4fdb" Dec 01 08:41:09 crc kubenswrapper[5004]: I1201 08:41:09.258038 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zgjgq"] Dec 01 08:41:09 crc kubenswrapper[5004]: I1201 08:41:09.962266 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" event={"ID":"f9977ebb-82de-4e96-8763-0b5a84f8d4ce","Type":"ContainerStarted","Data":"70f45b3d2a6bb4bf0d87f18bc08b282543e238e1d391a58745914af24ad7956c"} Dec 01 08:41:09 crc kubenswrapper[5004]: I1201 08:41:09.966601 5004 generic.go:334] "Generic (PLEG): container finished" podID="1c910a71-8445-4c64-a555-433bb2c60bbd" containerID="d5b78ca5142cc60878a4ad77fa71ddfca3d3c89f7ed8d105d0bc39e9b44fba76" exitCode=0 Dec 01 08:41:09 crc kubenswrapper[5004]: I1201 08:41:09.966826 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zgjgq" podUID="f4c8bbd5-63c7-441d-9262-b8d44344c7fa" containerName="registry-server" containerID="cri-o://1ce112820d5acbe5229e48d53907309311ce33034eb42872b1f0cde08253dc00" gracePeriod=2 Dec 01 08:41:09 crc kubenswrapper[5004]: I1201 08:41:09.967145 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1c910a71-8445-4c64-a555-433bb2c60bbd","Type":"ContainerDied","Data":"d5b78ca5142cc60878a4ad77fa71ddfca3d3c89f7ed8d105d0bc39e9b44fba76"} Dec 01 08:41:10 crc kubenswrapper[5004]: I1201 08:41:10.844987 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zgjgq" Dec 01 08:41:10 crc kubenswrapper[5004]: I1201 08:41:10.853750 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 01 08:41:10 crc kubenswrapper[5004]: I1201 08:41:10.883463 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c910a71-8445-4c64-a555-433bb2c60bbd-config-data\") pod \"1c910a71-8445-4c64-a555-433bb2c60bbd\" (UID: \"1c910a71-8445-4c64-a555-433bb2c60bbd\") " Dec 01 08:41:10 crc kubenswrapper[5004]: I1201 08:41:10.883533 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcvdg\" (UniqueName: \"kubernetes.io/projected/1c910a71-8445-4c64-a555-433bb2c60bbd-kube-api-access-fcvdg\") pod \"1c910a71-8445-4c64-a555-433bb2c60bbd\" (UID: \"1c910a71-8445-4c64-a555-433bb2c60bbd\") " Dec 01 08:41:10 crc kubenswrapper[5004]: I1201 08:41:10.883671 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1c910a71-8445-4c64-a555-433bb2c60bbd-etc-machine-id\") pod \"1c910a71-8445-4c64-a555-433bb2c60bbd\" (UID: \"1c910a71-8445-4c64-a555-433bb2c60bbd\") " Dec 01 08:41:10 crc kubenswrapper[5004]: I1201 08:41:10.883699 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c910a71-8445-4c64-a555-433bb2c60bbd-scripts\") pod \"1c910a71-8445-4c64-a555-433bb2c60bbd\" (UID: \"1c910a71-8445-4c64-a555-433bb2c60bbd\") " Dec 01 08:41:10 crc kubenswrapper[5004]: I1201 08:41:10.883938 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1c910a71-8445-4c64-a555-433bb2c60bbd-config-data-custom\") pod \"1c910a71-8445-4c64-a555-433bb2c60bbd\" (UID: \"1c910a71-8445-4c64-a555-433bb2c60bbd\") " Dec 01 08:41:10 crc kubenswrapper[5004]: I1201 08:41:10.883991 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4c8bbd5-63c7-441d-9262-b8d44344c7fa-catalog-content\") pod \"f4c8bbd5-63c7-441d-9262-b8d44344c7fa\" (UID: \"f4c8bbd5-63c7-441d-9262-b8d44344c7fa\") " Dec 01 08:41:10 crc kubenswrapper[5004]: I1201 08:41:10.884188 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgqsj\" (UniqueName: \"kubernetes.io/projected/f4c8bbd5-63c7-441d-9262-b8d44344c7fa-kube-api-access-qgqsj\") pod \"f4c8bbd5-63c7-441d-9262-b8d44344c7fa\" (UID: \"f4c8bbd5-63c7-441d-9262-b8d44344c7fa\") " Dec 01 08:41:10 crc kubenswrapper[5004]: I1201 08:41:10.884318 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4c8bbd5-63c7-441d-9262-b8d44344c7fa-utilities\") pod \"f4c8bbd5-63c7-441d-9262-b8d44344c7fa\" (UID: \"f4c8bbd5-63c7-441d-9262-b8d44344c7fa\") " Dec 01 08:41:10 crc kubenswrapper[5004]: I1201 08:41:10.884380 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c910a71-8445-4c64-a555-433bb2c60bbd-combined-ca-bundle\") pod \"1c910a71-8445-4c64-a555-433bb2c60bbd\" (UID: \"1c910a71-8445-4c64-a555-433bb2c60bbd\") " Dec 01 08:41:10 crc kubenswrapper[5004]: I1201 08:41:10.898778 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c910a71-8445-4c64-a555-433bb2c60bbd-scripts" (OuterVolumeSpecName: "scripts") pod "1c910a71-8445-4c64-a555-433bb2c60bbd" (UID: "1c910a71-8445-4c64-a555-433bb2c60bbd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:41:10 crc kubenswrapper[5004]: I1201 08:41:10.898849 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c910a71-8445-4c64-a555-433bb2c60bbd-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "1c910a71-8445-4c64-a555-433bb2c60bbd" (UID: "1c910a71-8445-4c64-a555-433bb2c60bbd"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 08:41:10 crc kubenswrapper[5004]: I1201 08:41:10.903156 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4c8bbd5-63c7-441d-9262-b8d44344c7fa-utilities" (OuterVolumeSpecName: "utilities") pod "f4c8bbd5-63c7-441d-9262-b8d44344c7fa" (UID: "f4c8bbd5-63c7-441d-9262-b8d44344c7fa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:41:10 crc kubenswrapper[5004]: I1201 08:41:10.906005 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4c8bbd5-63c7-441d-9262-b8d44344c7fa-kube-api-access-qgqsj" (OuterVolumeSpecName: "kube-api-access-qgqsj") pod "f4c8bbd5-63c7-441d-9262-b8d44344c7fa" (UID: "f4c8bbd5-63c7-441d-9262-b8d44344c7fa"). InnerVolumeSpecName "kube-api-access-qgqsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:41:10 crc kubenswrapper[5004]: I1201 08:41:10.906784 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c910a71-8445-4c64-a555-433bb2c60bbd-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1c910a71-8445-4c64-a555-433bb2c60bbd" (UID: "1c910a71-8445-4c64-a555-433bb2c60bbd"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:41:10 crc kubenswrapper[5004]: I1201 08:41:10.922927 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c910a71-8445-4c64-a555-433bb2c60bbd-kube-api-access-fcvdg" (OuterVolumeSpecName: "kube-api-access-fcvdg") pod "1c910a71-8445-4c64-a555-433bb2c60bbd" (UID: "1c910a71-8445-4c64-a555-433bb2c60bbd"). InnerVolumeSpecName "kube-api-access-fcvdg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:41:10 crc kubenswrapper[5004]: I1201 08:41:10.978643 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c910a71-8445-4c64-a555-433bb2c60bbd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1c910a71-8445-4c64-a555-433bb2c60bbd" (UID: "1c910a71-8445-4c64-a555-433bb2c60bbd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:41:10 crc kubenswrapper[5004]: I1201 08:41:10.985809 5004 generic.go:334] "Generic (PLEG): container finished" podID="f4c8bbd5-63c7-441d-9262-b8d44344c7fa" containerID="1ce112820d5acbe5229e48d53907309311ce33034eb42872b1f0cde08253dc00" exitCode=0 Dec 01 08:41:10 crc kubenswrapper[5004]: I1201 08:41:10.985897 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zgjgq" event={"ID":"f4c8bbd5-63c7-441d-9262-b8d44344c7fa","Type":"ContainerDied","Data":"1ce112820d5acbe5229e48d53907309311ce33034eb42872b1f0cde08253dc00"} Dec 01 08:41:10 crc kubenswrapper[5004]: I1201 08:41:10.985943 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zgjgq" event={"ID":"f4c8bbd5-63c7-441d-9262-b8d44344c7fa","Type":"ContainerDied","Data":"d5d8f75474011431ddcc351c020de455cc92b0948d1d49ab2a28001dc19aaba4"} Dec 01 08:41:10 crc kubenswrapper[5004]: I1201 08:41:10.985963 5004 scope.go:117] "RemoveContainer" containerID="1ce112820d5acbe5229e48d53907309311ce33034eb42872b1f0cde08253dc00" Dec 01 08:41:10 crc kubenswrapper[5004]: I1201 08:41:10.986072 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zgjgq" Dec 01 08:41:10 crc kubenswrapper[5004]: I1201 08:41:10.990860 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qgqsj\" (UniqueName: \"kubernetes.io/projected/f4c8bbd5-63c7-441d-9262-b8d44344c7fa-kube-api-access-qgqsj\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:10 crc kubenswrapper[5004]: I1201 08:41:10.990881 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4c8bbd5-63c7-441d-9262-b8d44344c7fa-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:10 crc kubenswrapper[5004]: I1201 08:41:10.990891 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c910a71-8445-4c64-a555-433bb2c60bbd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:10 crc kubenswrapper[5004]: I1201 08:41:10.990901 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcvdg\" (UniqueName: \"kubernetes.io/projected/1c910a71-8445-4c64-a555-433bb2c60bbd-kube-api-access-fcvdg\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:10 crc kubenswrapper[5004]: I1201 08:41:10.990909 5004 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1c910a71-8445-4c64-a555-433bb2c60bbd-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:10 crc kubenswrapper[5004]: I1201 08:41:10.990917 5004 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c910a71-8445-4c64-a555-433bb2c60bbd-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:10 crc kubenswrapper[5004]: I1201 08:41:10.990925 5004 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1c910a71-8445-4c64-a555-433bb2c60bbd-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:11 crc kubenswrapper[5004]: I1201 08:41:11.004979 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1c910a71-8445-4c64-a555-433bb2c60bbd","Type":"ContainerDied","Data":"5149b02f46f4e6cd86e56e13857f01668080f47cbf728b5072c7c7db0674d33e"} Dec 01 08:41:11 crc kubenswrapper[5004]: I1201 08:41:11.005190 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 01 08:41:11 crc kubenswrapper[5004]: I1201 08:41:11.016732 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4c8bbd5-63c7-441d-9262-b8d44344c7fa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f4c8bbd5-63c7-441d-9262-b8d44344c7fa" (UID: "f4c8bbd5-63c7-441d-9262-b8d44344c7fa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:41:11 crc kubenswrapper[5004]: I1201 08:41:11.044587 5004 scope.go:117] "RemoveContainer" containerID="bdeeb795990c8564109564e8080740677ec73d83420e3c4b12b77bda869a9f7f" Dec 01 08:41:11 crc kubenswrapper[5004]: I1201 08:41:11.054774 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c910a71-8445-4c64-a555-433bb2c60bbd-config-data" (OuterVolumeSpecName: "config-data") pod "1c910a71-8445-4c64-a555-433bb2c60bbd" (UID: "1c910a71-8445-4c64-a555-433bb2c60bbd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:41:11 crc kubenswrapper[5004]: I1201 08:41:11.083476 5004 scope.go:117] "RemoveContainer" containerID="45265fc64361a745488e972896546d0437e068e42072dd4208f3398b553b8c96" Dec 01 08:41:11 crc kubenswrapper[5004]: I1201 08:41:11.092978 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c910a71-8445-4c64-a555-433bb2c60bbd-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:11 crc kubenswrapper[5004]: I1201 08:41:11.093008 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4c8bbd5-63c7-441d-9262-b8d44344c7fa-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:11 crc kubenswrapper[5004]: I1201 08:41:11.167290 5004 scope.go:117] "RemoveContainer" containerID="1ce112820d5acbe5229e48d53907309311ce33034eb42872b1f0cde08253dc00" Dec 01 08:41:11 crc kubenswrapper[5004]: E1201 08:41:11.168035 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ce112820d5acbe5229e48d53907309311ce33034eb42872b1f0cde08253dc00\": container with ID starting with 1ce112820d5acbe5229e48d53907309311ce33034eb42872b1f0cde08253dc00 not found: ID does not exist" containerID="1ce112820d5acbe5229e48d53907309311ce33034eb42872b1f0cde08253dc00" Dec 01 08:41:11 crc kubenswrapper[5004]: I1201 08:41:11.168078 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ce112820d5acbe5229e48d53907309311ce33034eb42872b1f0cde08253dc00"} err="failed to get container status \"1ce112820d5acbe5229e48d53907309311ce33034eb42872b1f0cde08253dc00\": rpc error: code = NotFound desc = could not find container \"1ce112820d5acbe5229e48d53907309311ce33034eb42872b1f0cde08253dc00\": container with ID starting with 1ce112820d5acbe5229e48d53907309311ce33034eb42872b1f0cde08253dc00 not found: ID does not exist" Dec 01 08:41:11 crc kubenswrapper[5004]: I1201 08:41:11.168107 5004 scope.go:117] "RemoveContainer" containerID="bdeeb795990c8564109564e8080740677ec73d83420e3c4b12b77bda869a9f7f" Dec 01 08:41:11 crc kubenswrapper[5004]: E1201 08:41:11.168570 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdeeb795990c8564109564e8080740677ec73d83420e3c4b12b77bda869a9f7f\": container with ID starting with bdeeb795990c8564109564e8080740677ec73d83420e3c4b12b77bda869a9f7f not found: ID does not exist" containerID="bdeeb795990c8564109564e8080740677ec73d83420e3c4b12b77bda869a9f7f" Dec 01 08:41:11 crc kubenswrapper[5004]: I1201 08:41:11.168600 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdeeb795990c8564109564e8080740677ec73d83420e3c4b12b77bda869a9f7f"} err="failed to get container status \"bdeeb795990c8564109564e8080740677ec73d83420e3c4b12b77bda869a9f7f\": rpc error: code = NotFound desc = could not find container \"bdeeb795990c8564109564e8080740677ec73d83420e3c4b12b77bda869a9f7f\": container with ID starting with bdeeb795990c8564109564e8080740677ec73d83420e3c4b12b77bda869a9f7f not found: ID does not exist" Dec 01 08:41:11 crc kubenswrapper[5004]: I1201 08:41:11.168624 5004 scope.go:117] "RemoveContainer" containerID="45265fc64361a745488e972896546d0437e068e42072dd4208f3398b553b8c96" Dec 01 08:41:11 crc kubenswrapper[5004]: E1201 08:41:11.169335 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45265fc64361a745488e972896546d0437e068e42072dd4208f3398b553b8c96\": container with ID starting with 45265fc64361a745488e972896546d0437e068e42072dd4208f3398b553b8c96 not found: ID does not exist" containerID="45265fc64361a745488e972896546d0437e068e42072dd4208f3398b553b8c96" Dec 01 08:41:11 crc kubenswrapper[5004]: I1201 08:41:11.169357 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45265fc64361a745488e972896546d0437e068e42072dd4208f3398b553b8c96"} err="failed to get container status \"45265fc64361a745488e972896546d0437e068e42072dd4208f3398b553b8c96\": rpc error: code = NotFound desc = could not find container \"45265fc64361a745488e972896546d0437e068e42072dd4208f3398b553b8c96\": container with ID starting with 45265fc64361a745488e972896546d0437e068e42072dd4208f3398b553b8c96 not found: ID does not exist" Dec 01 08:41:11 crc kubenswrapper[5004]: I1201 08:41:11.169370 5004 scope.go:117] "RemoveContainer" containerID="4c59f8c69a9f6c8e17c3d7dd2514a80d7b6af36483e212edd7176fddc028f54e" Dec 01 08:41:11 crc kubenswrapper[5004]: I1201 08:41:11.206178 5004 scope.go:117] "RemoveContainer" containerID="d5b78ca5142cc60878a4ad77fa71ddfca3d3c89f7ed8d105d0bc39e9b44fba76" Dec 01 08:41:11 crc kubenswrapper[5004]: I1201 08:41:11.335765 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zgjgq"] Dec 01 08:41:11 crc kubenswrapper[5004]: I1201 08:41:11.359266 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zgjgq"] Dec 01 08:41:11 crc kubenswrapper[5004]: I1201 08:41:11.373985 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 08:41:11 crc kubenswrapper[5004]: I1201 08:41:11.391289 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 08:41:11 crc kubenswrapper[5004]: I1201 08:41:11.408504 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 08:41:11 crc kubenswrapper[5004]: E1201 08:41:11.408999 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c910a71-8445-4c64-a555-433bb2c60bbd" containerName="cinder-scheduler" Dec 01 08:41:11 crc kubenswrapper[5004]: I1201 08:41:11.409011 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c910a71-8445-4c64-a555-433bb2c60bbd" containerName="cinder-scheduler" Dec 01 08:41:11 crc kubenswrapper[5004]: E1201 08:41:11.409024 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4c8bbd5-63c7-441d-9262-b8d44344c7fa" containerName="extract-content" Dec 01 08:41:11 crc kubenswrapper[5004]: I1201 08:41:11.409030 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4c8bbd5-63c7-441d-9262-b8d44344c7fa" containerName="extract-content" Dec 01 08:41:11 crc kubenswrapper[5004]: E1201 08:41:11.409040 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbc3a584-72e3-4331-8bd5-c5accc1f0395" containerName="init" Dec 01 08:41:11 crc kubenswrapper[5004]: I1201 08:41:11.409046 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbc3a584-72e3-4331-8bd5-c5accc1f0395" containerName="init" Dec 01 08:41:11 crc kubenswrapper[5004]: E1201 08:41:11.409068 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbc3a584-72e3-4331-8bd5-c5accc1f0395" containerName="dnsmasq-dns" Dec 01 08:41:11 crc kubenswrapper[5004]: I1201 08:41:11.409073 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbc3a584-72e3-4331-8bd5-c5accc1f0395" containerName="dnsmasq-dns" Dec 01 08:41:11 crc kubenswrapper[5004]: E1201 08:41:11.409085 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc52bfd5-1560-4155-ba74-5fc2d92dfe73" containerName="neutron-httpd" Dec 01 08:41:11 crc kubenswrapper[5004]: I1201 08:41:11.409092 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc52bfd5-1560-4155-ba74-5fc2d92dfe73" containerName="neutron-httpd" Dec 01 08:41:11 crc kubenswrapper[5004]: E1201 08:41:11.409103 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c910a71-8445-4c64-a555-433bb2c60bbd" containerName="probe" Dec 01 08:41:11 crc kubenswrapper[5004]: I1201 08:41:11.409108 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c910a71-8445-4c64-a555-433bb2c60bbd" containerName="probe" Dec 01 08:41:11 crc kubenswrapper[5004]: E1201 08:41:11.409117 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4c8bbd5-63c7-441d-9262-b8d44344c7fa" containerName="registry-server" Dec 01 08:41:11 crc kubenswrapper[5004]: I1201 08:41:11.409122 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4c8bbd5-63c7-441d-9262-b8d44344c7fa" containerName="registry-server" Dec 01 08:41:11 crc kubenswrapper[5004]: E1201 08:41:11.409137 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc52bfd5-1560-4155-ba74-5fc2d92dfe73" containerName="neutron-api" Dec 01 08:41:11 crc kubenswrapper[5004]: I1201 08:41:11.409143 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc52bfd5-1560-4155-ba74-5fc2d92dfe73" containerName="neutron-api" Dec 01 08:41:11 crc kubenswrapper[5004]: E1201 08:41:11.409166 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4c8bbd5-63c7-441d-9262-b8d44344c7fa" containerName="extract-utilities" Dec 01 08:41:11 crc kubenswrapper[5004]: I1201 08:41:11.409173 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4c8bbd5-63c7-441d-9262-b8d44344c7fa" containerName="extract-utilities" Dec 01 08:41:11 crc kubenswrapper[5004]: I1201 08:41:11.410867 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c910a71-8445-4c64-a555-433bb2c60bbd" containerName="cinder-scheduler" Dec 01 08:41:11 crc kubenswrapper[5004]: I1201 08:41:11.410885 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c910a71-8445-4c64-a555-433bb2c60bbd" containerName="probe" Dec 01 08:41:11 crc kubenswrapper[5004]: I1201 08:41:11.410894 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc52bfd5-1560-4155-ba74-5fc2d92dfe73" containerName="neutron-httpd" Dec 01 08:41:11 crc kubenswrapper[5004]: I1201 08:41:11.410910 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc52bfd5-1560-4155-ba74-5fc2d92dfe73" containerName="neutron-api" Dec 01 08:41:11 crc kubenswrapper[5004]: I1201 08:41:11.410919 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbc3a584-72e3-4331-8bd5-c5accc1f0395" containerName="dnsmasq-dns" Dec 01 08:41:11 crc kubenswrapper[5004]: I1201 08:41:11.410930 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4c8bbd5-63c7-441d-9262-b8d44344c7fa" containerName="registry-server" Dec 01 08:41:11 crc kubenswrapper[5004]: I1201 08:41:11.412103 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 01 08:41:11 crc kubenswrapper[5004]: I1201 08:41:11.415686 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 01 08:41:11 crc kubenswrapper[5004]: I1201 08:41:11.416164 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 08:41:11 crc kubenswrapper[5004]: I1201 08:41:11.501827 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ca33de3-6acb-4792-acd0-47790a8d0ee6-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2ca33de3-6acb-4792-acd0-47790a8d0ee6\") " pod="openstack/cinder-scheduler-0" Dec 01 08:41:11 crc kubenswrapper[5004]: I1201 08:41:11.502125 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ca33de3-6acb-4792-acd0-47790a8d0ee6-config-data\") pod \"cinder-scheduler-0\" (UID: \"2ca33de3-6acb-4792-acd0-47790a8d0ee6\") " pod="openstack/cinder-scheduler-0" Dec 01 08:41:11 crc kubenswrapper[5004]: I1201 08:41:11.502217 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2ca33de3-6acb-4792-acd0-47790a8d0ee6-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2ca33de3-6acb-4792-acd0-47790a8d0ee6\") " pod="openstack/cinder-scheduler-0" Dec 01 08:41:11 crc kubenswrapper[5004]: I1201 08:41:11.502317 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pp7rj\" (UniqueName: \"kubernetes.io/projected/2ca33de3-6acb-4792-acd0-47790a8d0ee6-kube-api-access-pp7rj\") pod \"cinder-scheduler-0\" (UID: \"2ca33de3-6acb-4792-acd0-47790a8d0ee6\") " pod="openstack/cinder-scheduler-0" Dec 01 08:41:11 crc kubenswrapper[5004]: I1201 08:41:11.502396 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ca33de3-6acb-4792-acd0-47790a8d0ee6-scripts\") pod \"cinder-scheduler-0\" (UID: \"2ca33de3-6acb-4792-acd0-47790a8d0ee6\") " pod="openstack/cinder-scheduler-0" Dec 01 08:41:11 crc kubenswrapper[5004]: I1201 08:41:11.502576 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2ca33de3-6acb-4792-acd0-47790a8d0ee6-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2ca33de3-6acb-4792-acd0-47790a8d0ee6\") " pod="openstack/cinder-scheduler-0" Dec 01 08:41:11 crc kubenswrapper[5004]: I1201 08:41:11.604035 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2ca33de3-6acb-4792-acd0-47790a8d0ee6-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2ca33de3-6acb-4792-acd0-47790a8d0ee6\") " pod="openstack/cinder-scheduler-0" Dec 01 08:41:11 crc kubenswrapper[5004]: I1201 08:41:11.604133 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ca33de3-6acb-4792-acd0-47790a8d0ee6-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2ca33de3-6acb-4792-acd0-47790a8d0ee6\") " pod="openstack/cinder-scheduler-0" Dec 01 08:41:11 crc kubenswrapper[5004]: I1201 08:41:11.604163 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ca33de3-6acb-4792-acd0-47790a8d0ee6-config-data\") pod \"cinder-scheduler-0\" (UID: \"2ca33de3-6acb-4792-acd0-47790a8d0ee6\") " pod="openstack/cinder-scheduler-0" Dec 01 08:41:11 crc kubenswrapper[5004]: I1201 08:41:11.604186 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2ca33de3-6acb-4792-acd0-47790a8d0ee6-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2ca33de3-6acb-4792-acd0-47790a8d0ee6\") " pod="openstack/cinder-scheduler-0" Dec 01 08:41:11 crc kubenswrapper[5004]: I1201 08:41:11.604226 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pp7rj\" (UniqueName: \"kubernetes.io/projected/2ca33de3-6acb-4792-acd0-47790a8d0ee6-kube-api-access-pp7rj\") pod \"cinder-scheduler-0\" (UID: \"2ca33de3-6acb-4792-acd0-47790a8d0ee6\") " pod="openstack/cinder-scheduler-0" Dec 01 08:41:11 crc kubenswrapper[5004]: I1201 08:41:11.604247 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ca33de3-6acb-4792-acd0-47790a8d0ee6-scripts\") pod \"cinder-scheduler-0\" (UID: \"2ca33de3-6acb-4792-acd0-47790a8d0ee6\") " pod="openstack/cinder-scheduler-0" Dec 01 08:41:11 crc kubenswrapper[5004]: I1201 08:41:11.605065 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2ca33de3-6acb-4792-acd0-47790a8d0ee6-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2ca33de3-6acb-4792-acd0-47790a8d0ee6\") " pod="openstack/cinder-scheduler-0" Dec 01 08:41:11 crc kubenswrapper[5004]: I1201 08:41:11.608501 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ca33de3-6acb-4792-acd0-47790a8d0ee6-scripts\") pod \"cinder-scheduler-0\" (UID: \"2ca33de3-6acb-4792-acd0-47790a8d0ee6\") " pod="openstack/cinder-scheduler-0" Dec 01 08:41:11 crc kubenswrapper[5004]: I1201 08:41:11.609238 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ca33de3-6acb-4792-acd0-47790a8d0ee6-config-data\") pod \"cinder-scheduler-0\" (UID: \"2ca33de3-6acb-4792-acd0-47790a8d0ee6\") " pod="openstack/cinder-scheduler-0" Dec 01 08:41:11 crc kubenswrapper[5004]: I1201 08:41:11.611519 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2ca33de3-6acb-4792-acd0-47790a8d0ee6-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2ca33de3-6acb-4792-acd0-47790a8d0ee6\") " pod="openstack/cinder-scheduler-0" Dec 01 08:41:11 crc kubenswrapper[5004]: I1201 08:41:11.612149 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ca33de3-6acb-4792-acd0-47790a8d0ee6-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2ca33de3-6acb-4792-acd0-47790a8d0ee6\") " pod="openstack/cinder-scheduler-0" Dec 01 08:41:11 crc kubenswrapper[5004]: I1201 08:41:11.623701 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pp7rj\" (UniqueName: \"kubernetes.io/projected/2ca33de3-6acb-4792-acd0-47790a8d0ee6-kube-api-access-pp7rj\") pod \"cinder-scheduler-0\" (UID: \"2ca33de3-6acb-4792-acd0-47790a8d0ee6\") " pod="openstack/cinder-scheduler-0" Dec 01 08:41:11 crc kubenswrapper[5004]: I1201 08:41:11.755386 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 01 08:41:12 crc kubenswrapper[5004]: I1201 08:41:12.224395 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 08:41:12 crc kubenswrapper[5004]: W1201 08:41:12.226259 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ca33de3_6acb_4792_acd0_47790a8d0ee6.slice/crio-88a43b2a0aabb664dc3dfb1970616e839f585bc854a711a04fda081a4fbf2778 WatchSource:0}: Error finding container 88a43b2a0aabb664dc3dfb1970616e839f585bc854a711a04fda081a4fbf2778: Status 404 returned error can't find the container with id 88a43b2a0aabb664dc3dfb1970616e839f585bc854a711a04fda081a4fbf2778 Dec 01 08:41:12 crc kubenswrapper[5004]: I1201 08:41:12.496188 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 01 08:41:12 crc kubenswrapper[5004]: I1201 08:41:12.676650 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-585d58d86-6cl4z" Dec 01 08:41:12 crc kubenswrapper[5004]: I1201 08:41:12.718972 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-585d58d86-6cl4z" Dec 01 08:41:12 crc kubenswrapper[5004]: I1201 08:41:12.786075 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c910a71-8445-4c64-a555-433bb2c60bbd" path="/var/lib/kubelet/pods/1c910a71-8445-4c64-a555-433bb2c60bbd/volumes" Dec 01 08:41:12 crc kubenswrapper[5004]: I1201 08:41:12.786907 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4c8bbd5-63c7-441d-9262-b8d44344c7fa" path="/var/lib/kubelet/pods/f4c8bbd5-63c7-441d-9262-b8d44344c7fa/volumes" Dec 01 08:41:12 crc kubenswrapper[5004]: I1201 08:41:12.788299 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-65dd66d59d-csdqc"] Dec 01 08:41:12 crc kubenswrapper[5004]: I1201 08:41:12.788511 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-65dd66d59d-csdqc" podUID="2d7ab903-f57f-450e-8af4-95ee4c219310" containerName="barbican-api-log" containerID="cri-o://cfb456369a0911948f9c47135bb4dcc6ee04f0414ec9b430b49e3523a57c321a" gracePeriod=30 Dec 01 08:41:12 crc kubenswrapper[5004]: I1201 08:41:12.789092 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-65dd66d59d-csdqc" podUID="2d7ab903-f57f-450e-8af4-95ee4c219310" containerName="barbican-api" containerID="cri-o://ec3001c85cfb4b70331b0e5e19e5d9f9213368a70f563402d2b9a2a523a17e79" gracePeriod=30 Dec 01 08:41:13 crc kubenswrapper[5004]: I1201 08:41:13.041582 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2ca33de3-6acb-4792-acd0-47790a8d0ee6","Type":"ContainerStarted","Data":"edf1cc6fbe398362a5a5ff4c3d2e70fb76de8618945673e9a9fc15e00a3a40a2"} Dec 01 08:41:13 crc kubenswrapper[5004]: I1201 08:41:13.041923 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2ca33de3-6acb-4792-acd0-47790a8d0ee6","Type":"ContainerStarted","Data":"88a43b2a0aabb664dc3dfb1970616e839f585bc854a711a04fda081a4fbf2778"} Dec 01 08:41:13 crc kubenswrapper[5004]: I1201 08:41:13.045480 5004 generic.go:334] "Generic (PLEG): container finished" podID="2d7ab903-f57f-450e-8af4-95ee4c219310" containerID="cfb456369a0911948f9c47135bb4dcc6ee04f0414ec9b430b49e3523a57c321a" exitCode=143 Dec 01 08:41:13 crc kubenswrapper[5004]: I1201 08:41:13.046429 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-65dd66d59d-csdqc" event={"ID":"2d7ab903-f57f-450e-8af4-95ee4c219310","Type":"ContainerDied","Data":"cfb456369a0911948f9c47135bb4dcc6ee04f0414ec9b430b49e3523a57c321a"} Dec 01 08:41:14 crc kubenswrapper[5004]: I1201 08:41:14.058240 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2ca33de3-6acb-4792-acd0-47790a8d0ee6","Type":"ContainerStarted","Data":"4cc81c75ec09d2e692e7fc6dd8a0b3d5d12e9a21f5a96e920b20cd05236351ec"} Dec 01 08:41:14 crc kubenswrapper[5004]: I1201 08:41:14.080099 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.080080278 podStartE2EDuration="3.080080278s" podCreationTimestamp="2025-12-01 08:41:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:41:14.076731027 +0000 UTC m=+1451.641723049" watchObservedRunningTime="2025-12-01 08:41:14.080080278 +0000 UTC m=+1451.645072270" Dec 01 08:41:15 crc kubenswrapper[5004]: I1201 08:41:15.980078 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-65dd66d59d-csdqc" podUID="2d7ab903-f57f-450e-8af4-95ee4c219310" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.201:9311/healthcheck\": read tcp 10.217.0.2:41446->10.217.0.201:9311: read: connection reset by peer" Dec 01 08:41:15 crc kubenswrapper[5004]: I1201 08:41:15.981012 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-65dd66d59d-csdqc" podUID="2d7ab903-f57f-450e-8af4-95ee4c219310" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.201:9311/healthcheck\": read tcp 10.217.0.2:41448->10.217.0.201:9311: read: connection reset by peer" Dec 01 08:41:16 crc kubenswrapper[5004]: I1201 08:41:16.092252 5004 generic.go:334] "Generic (PLEG): container finished" podID="2d7ab903-f57f-450e-8af4-95ee4c219310" containerID="ec3001c85cfb4b70331b0e5e19e5d9f9213368a70f563402d2b9a2a523a17e79" exitCode=0 Dec 01 08:41:16 crc kubenswrapper[5004]: I1201 08:41:16.092321 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-65dd66d59d-csdqc" event={"ID":"2d7ab903-f57f-450e-8af4-95ee4c219310","Type":"ContainerDied","Data":"ec3001c85cfb4b70331b0e5e19e5d9f9213368a70f563402d2b9a2a523a17e79"} Dec 01 08:41:16 crc kubenswrapper[5004]: I1201 08:41:16.522053 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-6d7ffc7f79-7hlcz" Dec 01 08:41:16 crc kubenswrapper[5004]: I1201 08:41:16.619363 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-65dd66d59d-csdqc" Dec 01 08:41:16 crc kubenswrapper[5004]: I1201 08:41:16.734245 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2d7ab903-f57f-450e-8af4-95ee4c219310-config-data-custom\") pod \"2d7ab903-f57f-450e-8af4-95ee4c219310\" (UID: \"2d7ab903-f57f-450e-8af4-95ee4c219310\") " Dec 01 08:41:16 crc kubenswrapper[5004]: I1201 08:41:16.734398 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d7ab903-f57f-450e-8af4-95ee4c219310-logs\") pod \"2d7ab903-f57f-450e-8af4-95ee4c219310\" (UID: \"2d7ab903-f57f-450e-8af4-95ee4c219310\") " Dec 01 08:41:16 crc kubenswrapper[5004]: I1201 08:41:16.734450 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d7ab903-f57f-450e-8af4-95ee4c219310-config-data\") pod \"2d7ab903-f57f-450e-8af4-95ee4c219310\" (UID: \"2d7ab903-f57f-450e-8af4-95ee4c219310\") " Dec 01 08:41:16 crc kubenswrapper[5004]: I1201 08:41:16.734465 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xvw4\" (UniqueName: \"kubernetes.io/projected/2d7ab903-f57f-450e-8af4-95ee4c219310-kube-api-access-7xvw4\") pod \"2d7ab903-f57f-450e-8af4-95ee4c219310\" (UID: \"2d7ab903-f57f-450e-8af4-95ee4c219310\") " Dec 01 08:41:16 crc kubenswrapper[5004]: I1201 08:41:16.734494 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d7ab903-f57f-450e-8af4-95ee4c219310-combined-ca-bundle\") pod \"2d7ab903-f57f-450e-8af4-95ee4c219310\" (UID: \"2d7ab903-f57f-450e-8af4-95ee4c219310\") " Dec 01 08:41:16 crc kubenswrapper[5004]: I1201 08:41:16.735872 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d7ab903-f57f-450e-8af4-95ee4c219310-logs" (OuterVolumeSpecName: "logs") pod "2d7ab903-f57f-450e-8af4-95ee4c219310" (UID: "2d7ab903-f57f-450e-8af4-95ee4c219310"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:41:16 crc kubenswrapper[5004]: I1201 08:41:16.740876 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d7ab903-f57f-450e-8af4-95ee4c219310-kube-api-access-7xvw4" (OuterVolumeSpecName: "kube-api-access-7xvw4") pod "2d7ab903-f57f-450e-8af4-95ee4c219310" (UID: "2d7ab903-f57f-450e-8af4-95ee4c219310"). InnerVolumeSpecName "kube-api-access-7xvw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:41:16 crc kubenswrapper[5004]: I1201 08:41:16.755179 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d7ab903-f57f-450e-8af4-95ee4c219310-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2d7ab903-f57f-450e-8af4-95ee4c219310" (UID: "2d7ab903-f57f-450e-8af4-95ee4c219310"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:41:16 crc kubenswrapper[5004]: I1201 08:41:16.756218 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 01 08:41:16 crc kubenswrapper[5004]: I1201 08:41:16.770046 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d7ab903-f57f-450e-8af4-95ee4c219310-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2d7ab903-f57f-450e-8af4-95ee4c219310" (UID: "2d7ab903-f57f-450e-8af4-95ee4c219310"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:41:16 crc kubenswrapper[5004]: I1201 08:41:16.787738 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d7ab903-f57f-450e-8af4-95ee4c219310-config-data" (OuterVolumeSpecName: "config-data") pod "2d7ab903-f57f-450e-8af4-95ee4c219310" (UID: "2d7ab903-f57f-450e-8af4-95ee4c219310"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:41:16 crc kubenswrapper[5004]: I1201 08:41:16.836416 5004 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2d7ab903-f57f-450e-8af4-95ee4c219310-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:16 crc kubenswrapper[5004]: I1201 08:41:16.836942 5004 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d7ab903-f57f-450e-8af4-95ee4c219310-logs\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:16 crc kubenswrapper[5004]: I1201 08:41:16.837008 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d7ab903-f57f-450e-8af4-95ee4c219310-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:16 crc kubenswrapper[5004]: I1201 08:41:16.837060 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xvw4\" (UniqueName: \"kubernetes.io/projected/2d7ab903-f57f-450e-8af4-95ee4c219310-kube-api-access-7xvw4\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:16 crc kubenswrapper[5004]: I1201 08:41:16.837121 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d7ab903-f57f-450e-8af4-95ee4c219310-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:17 crc kubenswrapper[5004]: I1201 08:41:17.111721 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-65dd66d59d-csdqc" event={"ID":"2d7ab903-f57f-450e-8af4-95ee4c219310","Type":"ContainerDied","Data":"48e25e58135e65f251545800e04b5e2dbe90dba3a77b31e1afeb08e7c7399815"} Dec 01 08:41:17 crc kubenswrapper[5004]: I1201 08:41:17.111793 5004 scope.go:117] "RemoveContainer" containerID="ec3001c85cfb4b70331b0e5e19e5d9f9213368a70f563402d2b9a2a523a17e79" Dec 01 08:41:17 crc kubenswrapper[5004]: I1201 08:41:17.111979 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-65dd66d59d-csdqc" Dec 01 08:41:17 crc kubenswrapper[5004]: I1201 08:41:17.158104 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-65dd66d59d-csdqc"] Dec 01 08:41:17 crc kubenswrapper[5004]: I1201 08:41:17.163642 5004 scope.go:117] "RemoveContainer" containerID="cfb456369a0911948f9c47135bb4dcc6ee04f0414ec9b430b49e3523a57c321a" Dec 01 08:41:17 crc kubenswrapper[5004]: I1201 08:41:17.190020 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-65dd66d59d-csdqc"] Dec 01 08:41:18 crc kubenswrapper[5004]: I1201 08:41:18.783284 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d7ab903-f57f-450e-8af4-95ee4c219310" path="/var/lib/kubelet/pods/2d7ab903-f57f-450e-8af4-95ee4c219310/volumes" Dec 01 08:41:20 crc kubenswrapper[5004]: I1201 08:41:20.466721 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-7977b5fddc-g9fhv"] Dec 01 08:41:20 crc kubenswrapper[5004]: E1201 08:41:20.467765 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d7ab903-f57f-450e-8af4-95ee4c219310" containerName="barbican-api-log" Dec 01 08:41:20 crc kubenswrapper[5004]: I1201 08:41:20.467781 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d7ab903-f57f-450e-8af4-95ee4c219310" containerName="barbican-api-log" Dec 01 08:41:20 crc kubenswrapper[5004]: E1201 08:41:20.467807 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d7ab903-f57f-450e-8af4-95ee4c219310" containerName="barbican-api" Dec 01 08:41:20 crc kubenswrapper[5004]: I1201 08:41:20.467813 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d7ab903-f57f-450e-8af4-95ee4c219310" containerName="barbican-api" Dec 01 08:41:20 crc kubenswrapper[5004]: I1201 08:41:20.468029 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d7ab903-f57f-450e-8af4-95ee4c219310" containerName="barbican-api-log" Dec 01 08:41:20 crc kubenswrapper[5004]: I1201 08:41:20.468047 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d7ab903-f57f-450e-8af4-95ee4c219310" containerName="barbican-api" Dec 01 08:41:20 crc kubenswrapper[5004]: I1201 08:41:20.470996 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7977b5fddc-g9fhv" Dec 01 08:41:20 crc kubenswrapper[5004]: I1201 08:41:20.473247 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Dec 01 08:41:20 crc kubenswrapper[5004]: I1201 08:41:20.473924 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 01 08:41:20 crc kubenswrapper[5004]: I1201 08:41:20.474143 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Dec 01 08:41:20 crc kubenswrapper[5004]: I1201 08:41:20.493638 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7977b5fddc-g9fhv"] Dec 01 08:41:20 crc kubenswrapper[5004]: I1201 08:41:20.661213 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2842f079-806d-4bdd-8218-db4b2b000259-log-httpd\") pod \"swift-proxy-7977b5fddc-g9fhv\" (UID: \"2842f079-806d-4bdd-8218-db4b2b000259\") " pod="openstack/swift-proxy-7977b5fddc-g9fhv" Dec 01 08:41:20 crc kubenswrapper[5004]: I1201 08:41:20.661265 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2842f079-806d-4bdd-8218-db4b2b000259-public-tls-certs\") pod \"swift-proxy-7977b5fddc-g9fhv\" (UID: \"2842f079-806d-4bdd-8218-db4b2b000259\") " pod="openstack/swift-proxy-7977b5fddc-g9fhv" Dec 01 08:41:20 crc kubenswrapper[5004]: I1201 08:41:20.661288 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2842f079-806d-4bdd-8218-db4b2b000259-combined-ca-bundle\") pod \"swift-proxy-7977b5fddc-g9fhv\" (UID: \"2842f079-806d-4bdd-8218-db4b2b000259\") " pod="openstack/swift-proxy-7977b5fddc-g9fhv" Dec 01 08:41:20 crc kubenswrapper[5004]: I1201 08:41:20.661349 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2842f079-806d-4bdd-8218-db4b2b000259-internal-tls-certs\") pod \"swift-proxy-7977b5fddc-g9fhv\" (UID: \"2842f079-806d-4bdd-8218-db4b2b000259\") " pod="openstack/swift-proxy-7977b5fddc-g9fhv" Dec 01 08:41:20 crc kubenswrapper[5004]: I1201 08:41:20.661378 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2842f079-806d-4bdd-8218-db4b2b000259-run-httpd\") pod \"swift-proxy-7977b5fddc-g9fhv\" (UID: \"2842f079-806d-4bdd-8218-db4b2b000259\") " pod="openstack/swift-proxy-7977b5fddc-g9fhv" Dec 01 08:41:20 crc kubenswrapper[5004]: I1201 08:41:20.662327 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhg9x\" (UniqueName: \"kubernetes.io/projected/2842f079-806d-4bdd-8218-db4b2b000259-kube-api-access-lhg9x\") pod \"swift-proxy-7977b5fddc-g9fhv\" (UID: \"2842f079-806d-4bdd-8218-db4b2b000259\") " pod="openstack/swift-proxy-7977b5fddc-g9fhv" Dec 01 08:41:20 crc kubenswrapper[5004]: I1201 08:41:20.662404 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2842f079-806d-4bdd-8218-db4b2b000259-etc-swift\") pod \"swift-proxy-7977b5fddc-g9fhv\" (UID: \"2842f079-806d-4bdd-8218-db4b2b000259\") " pod="openstack/swift-proxy-7977b5fddc-g9fhv" Dec 01 08:41:20 crc kubenswrapper[5004]: I1201 08:41:20.662603 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2842f079-806d-4bdd-8218-db4b2b000259-config-data\") pod \"swift-proxy-7977b5fddc-g9fhv\" (UID: \"2842f079-806d-4bdd-8218-db4b2b000259\") " pod="openstack/swift-proxy-7977b5fddc-g9fhv" Dec 01 08:41:20 crc kubenswrapper[5004]: I1201 08:41:20.765029 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2842f079-806d-4bdd-8218-db4b2b000259-log-httpd\") pod \"swift-proxy-7977b5fddc-g9fhv\" (UID: \"2842f079-806d-4bdd-8218-db4b2b000259\") " pod="openstack/swift-proxy-7977b5fddc-g9fhv" Dec 01 08:41:20 crc kubenswrapper[5004]: I1201 08:41:20.765471 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2842f079-806d-4bdd-8218-db4b2b000259-public-tls-certs\") pod \"swift-proxy-7977b5fddc-g9fhv\" (UID: \"2842f079-806d-4bdd-8218-db4b2b000259\") " pod="openstack/swift-proxy-7977b5fddc-g9fhv" Dec 01 08:41:20 crc kubenswrapper[5004]: I1201 08:41:20.765590 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2842f079-806d-4bdd-8218-db4b2b000259-combined-ca-bundle\") pod \"swift-proxy-7977b5fddc-g9fhv\" (UID: \"2842f079-806d-4bdd-8218-db4b2b000259\") " pod="openstack/swift-proxy-7977b5fddc-g9fhv" Dec 01 08:41:20 crc kubenswrapper[5004]: I1201 08:41:20.765757 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2842f079-806d-4bdd-8218-db4b2b000259-internal-tls-certs\") pod \"swift-proxy-7977b5fddc-g9fhv\" (UID: \"2842f079-806d-4bdd-8218-db4b2b000259\") " pod="openstack/swift-proxy-7977b5fddc-g9fhv" Dec 01 08:41:20 crc kubenswrapper[5004]: I1201 08:41:20.765886 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2842f079-806d-4bdd-8218-db4b2b000259-run-httpd\") pod \"swift-proxy-7977b5fddc-g9fhv\" (UID: \"2842f079-806d-4bdd-8218-db4b2b000259\") " pod="openstack/swift-proxy-7977b5fddc-g9fhv" Dec 01 08:41:20 crc kubenswrapper[5004]: I1201 08:41:20.766019 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhg9x\" (UniqueName: \"kubernetes.io/projected/2842f079-806d-4bdd-8218-db4b2b000259-kube-api-access-lhg9x\") pod \"swift-proxy-7977b5fddc-g9fhv\" (UID: \"2842f079-806d-4bdd-8218-db4b2b000259\") " pod="openstack/swift-proxy-7977b5fddc-g9fhv" Dec 01 08:41:20 crc kubenswrapper[5004]: I1201 08:41:20.766181 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2842f079-806d-4bdd-8218-db4b2b000259-etc-swift\") pod \"swift-proxy-7977b5fddc-g9fhv\" (UID: \"2842f079-806d-4bdd-8218-db4b2b000259\") " pod="openstack/swift-proxy-7977b5fddc-g9fhv" Dec 01 08:41:20 crc kubenswrapper[5004]: I1201 08:41:20.766322 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2842f079-806d-4bdd-8218-db4b2b000259-config-data\") pod \"swift-proxy-7977b5fddc-g9fhv\" (UID: \"2842f079-806d-4bdd-8218-db4b2b000259\") " pod="openstack/swift-proxy-7977b5fddc-g9fhv" Dec 01 08:41:20 crc kubenswrapper[5004]: I1201 08:41:20.767520 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2842f079-806d-4bdd-8218-db4b2b000259-run-httpd\") pod \"swift-proxy-7977b5fddc-g9fhv\" (UID: \"2842f079-806d-4bdd-8218-db4b2b000259\") " pod="openstack/swift-proxy-7977b5fddc-g9fhv" Dec 01 08:41:20 crc kubenswrapper[5004]: I1201 08:41:20.768440 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2842f079-806d-4bdd-8218-db4b2b000259-log-httpd\") pod \"swift-proxy-7977b5fddc-g9fhv\" (UID: \"2842f079-806d-4bdd-8218-db4b2b000259\") " pod="openstack/swift-proxy-7977b5fddc-g9fhv" Dec 01 08:41:20 crc kubenswrapper[5004]: I1201 08:41:20.772548 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2842f079-806d-4bdd-8218-db4b2b000259-internal-tls-certs\") pod \"swift-proxy-7977b5fddc-g9fhv\" (UID: \"2842f079-806d-4bdd-8218-db4b2b000259\") " pod="openstack/swift-proxy-7977b5fddc-g9fhv" Dec 01 08:41:20 crc kubenswrapper[5004]: I1201 08:41:20.772978 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2842f079-806d-4bdd-8218-db4b2b000259-public-tls-certs\") pod \"swift-proxy-7977b5fddc-g9fhv\" (UID: \"2842f079-806d-4bdd-8218-db4b2b000259\") " pod="openstack/swift-proxy-7977b5fddc-g9fhv" Dec 01 08:41:20 crc kubenswrapper[5004]: I1201 08:41:20.776008 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2842f079-806d-4bdd-8218-db4b2b000259-etc-swift\") pod \"swift-proxy-7977b5fddc-g9fhv\" (UID: \"2842f079-806d-4bdd-8218-db4b2b000259\") " pod="openstack/swift-proxy-7977b5fddc-g9fhv" Dec 01 08:41:20 crc kubenswrapper[5004]: I1201 08:41:20.779290 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2842f079-806d-4bdd-8218-db4b2b000259-config-data\") pod \"swift-proxy-7977b5fddc-g9fhv\" (UID: \"2842f079-806d-4bdd-8218-db4b2b000259\") " pod="openstack/swift-proxy-7977b5fddc-g9fhv" Dec 01 08:41:20 crc kubenswrapper[5004]: I1201 08:41:20.782502 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2842f079-806d-4bdd-8218-db4b2b000259-combined-ca-bundle\") pod \"swift-proxy-7977b5fddc-g9fhv\" (UID: \"2842f079-806d-4bdd-8218-db4b2b000259\") " pod="openstack/swift-proxy-7977b5fddc-g9fhv" Dec 01 08:41:20 crc kubenswrapper[5004]: I1201 08:41:20.796885 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhg9x\" (UniqueName: \"kubernetes.io/projected/2842f079-806d-4bdd-8218-db4b2b000259-kube-api-access-lhg9x\") pod \"swift-proxy-7977b5fddc-g9fhv\" (UID: \"2842f079-806d-4bdd-8218-db4b2b000259\") " pod="openstack/swift-proxy-7977b5fddc-g9fhv" Dec 01 08:41:20 crc kubenswrapper[5004]: I1201 08:41:20.813071 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7977b5fddc-g9fhv" Dec 01 08:41:21 crc kubenswrapper[5004]: I1201 08:41:21.445223 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 01 08:41:21 crc kubenswrapper[5004]: I1201 08:41:21.447476 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 01 08:41:21 crc kubenswrapper[5004]: I1201 08:41:21.451663 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 01 08:41:21 crc kubenswrapper[5004]: I1201 08:41:21.451894 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-j2scb" Dec 01 08:41:21 crc kubenswrapper[5004]: I1201 08:41:21.452063 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 01 08:41:21 crc kubenswrapper[5004]: I1201 08:41:21.461683 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 01 08:41:21 crc kubenswrapper[5004]: I1201 08:41:21.492037 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a4252410-0df2-4381-97da-772a062f3f8b-openstack-config\") pod \"openstackclient\" (UID: \"a4252410-0df2-4381-97da-772a062f3f8b\") " pod="openstack/openstackclient" Dec 01 08:41:21 crc kubenswrapper[5004]: I1201 08:41:21.492151 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56wd9\" (UniqueName: \"kubernetes.io/projected/a4252410-0df2-4381-97da-772a062f3f8b-kube-api-access-56wd9\") pod \"openstackclient\" (UID: \"a4252410-0df2-4381-97da-772a062f3f8b\") " pod="openstack/openstackclient" Dec 01 08:41:21 crc kubenswrapper[5004]: I1201 08:41:21.492245 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a4252410-0df2-4381-97da-772a062f3f8b-openstack-config-secret\") pod \"openstackclient\" (UID: \"a4252410-0df2-4381-97da-772a062f3f8b\") " pod="openstack/openstackclient" Dec 01 08:41:21 crc kubenswrapper[5004]: I1201 08:41:21.492267 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4252410-0df2-4381-97da-772a062f3f8b-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a4252410-0df2-4381-97da-772a062f3f8b\") " pod="openstack/openstackclient" Dec 01 08:41:21 crc kubenswrapper[5004]: I1201 08:41:21.569488 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7977b5fddc-g9fhv"] Dec 01 08:41:21 crc kubenswrapper[5004]: I1201 08:41:21.594700 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56wd9\" (UniqueName: \"kubernetes.io/projected/a4252410-0df2-4381-97da-772a062f3f8b-kube-api-access-56wd9\") pod \"openstackclient\" (UID: \"a4252410-0df2-4381-97da-772a062f3f8b\") " pod="openstack/openstackclient" Dec 01 08:41:21 crc kubenswrapper[5004]: I1201 08:41:21.595047 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a4252410-0df2-4381-97da-772a062f3f8b-openstack-config-secret\") pod \"openstackclient\" (UID: \"a4252410-0df2-4381-97da-772a062f3f8b\") " pod="openstack/openstackclient" Dec 01 08:41:21 crc kubenswrapper[5004]: I1201 08:41:21.595145 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4252410-0df2-4381-97da-772a062f3f8b-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a4252410-0df2-4381-97da-772a062f3f8b\") " pod="openstack/openstackclient" Dec 01 08:41:21 crc kubenswrapper[5004]: I1201 08:41:21.595961 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a4252410-0df2-4381-97da-772a062f3f8b-openstack-config\") pod \"openstackclient\" (UID: \"a4252410-0df2-4381-97da-772a062f3f8b\") " pod="openstack/openstackclient" Dec 01 08:41:21 crc kubenswrapper[5004]: I1201 08:41:21.596906 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a4252410-0df2-4381-97da-772a062f3f8b-openstack-config\") pod \"openstackclient\" (UID: \"a4252410-0df2-4381-97da-772a062f3f8b\") " pod="openstack/openstackclient" Dec 01 08:41:21 crc kubenswrapper[5004]: I1201 08:41:21.601184 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4252410-0df2-4381-97da-772a062f3f8b-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a4252410-0df2-4381-97da-772a062f3f8b\") " pod="openstack/openstackclient" Dec 01 08:41:21 crc kubenswrapper[5004]: I1201 08:41:21.605166 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a4252410-0df2-4381-97da-772a062f3f8b-openstack-config-secret\") pod \"openstackclient\" (UID: \"a4252410-0df2-4381-97da-772a062f3f8b\") " pod="openstack/openstackclient" Dec 01 08:41:21 crc kubenswrapper[5004]: I1201 08:41:21.613442 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56wd9\" (UniqueName: \"kubernetes.io/projected/a4252410-0df2-4381-97da-772a062f3f8b-kube-api-access-56wd9\") pod \"openstackclient\" (UID: \"a4252410-0df2-4381-97da-772a062f3f8b\") " pod="openstack/openstackclient" Dec 01 08:41:21 crc kubenswrapper[5004]: I1201 08:41:21.777196 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 01 08:41:22 crc kubenswrapper[5004]: I1201 08:41:22.040587 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 01 08:41:22 crc kubenswrapper[5004]: I1201 08:41:22.186409 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7977b5fddc-g9fhv" event={"ID":"2842f079-806d-4bdd-8218-db4b2b000259","Type":"ContainerStarted","Data":"825eb0c76eca1fffc941b377e492a60a3faffd56ffc6841eb82c5753b35a8b7e"} Dec 01 08:41:22 crc kubenswrapper[5004]: I1201 08:41:22.186447 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7977b5fddc-g9fhv" event={"ID":"2842f079-806d-4bdd-8218-db4b2b000259","Type":"ContainerStarted","Data":"c9bcd1877fd636a2fc5494974571789ca3698f169812e98faf6ce27ebde0f53c"} Dec 01 08:41:22 crc kubenswrapper[5004]: I1201 08:41:22.186458 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7977b5fddc-g9fhv" event={"ID":"2842f079-806d-4bdd-8218-db4b2b000259","Type":"ContainerStarted","Data":"0a4553e308f54a6573c47bd5de7eab23ea743cb4230274e666d7007fc99da86a"} Dec 01 08:41:22 crc kubenswrapper[5004]: I1201 08:41:22.186878 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7977b5fddc-g9fhv" Dec 01 08:41:22 crc kubenswrapper[5004]: I1201 08:41:22.217170 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-7977b5fddc-g9fhv" podStartSLOduration=2.217150567 podStartE2EDuration="2.217150567s" podCreationTimestamp="2025-12-01 08:41:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:41:22.210610877 +0000 UTC m=+1459.775602879" watchObservedRunningTime="2025-12-01 08:41:22.217150567 +0000 UTC m=+1459.782142549" Dec 01 08:41:22 crc kubenswrapper[5004]: W1201 08:41:22.275037 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4252410_0df2_4381_97da_772a062f3f8b.slice/crio-2eee5ea85c848e1b8e262d8b91f734598cc211ffc2aef81a790c620f020c0b94 WatchSource:0}: Error finding container 2eee5ea85c848e1b8e262d8b91f734598cc211ffc2aef81a790c620f020c0b94: Status 404 returned error can't find the container with id 2eee5ea85c848e1b8e262d8b91f734598cc211ffc2aef81a790c620f020c0b94 Dec 01 08:41:22 crc kubenswrapper[5004]: I1201 08:41:22.281724 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 01 08:41:22 crc kubenswrapper[5004]: I1201 08:41:22.466780 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 08:41:22 crc kubenswrapper[5004]: I1201 08:41:22.467283 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e8d0df30-51e3-436d-a6a6-f65dcce6db43" containerName="ceilometer-central-agent" containerID="cri-o://50d0650801191286e1e076cbe935cf885747cedc8d93d1ada5eefd63f626c140" gracePeriod=30 Dec 01 08:41:22 crc kubenswrapper[5004]: I1201 08:41:22.467395 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e8d0df30-51e3-436d-a6a6-f65dcce6db43" containerName="ceilometer-notification-agent" containerID="cri-o://8233abab7381b4b33e414d6dd7c421f095c3af601e8a4f09e9d238fd41d311b6" gracePeriod=30 Dec 01 08:41:22 crc kubenswrapper[5004]: I1201 08:41:22.467449 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e8d0df30-51e3-436d-a6a6-f65dcce6db43" containerName="proxy-httpd" containerID="cri-o://96cd9079bacc653ba543eec29b93edca4bb82d19e66b908eec9ef5edfcdbf843" gracePeriod=30 Dec 01 08:41:22 crc kubenswrapper[5004]: I1201 08:41:22.467534 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e8d0df30-51e3-436d-a6a6-f65dcce6db43" containerName="sg-core" containerID="cri-o://db3c9c4681c539e98b6da1a0b5264aa96c8db05e0e318d8711e90f6dd314a2e2" gracePeriod=30 Dec 01 08:41:22 crc kubenswrapper[5004]: I1201 08:41:22.475280 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="e8d0df30-51e3-436d-a6a6-f65dcce6db43" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.202:3000/\": EOF" Dec 01 08:41:23 crc kubenswrapper[5004]: I1201 08:41:23.244441 5004 generic.go:334] "Generic (PLEG): container finished" podID="e8d0df30-51e3-436d-a6a6-f65dcce6db43" containerID="96cd9079bacc653ba543eec29b93edca4bb82d19e66b908eec9ef5edfcdbf843" exitCode=0 Dec 01 08:41:23 crc kubenswrapper[5004]: I1201 08:41:23.244714 5004 generic.go:334] "Generic (PLEG): container finished" podID="e8d0df30-51e3-436d-a6a6-f65dcce6db43" containerID="db3c9c4681c539e98b6da1a0b5264aa96c8db05e0e318d8711e90f6dd314a2e2" exitCode=2 Dec 01 08:41:23 crc kubenswrapper[5004]: I1201 08:41:23.244722 5004 generic.go:334] "Generic (PLEG): container finished" podID="e8d0df30-51e3-436d-a6a6-f65dcce6db43" containerID="50d0650801191286e1e076cbe935cf885747cedc8d93d1ada5eefd63f626c140" exitCode=0 Dec 01 08:41:23 crc kubenswrapper[5004]: I1201 08:41:23.244774 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e8d0df30-51e3-436d-a6a6-f65dcce6db43","Type":"ContainerDied","Data":"96cd9079bacc653ba543eec29b93edca4bb82d19e66b908eec9ef5edfcdbf843"} Dec 01 08:41:23 crc kubenswrapper[5004]: I1201 08:41:23.244800 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e8d0df30-51e3-436d-a6a6-f65dcce6db43","Type":"ContainerDied","Data":"db3c9c4681c539e98b6da1a0b5264aa96c8db05e0e318d8711e90f6dd314a2e2"} Dec 01 08:41:23 crc kubenswrapper[5004]: I1201 08:41:23.244810 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e8d0df30-51e3-436d-a6a6-f65dcce6db43","Type":"ContainerDied","Data":"50d0650801191286e1e076cbe935cf885747cedc8d93d1ada5eefd63f626c140"} Dec 01 08:41:23 crc kubenswrapper[5004]: I1201 08:41:23.257773 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"a4252410-0df2-4381-97da-772a062f3f8b","Type":"ContainerStarted","Data":"2eee5ea85c848e1b8e262d8b91f734598cc211ffc2aef81a790c620f020c0b94"} Dec 01 08:41:23 crc kubenswrapper[5004]: I1201 08:41:23.257817 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7977b5fddc-g9fhv" Dec 01 08:41:25 crc kubenswrapper[5004]: I1201 08:41:25.304757 5004 generic.go:334] "Generic (PLEG): container finished" podID="e8d0df30-51e3-436d-a6a6-f65dcce6db43" containerID="8233abab7381b4b33e414d6dd7c421f095c3af601e8a4f09e9d238fd41d311b6" exitCode=0 Dec 01 08:41:25 crc kubenswrapper[5004]: I1201 08:41:25.304842 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e8d0df30-51e3-436d-a6a6-f65dcce6db43","Type":"ContainerDied","Data":"8233abab7381b4b33e414d6dd7c421f095c3af601e8a4f09e9d238fd41d311b6"} Dec 01 08:41:25 crc kubenswrapper[5004]: I1201 08:41:25.925875 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 08:41:26 crc kubenswrapper[5004]: I1201 08:41:26.001443 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8d0df30-51e3-436d-a6a6-f65dcce6db43-config-data\") pod \"e8d0df30-51e3-436d-a6a6-f65dcce6db43\" (UID: \"e8d0df30-51e3-436d-a6a6-f65dcce6db43\") " Dec 01 08:41:26 crc kubenswrapper[5004]: I1201 08:41:26.001511 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vsjpx\" (UniqueName: \"kubernetes.io/projected/e8d0df30-51e3-436d-a6a6-f65dcce6db43-kube-api-access-vsjpx\") pod \"e8d0df30-51e3-436d-a6a6-f65dcce6db43\" (UID: \"e8d0df30-51e3-436d-a6a6-f65dcce6db43\") " Dec 01 08:41:26 crc kubenswrapper[5004]: I1201 08:41:26.001602 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8d0df30-51e3-436d-a6a6-f65dcce6db43-run-httpd\") pod \"e8d0df30-51e3-436d-a6a6-f65dcce6db43\" (UID: \"e8d0df30-51e3-436d-a6a6-f65dcce6db43\") " Dec 01 08:41:26 crc kubenswrapper[5004]: I1201 08:41:26.001643 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8d0df30-51e3-436d-a6a6-f65dcce6db43-combined-ca-bundle\") pod \"e8d0df30-51e3-436d-a6a6-f65dcce6db43\" (UID: \"e8d0df30-51e3-436d-a6a6-f65dcce6db43\") " Dec 01 08:41:26 crc kubenswrapper[5004]: I1201 08:41:26.001774 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8d0df30-51e3-436d-a6a6-f65dcce6db43-scripts\") pod \"e8d0df30-51e3-436d-a6a6-f65dcce6db43\" (UID: \"e8d0df30-51e3-436d-a6a6-f65dcce6db43\") " Dec 01 08:41:26 crc kubenswrapper[5004]: I1201 08:41:26.001874 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e8d0df30-51e3-436d-a6a6-f65dcce6db43-sg-core-conf-yaml\") pod \"e8d0df30-51e3-436d-a6a6-f65dcce6db43\" (UID: \"e8d0df30-51e3-436d-a6a6-f65dcce6db43\") " Dec 01 08:41:26 crc kubenswrapper[5004]: I1201 08:41:26.001914 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8d0df30-51e3-436d-a6a6-f65dcce6db43-log-httpd\") pod \"e8d0df30-51e3-436d-a6a6-f65dcce6db43\" (UID: \"e8d0df30-51e3-436d-a6a6-f65dcce6db43\") " Dec 01 08:41:26 crc kubenswrapper[5004]: I1201 08:41:26.002669 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8d0df30-51e3-436d-a6a6-f65dcce6db43-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e8d0df30-51e3-436d-a6a6-f65dcce6db43" (UID: "e8d0df30-51e3-436d-a6a6-f65dcce6db43"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:41:26 crc kubenswrapper[5004]: I1201 08:41:26.002892 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8d0df30-51e3-436d-a6a6-f65dcce6db43-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e8d0df30-51e3-436d-a6a6-f65dcce6db43" (UID: "e8d0df30-51e3-436d-a6a6-f65dcce6db43"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:41:26 crc kubenswrapper[5004]: I1201 08:41:26.007031 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8d0df30-51e3-436d-a6a6-f65dcce6db43-scripts" (OuterVolumeSpecName: "scripts") pod "e8d0df30-51e3-436d-a6a6-f65dcce6db43" (UID: "e8d0df30-51e3-436d-a6a6-f65dcce6db43"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:41:26 crc kubenswrapper[5004]: I1201 08:41:26.027319 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8d0df30-51e3-436d-a6a6-f65dcce6db43-kube-api-access-vsjpx" (OuterVolumeSpecName: "kube-api-access-vsjpx") pod "e8d0df30-51e3-436d-a6a6-f65dcce6db43" (UID: "e8d0df30-51e3-436d-a6a6-f65dcce6db43"). InnerVolumeSpecName "kube-api-access-vsjpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:41:26 crc kubenswrapper[5004]: I1201 08:41:26.062205 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8d0df30-51e3-436d-a6a6-f65dcce6db43-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e8d0df30-51e3-436d-a6a6-f65dcce6db43" (UID: "e8d0df30-51e3-436d-a6a6-f65dcce6db43"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:41:26 crc kubenswrapper[5004]: I1201 08:41:26.104800 5004 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8d0df30-51e3-436d-a6a6-f65dcce6db43-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:26 crc kubenswrapper[5004]: I1201 08:41:26.104831 5004 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e8d0df30-51e3-436d-a6a6-f65dcce6db43-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:26 crc kubenswrapper[5004]: I1201 08:41:26.104843 5004 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8d0df30-51e3-436d-a6a6-f65dcce6db43-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:26 crc kubenswrapper[5004]: I1201 08:41:26.104852 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vsjpx\" (UniqueName: \"kubernetes.io/projected/e8d0df30-51e3-436d-a6a6-f65dcce6db43-kube-api-access-vsjpx\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:26 crc kubenswrapper[5004]: I1201 08:41:26.104861 5004 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8d0df30-51e3-436d-a6a6-f65dcce6db43-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:26 crc kubenswrapper[5004]: I1201 08:41:26.107707 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8d0df30-51e3-436d-a6a6-f65dcce6db43-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e8d0df30-51e3-436d-a6a6-f65dcce6db43" (UID: "e8d0df30-51e3-436d-a6a6-f65dcce6db43"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:41:26 crc kubenswrapper[5004]: I1201 08:41:26.152980 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8d0df30-51e3-436d-a6a6-f65dcce6db43-config-data" (OuterVolumeSpecName: "config-data") pod "e8d0df30-51e3-436d-a6a6-f65dcce6db43" (UID: "e8d0df30-51e3-436d-a6a6-f65dcce6db43"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:41:26 crc kubenswrapper[5004]: I1201 08:41:26.207295 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8d0df30-51e3-436d-a6a6-f65dcce6db43-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:26 crc kubenswrapper[5004]: I1201 08:41:26.207331 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8d0df30-51e3-436d-a6a6-f65dcce6db43-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:26 crc kubenswrapper[5004]: I1201 08:41:26.320799 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e8d0df30-51e3-436d-a6a6-f65dcce6db43","Type":"ContainerDied","Data":"8695336a6b0b5bfaa2d1058a1cfcaece8b30cb43b5ab4b855214dfc396ec240e"} Dec 01 08:41:26 crc kubenswrapper[5004]: I1201 08:41:26.320852 5004 scope.go:117] "RemoveContainer" containerID="96cd9079bacc653ba543eec29b93edca4bb82d19e66b908eec9ef5edfcdbf843" Dec 01 08:41:26 crc kubenswrapper[5004]: I1201 08:41:26.320885 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 08:41:26 crc kubenswrapper[5004]: I1201 08:41:26.362201 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 08:41:26 crc kubenswrapper[5004]: I1201 08:41:26.363197 5004 scope.go:117] "RemoveContainer" containerID="db3c9c4681c539e98b6da1a0b5264aa96c8db05e0e318d8711e90f6dd314a2e2" Dec 01 08:41:26 crc kubenswrapper[5004]: I1201 08:41:26.373212 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 01 08:41:26 crc kubenswrapper[5004]: I1201 08:41:26.389507 5004 scope.go:117] "RemoveContainer" containerID="8233abab7381b4b33e414d6dd7c421f095c3af601e8a4f09e9d238fd41d311b6" Dec 01 08:41:26 crc kubenswrapper[5004]: I1201 08:41:26.389658 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 08:41:26 crc kubenswrapper[5004]: E1201 08:41:26.390648 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8d0df30-51e3-436d-a6a6-f65dcce6db43" containerName="proxy-httpd" Dec 01 08:41:26 crc kubenswrapper[5004]: I1201 08:41:26.390676 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8d0df30-51e3-436d-a6a6-f65dcce6db43" containerName="proxy-httpd" Dec 01 08:41:26 crc kubenswrapper[5004]: E1201 08:41:26.390717 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8d0df30-51e3-436d-a6a6-f65dcce6db43" containerName="sg-core" Dec 01 08:41:26 crc kubenswrapper[5004]: I1201 08:41:26.390726 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8d0df30-51e3-436d-a6a6-f65dcce6db43" containerName="sg-core" Dec 01 08:41:26 crc kubenswrapper[5004]: E1201 08:41:26.390740 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8d0df30-51e3-436d-a6a6-f65dcce6db43" containerName="ceilometer-notification-agent" Dec 01 08:41:26 crc kubenswrapper[5004]: I1201 08:41:26.390748 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8d0df30-51e3-436d-a6a6-f65dcce6db43" containerName="ceilometer-notification-agent" Dec 01 08:41:26 crc kubenswrapper[5004]: E1201 08:41:26.390783 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8d0df30-51e3-436d-a6a6-f65dcce6db43" containerName="ceilometer-central-agent" Dec 01 08:41:26 crc kubenswrapper[5004]: I1201 08:41:26.390793 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8d0df30-51e3-436d-a6a6-f65dcce6db43" containerName="ceilometer-central-agent" Dec 01 08:41:26 crc kubenswrapper[5004]: I1201 08:41:26.391099 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8d0df30-51e3-436d-a6a6-f65dcce6db43" containerName="ceilometer-notification-agent" Dec 01 08:41:26 crc kubenswrapper[5004]: I1201 08:41:26.391123 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8d0df30-51e3-436d-a6a6-f65dcce6db43" containerName="sg-core" Dec 01 08:41:26 crc kubenswrapper[5004]: I1201 08:41:26.391141 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8d0df30-51e3-436d-a6a6-f65dcce6db43" containerName="proxy-httpd" Dec 01 08:41:26 crc kubenswrapper[5004]: I1201 08:41:26.391155 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8d0df30-51e3-436d-a6a6-f65dcce6db43" containerName="ceilometer-central-agent" Dec 01 08:41:26 crc kubenswrapper[5004]: I1201 08:41:26.393888 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 08:41:26 crc kubenswrapper[5004]: I1201 08:41:26.398895 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 08:41:26 crc kubenswrapper[5004]: I1201 08:41:26.399728 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 08:41:26 crc kubenswrapper[5004]: I1201 08:41:26.424291 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 08:41:26 crc kubenswrapper[5004]: I1201 08:41:26.434470 5004 scope.go:117] "RemoveContainer" containerID="50d0650801191286e1e076cbe935cf885747cedc8d93d1ada5eefd63f626c140" Dec 01 08:41:26 crc kubenswrapper[5004]: I1201 08:41:26.512791 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/39fa01cf-5ba6-4a0a-8240-2b2eba0decf6-run-httpd\") pod \"ceilometer-0\" (UID: \"39fa01cf-5ba6-4a0a-8240-2b2eba0decf6\") " pod="openstack/ceilometer-0" Dec 01 08:41:26 crc kubenswrapper[5004]: I1201 08:41:26.512863 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/39fa01cf-5ba6-4a0a-8240-2b2eba0decf6-log-httpd\") pod \"ceilometer-0\" (UID: \"39fa01cf-5ba6-4a0a-8240-2b2eba0decf6\") " pod="openstack/ceilometer-0" Dec 01 08:41:26 crc kubenswrapper[5004]: I1201 08:41:26.512887 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cbpw\" (UniqueName: \"kubernetes.io/projected/39fa01cf-5ba6-4a0a-8240-2b2eba0decf6-kube-api-access-2cbpw\") pod \"ceilometer-0\" (UID: \"39fa01cf-5ba6-4a0a-8240-2b2eba0decf6\") " pod="openstack/ceilometer-0" Dec 01 08:41:26 crc kubenswrapper[5004]: I1201 08:41:26.512947 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39fa01cf-5ba6-4a0a-8240-2b2eba0decf6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"39fa01cf-5ba6-4a0a-8240-2b2eba0decf6\") " pod="openstack/ceilometer-0" Dec 01 08:41:26 crc kubenswrapper[5004]: I1201 08:41:26.512977 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/39fa01cf-5ba6-4a0a-8240-2b2eba0decf6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"39fa01cf-5ba6-4a0a-8240-2b2eba0decf6\") " pod="openstack/ceilometer-0" Dec 01 08:41:26 crc kubenswrapper[5004]: I1201 08:41:26.513016 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39fa01cf-5ba6-4a0a-8240-2b2eba0decf6-config-data\") pod \"ceilometer-0\" (UID: \"39fa01cf-5ba6-4a0a-8240-2b2eba0decf6\") " pod="openstack/ceilometer-0" Dec 01 08:41:26 crc kubenswrapper[5004]: I1201 08:41:26.513117 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39fa01cf-5ba6-4a0a-8240-2b2eba0decf6-scripts\") pod \"ceilometer-0\" (UID: \"39fa01cf-5ba6-4a0a-8240-2b2eba0decf6\") " pod="openstack/ceilometer-0" Dec 01 08:41:26 crc kubenswrapper[5004]: I1201 08:41:26.616224 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39fa01cf-5ba6-4a0a-8240-2b2eba0decf6-scripts\") pod \"ceilometer-0\" (UID: \"39fa01cf-5ba6-4a0a-8240-2b2eba0decf6\") " pod="openstack/ceilometer-0" Dec 01 08:41:26 crc kubenswrapper[5004]: I1201 08:41:26.616271 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/39fa01cf-5ba6-4a0a-8240-2b2eba0decf6-run-httpd\") pod \"ceilometer-0\" (UID: \"39fa01cf-5ba6-4a0a-8240-2b2eba0decf6\") " pod="openstack/ceilometer-0" Dec 01 08:41:26 crc kubenswrapper[5004]: I1201 08:41:26.616319 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/39fa01cf-5ba6-4a0a-8240-2b2eba0decf6-log-httpd\") pod \"ceilometer-0\" (UID: \"39fa01cf-5ba6-4a0a-8240-2b2eba0decf6\") " pod="openstack/ceilometer-0" Dec 01 08:41:26 crc kubenswrapper[5004]: I1201 08:41:26.616346 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cbpw\" (UniqueName: \"kubernetes.io/projected/39fa01cf-5ba6-4a0a-8240-2b2eba0decf6-kube-api-access-2cbpw\") pod \"ceilometer-0\" (UID: \"39fa01cf-5ba6-4a0a-8240-2b2eba0decf6\") " pod="openstack/ceilometer-0" Dec 01 08:41:26 crc kubenswrapper[5004]: I1201 08:41:26.616419 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39fa01cf-5ba6-4a0a-8240-2b2eba0decf6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"39fa01cf-5ba6-4a0a-8240-2b2eba0decf6\") " pod="openstack/ceilometer-0" Dec 01 08:41:26 crc kubenswrapper[5004]: I1201 08:41:26.616461 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/39fa01cf-5ba6-4a0a-8240-2b2eba0decf6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"39fa01cf-5ba6-4a0a-8240-2b2eba0decf6\") " pod="openstack/ceilometer-0" Dec 01 08:41:26 crc kubenswrapper[5004]: I1201 08:41:26.616501 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39fa01cf-5ba6-4a0a-8240-2b2eba0decf6-config-data\") pod \"ceilometer-0\" (UID: \"39fa01cf-5ba6-4a0a-8240-2b2eba0decf6\") " pod="openstack/ceilometer-0" Dec 01 08:41:26 crc kubenswrapper[5004]: I1201 08:41:26.617640 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/39fa01cf-5ba6-4a0a-8240-2b2eba0decf6-log-httpd\") pod \"ceilometer-0\" (UID: \"39fa01cf-5ba6-4a0a-8240-2b2eba0decf6\") " pod="openstack/ceilometer-0" Dec 01 08:41:26 crc kubenswrapper[5004]: I1201 08:41:26.618102 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/39fa01cf-5ba6-4a0a-8240-2b2eba0decf6-run-httpd\") pod \"ceilometer-0\" (UID: \"39fa01cf-5ba6-4a0a-8240-2b2eba0decf6\") " pod="openstack/ceilometer-0" Dec 01 08:41:26 crc kubenswrapper[5004]: I1201 08:41:26.620831 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39fa01cf-5ba6-4a0a-8240-2b2eba0decf6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"39fa01cf-5ba6-4a0a-8240-2b2eba0decf6\") " pod="openstack/ceilometer-0" Dec 01 08:41:26 crc kubenswrapper[5004]: I1201 08:41:26.621105 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/39fa01cf-5ba6-4a0a-8240-2b2eba0decf6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"39fa01cf-5ba6-4a0a-8240-2b2eba0decf6\") " pod="openstack/ceilometer-0" Dec 01 08:41:26 crc kubenswrapper[5004]: I1201 08:41:26.621594 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39fa01cf-5ba6-4a0a-8240-2b2eba0decf6-scripts\") pod \"ceilometer-0\" (UID: \"39fa01cf-5ba6-4a0a-8240-2b2eba0decf6\") " pod="openstack/ceilometer-0" Dec 01 08:41:26 crc kubenswrapper[5004]: I1201 08:41:26.621642 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39fa01cf-5ba6-4a0a-8240-2b2eba0decf6-config-data\") pod \"ceilometer-0\" (UID: \"39fa01cf-5ba6-4a0a-8240-2b2eba0decf6\") " pod="openstack/ceilometer-0" Dec 01 08:41:26 crc kubenswrapper[5004]: I1201 08:41:26.635601 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cbpw\" (UniqueName: \"kubernetes.io/projected/39fa01cf-5ba6-4a0a-8240-2b2eba0decf6-kube-api-access-2cbpw\") pod \"ceilometer-0\" (UID: \"39fa01cf-5ba6-4a0a-8240-2b2eba0decf6\") " pod="openstack/ceilometer-0" Dec 01 08:41:26 crc kubenswrapper[5004]: I1201 08:41:26.721940 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 08:41:26 crc kubenswrapper[5004]: I1201 08:41:26.782360 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8d0df30-51e3-436d-a6a6-f65dcce6db43" path="/var/lib/kubelet/pods/e8d0df30-51e3-436d-a6a6-f65dcce6db43/volumes" Dec 01 08:41:29 crc kubenswrapper[5004]: I1201 08:41:29.572678 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 08:41:30 crc kubenswrapper[5004]: I1201 08:41:30.820014 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7977b5fddc-g9fhv" Dec 01 08:41:30 crc kubenswrapper[5004]: I1201 08:41:30.825459 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7977b5fddc-g9fhv" Dec 01 08:41:34 crc kubenswrapper[5004]: I1201 08:41:34.851405 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 08:41:35 crc kubenswrapper[5004]: I1201 08:41:35.503712 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"a4252410-0df2-4381-97da-772a062f3f8b","Type":"ContainerStarted","Data":"5eb5fe8f6c0af2543114d3f49f113082836bebbfa4f5e568fecbb6cea106b3e0"} Dec 01 08:41:35 crc kubenswrapper[5004]: I1201 08:41:35.505301 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"39fa01cf-5ba6-4a0a-8240-2b2eba0decf6","Type":"ContainerStarted","Data":"fea9b49602e197a763761422d6a8c36ad1af8c0dea50d4f727f197ea88646a2e"} Dec 01 08:41:35 crc kubenswrapper[5004]: I1201 08:41:35.523421 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.248611231 podStartE2EDuration="14.523401431s" podCreationTimestamp="2025-12-01 08:41:21 +0000 UTC" firstStartedPulling="2025-12-01 08:41:22.278474004 +0000 UTC m=+1459.843465986" lastFinishedPulling="2025-12-01 08:41:34.553264204 +0000 UTC m=+1472.118256186" observedRunningTime="2025-12-01 08:41:35.520296285 +0000 UTC m=+1473.085288267" watchObservedRunningTime="2025-12-01 08:41:35.523401431 +0000 UTC m=+1473.088393413" Dec 01 08:41:36 crc kubenswrapper[5004]: I1201 08:41:36.517831 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"39fa01cf-5ba6-4a0a-8240-2b2eba0decf6","Type":"ContainerStarted","Data":"f85fe523c36185516b7a492fd599b5d479b15257da6a82e8a48292efd8203240"} Dec 01 08:41:37 crc kubenswrapper[5004]: I1201 08:41:37.469020 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-5d95bb6c5-cljg2"] Dec 01 08:41:37 crc kubenswrapper[5004]: I1201 08:41:37.470545 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-5d95bb6c5-cljg2" Dec 01 08:41:37 crc kubenswrapper[5004]: I1201 08:41:37.475475 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-d6ghb" Dec 01 08:41:37 crc kubenswrapper[5004]: I1201 08:41:37.475953 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Dec 01 08:41:37 crc kubenswrapper[5004]: I1201 08:41:37.476099 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Dec 01 08:41:37 crc kubenswrapper[5004]: I1201 08:41:37.498134 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-5d95bb6c5-cljg2"] Dec 01 08:41:37 crc kubenswrapper[5004]: I1201 08:41:37.604035 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4jhq\" (UniqueName: \"kubernetes.io/projected/3ecc657f-21cf-40d3-8f77-29a9f9c4f5f2-kube-api-access-m4jhq\") pod \"heat-engine-5d95bb6c5-cljg2\" (UID: \"3ecc657f-21cf-40d3-8f77-29a9f9c4f5f2\") " pod="openstack/heat-engine-5d95bb6c5-cljg2" Dec 01 08:41:37 crc kubenswrapper[5004]: I1201 08:41:37.604343 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ecc657f-21cf-40d3-8f77-29a9f9c4f5f2-combined-ca-bundle\") pod \"heat-engine-5d95bb6c5-cljg2\" (UID: \"3ecc657f-21cf-40d3-8f77-29a9f9c4f5f2\") " pod="openstack/heat-engine-5d95bb6c5-cljg2" Dec 01 08:41:37 crc kubenswrapper[5004]: I1201 08:41:37.605617 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3ecc657f-21cf-40d3-8f77-29a9f9c4f5f2-config-data-custom\") pod \"heat-engine-5d95bb6c5-cljg2\" (UID: \"3ecc657f-21cf-40d3-8f77-29a9f9c4f5f2\") " pod="openstack/heat-engine-5d95bb6c5-cljg2" Dec 01 08:41:37 crc kubenswrapper[5004]: I1201 08:41:37.605675 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ecc657f-21cf-40d3-8f77-29a9f9c4f5f2-config-data\") pod \"heat-engine-5d95bb6c5-cljg2\" (UID: \"3ecc657f-21cf-40d3-8f77-29a9f9c4f5f2\") " pod="openstack/heat-engine-5d95bb6c5-cljg2" Dec 01 08:41:37 crc kubenswrapper[5004]: I1201 08:41:37.616462 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"39fa01cf-5ba6-4a0a-8240-2b2eba0decf6","Type":"ContainerStarted","Data":"0d3638419e618bdc1512483089151d9c94e2ca2076f244ebbfb1eef43d956740"} Dec 01 08:41:37 crc kubenswrapper[5004]: I1201 08:41:37.672397 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7d978555f9-fpvfz"] Dec 01 08:41:37 crc kubenswrapper[5004]: I1201 08:41:37.678609 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d978555f9-fpvfz" Dec 01 08:41:37 crc kubenswrapper[5004]: I1201 08:41:37.692810 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-557748dddd-qnftd"] Dec 01 08:41:37 crc kubenswrapper[5004]: I1201 08:41:37.694927 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-557748dddd-qnftd" Dec 01 08:41:37 crc kubenswrapper[5004]: I1201 08:41:37.707969 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Dec 01 08:41:37 crc kubenswrapper[5004]: I1201 08:41:37.709587 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4jhq\" (UniqueName: \"kubernetes.io/projected/3ecc657f-21cf-40d3-8f77-29a9f9c4f5f2-kube-api-access-m4jhq\") pod \"heat-engine-5d95bb6c5-cljg2\" (UID: \"3ecc657f-21cf-40d3-8f77-29a9f9c4f5f2\") " pod="openstack/heat-engine-5d95bb6c5-cljg2" Dec 01 08:41:37 crc kubenswrapper[5004]: I1201 08:41:37.709624 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ecc657f-21cf-40d3-8f77-29a9f9c4f5f2-combined-ca-bundle\") pod \"heat-engine-5d95bb6c5-cljg2\" (UID: \"3ecc657f-21cf-40d3-8f77-29a9f9c4f5f2\") " pod="openstack/heat-engine-5d95bb6c5-cljg2" Dec 01 08:41:37 crc kubenswrapper[5004]: I1201 08:41:37.709714 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3ecc657f-21cf-40d3-8f77-29a9f9c4f5f2-config-data-custom\") pod \"heat-engine-5d95bb6c5-cljg2\" (UID: \"3ecc657f-21cf-40d3-8f77-29a9f9c4f5f2\") " pod="openstack/heat-engine-5d95bb6c5-cljg2" Dec 01 08:41:37 crc kubenswrapper[5004]: I1201 08:41:37.709734 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ecc657f-21cf-40d3-8f77-29a9f9c4f5f2-config-data\") pod \"heat-engine-5d95bb6c5-cljg2\" (UID: \"3ecc657f-21cf-40d3-8f77-29a9f9c4f5f2\") " pod="openstack/heat-engine-5d95bb6c5-cljg2" Dec 01 08:41:37 crc kubenswrapper[5004]: I1201 08:41:37.715317 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3ecc657f-21cf-40d3-8f77-29a9f9c4f5f2-config-data-custom\") pod \"heat-engine-5d95bb6c5-cljg2\" (UID: \"3ecc657f-21cf-40d3-8f77-29a9f9c4f5f2\") " pod="openstack/heat-engine-5d95bb6c5-cljg2" Dec 01 08:41:37 crc kubenswrapper[5004]: I1201 08:41:37.719922 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ecc657f-21cf-40d3-8f77-29a9f9c4f5f2-combined-ca-bundle\") pod \"heat-engine-5d95bb6c5-cljg2\" (UID: \"3ecc657f-21cf-40d3-8f77-29a9f9c4f5f2\") " pod="openstack/heat-engine-5d95bb6c5-cljg2" Dec 01 08:41:37 crc kubenswrapper[5004]: I1201 08:41:37.722671 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d978555f9-fpvfz"] Dec 01 08:41:37 crc kubenswrapper[5004]: I1201 08:41:37.727978 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ecc657f-21cf-40d3-8f77-29a9f9c4f5f2-config-data\") pod \"heat-engine-5d95bb6c5-cljg2\" (UID: \"3ecc657f-21cf-40d3-8f77-29a9f9c4f5f2\") " pod="openstack/heat-engine-5d95bb6c5-cljg2" Dec 01 08:41:37 crc kubenswrapper[5004]: I1201 08:41:37.740082 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4jhq\" (UniqueName: \"kubernetes.io/projected/3ecc657f-21cf-40d3-8f77-29a9f9c4f5f2-kube-api-access-m4jhq\") pod \"heat-engine-5d95bb6c5-cljg2\" (UID: \"3ecc657f-21cf-40d3-8f77-29a9f9c4f5f2\") " pod="openstack/heat-engine-5d95bb6c5-cljg2" Dec 01 08:41:37 crc kubenswrapper[5004]: I1201 08:41:37.746625 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-557748dddd-qnftd"] Dec 01 08:41:37 crc kubenswrapper[5004]: I1201 08:41:37.772629 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-566b457cd5-4qc77"] Dec 01 08:41:37 crc kubenswrapper[5004]: I1201 08:41:37.779279 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-566b457cd5-4qc77" Dec 01 08:41:37 crc kubenswrapper[5004]: I1201 08:41:37.785915 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Dec 01 08:41:37 crc kubenswrapper[5004]: I1201 08:41:37.788657 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-566b457cd5-4qc77"] Dec 01 08:41:37 crc kubenswrapper[5004]: I1201 08:41:37.811463 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05a450a4-acc5-4c46-ae41-7ff4de029a72-combined-ca-bundle\") pod \"heat-cfnapi-557748dddd-qnftd\" (UID: \"05a450a4-acc5-4c46-ae41-7ff4de029a72\") " pod="openstack/heat-cfnapi-557748dddd-qnftd" Dec 01 08:41:37 crc kubenswrapper[5004]: I1201 08:41:37.811530 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6d31297b-7a09-4806-b488-1c6e7ea17ab0-ovsdbserver-nb\") pod \"dnsmasq-dns-7d978555f9-fpvfz\" (UID: \"6d31297b-7a09-4806-b488-1c6e7ea17ab0\") " pod="openstack/dnsmasq-dns-7d978555f9-fpvfz" Dec 01 08:41:37 crc kubenswrapper[5004]: I1201 08:41:37.811784 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d31297b-7a09-4806-b488-1c6e7ea17ab0-dns-svc\") pod \"dnsmasq-dns-7d978555f9-fpvfz\" (UID: \"6d31297b-7a09-4806-b488-1c6e7ea17ab0\") " pod="openstack/dnsmasq-dns-7d978555f9-fpvfz" Dec 01 08:41:37 crc kubenswrapper[5004]: I1201 08:41:37.811826 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mj89\" (UniqueName: \"kubernetes.io/projected/6d31297b-7a09-4806-b488-1c6e7ea17ab0-kube-api-access-6mj89\") pod \"dnsmasq-dns-7d978555f9-fpvfz\" (UID: \"6d31297b-7a09-4806-b488-1c6e7ea17ab0\") " pod="openstack/dnsmasq-dns-7d978555f9-fpvfz" Dec 01 08:41:37 crc kubenswrapper[5004]: I1201 08:41:37.811889 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d31297b-7a09-4806-b488-1c6e7ea17ab0-config\") pod \"dnsmasq-dns-7d978555f9-fpvfz\" (UID: \"6d31297b-7a09-4806-b488-1c6e7ea17ab0\") " pod="openstack/dnsmasq-dns-7d978555f9-fpvfz" Dec 01 08:41:37 crc kubenswrapper[5004]: I1201 08:41:37.811984 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05a450a4-acc5-4c46-ae41-7ff4de029a72-config-data\") pod \"heat-cfnapi-557748dddd-qnftd\" (UID: \"05a450a4-acc5-4c46-ae41-7ff4de029a72\") " pod="openstack/heat-cfnapi-557748dddd-qnftd" Dec 01 08:41:37 crc kubenswrapper[5004]: I1201 08:41:37.812077 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/05a450a4-acc5-4c46-ae41-7ff4de029a72-config-data-custom\") pod \"heat-cfnapi-557748dddd-qnftd\" (UID: \"05a450a4-acc5-4c46-ae41-7ff4de029a72\") " pod="openstack/heat-cfnapi-557748dddd-qnftd" Dec 01 08:41:37 crc kubenswrapper[5004]: I1201 08:41:37.812153 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6d31297b-7a09-4806-b488-1c6e7ea17ab0-ovsdbserver-sb\") pod \"dnsmasq-dns-7d978555f9-fpvfz\" (UID: \"6d31297b-7a09-4806-b488-1c6e7ea17ab0\") " pod="openstack/dnsmasq-dns-7d978555f9-fpvfz" Dec 01 08:41:37 crc kubenswrapper[5004]: I1201 08:41:37.812279 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6d31297b-7a09-4806-b488-1c6e7ea17ab0-dns-swift-storage-0\") pod \"dnsmasq-dns-7d978555f9-fpvfz\" (UID: \"6d31297b-7a09-4806-b488-1c6e7ea17ab0\") " pod="openstack/dnsmasq-dns-7d978555f9-fpvfz" Dec 01 08:41:37 crc kubenswrapper[5004]: I1201 08:41:37.812311 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9zgw\" (UniqueName: \"kubernetes.io/projected/05a450a4-acc5-4c46-ae41-7ff4de029a72-kube-api-access-j9zgw\") pod \"heat-cfnapi-557748dddd-qnftd\" (UID: \"05a450a4-acc5-4c46-ae41-7ff4de029a72\") " pod="openstack/heat-cfnapi-557748dddd-qnftd" Dec 01 08:41:37 crc kubenswrapper[5004]: I1201 08:41:37.873437 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-5d95bb6c5-cljg2" Dec 01 08:41:37 crc kubenswrapper[5004]: I1201 08:41:37.914011 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/972d0223-e732-4bd0-99e3-dccb174a2511-config-data-custom\") pod \"heat-api-566b457cd5-4qc77\" (UID: \"972d0223-e732-4bd0-99e3-dccb174a2511\") " pod="openstack/heat-api-566b457cd5-4qc77" Dec 01 08:41:37 crc kubenswrapper[5004]: I1201 08:41:37.914285 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05a450a4-acc5-4c46-ae41-7ff4de029a72-config-data\") pod \"heat-cfnapi-557748dddd-qnftd\" (UID: \"05a450a4-acc5-4c46-ae41-7ff4de029a72\") " pod="openstack/heat-cfnapi-557748dddd-qnftd" Dec 01 08:41:37 crc kubenswrapper[5004]: I1201 08:41:37.914333 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/05a450a4-acc5-4c46-ae41-7ff4de029a72-config-data-custom\") pod \"heat-cfnapi-557748dddd-qnftd\" (UID: \"05a450a4-acc5-4c46-ae41-7ff4de029a72\") " pod="openstack/heat-cfnapi-557748dddd-qnftd" Dec 01 08:41:37 crc kubenswrapper[5004]: I1201 08:41:37.914349 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/972d0223-e732-4bd0-99e3-dccb174a2511-combined-ca-bundle\") pod \"heat-api-566b457cd5-4qc77\" (UID: \"972d0223-e732-4bd0-99e3-dccb174a2511\") " pod="openstack/heat-api-566b457cd5-4qc77" Dec 01 08:41:37 crc kubenswrapper[5004]: I1201 08:41:37.914369 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bms4s\" (UniqueName: \"kubernetes.io/projected/972d0223-e732-4bd0-99e3-dccb174a2511-kube-api-access-bms4s\") pod \"heat-api-566b457cd5-4qc77\" (UID: \"972d0223-e732-4bd0-99e3-dccb174a2511\") " pod="openstack/heat-api-566b457cd5-4qc77" Dec 01 08:41:37 crc kubenswrapper[5004]: I1201 08:41:37.914411 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6d31297b-7a09-4806-b488-1c6e7ea17ab0-ovsdbserver-sb\") pod \"dnsmasq-dns-7d978555f9-fpvfz\" (UID: \"6d31297b-7a09-4806-b488-1c6e7ea17ab0\") " pod="openstack/dnsmasq-dns-7d978555f9-fpvfz" Dec 01 08:41:37 crc kubenswrapper[5004]: I1201 08:41:37.914465 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6d31297b-7a09-4806-b488-1c6e7ea17ab0-dns-swift-storage-0\") pod \"dnsmasq-dns-7d978555f9-fpvfz\" (UID: \"6d31297b-7a09-4806-b488-1c6e7ea17ab0\") " pod="openstack/dnsmasq-dns-7d978555f9-fpvfz" Dec 01 08:41:37 crc kubenswrapper[5004]: I1201 08:41:37.914486 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9zgw\" (UniqueName: \"kubernetes.io/projected/05a450a4-acc5-4c46-ae41-7ff4de029a72-kube-api-access-j9zgw\") pod \"heat-cfnapi-557748dddd-qnftd\" (UID: \"05a450a4-acc5-4c46-ae41-7ff4de029a72\") " pod="openstack/heat-cfnapi-557748dddd-qnftd" Dec 01 08:41:37 crc kubenswrapper[5004]: I1201 08:41:37.914526 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05a450a4-acc5-4c46-ae41-7ff4de029a72-combined-ca-bundle\") pod \"heat-cfnapi-557748dddd-qnftd\" (UID: \"05a450a4-acc5-4c46-ae41-7ff4de029a72\") " pod="openstack/heat-cfnapi-557748dddd-qnftd" Dec 01 08:41:37 crc kubenswrapper[5004]: I1201 08:41:37.914575 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6d31297b-7a09-4806-b488-1c6e7ea17ab0-ovsdbserver-nb\") pod \"dnsmasq-dns-7d978555f9-fpvfz\" (UID: \"6d31297b-7a09-4806-b488-1c6e7ea17ab0\") " pod="openstack/dnsmasq-dns-7d978555f9-fpvfz" Dec 01 08:41:37 crc kubenswrapper[5004]: I1201 08:41:37.914592 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/972d0223-e732-4bd0-99e3-dccb174a2511-config-data\") pod \"heat-api-566b457cd5-4qc77\" (UID: \"972d0223-e732-4bd0-99e3-dccb174a2511\") " pod="openstack/heat-api-566b457cd5-4qc77" Dec 01 08:41:37 crc kubenswrapper[5004]: I1201 08:41:37.914651 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d31297b-7a09-4806-b488-1c6e7ea17ab0-dns-svc\") pod \"dnsmasq-dns-7d978555f9-fpvfz\" (UID: \"6d31297b-7a09-4806-b488-1c6e7ea17ab0\") " pod="openstack/dnsmasq-dns-7d978555f9-fpvfz" Dec 01 08:41:37 crc kubenswrapper[5004]: I1201 08:41:37.914672 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mj89\" (UniqueName: \"kubernetes.io/projected/6d31297b-7a09-4806-b488-1c6e7ea17ab0-kube-api-access-6mj89\") pod \"dnsmasq-dns-7d978555f9-fpvfz\" (UID: \"6d31297b-7a09-4806-b488-1c6e7ea17ab0\") " pod="openstack/dnsmasq-dns-7d978555f9-fpvfz" Dec 01 08:41:37 crc kubenswrapper[5004]: I1201 08:41:37.914703 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d31297b-7a09-4806-b488-1c6e7ea17ab0-config\") pod \"dnsmasq-dns-7d978555f9-fpvfz\" (UID: \"6d31297b-7a09-4806-b488-1c6e7ea17ab0\") " pod="openstack/dnsmasq-dns-7d978555f9-fpvfz" Dec 01 08:41:37 crc kubenswrapper[5004]: I1201 08:41:37.916460 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d31297b-7a09-4806-b488-1c6e7ea17ab0-dns-svc\") pod \"dnsmasq-dns-7d978555f9-fpvfz\" (UID: \"6d31297b-7a09-4806-b488-1c6e7ea17ab0\") " pod="openstack/dnsmasq-dns-7d978555f9-fpvfz" Dec 01 08:41:37 crc kubenswrapper[5004]: I1201 08:41:37.917111 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6d31297b-7a09-4806-b488-1c6e7ea17ab0-ovsdbserver-sb\") pod \"dnsmasq-dns-7d978555f9-fpvfz\" (UID: \"6d31297b-7a09-4806-b488-1c6e7ea17ab0\") " pod="openstack/dnsmasq-dns-7d978555f9-fpvfz" Dec 01 08:41:37 crc kubenswrapper[5004]: I1201 08:41:37.917885 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6d31297b-7a09-4806-b488-1c6e7ea17ab0-dns-swift-storage-0\") pod \"dnsmasq-dns-7d978555f9-fpvfz\" (UID: \"6d31297b-7a09-4806-b488-1c6e7ea17ab0\") " pod="openstack/dnsmasq-dns-7d978555f9-fpvfz" Dec 01 08:41:37 crc kubenswrapper[5004]: I1201 08:41:37.918934 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d31297b-7a09-4806-b488-1c6e7ea17ab0-config\") pod \"dnsmasq-dns-7d978555f9-fpvfz\" (UID: \"6d31297b-7a09-4806-b488-1c6e7ea17ab0\") " pod="openstack/dnsmasq-dns-7d978555f9-fpvfz" Dec 01 08:41:37 crc kubenswrapper[5004]: I1201 08:41:37.919541 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05a450a4-acc5-4c46-ae41-7ff4de029a72-config-data\") pod \"heat-cfnapi-557748dddd-qnftd\" (UID: \"05a450a4-acc5-4c46-ae41-7ff4de029a72\") " pod="openstack/heat-cfnapi-557748dddd-qnftd" Dec 01 08:41:37 crc kubenswrapper[5004]: I1201 08:41:37.919811 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05a450a4-acc5-4c46-ae41-7ff4de029a72-combined-ca-bundle\") pod \"heat-cfnapi-557748dddd-qnftd\" (UID: \"05a450a4-acc5-4c46-ae41-7ff4de029a72\") " pod="openstack/heat-cfnapi-557748dddd-qnftd" Dec 01 08:41:37 crc kubenswrapper[5004]: I1201 08:41:37.920621 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/05a450a4-acc5-4c46-ae41-7ff4de029a72-config-data-custom\") pod \"heat-cfnapi-557748dddd-qnftd\" (UID: \"05a450a4-acc5-4c46-ae41-7ff4de029a72\") " pod="openstack/heat-cfnapi-557748dddd-qnftd" Dec 01 08:41:37 crc kubenswrapper[5004]: I1201 08:41:37.923948 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6d31297b-7a09-4806-b488-1c6e7ea17ab0-ovsdbserver-nb\") pod \"dnsmasq-dns-7d978555f9-fpvfz\" (UID: \"6d31297b-7a09-4806-b488-1c6e7ea17ab0\") " pod="openstack/dnsmasq-dns-7d978555f9-fpvfz" Dec 01 08:41:37 crc kubenswrapper[5004]: I1201 08:41:37.932376 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mj89\" (UniqueName: \"kubernetes.io/projected/6d31297b-7a09-4806-b488-1c6e7ea17ab0-kube-api-access-6mj89\") pod \"dnsmasq-dns-7d978555f9-fpvfz\" (UID: \"6d31297b-7a09-4806-b488-1c6e7ea17ab0\") " pod="openstack/dnsmasq-dns-7d978555f9-fpvfz" Dec 01 08:41:37 crc kubenswrapper[5004]: I1201 08:41:37.933330 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9zgw\" (UniqueName: \"kubernetes.io/projected/05a450a4-acc5-4c46-ae41-7ff4de029a72-kube-api-access-j9zgw\") pod \"heat-cfnapi-557748dddd-qnftd\" (UID: \"05a450a4-acc5-4c46-ae41-7ff4de029a72\") " pod="openstack/heat-cfnapi-557748dddd-qnftd" Dec 01 08:41:38 crc kubenswrapper[5004]: I1201 08:41:38.024235 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/972d0223-e732-4bd0-99e3-dccb174a2511-config-data-custom\") pod \"heat-api-566b457cd5-4qc77\" (UID: \"972d0223-e732-4bd0-99e3-dccb174a2511\") " pod="openstack/heat-api-566b457cd5-4qc77" Dec 01 08:41:38 crc kubenswrapper[5004]: I1201 08:41:38.024305 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/972d0223-e732-4bd0-99e3-dccb174a2511-combined-ca-bundle\") pod \"heat-api-566b457cd5-4qc77\" (UID: \"972d0223-e732-4bd0-99e3-dccb174a2511\") " pod="openstack/heat-api-566b457cd5-4qc77" Dec 01 08:41:38 crc kubenswrapper[5004]: I1201 08:41:38.024323 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bms4s\" (UniqueName: \"kubernetes.io/projected/972d0223-e732-4bd0-99e3-dccb174a2511-kube-api-access-bms4s\") pod \"heat-api-566b457cd5-4qc77\" (UID: \"972d0223-e732-4bd0-99e3-dccb174a2511\") " pod="openstack/heat-api-566b457cd5-4qc77" Dec 01 08:41:38 crc kubenswrapper[5004]: I1201 08:41:38.024456 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/972d0223-e732-4bd0-99e3-dccb174a2511-config-data\") pod \"heat-api-566b457cd5-4qc77\" (UID: \"972d0223-e732-4bd0-99e3-dccb174a2511\") " pod="openstack/heat-api-566b457cd5-4qc77" Dec 01 08:41:38 crc kubenswrapper[5004]: I1201 08:41:38.031987 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/972d0223-e732-4bd0-99e3-dccb174a2511-config-data\") pod \"heat-api-566b457cd5-4qc77\" (UID: \"972d0223-e732-4bd0-99e3-dccb174a2511\") " pod="openstack/heat-api-566b457cd5-4qc77" Dec 01 08:41:38 crc kubenswrapper[5004]: I1201 08:41:38.032335 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/972d0223-e732-4bd0-99e3-dccb174a2511-combined-ca-bundle\") pod \"heat-api-566b457cd5-4qc77\" (UID: \"972d0223-e732-4bd0-99e3-dccb174a2511\") " pod="openstack/heat-api-566b457cd5-4qc77" Dec 01 08:41:38 crc kubenswrapper[5004]: I1201 08:41:38.038185 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/972d0223-e732-4bd0-99e3-dccb174a2511-config-data-custom\") pod \"heat-api-566b457cd5-4qc77\" (UID: \"972d0223-e732-4bd0-99e3-dccb174a2511\") " pod="openstack/heat-api-566b457cd5-4qc77" Dec 01 08:41:38 crc kubenswrapper[5004]: I1201 08:41:38.055532 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bms4s\" (UniqueName: \"kubernetes.io/projected/972d0223-e732-4bd0-99e3-dccb174a2511-kube-api-access-bms4s\") pod \"heat-api-566b457cd5-4qc77\" (UID: \"972d0223-e732-4bd0-99e3-dccb174a2511\") " pod="openstack/heat-api-566b457cd5-4qc77" Dec 01 08:41:38 crc kubenswrapper[5004]: I1201 08:41:38.142994 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d978555f9-fpvfz" Dec 01 08:41:38 crc kubenswrapper[5004]: I1201 08:41:38.159949 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-557748dddd-qnftd" Dec 01 08:41:38 crc kubenswrapper[5004]: I1201 08:41:38.173455 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-566b457cd5-4qc77" Dec 01 08:41:38 crc kubenswrapper[5004]: I1201 08:41:38.415313 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-5d95bb6c5-cljg2"] Dec 01 08:41:38 crc kubenswrapper[5004]: I1201 08:41:38.628158 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-5d95bb6c5-cljg2" event={"ID":"3ecc657f-21cf-40d3-8f77-29a9f9c4f5f2","Type":"ContainerStarted","Data":"e7c1e8a89251856a63c98529f409523d5cc3fe18ece5baa926844fd4b60da735"} Dec 01 08:41:38 crc kubenswrapper[5004]: I1201 08:41:38.632472 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"39fa01cf-5ba6-4a0a-8240-2b2eba0decf6","Type":"ContainerStarted","Data":"87be489943345253cd96375601b91a7ebcde30a3d80175c025c2d064b4578e0a"} Dec 01 08:41:38 crc kubenswrapper[5004]: I1201 08:41:38.844903 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-557748dddd-qnftd"] Dec 01 08:41:38 crc kubenswrapper[5004]: W1201 08:41:38.848814 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05a450a4_acc5_4c46_ae41_7ff4de029a72.slice/crio-ad6122e54d7043f7e6a93267e3de71ebc76b42290e050a1104dbe43ba2f56b3d WatchSource:0}: Error finding container ad6122e54d7043f7e6a93267e3de71ebc76b42290e050a1104dbe43ba2f56b3d: Status 404 returned error can't find the container with id ad6122e54d7043f7e6a93267e3de71ebc76b42290e050a1104dbe43ba2f56b3d Dec 01 08:41:38 crc kubenswrapper[5004]: I1201 08:41:38.857702 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d978555f9-fpvfz"] Dec 01 08:41:39 crc kubenswrapper[5004]: I1201 08:41:39.032978 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-566b457cd5-4qc77"] Dec 01 08:41:39 crc kubenswrapper[5004]: W1201 08:41:39.039267 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod972d0223_e732_4bd0_99e3_dccb174a2511.slice/crio-3bad11a123a22b0f885f6627327db42ee0267efc915fc48564cee3612260787e WatchSource:0}: Error finding container 3bad11a123a22b0f885f6627327db42ee0267efc915fc48564cee3612260787e: Status 404 returned error can't find the container with id 3bad11a123a22b0f885f6627327db42ee0267efc915fc48564cee3612260787e Dec 01 08:41:39 crc kubenswrapper[5004]: I1201 08:41:39.658690 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-557748dddd-qnftd" event={"ID":"05a450a4-acc5-4c46-ae41-7ff4de029a72","Type":"ContainerStarted","Data":"ad6122e54d7043f7e6a93267e3de71ebc76b42290e050a1104dbe43ba2f56b3d"} Dec 01 08:41:39 crc kubenswrapper[5004]: I1201 08:41:39.673957 5004 generic.go:334] "Generic (PLEG): container finished" podID="6d31297b-7a09-4806-b488-1c6e7ea17ab0" containerID="4e026f98251c490003dc4784baabd4babacba50787c6e7159e604e1bea8ba9a2" exitCode=0 Dec 01 08:41:39 crc kubenswrapper[5004]: I1201 08:41:39.674041 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d978555f9-fpvfz" event={"ID":"6d31297b-7a09-4806-b488-1c6e7ea17ab0","Type":"ContainerDied","Data":"4e026f98251c490003dc4784baabd4babacba50787c6e7159e604e1bea8ba9a2"} Dec 01 08:41:39 crc kubenswrapper[5004]: I1201 08:41:39.674069 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d978555f9-fpvfz" event={"ID":"6d31297b-7a09-4806-b488-1c6e7ea17ab0","Type":"ContainerStarted","Data":"fb3a125a6d540161eefbcb6bffdfbe144352dd3650ec23d544e6eecf4d45b36e"} Dec 01 08:41:39 crc kubenswrapper[5004]: I1201 08:41:39.685057 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-566b457cd5-4qc77" event={"ID":"972d0223-e732-4bd0-99e3-dccb174a2511","Type":"ContainerStarted","Data":"3bad11a123a22b0f885f6627327db42ee0267efc915fc48564cee3612260787e"} Dec 01 08:41:39 crc kubenswrapper[5004]: I1201 08:41:39.688398 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"39fa01cf-5ba6-4a0a-8240-2b2eba0decf6","Type":"ContainerStarted","Data":"837002c0fb6c00f56e148db9ac7b49b19a3039824aa05dbd99d9de35d418e360"} Dec 01 08:41:39 crc kubenswrapper[5004]: I1201 08:41:39.688602 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="39fa01cf-5ba6-4a0a-8240-2b2eba0decf6" containerName="ceilometer-central-agent" containerID="cri-o://f85fe523c36185516b7a492fd599b5d479b15257da6a82e8a48292efd8203240" gracePeriod=30 Dec 01 08:41:39 crc kubenswrapper[5004]: I1201 08:41:39.688684 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="39fa01cf-5ba6-4a0a-8240-2b2eba0decf6" containerName="proxy-httpd" containerID="cri-o://837002c0fb6c00f56e148db9ac7b49b19a3039824aa05dbd99d9de35d418e360" gracePeriod=30 Dec 01 08:41:39 crc kubenswrapper[5004]: I1201 08:41:39.688712 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 08:41:39 crc kubenswrapper[5004]: I1201 08:41:39.688736 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="39fa01cf-5ba6-4a0a-8240-2b2eba0decf6" containerName="sg-core" containerID="cri-o://87be489943345253cd96375601b91a7ebcde30a3d80175c025c2d064b4578e0a" gracePeriod=30 Dec 01 08:41:39 crc kubenswrapper[5004]: I1201 08:41:39.688779 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="39fa01cf-5ba6-4a0a-8240-2b2eba0decf6" containerName="ceilometer-notification-agent" containerID="cri-o://0d3638419e618bdc1512483089151d9c94e2ca2076f244ebbfb1eef43d956740" gracePeriod=30 Dec 01 08:41:39 crc kubenswrapper[5004]: I1201 08:41:39.721743 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-5d95bb6c5-cljg2" event={"ID":"3ecc657f-21cf-40d3-8f77-29a9f9c4f5f2","Type":"ContainerStarted","Data":"347b335225f2c0e10e2c63c058f0135ca01ba3b404f9d2161dd6a7d8bc0f56e9"} Dec 01 08:41:39 crc kubenswrapper[5004]: I1201 08:41:39.722671 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-5d95bb6c5-cljg2" Dec 01 08:41:39 crc kubenswrapper[5004]: I1201 08:41:39.793370 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=9.572832938 podStartE2EDuration="13.79334814s" podCreationTimestamp="2025-12-01 08:41:26 +0000 UTC" firstStartedPulling="2025-12-01 08:41:34.880346793 +0000 UTC m=+1472.445338775" lastFinishedPulling="2025-12-01 08:41:39.100861995 +0000 UTC m=+1476.665853977" observedRunningTime="2025-12-01 08:41:39.754384318 +0000 UTC m=+1477.319376300" watchObservedRunningTime="2025-12-01 08:41:39.79334814 +0000 UTC m=+1477.358340122" Dec 01 08:41:39 crc kubenswrapper[5004]: I1201 08:41:39.910515 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-5d95bb6c5-cljg2" podStartSLOduration=2.910494461 podStartE2EDuration="2.910494461s" podCreationTimestamp="2025-12-01 08:41:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:41:39.827682439 +0000 UTC m=+1477.392674421" watchObservedRunningTime="2025-12-01 08:41:39.910494461 +0000 UTC m=+1477.475486443" Dec 01 08:41:40 crc kubenswrapper[5004]: I1201 08:41:40.738654 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d978555f9-fpvfz" event={"ID":"6d31297b-7a09-4806-b488-1c6e7ea17ab0","Type":"ContainerStarted","Data":"0b81a733c8f514f09eb04912b4c63f52c6900f05c44b4ef3f29bffb86206f57b"} Dec 01 08:41:40 crc kubenswrapper[5004]: I1201 08:41:40.739249 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7d978555f9-fpvfz" Dec 01 08:41:40 crc kubenswrapper[5004]: I1201 08:41:40.743001 5004 generic.go:334] "Generic (PLEG): container finished" podID="39fa01cf-5ba6-4a0a-8240-2b2eba0decf6" containerID="837002c0fb6c00f56e148db9ac7b49b19a3039824aa05dbd99d9de35d418e360" exitCode=0 Dec 01 08:41:40 crc kubenswrapper[5004]: I1201 08:41:40.743029 5004 generic.go:334] "Generic (PLEG): container finished" podID="39fa01cf-5ba6-4a0a-8240-2b2eba0decf6" containerID="87be489943345253cd96375601b91a7ebcde30a3d80175c025c2d064b4578e0a" exitCode=2 Dec 01 08:41:40 crc kubenswrapper[5004]: I1201 08:41:40.743036 5004 generic.go:334] "Generic (PLEG): container finished" podID="39fa01cf-5ba6-4a0a-8240-2b2eba0decf6" containerID="0d3638419e618bdc1512483089151d9c94e2ca2076f244ebbfb1eef43d956740" exitCode=0 Dec 01 08:41:40 crc kubenswrapper[5004]: I1201 08:41:40.743816 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"39fa01cf-5ba6-4a0a-8240-2b2eba0decf6","Type":"ContainerDied","Data":"837002c0fb6c00f56e148db9ac7b49b19a3039824aa05dbd99d9de35d418e360"} Dec 01 08:41:40 crc kubenswrapper[5004]: I1201 08:41:40.743856 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"39fa01cf-5ba6-4a0a-8240-2b2eba0decf6","Type":"ContainerDied","Data":"87be489943345253cd96375601b91a7ebcde30a3d80175c025c2d064b4578e0a"} Dec 01 08:41:40 crc kubenswrapper[5004]: I1201 08:41:40.743874 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"39fa01cf-5ba6-4a0a-8240-2b2eba0decf6","Type":"ContainerDied","Data":"0d3638419e618bdc1512483089151d9c94e2ca2076f244ebbfb1eef43d956740"} Dec 01 08:41:40 crc kubenswrapper[5004]: I1201 08:41:40.766728 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7d978555f9-fpvfz" podStartSLOduration=3.766702166 podStartE2EDuration="3.766702166s" podCreationTimestamp="2025-12-01 08:41:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:41:40.75581838 +0000 UTC m=+1478.320810362" watchObservedRunningTime="2025-12-01 08:41:40.766702166 +0000 UTC m=+1478.331694168" Dec 01 08:41:42 crc kubenswrapper[5004]: I1201 08:41:42.777402 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-557748dddd-qnftd" event={"ID":"05a450a4-acc5-4c46-ae41-7ff4de029a72","Type":"ContainerStarted","Data":"abb2a140cbc457456e229bbda184c16b30744ce9421ebda600e9f106c0c18db0"} Dec 01 08:41:42 crc kubenswrapper[5004]: I1201 08:41:42.778155 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-557748dddd-qnftd" Dec 01 08:41:42 crc kubenswrapper[5004]: I1201 08:41:42.783104 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-566b457cd5-4qc77" event={"ID":"972d0223-e732-4bd0-99e3-dccb174a2511","Type":"ContainerStarted","Data":"87467b86401908b1737b6fcf86a15faf6b206ff876c1b0009a4062b3860d1475"} Dec 01 08:41:42 crc kubenswrapper[5004]: I1201 08:41:42.783250 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-566b457cd5-4qc77" Dec 01 08:41:42 crc kubenswrapper[5004]: I1201 08:41:42.825501 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-557748dddd-qnftd" podStartSLOduration=2.7044198980000003 podStartE2EDuration="5.825477204s" podCreationTimestamp="2025-12-01 08:41:37 +0000 UTC" firstStartedPulling="2025-12-01 08:41:38.855799819 +0000 UTC m=+1476.420791841" lastFinishedPulling="2025-12-01 08:41:41.976857135 +0000 UTC m=+1479.541849147" observedRunningTime="2025-12-01 08:41:42.817355606 +0000 UTC m=+1480.382347588" watchObservedRunningTime="2025-12-01 08:41:42.825477204 +0000 UTC m=+1480.390469206" Dec 01 08:41:42 crc kubenswrapper[5004]: I1201 08:41:42.841721 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-566b457cd5-4qc77" podStartSLOduration=2.907140609 podStartE2EDuration="5.84169863s" podCreationTimestamp="2025-12-01 08:41:37 +0000 UTC" firstStartedPulling="2025-12-01 08:41:39.044514009 +0000 UTC m=+1476.609505981" lastFinishedPulling="2025-12-01 08:41:41.97907202 +0000 UTC m=+1479.544064002" observedRunningTime="2025-12-01 08:41:42.838990844 +0000 UTC m=+1480.403982826" watchObservedRunningTime="2025-12-01 08:41:42.84169863 +0000 UTC m=+1480.406690632" Dec 01 08:41:44 crc kubenswrapper[5004]: I1201 08:41:44.890119 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-7cfb9cfbf9-fqxms"] Dec 01 08:41:44 crc kubenswrapper[5004]: I1201 08:41:44.891872 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-7cfb9cfbf9-fqxms" Dec 01 08:41:44 crc kubenswrapper[5004]: I1201 08:41:44.901929 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-7cfb9cfbf9-fqxms"] Dec 01 08:41:44 crc kubenswrapper[5004]: I1201 08:41:44.915797 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-7f549dcc6f-fhw4x"] Dec 01 08:41:44 crc kubenswrapper[5004]: I1201 08:41:44.917212 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7f549dcc6f-fhw4x" Dec 01 08:41:44 crc kubenswrapper[5004]: I1201 08:41:44.935347 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-67bc95bb56-lkml4"] Dec 01 08:41:44 crc kubenswrapper[5004]: I1201 08:41:44.936882 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-67bc95bb56-lkml4" Dec 01 08:41:44 crc kubenswrapper[5004]: I1201 08:41:44.952021 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-67bc95bb56-lkml4"] Dec 01 08:41:44 crc kubenswrapper[5004]: I1201 08:41:44.967620 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-7f549dcc6f-fhw4x"] Dec 01 08:41:45 crc kubenswrapper[5004]: I1201 08:41:45.024647 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66885fce-0b69-4fc5-b4cd-f7b33bee9046-config-data\") pod \"heat-engine-7cfb9cfbf9-fqxms\" (UID: \"66885fce-0b69-4fc5-b4cd-f7b33bee9046\") " pod="openstack/heat-engine-7cfb9cfbf9-fqxms" Dec 01 08:41:45 crc kubenswrapper[5004]: I1201 08:41:45.024700 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e866b18-ec1e-4d54-acb5-89b374c7a9d5-combined-ca-bundle\") pod \"heat-api-7f549dcc6f-fhw4x\" (UID: \"0e866b18-ec1e-4d54-acb5-89b374c7a9d5\") " pod="openstack/heat-api-7f549dcc6f-fhw4x" Dec 01 08:41:45 crc kubenswrapper[5004]: I1201 08:41:45.024739 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/244abe13-3114-4206-9505-f3b0fdda447e-config-data\") pod \"heat-cfnapi-67bc95bb56-lkml4\" (UID: \"244abe13-3114-4206-9505-f3b0fdda447e\") " pod="openstack/heat-cfnapi-67bc95bb56-lkml4" Dec 01 08:41:45 crc kubenswrapper[5004]: I1201 08:41:45.024767 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/244abe13-3114-4206-9505-f3b0fdda447e-combined-ca-bundle\") pod \"heat-cfnapi-67bc95bb56-lkml4\" (UID: \"244abe13-3114-4206-9505-f3b0fdda447e\") " pod="openstack/heat-cfnapi-67bc95bb56-lkml4" Dec 01 08:41:45 crc kubenswrapper[5004]: I1201 08:41:45.024835 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/244abe13-3114-4206-9505-f3b0fdda447e-config-data-custom\") pod \"heat-cfnapi-67bc95bb56-lkml4\" (UID: \"244abe13-3114-4206-9505-f3b0fdda447e\") " pod="openstack/heat-cfnapi-67bc95bb56-lkml4" Dec 01 08:41:45 crc kubenswrapper[5004]: I1201 08:41:45.024889 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/66885fce-0b69-4fc5-b4cd-f7b33bee9046-config-data-custom\") pod \"heat-engine-7cfb9cfbf9-fqxms\" (UID: \"66885fce-0b69-4fc5-b4cd-f7b33bee9046\") " pod="openstack/heat-engine-7cfb9cfbf9-fqxms" Dec 01 08:41:45 crc kubenswrapper[5004]: I1201 08:41:45.024961 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xdsx\" (UniqueName: \"kubernetes.io/projected/0e866b18-ec1e-4d54-acb5-89b374c7a9d5-kube-api-access-5xdsx\") pod \"heat-api-7f549dcc6f-fhw4x\" (UID: \"0e866b18-ec1e-4d54-acb5-89b374c7a9d5\") " pod="openstack/heat-api-7f549dcc6f-fhw4x" Dec 01 08:41:45 crc kubenswrapper[5004]: I1201 08:41:45.024991 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xr6vh\" (UniqueName: \"kubernetes.io/projected/66885fce-0b69-4fc5-b4cd-f7b33bee9046-kube-api-access-xr6vh\") pod \"heat-engine-7cfb9cfbf9-fqxms\" (UID: \"66885fce-0b69-4fc5-b4cd-f7b33bee9046\") " pod="openstack/heat-engine-7cfb9cfbf9-fqxms" Dec 01 08:41:45 crc kubenswrapper[5004]: I1201 08:41:45.025027 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e866b18-ec1e-4d54-acb5-89b374c7a9d5-config-data\") pod \"heat-api-7f549dcc6f-fhw4x\" (UID: \"0e866b18-ec1e-4d54-acb5-89b374c7a9d5\") " pod="openstack/heat-api-7f549dcc6f-fhw4x" Dec 01 08:41:45 crc kubenswrapper[5004]: I1201 08:41:45.025054 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwt5z\" (UniqueName: \"kubernetes.io/projected/244abe13-3114-4206-9505-f3b0fdda447e-kube-api-access-fwt5z\") pod \"heat-cfnapi-67bc95bb56-lkml4\" (UID: \"244abe13-3114-4206-9505-f3b0fdda447e\") " pod="openstack/heat-cfnapi-67bc95bb56-lkml4" Dec 01 08:41:45 crc kubenswrapper[5004]: I1201 08:41:45.025091 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66885fce-0b69-4fc5-b4cd-f7b33bee9046-combined-ca-bundle\") pod \"heat-engine-7cfb9cfbf9-fqxms\" (UID: \"66885fce-0b69-4fc5-b4cd-f7b33bee9046\") " pod="openstack/heat-engine-7cfb9cfbf9-fqxms" Dec 01 08:41:45 crc kubenswrapper[5004]: I1201 08:41:45.025111 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0e866b18-ec1e-4d54-acb5-89b374c7a9d5-config-data-custom\") pod \"heat-api-7f549dcc6f-fhw4x\" (UID: \"0e866b18-ec1e-4d54-acb5-89b374c7a9d5\") " pod="openstack/heat-api-7f549dcc6f-fhw4x" Dec 01 08:41:45 crc kubenswrapper[5004]: I1201 08:41:45.126501 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/244abe13-3114-4206-9505-f3b0fdda447e-config-data-custom\") pod \"heat-cfnapi-67bc95bb56-lkml4\" (UID: \"244abe13-3114-4206-9505-f3b0fdda447e\") " pod="openstack/heat-cfnapi-67bc95bb56-lkml4" Dec 01 08:41:45 crc kubenswrapper[5004]: I1201 08:41:45.126553 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/66885fce-0b69-4fc5-b4cd-f7b33bee9046-config-data-custom\") pod \"heat-engine-7cfb9cfbf9-fqxms\" (UID: \"66885fce-0b69-4fc5-b4cd-f7b33bee9046\") " pod="openstack/heat-engine-7cfb9cfbf9-fqxms" Dec 01 08:41:45 crc kubenswrapper[5004]: I1201 08:41:45.126627 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xdsx\" (UniqueName: \"kubernetes.io/projected/0e866b18-ec1e-4d54-acb5-89b374c7a9d5-kube-api-access-5xdsx\") pod \"heat-api-7f549dcc6f-fhw4x\" (UID: \"0e866b18-ec1e-4d54-acb5-89b374c7a9d5\") " pod="openstack/heat-api-7f549dcc6f-fhw4x" Dec 01 08:41:45 crc kubenswrapper[5004]: I1201 08:41:45.126654 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xr6vh\" (UniqueName: \"kubernetes.io/projected/66885fce-0b69-4fc5-b4cd-f7b33bee9046-kube-api-access-xr6vh\") pod \"heat-engine-7cfb9cfbf9-fqxms\" (UID: \"66885fce-0b69-4fc5-b4cd-f7b33bee9046\") " pod="openstack/heat-engine-7cfb9cfbf9-fqxms" Dec 01 08:41:45 crc kubenswrapper[5004]: I1201 08:41:45.126674 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e866b18-ec1e-4d54-acb5-89b374c7a9d5-config-data\") pod \"heat-api-7f549dcc6f-fhw4x\" (UID: \"0e866b18-ec1e-4d54-acb5-89b374c7a9d5\") " pod="openstack/heat-api-7f549dcc6f-fhw4x" Dec 01 08:41:45 crc kubenswrapper[5004]: I1201 08:41:45.126693 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwt5z\" (UniqueName: \"kubernetes.io/projected/244abe13-3114-4206-9505-f3b0fdda447e-kube-api-access-fwt5z\") pod \"heat-cfnapi-67bc95bb56-lkml4\" (UID: \"244abe13-3114-4206-9505-f3b0fdda447e\") " pod="openstack/heat-cfnapi-67bc95bb56-lkml4" Dec 01 08:41:45 crc kubenswrapper[5004]: I1201 08:41:45.126721 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66885fce-0b69-4fc5-b4cd-f7b33bee9046-combined-ca-bundle\") pod \"heat-engine-7cfb9cfbf9-fqxms\" (UID: \"66885fce-0b69-4fc5-b4cd-f7b33bee9046\") " pod="openstack/heat-engine-7cfb9cfbf9-fqxms" Dec 01 08:41:45 crc kubenswrapper[5004]: I1201 08:41:45.126737 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0e866b18-ec1e-4d54-acb5-89b374c7a9d5-config-data-custom\") pod \"heat-api-7f549dcc6f-fhw4x\" (UID: \"0e866b18-ec1e-4d54-acb5-89b374c7a9d5\") " pod="openstack/heat-api-7f549dcc6f-fhw4x" Dec 01 08:41:45 crc kubenswrapper[5004]: I1201 08:41:45.126808 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66885fce-0b69-4fc5-b4cd-f7b33bee9046-config-data\") pod \"heat-engine-7cfb9cfbf9-fqxms\" (UID: \"66885fce-0b69-4fc5-b4cd-f7b33bee9046\") " pod="openstack/heat-engine-7cfb9cfbf9-fqxms" Dec 01 08:41:45 crc kubenswrapper[5004]: I1201 08:41:45.126829 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e866b18-ec1e-4d54-acb5-89b374c7a9d5-combined-ca-bundle\") pod \"heat-api-7f549dcc6f-fhw4x\" (UID: \"0e866b18-ec1e-4d54-acb5-89b374c7a9d5\") " pod="openstack/heat-api-7f549dcc6f-fhw4x" Dec 01 08:41:45 crc kubenswrapper[5004]: I1201 08:41:45.126847 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/244abe13-3114-4206-9505-f3b0fdda447e-config-data\") pod \"heat-cfnapi-67bc95bb56-lkml4\" (UID: \"244abe13-3114-4206-9505-f3b0fdda447e\") " pod="openstack/heat-cfnapi-67bc95bb56-lkml4" Dec 01 08:41:45 crc kubenswrapper[5004]: I1201 08:41:45.126866 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/244abe13-3114-4206-9505-f3b0fdda447e-combined-ca-bundle\") pod \"heat-cfnapi-67bc95bb56-lkml4\" (UID: \"244abe13-3114-4206-9505-f3b0fdda447e\") " pod="openstack/heat-cfnapi-67bc95bb56-lkml4" Dec 01 08:41:45 crc kubenswrapper[5004]: I1201 08:41:45.132637 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/244abe13-3114-4206-9505-f3b0fdda447e-config-data\") pod \"heat-cfnapi-67bc95bb56-lkml4\" (UID: \"244abe13-3114-4206-9505-f3b0fdda447e\") " pod="openstack/heat-cfnapi-67bc95bb56-lkml4" Dec 01 08:41:45 crc kubenswrapper[5004]: I1201 08:41:45.134810 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66885fce-0b69-4fc5-b4cd-f7b33bee9046-combined-ca-bundle\") pod \"heat-engine-7cfb9cfbf9-fqxms\" (UID: \"66885fce-0b69-4fc5-b4cd-f7b33bee9046\") " pod="openstack/heat-engine-7cfb9cfbf9-fqxms" Dec 01 08:41:45 crc kubenswrapper[5004]: I1201 08:41:45.136160 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e866b18-ec1e-4d54-acb5-89b374c7a9d5-combined-ca-bundle\") pod \"heat-api-7f549dcc6f-fhw4x\" (UID: \"0e866b18-ec1e-4d54-acb5-89b374c7a9d5\") " pod="openstack/heat-api-7f549dcc6f-fhw4x" Dec 01 08:41:45 crc kubenswrapper[5004]: I1201 08:41:45.136614 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/66885fce-0b69-4fc5-b4cd-f7b33bee9046-config-data-custom\") pod \"heat-engine-7cfb9cfbf9-fqxms\" (UID: \"66885fce-0b69-4fc5-b4cd-f7b33bee9046\") " pod="openstack/heat-engine-7cfb9cfbf9-fqxms" Dec 01 08:41:45 crc kubenswrapper[5004]: I1201 08:41:45.137302 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66885fce-0b69-4fc5-b4cd-f7b33bee9046-config-data\") pod \"heat-engine-7cfb9cfbf9-fqxms\" (UID: \"66885fce-0b69-4fc5-b4cd-f7b33bee9046\") " pod="openstack/heat-engine-7cfb9cfbf9-fqxms" Dec 01 08:41:45 crc kubenswrapper[5004]: I1201 08:41:45.138767 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/244abe13-3114-4206-9505-f3b0fdda447e-config-data-custom\") pod \"heat-cfnapi-67bc95bb56-lkml4\" (UID: \"244abe13-3114-4206-9505-f3b0fdda447e\") " pod="openstack/heat-cfnapi-67bc95bb56-lkml4" Dec 01 08:41:45 crc kubenswrapper[5004]: I1201 08:41:45.141227 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/244abe13-3114-4206-9505-f3b0fdda447e-combined-ca-bundle\") pod \"heat-cfnapi-67bc95bb56-lkml4\" (UID: \"244abe13-3114-4206-9505-f3b0fdda447e\") " pod="openstack/heat-cfnapi-67bc95bb56-lkml4" Dec 01 08:41:45 crc kubenswrapper[5004]: I1201 08:41:45.146392 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xr6vh\" (UniqueName: \"kubernetes.io/projected/66885fce-0b69-4fc5-b4cd-f7b33bee9046-kube-api-access-xr6vh\") pod \"heat-engine-7cfb9cfbf9-fqxms\" (UID: \"66885fce-0b69-4fc5-b4cd-f7b33bee9046\") " pod="openstack/heat-engine-7cfb9cfbf9-fqxms" Dec 01 08:41:45 crc kubenswrapper[5004]: I1201 08:41:45.148101 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwt5z\" (UniqueName: \"kubernetes.io/projected/244abe13-3114-4206-9505-f3b0fdda447e-kube-api-access-fwt5z\") pod \"heat-cfnapi-67bc95bb56-lkml4\" (UID: \"244abe13-3114-4206-9505-f3b0fdda447e\") " pod="openstack/heat-cfnapi-67bc95bb56-lkml4" Dec 01 08:41:45 crc kubenswrapper[5004]: I1201 08:41:45.161269 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e866b18-ec1e-4d54-acb5-89b374c7a9d5-config-data\") pod \"heat-api-7f549dcc6f-fhw4x\" (UID: \"0e866b18-ec1e-4d54-acb5-89b374c7a9d5\") " pod="openstack/heat-api-7f549dcc6f-fhw4x" Dec 01 08:41:45 crc kubenswrapper[5004]: I1201 08:41:45.163029 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0e866b18-ec1e-4d54-acb5-89b374c7a9d5-config-data-custom\") pod \"heat-api-7f549dcc6f-fhw4x\" (UID: \"0e866b18-ec1e-4d54-acb5-89b374c7a9d5\") " pod="openstack/heat-api-7f549dcc6f-fhw4x" Dec 01 08:41:45 crc kubenswrapper[5004]: I1201 08:41:45.168135 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xdsx\" (UniqueName: \"kubernetes.io/projected/0e866b18-ec1e-4d54-acb5-89b374c7a9d5-kube-api-access-5xdsx\") pod \"heat-api-7f549dcc6f-fhw4x\" (UID: \"0e866b18-ec1e-4d54-acb5-89b374c7a9d5\") " pod="openstack/heat-api-7f549dcc6f-fhw4x" Dec 01 08:41:45 crc kubenswrapper[5004]: I1201 08:41:45.218302 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-7cfb9cfbf9-fqxms" Dec 01 08:41:45 crc kubenswrapper[5004]: I1201 08:41:45.231637 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7f549dcc6f-fhw4x" Dec 01 08:41:45 crc kubenswrapper[5004]: I1201 08:41:45.251771 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-67bc95bb56-lkml4" Dec 01 08:41:45 crc kubenswrapper[5004]: W1201 08:41:45.958196 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66885fce_0b69_4fc5_b4cd_f7b33bee9046.slice/crio-1fb43b408fe73f1ef3a012e55d07847dac54817931de9e8088ce2c850afcadc7 WatchSource:0}: Error finding container 1fb43b408fe73f1ef3a012e55d07847dac54817931de9e8088ce2c850afcadc7: Status 404 returned error can't find the container with id 1fb43b408fe73f1ef3a012e55d07847dac54817931de9e8088ce2c850afcadc7 Dec 01 08:41:45 crc kubenswrapper[5004]: I1201 08:41:45.958455 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-7cfb9cfbf9-fqxms"] Dec 01 08:41:45 crc kubenswrapper[5004]: W1201 08:41:45.961230 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e866b18_ec1e_4d54_acb5_89b374c7a9d5.slice/crio-24036210d6eb5aed7ffce704d89da4631a3c0290b3ff7524fcd9a1988f9d850e WatchSource:0}: Error finding container 24036210d6eb5aed7ffce704d89da4631a3c0290b3ff7524fcd9a1988f9d850e: Status 404 returned error can't find the container with id 24036210d6eb5aed7ffce704d89da4631a3c0290b3ff7524fcd9a1988f9d850e Dec 01 08:41:45 crc kubenswrapper[5004]: I1201 08:41:45.970202 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-7f549dcc6f-fhw4x"] Dec 01 08:41:46 crc kubenswrapper[5004]: I1201 08:41:46.144615 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-67bc95bb56-lkml4"] Dec 01 08:41:46 crc kubenswrapper[5004]: I1201 08:41:46.808981 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-566b457cd5-4qc77"] Dec 01 08:41:46 crc kubenswrapper[5004]: I1201 08:41:46.809635 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-566b457cd5-4qc77" podUID="972d0223-e732-4bd0-99e3-dccb174a2511" containerName="heat-api" containerID="cri-o://87467b86401908b1737b6fcf86a15faf6b206ff876c1b0009a4062b3860d1475" gracePeriod=60 Dec 01 08:41:46 crc kubenswrapper[5004]: I1201 08:41:46.840312 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-67bc95bb56-lkml4" event={"ID":"244abe13-3114-4206-9505-f3b0fdda447e","Type":"ContainerStarted","Data":"c055793885623df684ea4da9faf40557172cb5ae2ee67606f7b9dc7ce964e369"} Dec 01 08:41:46 crc kubenswrapper[5004]: I1201 08:41:46.840358 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-67bc95bb56-lkml4" event={"ID":"244abe13-3114-4206-9505-f3b0fdda447e","Type":"ContainerStarted","Data":"a6aec3e56a4058bc861d9c00b25c2c2c075fd48b6cd6964d39c63e8c22c834c0"} Dec 01 08:41:46 crc kubenswrapper[5004]: I1201 08:41:46.841287 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-67bc95bb56-lkml4" Dec 01 08:41:46 crc kubenswrapper[5004]: I1201 08:41:46.844378 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-7cfb9cfbf9-fqxms" event={"ID":"66885fce-0b69-4fc5-b4cd-f7b33bee9046","Type":"ContainerStarted","Data":"e46324c44b100e99dbf3734ebb47701ffa88d1c729fc38d2cb362f220ab6edf1"} Dec 01 08:41:46 crc kubenswrapper[5004]: I1201 08:41:46.844426 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-7cfb9cfbf9-fqxms" event={"ID":"66885fce-0b69-4fc5-b4cd-f7b33bee9046","Type":"ContainerStarted","Data":"1fb43b408fe73f1ef3a012e55d07847dac54817931de9e8088ce2c850afcadc7"} Dec 01 08:41:46 crc kubenswrapper[5004]: I1201 08:41:46.845441 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-7cfb9cfbf9-fqxms" Dec 01 08:41:46 crc kubenswrapper[5004]: I1201 08:41:46.849073 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-557748dddd-qnftd"] Dec 01 08:41:46 crc kubenswrapper[5004]: I1201 08:41:46.849262 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-557748dddd-qnftd" podUID="05a450a4-acc5-4c46-ae41-7ff4de029a72" containerName="heat-cfnapi" containerID="cri-o://abb2a140cbc457456e229bbda184c16b30744ce9421ebda600e9f106c0c18db0" gracePeriod=60 Dec 01 08:41:46 crc kubenswrapper[5004]: I1201 08:41:46.855738 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7f549dcc6f-fhw4x" event={"ID":"0e866b18-ec1e-4d54-acb5-89b374c7a9d5","Type":"ContainerStarted","Data":"4e6eda218947e40b974204e1d0a78454f1b19593eff3b0c00f53c4ccd42da6e9"} Dec 01 08:41:46 crc kubenswrapper[5004]: I1201 08:41:46.855778 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7f549dcc6f-fhw4x" event={"ID":"0e866b18-ec1e-4d54-acb5-89b374c7a9d5","Type":"ContainerStarted","Data":"24036210d6eb5aed7ffce704d89da4631a3c0290b3ff7524fcd9a1988f9d850e"} Dec 01 08:41:46 crc kubenswrapper[5004]: I1201 08:41:46.856706 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-7f549dcc6f-fhw4x" Dec 01 08:41:46 crc kubenswrapper[5004]: I1201 08:41:46.872540 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-77c6469896-8fmqx"] Dec 01 08:41:46 crc kubenswrapper[5004]: I1201 08:41:46.874974 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-77c6469896-8fmqx" Dec 01 08:41:46 crc kubenswrapper[5004]: I1201 08:41:46.881663 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-5b78cf6765-f2dsx"] Dec 01 08:41:46 crc kubenswrapper[5004]: I1201 08:41:46.883262 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5b78cf6765-f2dsx" Dec 01 08:41:46 crc kubenswrapper[5004]: I1201 08:41:46.887068 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-internal-svc" Dec 01 08:41:46 crc kubenswrapper[5004]: I1201 08:41:46.887268 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-public-svc" Dec 01 08:41:46 crc kubenswrapper[5004]: I1201 08:41:46.903006 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-public-svc" Dec 01 08:41:46 crc kubenswrapper[5004]: I1201 08:41:46.903278 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-internal-svc" Dec 01 08:41:46 crc kubenswrapper[5004]: I1201 08:41:46.903616 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-77c6469896-8fmqx"] Dec 01 08:41:46 crc kubenswrapper[5004]: I1201 08:41:46.944223 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-5b78cf6765-f2dsx"] Dec 01 08:41:46 crc kubenswrapper[5004]: I1201 08:41:46.945068 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-67bc95bb56-lkml4" podStartSLOduration=2.945044611 podStartE2EDuration="2.945044611s" podCreationTimestamp="2025-12-01 08:41:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:41:46.869160376 +0000 UTC m=+1484.434152358" watchObservedRunningTime="2025-12-01 08:41:46.945044611 +0000 UTC m=+1484.510036603" Dec 01 08:41:46 crc kubenswrapper[5004]: I1201 08:41:46.976327 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a-public-tls-certs\") pod \"heat-cfnapi-5b78cf6765-f2dsx\" (UID: \"a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a\") " pod="openstack/heat-cfnapi-5b78cf6765-f2dsx" Dec 01 08:41:46 crc kubenswrapper[5004]: I1201 08:41:46.976461 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a-config-data-custom\") pod \"heat-cfnapi-5b78cf6765-f2dsx\" (UID: \"a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a\") " pod="openstack/heat-cfnapi-5b78cf6765-f2dsx" Dec 01 08:41:46 crc kubenswrapper[5004]: I1201 08:41:46.976499 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a-config-data\") pod \"heat-cfnapi-5b78cf6765-f2dsx\" (UID: \"a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a\") " pod="openstack/heat-cfnapi-5b78cf6765-f2dsx" Dec 01 08:41:46 crc kubenswrapper[5004]: I1201 08:41:46.976518 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cda3888-d928-439e-9dfa-54e3535e4be9-config-data\") pod \"heat-api-77c6469896-8fmqx\" (UID: \"6cda3888-d928-439e-9dfa-54e3535e4be9\") " pod="openstack/heat-api-77c6469896-8fmqx" Dec 01 08:41:46 crc kubenswrapper[5004]: I1201 08:41:46.976660 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cda3888-d928-439e-9dfa-54e3535e4be9-internal-tls-certs\") pod \"heat-api-77c6469896-8fmqx\" (UID: \"6cda3888-d928-439e-9dfa-54e3535e4be9\") " pod="openstack/heat-api-77c6469896-8fmqx" Dec 01 08:41:46 crc kubenswrapper[5004]: I1201 08:41:46.976679 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a-internal-tls-certs\") pod \"heat-cfnapi-5b78cf6765-f2dsx\" (UID: \"a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a\") " pod="openstack/heat-cfnapi-5b78cf6765-f2dsx" Dec 01 08:41:46 crc kubenswrapper[5004]: I1201 08:41:46.976749 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6cda3888-d928-439e-9dfa-54e3535e4be9-config-data-custom\") pod \"heat-api-77c6469896-8fmqx\" (UID: \"6cda3888-d928-439e-9dfa-54e3535e4be9\") " pod="openstack/heat-api-77c6469896-8fmqx" Dec 01 08:41:46 crc kubenswrapper[5004]: I1201 08:41:46.976821 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cda3888-d928-439e-9dfa-54e3535e4be9-public-tls-certs\") pod \"heat-api-77c6469896-8fmqx\" (UID: \"6cda3888-d928-439e-9dfa-54e3535e4be9\") " pod="openstack/heat-api-77c6469896-8fmqx" Dec 01 08:41:46 crc kubenswrapper[5004]: I1201 08:41:46.976860 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a-combined-ca-bundle\") pod \"heat-cfnapi-5b78cf6765-f2dsx\" (UID: \"a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a\") " pod="openstack/heat-cfnapi-5b78cf6765-f2dsx" Dec 01 08:41:46 crc kubenswrapper[5004]: I1201 08:41:46.976916 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5znx2\" (UniqueName: \"kubernetes.io/projected/a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a-kube-api-access-5znx2\") pod \"heat-cfnapi-5b78cf6765-f2dsx\" (UID: \"a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a\") " pod="openstack/heat-cfnapi-5b78cf6765-f2dsx" Dec 01 08:41:46 crc kubenswrapper[5004]: I1201 08:41:46.976998 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cda3888-d928-439e-9dfa-54e3535e4be9-combined-ca-bundle\") pod \"heat-api-77c6469896-8fmqx\" (UID: \"6cda3888-d928-439e-9dfa-54e3535e4be9\") " pod="openstack/heat-api-77c6469896-8fmqx" Dec 01 08:41:46 crc kubenswrapper[5004]: I1201 08:41:46.977052 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7wzx\" (UniqueName: \"kubernetes.io/projected/6cda3888-d928-439e-9dfa-54e3535e4be9-kube-api-access-q7wzx\") pod \"heat-api-77c6469896-8fmqx\" (UID: \"6cda3888-d928-439e-9dfa-54e3535e4be9\") " pod="openstack/heat-api-77c6469896-8fmqx" Dec 01 08:41:46 crc kubenswrapper[5004]: I1201 08:41:46.992161 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-7f549dcc6f-fhw4x" podStartSLOduration=2.992143761 podStartE2EDuration="2.992143761s" podCreationTimestamp="2025-12-01 08:41:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:41:46.903514396 +0000 UTC m=+1484.468506378" watchObservedRunningTime="2025-12-01 08:41:46.992143761 +0000 UTC m=+1484.557135743" Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.011523 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-7cfb9cfbf9-fqxms" podStartSLOduration=3.011506264 podStartE2EDuration="3.011506264s" podCreationTimestamp="2025-12-01 08:41:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:41:46.92577389 +0000 UTC m=+1484.490765902" watchObservedRunningTime="2025-12-01 08:41:47.011506264 +0000 UTC m=+1484.576498236" Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.078474 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a-public-tls-certs\") pod \"heat-cfnapi-5b78cf6765-f2dsx\" (UID: \"a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a\") " pod="openstack/heat-cfnapi-5b78cf6765-f2dsx" Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.078604 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a-config-data-custom\") pod \"heat-cfnapi-5b78cf6765-f2dsx\" (UID: \"a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a\") " pod="openstack/heat-cfnapi-5b78cf6765-f2dsx" Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.078642 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a-config-data\") pod \"heat-cfnapi-5b78cf6765-f2dsx\" (UID: \"a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a\") " pod="openstack/heat-cfnapi-5b78cf6765-f2dsx" Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.078665 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cda3888-d928-439e-9dfa-54e3535e4be9-config-data\") pod \"heat-api-77c6469896-8fmqx\" (UID: \"6cda3888-d928-439e-9dfa-54e3535e4be9\") " pod="openstack/heat-api-77c6469896-8fmqx" Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.078720 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cda3888-d928-439e-9dfa-54e3535e4be9-internal-tls-certs\") pod \"heat-api-77c6469896-8fmqx\" (UID: \"6cda3888-d928-439e-9dfa-54e3535e4be9\") " pod="openstack/heat-api-77c6469896-8fmqx" Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.078745 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a-internal-tls-certs\") pod \"heat-cfnapi-5b78cf6765-f2dsx\" (UID: \"a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a\") " pod="openstack/heat-cfnapi-5b78cf6765-f2dsx" Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.078782 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6cda3888-d928-439e-9dfa-54e3535e4be9-config-data-custom\") pod \"heat-api-77c6469896-8fmqx\" (UID: \"6cda3888-d928-439e-9dfa-54e3535e4be9\") " pod="openstack/heat-api-77c6469896-8fmqx" Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.078825 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cda3888-d928-439e-9dfa-54e3535e4be9-public-tls-certs\") pod \"heat-api-77c6469896-8fmqx\" (UID: \"6cda3888-d928-439e-9dfa-54e3535e4be9\") " pod="openstack/heat-api-77c6469896-8fmqx" Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.078878 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a-combined-ca-bundle\") pod \"heat-cfnapi-5b78cf6765-f2dsx\" (UID: \"a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a\") " pod="openstack/heat-cfnapi-5b78cf6765-f2dsx" Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.078909 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5znx2\" (UniqueName: \"kubernetes.io/projected/a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a-kube-api-access-5znx2\") pod \"heat-cfnapi-5b78cf6765-f2dsx\" (UID: \"a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a\") " pod="openstack/heat-cfnapi-5b78cf6765-f2dsx" Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.078947 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cda3888-d928-439e-9dfa-54e3535e4be9-combined-ca-bundle\") pod \"heat-api-77c6469896-8fmqx\" (UID: \"6cda3888-d928-439e-9dfa-54e3535e4be9\") " pod="openstack/heat-api-77c6469896-8fmqx" Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.078972 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7wzx\" (UniqueName: \"kubernetes.io/projected/6cda3888-d928-439e-9dfa-54e3535e4be9-kube-api-access-q7wzx\") pod \"heat-api-77c6469896-8fmqx\" (UID: \"6cda3888-d928-439e-9dfa-54e3535e4be9\") " pod="openstack/heat-api-77c6469896-8fmqx" Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.083442 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a-public-tls-certs\") pod \"heat-cfnapi-5b78cf6765-f2dsx\" (UID: \"a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a\") " pod="openstack/heat-cfnapi-5b78cf6765-f2dsx" Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.084389 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cda3888-d928-439e-9dfa-54e3535e4be9-combined-ca-bundle\") pod \"heat-api-77c6469896-8fmqx\" (UID: \"6cda3888-d928-439e-9dfa-54e3535e4be9\") " pod="openstack/heat-api-77c6469896-8fmqx" Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.088336 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6cda3888-d928-439e-9dfa-54e3535e4be9-config-data-custom\") pod \"heat-api-77c6469896-8fmqx\" (UID: \"6cda3888-d928-439e-9dfa-54e3535e4be9\") " pod="openstack/heat-api-77c6469896-8fmqx" Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.090071 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a-config-data-custom\") pod \"heat-cfnapi-5b78cf6765-f2dsx\" (UID: \"a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a\") " pod="openstack/heat-cfnapi-5b78cf6765-f2dsx" Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.091038 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a-combined-ca-bundle\") pod \"heat-cfnapi-5b78cf6765-f2dsx\" (UID: \"a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a\") " pod="openstack/heat-cfnapi-5b78cf6765-f2dsx" Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.093540 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cda3888-d928-439e-9dfa-54e3535e4be9-config-data\") pod \"heat-api-77c6469896-8fmqx\" (UID: \"6cda3888-d928-439e-9dfa-54e3535e4be9\") " pod="openstack/heat-api-77c6469896-8fmqx" Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.094412 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a-internal-tls-certs\") pod \"heat-cfnapi-5b78cf6765-f2dsx\" (UID: \"a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a\") " pod="openstack/heat-cfnapi-5b78cf6765-f2dsx" Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.094591 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a-config-data\") pod \"heat-cfnapi-5b78cf6765-f2dsx\" (UID: \"a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a\") " pod="openstack/heat-cfnapi-5b78cf6765-f2dsx" Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.095913 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cda3888-d928-439e-9dfa-54e3535e4be9-public-tls-certs\") pod \"heat-api-77c6469896-8fmqx\" (UID: \"6cda3888-d928-439e-9dfa-54e3535e4be9\") " pod="openstack/heat-api-77c6469896-8fmqx" Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.097257 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5znx2\" (UniqueName: \"kubernetes.io/projected/a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a-kube-api-access-5znx2\") pod \"heat-cfnapi-5b78cf6765-f2dsx\" (UID: \"a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a\") " pod="openstack/heat-cfnapi-5b78cf6765-f2dsx" Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.101212 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cda3888-d928-439e-9dfa-54e3535e4be9-internal-tls-certs\") pod \"heat-api-77c6469896-8fmqx\" (UID: \"6cda3888-d928-439e-9dfa-54e3535e4be9\") " pod="openstack/heat-api-77c6469896-8fmqx" Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.102552 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7wzx\" (UniqueName: \"kubernetes.io/projected/6cda3888-d928-439e-9dfa-54e3535e4be9-kube-api-access-q7wzx\") pod \"heat-api-77c6469896-8fmqx\" (UID: \"6cda3888-d928-439e-9dfa-54e3535e4be9\") " pod="openstack/heat-api-77c6469896-8fmqx" Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.234255 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-77c6469896-8fmqx" Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.255292 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5b78cf6765-f2dsx" Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.631144 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-566b457cd5-4qc77" Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.697548 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/972d0223-e732-4bd0-99e3-dccb174a2511-config-data\") pod \"972d0223-e732-4bd0-99e3-dccb174a2511\" (UID: \"972d0223-e732-4bd0-99e3-dccb174a2511\") " Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.697752 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/972d0223-e732-4bd0-99e3-dccb174a2511-combined-ca-bundle\") pod \"972d0223-e732-4bd0-99e3-dccb174a2511\" (UID: \"972d0223-e732-4bd0-99e3-dccb174a2511\") " Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.697966 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bms4s\" (UniqueName: \"kubernetes.io/projected/972d0223-e732-4bd0-99e3-dccb174a2511-kube-api-access-bms4s\") pod \"972d0223-e732-4bd0-99e3-dccb174a2511\" (UID: \"972d0223-e732-4bd0-99e3-dccb174a2511\") " Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.697994 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/972d0223-e732-4bd0-99e3-dccb174a2511-config-data-custom\") pod \"972d0223-e732-4bd0-99e3-dccb174a2511\" (UID: \"972d0223-e732-4bd0-99e3-dccb174a2511\") " Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.707969 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-557748dddd-qnftd" Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.708691 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/972d0223-e732-4bd0-99e3-dccb174a2511-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "972d0223-e732-4bd0-99e3-dccb174a2511" (UID: "972d0223-e732-4bd0-99e3-dccb174a2511"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.712490 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/972d0223-e732-4bd0-99e3-dccb174a2511-kube-api-access-bms4s" (OuterVolumeSpecName: "kube-api-access-bms4s") pod "972d0223-e732-4bd0-99e3-dccb174a2511" (UID: "972d0223-e732-4bd0-99e3-dccb174a2511"). InnerVolumeSpecName "kube-api-access-bms4s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.744775 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/972d0223-e732-4bd0-99e3-dccb174a2511-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "972d0223-e732-4bd0-99e3-dccb174a2511" (UID: "972d0223-e732-4bd0-99e3-dccb174a2511"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.777669 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/972d0223-e732-4bd0-99e3-dccb174a2511-config-data" (OuterVolumeSpecName: "config-data") pod "972d0223-e732-4bd0-99e3-dccb174a2511" (UID: "972d0223-e732-4bd0-99e3-dccb174a2511"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.799519 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/05a450a4-acc5-4c46-ae41-7ff4de029a72-config-data-custom\") pod \"05a450a4-acc5-4c46-ae41-7ff4de029a72\" (UID: \"05a450a4-acc5-4c46-ae41-7ff4de029a72\") " Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.799707 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05a450a4-acc5-4c46-ae41-7ff4de029a72-combined-ca-bundle\") pod \"05a450a4-acc5-4c46-ae41-7ff4de029a72\" (UID: \"05a450a4-acc5-4c46-ae41-7ff4de029a72\") " Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.799757 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05a450a4-acc5-4c46-ae41-7ff4de029a72-config-data\") pod \"05a450a4-acc5-4c46-ae41-7ff4de029a72\" (UID: \"05a450a4-acc5-4c46-ae41-7ff4de029a72\") " Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.799959 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9zgw\" (UniqueName: \"kubernetes.io/projected/05a450a4-acc5-4c46-ae41-7ff4de029a72-kube-api-access-j9zgw\") pod \"05a450a4-acc5-4c46-ae41-7ff4de029a72\" (UID: \"05a450a4-acc5-4c46-ae41-7ff4de029a72\") " Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.802103 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bms4s\" (UniqueName: \"kubernetes.io/projected/972d0223-e732-4bd0-99e3-dccb174a2511-kube-api-access-bms4s\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.802147 5004 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/972d0223-e732-4bd0-99e3-dccb174a2511-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.802160 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/972d0223-e732-4bd0-99e3-dccb174a2511-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.802173 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/972d0223-e732-4bd0-99e3-dccb174a2511-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.805211 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05a450a4-acc5-4c46-ae41-7ff4de029a72-kube-api-access-j9zgw" (OuterVolumeSpecName: "kube-api-access-j9zgw") pod "05a450a4-acc5-4c46-ae41-7ff4de029a72" (UID: "05a450a4-acc5-4c46-ae41-7ff4de029a72"). InnerVolumeSpecName "kube-api-access-j9zgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.824250 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05a450a4-acc5-4c46-ae41-7ff4de029a72-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "05a450a4-acc5-4c46-ae41-7ff4de029a72" (UID: "05a450a4-acc5-4c46-ae41-7ff4de029a72"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:41:47 crc kubenswrapper[5004]: W1201 08:41:47.850750 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda98e9ac8_7fc0_4778_823a_fb3d6b8e0e1a.slice/crio-54a55fc0d908cde346782c051fd460c62447ed402a487adde1b1354af012fff9 WatchSource:0}: Error finding container 54a55fc0d908cde346782c051fd460c62447ed402a487adde1b1354af012fff9: Status 404 returned error can't find the container with id 54a55fc0d908cde346782c051fd460c62447ed402a487adde1b1354af012fff9 Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.872910 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-5b78cf6765-f2dsx"] Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.886355 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05a450a4-acc5-4c46-ae41-7ff4de029a72-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "05a450a4-acc5-4c46-ae41-7ff4de029a72" (UID: "05a450a4-acc5-4c46-ae41-7ff4de029a72"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.891068 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5b78cf6765-f2dsx" event={"ID":"a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a","Type":"ContainerStarted","Data":"54a55fc0d908cde346782c051fd460c62447ed402a487adde1b1354af012fff9"} Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.893871 5004 generic.go:334] "Generic (PLEG): container finished" podID="05a450a4-acc5-4c46-ae41-7ff4de029a72" containerID="abb2a140cbc457456e229bbda184c16b30744ce9421ebda600e9f106c0c18db0" exitCode=0 Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.894435 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-557748dddd-qnftd" Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.894463 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-557748dddd-qnftd" event={"ID":"05a450a4-acc5-4c46-ae41-7ff4de029a72","Type":"ContainerDied","Data":"abb2a140cbc457456e229bbda184c16b30744ce9421ebda600e9f106c0c18db0"} Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.896909 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-557748dddd-qnftd" event={"ID":"05a450a4-acc5-4c46-ae41-7ff4de029a72","Type":"ContainerDied","Data":"ad6122e54d7043f7e6a93267e3de71ebc76b42290e050a1104dbe43ba2f56b3d"} Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.896971 5004 scope.go:117] "RemoveContainer" containerID="abb2a140cbc457456e229bbda184c16b30744ce9421ebda600e9f106c0c18db0" Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.894716 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05a450a4-acc5-4c46-ae41-7ff4de029a72-config-data" (OuterVolumeSpecName: "config-data") pod "05a450a4-acc5-4c46-ae41-7ff4de029a72" (UID: "05a450a4-acc5-4c46-ae41-7ff4de029a72"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:41:47 crc kubenswrapper[5004]: W1201 08:41:47.896199 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6cda3888_d928_439e_9dfa_54e3535e4be9.slice/crio-1e824aafe6b218266be44356546689940a6ea769707fa852e39c7cdf6c7c5a62 WatchSource:0}: Error finding container 1e824aafe6b218266be44356546689940a6ea769707fa852e39c7cdf6c7c5a62: Status 404 returned error can't find the container with id 1e824aafe6b218266be44356546689940a6ea769707fa852e39c7cdf6c7c5a62 Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.903417 5004 generic.go:334] "Generic (PLEG): container finished" podID="972d0223-e732-4bd0-99e3-dccb174a2511" containerID="87467b86401908b1737b6fcf86a15faf6b206ff876c1b0009a4062b3860d1475" exitCode=0 Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.903493 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-566b457cd5-4qc77" event={"ID":"972d0223-e732-4bd0-99e3-dccb174a2511","Type":"ContainerDied","Data":"87467b86401908b1737b6fcf86a15faf6b206ff876c1b0009a4062b3860d1475"} Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.903526 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-566b457cd5-4qc77" event={"ID":"972d0223-e732-4bd0-99e3-dccb174a2511","Type":"ContainerDied","Data":"3bad11a123a22b0f885f6627327db42ee0267efc915fc48564cee3612260787e"} Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.903605 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-566b457cd5-4qc77" Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.910601 5004 generic.go:334] "Generic (PLEG): container finished" podID="0e866b18-ec1e-4d54-acb5-89b374c7a9d5" containerID="4e6eda218947e40b974204e1d0a78454f1b19593eff3b0c00f53c4ccd42da6e9" exitCode=1 Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.910653 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7f549dcc6f-fhw4x" event={"ID":"0e866b18-ec1e-4d54-acb5-89b374c7a9d5","Type":"ContainerDied","Data":"4e6eda218947e40b974204e1d0a78454f1b19593eff3b0c00f53c4ccd42da6e9"} Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.910674 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9zgw\" (UniqueName: \"kubernetes.io/projected/05a450a4-acc5-4c46-ae41-7ff4de029a72-kube-api-access-j9zgw\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.910701 5004 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/05a450a4-acc5-4c46-ae41-7ff4de029a72-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.910710 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05a450a4-acc5-4c46-ae41-7ff4de029a72-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.910817 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05a450a4-acc5-4c46-ae41-7ff4de029a72-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.911166 5004 scope.go:117] "RemoveContainer" containerID="4e6eda218947e40b974204e1d0a78454f1b19593eff3b0c00f53c4ccd42da6e9" Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.912132 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-77c6469896-8fmqx"] Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.917290 5004 generic.go:334] "Generic (PLEG): container finished" podID="244abe13-3114-4206-9505-f3b0fdda447e" containerID="c055793885623df684ea4da9faf40557172cb5ae2ee67606f7b9dc7ce964e369" exitCode=1 Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.918192 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-67bc95bb56-lkml4" event={"ID":"244abe13-3114-4206-9505-f3b0fdda447e","Type":"ContainerDied","Data":"c055793885623df684ea4da9faf40557172cb5ae2ee67606f7b9dc7ce964e369"} Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.918997 5004 scope.go:117] "RemoveContainer" containerID="c055793885623df684ea4da9faf40557172cb5ae2ee67606f7b9dc7ce964e369" Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.951213 5004 scope.go:117] "RemoveContainer" containerID="abb2a140cbc457456e229bbda184c16b30744ce9421ebda600e9f106c0c18db0" Dec 01 08:41:47 crc kubenswrapper[5004]: E1201 08:41:47.953121 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abb2a140cbc457456e229bbda184c16b30744ce9421ebda600e9f106c0c18db0\": container with ID starting with abb2a140cbc457456e229bbda184c16b30744ce9421ebda600e9f106c0c18db0 not found: ID does not exist" containerID="abb2a140cbc457456e229bbda184c16b30744ce9421ebda600e9f106c0c18db0" Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.953164 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abb2a140cbc457456e229bbda184c16b30744ce9421ebda600e9f106c0c18db0"} err="failed to get container status \"abb2a140cbc457456e229bbda184c16b30744ce9421ebda600e9f106c0c18db0\": rpc error: code = NotFound desc = could not find container \"abb2a140cbc457456e229bbda184c16b30744ce9421ebda600e9f106c0c18db0\": container with ID starting with abb2a140cbc457456e229bbda184c16b30744ce9421ebda600e9f106c0c18db0 not found: ID does not exist" Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.953187 5004 scope.go:117] "RemoveContainer" containerID="87467b86401908b1737b6fcf86a15faf6b206ff876c1b0009a4062b3860d1475" Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.968530 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-xclx6"] Dec 01 08:41:47 crc kubenswrapper[5004]: E1201 08:41:47.969912 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05a450a4-acc5-4c46-ae41-7ff4de029a72" containerName="heat-cfnapi" Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.969934 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="05a450a4-acc5-4c46-ae41-7ff4de029a72" containerName="heat-cfnapi" Dec 01 08:41:47 crc kubenswrapper[5004]: E1201 08:41:47.969956 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="972d0223-e732-4bd0-99e3-dccb174a2511" containerName="heat-api" Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.969962 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="972d0223-e732-4bd0-99e3-dccb174a2511" containerName="heat-api" Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.970199 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="05a450a4-acc5-4c46-ae41-7ff4de029a72" containerName="heat-cfnapi" Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.970215 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="972d0223-e732-4bd0-99e3-dccb174a2511" containerName="heat-api" Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.972386 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-xclx6" Dec 01 08:41:47 crc kubenswrapper[5004]: I1201 08:41:47.988652 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-xclx6"] Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.014749 5004 scope.go:117] "RemoveContainer" containerID="87467b86401908b1737b6fcf86a15faf6b206ff876c1b0009a4062b3860d1475" Dec 01 08:41:48 crc kubenswrapper[5004]: E1201 08:41:48.016237 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87467b86401908b1737b6fcf86a15faf6b206ff876c1b0009a4062b3860d1475\": container with ID starting with 87467b86401908b1737b6fcf86a15faf6b206ff876c1b0009a4062b3860d1475 not found: ID does not exist" containerID="87467b86401908b1737b6fcf86a15faf6b206ff876c1b0009a4062b3860d1475" Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.016298 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87467b86401908b1737b6fcf86a15faf6b206ff876c1b0009a4062b3860d1475"} err="failed to get container status \"87467b86401908b1737b6fcf86a15faf6b206ff876c1b0009a4062b3860d1475\": rpc error: code = NotFound desc = could not find container \"87467b86401908b1737b6fcf86a15faf6b206ff876c1b0009a4062b3860d1475\": container with ID starting with 87467b86401908b1737b6fcf86a15faf6b206ff876c1b0009a4062b3860d1475 not found: ID does not exist" Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.019421 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkrdx\" (UniqueName: \"kubernetes.io/projected/b50edddf-3daf-4bee-83da-8c44123a382f-kube-api-access-jkrdx\") pod \"nova-api-db-create-xclx6\" (UID: \"b50edddf-3daf-4bee-83da-8c44123a382f\") " pod="openstack/nova-api-db-create-xclx6" Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.019736 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b50edddf-3daf-4bee-83da-8c44123a382f-operator-scripts\") pod \"nova-api-db-create-xclx6\" (UID: \"b50edddf-3daf-4bee-83da-8c44123a382f\") " pod="openstack/nova-api-db-create-xclx6" Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.028767 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-94kdj"] Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.031444 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-94kdj" Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.049650 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-94kdj"] Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.099640 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-566b457cd5-4qc77"] Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.119017 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-566b457cd5-4qc77"] Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.121939 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkrdx\" (UniqueName: \"kubernetes.io/projected/b50edddf-3daf-4bee-83da-8c44123a382f-kube-api-access-jkrdx\") pod \"nova-api-db-create-xclx6\" (UID: \"b50edddf-3daf-4bee-83da-8c44123a382f\") " pod="openstack/nova-api-db-create-xclx6" Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.122039 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eede402f-9c2a-4dcf-9de9-33df959b5cd8-operator-scripts\") pod \"nova-cell0-db-create-94kdj\" (UID: \"eede402f-9c2a-4dcf-9de9-33df959b5cd8\") " pod="openstack/nova-cell0-db-create-94kdj" Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.122091 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z58h6\" (UniqueName: \"kubernetes.io/projected/eede402f-9c2a-4dcf-9de9-33df959b5cd8-kube-api-access-z58h6\") pod \"nova-cell0-db-create-94kdj\" (UID: \"eede402f-9c2a-4dcf-9de9-33df959b5cd8\") " pod="openstack/nova-cell0-db-create-94kdj" Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.122113 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b50edddf-3daf-4bee-83da-8c44123a382f-operator-scripts\") pod \"nova-api-db-create-xclx6\" (UID: \"b50edddf-3daf-4bee-83da-8c44123a382f\") " pod="openstack/nova-api-db-create-xclx6" Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.122864 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b50edddf-3daf-4bee-83da-8c44123a382f-operator-scripts\") pod \"nova-api-db-create-xclx6\" (UID: \"b50edddf-3daf-4bee-83da-8c44123a382f\") " pod="openstack/nova-api-db-create-xclx6" Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.142198 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkrdx\" (UniqueName: \"kubernetes.io/projected/b50edddf-3daf-4bee-83da-8c44123a382f-kube-api-access-jkrdx\") pod \"nova-api-db-create-xclx6\" (UID: \"b50edddf-3daf-4bee-83da-8c44123a382f\") " pod="openstack/nova-api-db-create-xclx6" Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.144413 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7d978555f9-fpvfz" Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.170235 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-w9gjc"] Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.171776 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-w9gjc" Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.190250 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-12e3-account-create-update-2kzjl"] Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.191866 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-12e3-account-create-update-2kzjl" Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.199340 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.218442 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-12e3-account-create-update-2kzjl"] Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.224930 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e1b1e393-9cd1-4f89-8b3e-b02ee0d397bd-operator-scripts\") pod \"nova-cell1-db-create-w9gjc\" (UID: \"e1b1e393-9cd1-4f89-8b3e-b02ee0d397bd\") " pod="openstack/nova-cell1-db-create-w9gjc" Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.225223 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eede402f-9c2a-4dcf-9de9-33df959b5cd8-operator-scripts\") pod \"nova-cell0-db-create-94kdj\" (UID: \"eede402f-9c2a-4dcf-9de9-33df959b5cd8\") " pod="openstack/nova-cell0-db-create-94kdj" Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.225347 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f01fa36-7d6e-48b6-a764-8aa381b5cf7a-operator-scripts\") pod \"nova-api-12e3-account-create-update-2kzjl\" (UID: \"4f01fa36-7d6e-48b6-a764-8aa381b5cf7a\") " pod="openstack/nova-api-12e3-account-create-update-2kzjl" Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.225471 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z58h6\" (UniqueName: \"kubernetes.io/projected/eede402f-9c2a-4dcf-9de9-33df959b5cd8-kube-api-access-z58h6\") pod \"nova-cell0-db-create-94kdj\" (UID: \"eede402f-9c2a-4dcf-9de9-33df959b5cd8\") " pod="openstack/nova-cell0-db-create-94kdj" Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.225628 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9gdg\" (UniqueName: \"kubernetes.io/projected/e1b1e393-9cd1-4f89-8b3e-b02ee0d397bd-kube-api-access-c9gdg\") pod \"nova-cell1-db-create-w9gjc\" (UID: \"e1b1e393-9cd1-4f89-8b3e-b02ee0d397bd\") " pod="openstack/nova-cell1-db-create-w9gjc" Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.226295 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5dbs\" (UniqueName: \"kubernetes.io/projected/4f01fa36-7d6e-48b6-a764-8aa381b5cf7a-kube-api-access-c5dbs\") pod \"nova-api-12e3-account-create-update-2kzjl\" (UID: \"4f01fa36-7d6e-48b6-a764-8aa381b5cf7a\") " pod="openstack/nova-api-12e3-account-create-update-2kzjl" Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.232745 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eede402f-9c2a-4dcf-9de9-33df959b5cd8-operator-scripts\") pod \"nova-cell0-db-create-94kdj\" (UID: \"eede402f-9c2a-4dcf-9de9-33df959b5cd8\") " pod="openstack/nova-cell0-db-create-94kdj" Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.251846 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z58h6\" (UniqueName: \"kubernetes.io/projected/eede402f-9c2a-4dcf-9de9-33df959b5cd8-kube-api-access-z58h6\") pod \"nova-cell0-db-create-94kdj\" (UID: \"eede402f-9c2a-4dcf-9de9-33df959b5cd8\") " pod="openstack/nova-cell0-db-create-94kdj" Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.257691 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-w9gjc"] Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.290190 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-zqhkd"] Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.290412 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6bb4fc677f-zqhkd" podUID="e56ced1f-e623-4c1a-8da6-944c91827cac" containerName="dnsmasq-dns" containerID="cri-o://821287477bfaf6931ed0af5b0546c9fd2cfc10b9a29881cd227e2471910721bb" gracePeriod=10 Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.299087 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-xclx6" Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.330049 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f01fa36-7d6e-48b6-a764-8aa381b5cf7a-operator-scripts\") pod \"nova-api-12e3-account-create-update-2kzjl\" (UID: \"4f01fa36-7d6e-48b6-a764-8aa381b5cf7a\") " pod="openstack/nova-api-12e3-account-create-update-2kzjl" Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.330159 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9gdg\" (UniqueName: \"kubernetes.io/projected/e1b1e393-9cd1-4f89-8b3e-b02ee0d397bd-kube-api-access-c9gdg\") pod \"nova-cell1-db-create-w9gjc\" (UID: \"e1b1e393-9cd1-4f89-8b3e-b02ee0d397bd\") " pod="openstack/nova-cell1-db-create-w9gjc" Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.330223 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5dbs\" (UniqueName: \"kubernetes.io/projected/4f01fa36-7d6e-48b6-a764-8aa381b5cf7a-kube-api-access-c5dbs\") pod \"nova-api-12e3-account-create-update-2kzjl\" (UID: \"4f01fa36-7d6e-48b6-a764-8aa381b5cf7a\") " pod="openstack/nova-api-12e3-account-create-update-2kzjl" Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.330255 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e1b1e393-9cd1-4f89-8b3e-b02ee0d397bd-operator-scripts\") pod \"nova-cell1-db-create-w9gjc\" (UID: \"e1b1e393-9cd1-4f89-8b3e-b02ee0d397bd\") " pod="openstack/nova-cell1-db-create-w9gjc" Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.331080 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e1b1e393-9cd1-4f89-8b3e-b02ee0d397bd-operator-scripts\") pod \"nova-cell1-db-create-w9gjc\" (UID: \"e1b1e393-9cd1-4f89-8b3e-b02ee0d397bd\") " pod="openstack/nova-cell1-db-create-w9gjc" Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.331126 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f01fa36-7d6e-48b6-a764-8aa381b5cf7a-operator-scripts\") pod \"nova-api-12e3-account-create-update-2kzjl\" (UID: \"4f01fa36-7d6e-48b6-a764-8aa381b5cf7a\") " pod="openstack/nova-api-12e3-account-create-update-2kzjl" Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.338667 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-ae05-account-create-update-fx8ms"] Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.340387 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ae05-account-create-update-fx8ms" Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.349169 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.356138 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9gdg\" (UniqueName: \"kubernetes.io/projected/e1b1e393-9cd1-4f89-8b3e-b02ee0d397bd-kube-api-access-c9gdg\") pod \"nova-cell1-db-create-w9gjc\" (UID: \"e1b1e393-9cd1-4f89-8b3e-b02ee0d397bd\") " pod="openstack/nova-cell1-db-create-w9gjc" Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.358001 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-ae05-account-create-update-fx8ms"] Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.386224 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5dbs\" (UniqueName: \"kubernetes.io/projected/4f01fa36-7d6e-48b6-a764-8aa381b5cf7a-kube-api-access-c5dbs\") pod \"nova-api-12e3-account-create-update-2kzjl\" (UID: \"4f01fa36-7d6e-48b6-a764-8aa381b5cf7a\") " pod="openstack/nova-api-12e3-account-create-update-2kzjl" Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.386874 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-94kdj" Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.432881 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7924d39b-830f-4f91-ae39-6b04c31d3f61-operator-scripts\") pod \"nova-cell0-ae05-account-create-update-fx8ms\" (UID: \"7924d39b-830f-4f91-ae39-6b04c31d3f61\") " pod="openstack/nova-cell0-ae05-account-create-update-fx8ms" Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.433028 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrlgm\" (UniqueName: \"kubernetes.io/projected/7924d39b-830f-4f91-ae39-6b04c31d3f61-kube-api-access-qrlgm\") pod \"nova-cell0-ae05-account-create-update-fx8ms\" (UID: \"7924d39b-830f-4f91-ae39-6b04c31d3f61\") " pod="openstack/nova-cell0-ae05-account-create-update-fx8ms" Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.494834 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-976b-account-create-update-8qblk"] Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.496533 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-976b-account-create-update-8qblk" Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.498479 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.535799 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5135af8b-1b58-4816-ace1-424059c5267a-operator-scripts\") pod \"nova-cell1-976b-account-create-update-8qblk\" (UID: \"5135af8b-1b58-4816-ace1-424059c5267a\") " pod="openstack/nova-cell1-976b-account-create-update-8qblk" Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.535931 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrlgm\" (UniqueName: \"kubernetes.io/projected/7924d39b-830f-4f91-ae39-6b04c31d3f61-kube-api-access-qrlgm\") pod \"nova-cell0-ae05-account-create-update-fx8ms\" (UID: \"7924d39b-830f-4f91-ae39-6b04c31d3f61\") " pod="openstack/nova-cell0-ae05-account-create-update-fx8ms" Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.535959 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-976b-account-create-update-8qblk"] Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.536028 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7c4c\" (UniqueName: \"kubernetes.io/projected/5135af8b-1b58-4816-ace1-424059c5267a-kube-api-access-c7c4c\") pod \"nova-cell1-976b-account-create-update-8qblk\" (UID: \"5135af8b-1b58-4816-ace1-424059c5267a\") " pod="openstack/nova-cell1-976b-account-create-update-8qblk" Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.536189 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7924d39b-830f-4f91-ae39-6b04c31d3f61-operator-scripts\") pod \"nova-cell0-ae05-account-create-update-fx8ms\" (UID: \"7924d39b-830f-4f91-ae39-6b04c31d3f61\") " pod="openstack/nova-cell0-ae05-account-create-update-fx8ms" Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.541780 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7924d39b-830f-4f91-ae39-6b04c31d3f61-operator-scripts\") pod \"nova-cell0-ae05-account-create-update-fx8ms\" (UID: \"7924d39b-830f-4f91-ae39-6b04c31d3f61\") " pod="openstack/nova-cell0-ae05-account-create-update-fx8ms" Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.587627 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrlgm\" (UniqueName: \"kubernetes.io/projected/7924d39b-830f-4f91-ae39-6b04c31d3f61-kube-api-access-qrlgm\") pod \"nova-cell0-ae05-account-create-update-fx8ms\" (UID: \"7924d39b-830f-4f91-ae39-6b04c31d3f61\") " pod="openstack/nova-cell0-ae05-account-create-update-fx8ms" Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.637734 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7c4c\" (UniqueName: \"kubernetes.io/projected/5135af8b-1b58-4816-ace1-424059c5267a-kube-api-access-c7c4c\") pod \"nova-cell1-976b-account-create-update-8qblk\" (UID: \"5135af8b-1b58-4816-ace1-424059c5267a\") " pod="openstack/nova-cell1-976b-account-create-update-8qblk" Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.637836 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5135af8b-1b58-4816-ace1-424059c5267a-operator-scripts\") pod \"nova-cell1-976b-account-create-update-8qblk\" (UID: \"5135af8b-1b58-4816-ace1-424059c5267a\") " pod="openstack/nova-cell1-976b-account-create-update-8qblk" Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.638819 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5135af8b-1b58-4816-ace1-424059c5267a-operator-scripts\") pod \"nova-cell1-976b-account-create-update-8qblk\" (UID: \"5135af8b-1b58-4816-ace1-424059c5267a\") " pod="openstack/nova-cell1-976b-account-create-update-8qblk" Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.655426 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7c4c\" (UniqueName: \"kubernetes.io/projected/5135af8b-1b58-4816-ace1-424059c5267a-kube-api-access-c7c4c\") pod \"nova-cell1-976b-account-create-update-8qblk\" (UID: \"5135af8b-1b58-4816-ace1-424059c5267a\") " pod="openstack/nova-cell1-976b-account-create-update-8qblk" Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.712525 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-w9gjc" Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.727707 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-12e3-account-create-update-2kzjl" Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.742842 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-557748dddd-qnftd"] Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.758089 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-557748dddd-qnftd"] Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.779194 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05a450a4-acc5-4c46-ae41-7ff4de029a72" path="/var/lib/kubelet/pods/05a450a4-acc5-4c46-ae41-7ff4de029a72/volumes" Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.779989 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="972d0223-e732-4bd0-99e3-dccb174a2511" path="/var/lib/kubelet/pods/972d0223-e732-4bd0-99e3-dccb174a2511/volumes" Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.789532 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ae05-account-create-update-fx8ms" Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.807097 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-976b-account-create-update-8qblk" Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.937330 5004 generic.go:334] "Generic (PLEG): container finished" podID="e56ced1f-e623-4c1a-8da6-944c91827cac" containerID="821287477bfaf6931ed0af5b0546c9fd2cfc10b9a29881cd227e2471910721bb" exitCode=0 Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.937413 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-zqhkd" event={"ID":"e56ced1f-e623-4c1a-8da6-944c91827cac","Type":"ContainerDied","Data":"821287477bfaf6931ed0af5b0546c9fd2cfc10b9a29881cd227e2471910721bb"} Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.983768 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5b78cf6765-f2dsx" event={"ID":"a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a","Type":"ContainerStarted","Data":"a7674e31b785c47a0c5f335ab681e055f6c081ef07eb98adb40bb183d9b0987b"} Dec 01 08:41:48 crc kubenswrapper[5004]: I1201 08:41:48.985256 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-5b78cf6765-f2dsx" Dec 01 08:41:49 crc kubenswrapper[5004]: I1201 08:41:49.019956 5004 generic.go:334] "Generic (PLEG): container finished" podID="0e866b18-ec1e-4d54-acb5-89b374c7a9d5" containerID="11c240da9eb0d4278fdf2349444d2c6f8636a7477bbbb29db5b2bcdd9596e62f" exitCode=1 Dec 01 08:41:49 crc kubenswrapper[5004]: I1201 08:41:49.020223 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7f549dcc6f-fhw4x" event={"ID":"0e866b18-ec1e-4d54-acb5-89b374c7a9d5","Type":"ContainerDied","Data":"11c240da9eb0d4278fdf2349444d2c6f8636a7477bbbb29db5b2bcdd9596e62f"} Dec 01 08:41:49 crc kubenswrapper[5004]: I1201 08:41:49.020257 5004 scope.go:117] "RemoveContainer" containerID="4e6eda218947e40b974204e1d0a78454f1b19593eff3b0c00f53c4ccd42da6e9" Dec 01 08:41:49 crc kubenswrapper[5004]: I1201 08:41:49.021037 5004 scope.go:117] "RemoveContainer" containerID="11c240da9eb0d4278fdf2349444d2c6f8636a7477bbbb29db5b2bcdd9596e62f" Dec 01 08:41:49 crc kubenswrapper[5004]: E1201 08:41:49.021285 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-7f549dcc6f-fhw4x_openstack(0e866b18-ec1e-4d54-acb5-89b374c7a9d5)\"" pod="openstack/heat-api-7f549dcc6f-fhw4x" podUID="0e866b18-ec1e-4d54-acb5-89b374c7a9d5" Dec 01 08:41:49 crc kubenswrapper[5004]: I1201 08:41:49.024489 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-5b78cf6765-f2dsx" podStartSLOduration=3.024470184 podStartE2EDuration="3.024470184s" podCreationTimestamp="2025-12-01 08:41:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:41:49.001206665 +0000 UTC m=+1486.566198647" watchObservedRunningTime="2025-12-01 08:41:49.024470184 +0000 UTC m=+1486.589462166" Dec 01 08:41:49 crc kubenswrapper[5004]: I1201 08:41:49.046981 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-77c6469896-8fmqx" event={"ID":"6cda3888-d928-439e-9dfa-54e3535e4be9","Type":"ContainerStarted","Data":"08449de0d68aad31145fdc1ede407b7192336817d4528c0aa1b7e6af9203d422"} Dec 01 08:41:49 crc kubenswrapper[5004]: I1201 08:41:49.047024 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-77c6469896-8fmqx" event={"ID":"6cda3888-d928-439e-9dfa-54e3535e4be9","Type":"ContainerStarted","Data":"1e824aafe6b218266be44356546689940a6ea769707fa852e39c7cdf6c7c5a62"} Dec 01 08:41:49 crc kubenswrapper[5004]: I1201 08:41:49.047219 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-77c6469896-8fmqx" Dec 01 08:41:49 crc kubenswrapper[5004]: I1201 08:41:49.135981 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-67bc95bb56-lkml4" event={"ID":"244abe13-3114-4206-9505-f3b0fdda447e","Type":"ContainerStarted","Data":"a79f9e7e64a47dd07136cf47c21fcdd10ecfc38370970bd9af1f1ff18a1ab176"} Dec 01 08:41:49 crc kubenswrapper[5004]: I1201 08:41:49.136052 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-67bc95bb56-lkml4" Dec 01 08:41:49 crc kubenswrapper[5004]: I1201 08:41:49.161574 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-77c6469896-8fmqx" podStartSLOduration=3.161536912 podStartE2EDuration="3.161536912s" podCreationTimestamp="2025-12-01 08:41:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:41:49.135297801 +0000 UTC m=+1486.700289793" watchObservedRunningTime="2025-12-01 08:41:49.161536912 +0000 UTC m=+1486.726528884" Dec 01 08:41:49 crc kubenswrapper[5004]: I1201 08:41:49.235169 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-xclx6"] Dec 01 08:41:49 crc kubenswrapper[5004]: I1201 08:41:49.329555 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-zqhkd" Dec 01 08:41:49 crc kubenswrapper[5004]: I1201 08:41:49.471969 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-94kdj"] Dec 01 08:41:49 crc kubenswrapper[5004]: I1201 08:41:49.485161 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e56ced1f-e623-4c1a-8da6-944c91827cac-ovsdbserver-sb\") pod \"e56ced1f-e623-4c1a-8da6-944c91827cac\" (UID: \"e56ced1f-e623-4c1a-8da6-944c91827cac\") " Dec 01 08:41:49 crc kubenswrapper[5004]: I1201 08:41:49.485419 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e56ced1f-e623-4c1a-8da6-944c91827cac-dns-svc\") pod \"e56ced1f-e623-4c1a-8da6-944c91827cac\" (UID: \"e56ced1f-e623-4c1a-8da6-944c91827cac\") " Dec 01 08:41:49 crc kubenswrapper[5004]: I1201 08:41:49.487838 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e56ced1f-e623-4c1a-8da6-944c91827cac-dns-swift-storage-0\") pod \"e56ced1f-e623-4c1a-8da6-944c91827cac\" (UID: \"e56ced1f-e623-4c1a-8da6-944c91827cac\") " Dec 01 08:41:49 crc kubenswrapper[5004]: I1201 08:41:49.487919 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e56ced1f-e623-4c1a-8da6-944c91827cac-config\") pod \"e56ced1f-e623-4c1a-8da6-944c91827cac\" (UID: \"e56ced1f-e623-4c1a-8da6-944c91827cac\") " Dec 01 08:41:49 crc kubenswrapper[5004]: I1201 08:41:49.488387 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e56ced1f-e623-4c1a-8da6-944c91827cac-ovsdbserver-nb\") pod \"e56ced1f-e623-4c1a-8da6-944c91827cac\" (UID: \"e56ced1f-e623-4c1a-8da6-944c91827cac\") " Dec 01 08:41:49 crc kubenswrapper[5004]: I1201 08:41:49.488430 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fl5sp\" (UniqueName: \"kubernetes.io/projected/e56ced1f-e623-4c1a-8da6-944c91827cac-kube-api-access-fl5sp\") pod \"e56ced1f-e623-4c1a-8da6-944c91827cac\" (UID: \"e56ced1f-e623-4c1a-8da6-944c91827cac\") " Dec 01 08:41:49 crc kubenswrapper[5004]: I1201 08:41:49.514252 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e56ced1f-e623-4c1a-8da6-944c91827cac-kube-api-access-fl5sp" (OuterVolumeSpecName: "kube-api-access-fl5sp") pod "e56ced1f-e623-4c1a-8da6-944c91827cac" (UID: "e56ced1f-e623-4c1a-8da6-944c91827cac"). InnerVolumeSpecName "kube-api-access-fl5sp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:41:49 crc kubenswrapper[5004]: I1201 08:41:49.574218 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e56ced1f-e623-4c1a-8da6-944c91827cac-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e56ced1f-e623-4c1a-8da6-944c91827cac" (UID: "e56ced1f-e623-4c1a-8da6-944c91827cac"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:41:49 crc kubenswrapper[5004]: I1201 08:41:49.580811 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e56ced1f-e623-4c1a-8da6-944c91827cac-config" (OuterVolumeSpecName: "config") pod "e56ced1f-e623-4c1a-8da6-944c91827cac" (UID: "e56ced1f-e623-4c1a-8da6-944c91827cac"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:41:49 crc kubenswrapper[5004]: I1201 08:41:49.590767 5004 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e56ced1f-e623-4c1a-8da6-944c91827cac-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:49 crc kubenswrapper[5004]: I1201 08:41:49.590796 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e56ced1f-e623-4c1a-8da6-944c91827cac-config\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:49 crc kubenswrapper[5004]: I1201 08:41:49.590807 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fl5sp\" (UniqueName: \"kubernetes.io/projected/e56ced1f-e623-4c1a-8da6-944c91827cac-kube-api-access-fl5sp\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:49 crc kubenswrapper[5004]: I1201 08:41:49.592451 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e56ced1f-e623-4c1a-8da6-944c91827cac-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e56ced1f-e623-4c1a-8da6-944c91827cac" (UID: "e56ced1f-e623-4c1a-8da6-944c91827cac"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:41:49 crc kubenswrapper[5004]: I1201 08:41:49.597195 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e56ced1f-e623-4c1a-8da6-944c91827cac-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e56ced1f-e623-4c1a-8da6-944c91827cac" (UID: "e56ced1f-e623-4c1a-8da6-944c91827cac"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:41:49 crc kubenswrapper[5004]: I1201 08:41:49.623233 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e56ced1f-e623-4c1a-8da6-944c91827cac-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e56ced1f-e623-4c1a-8da6-944c91827cac" (UID: "e56ced1f-e623-4c1a-8da6-944c91827cac"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:41:49 crc kubenswrapper[5004]: I1201 08:41:49.672122 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-w9gjc"] Dec 01 08:41:49 crc kubenswrapper[5004]: I1201 08:41:49.693148 5004 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e56ced1f-e623-4c1a-8da6-944c91827cac-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:49 crc kubenswrapper[5004]: I1201 08:41:49.693176 5004 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e56ced1f-e623-4c1a-8da6-944c91827cac-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:49 crc kubenswrapper[5004]: I1201 08:41:49.693187 5004 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e56ced1f-e623-4c1a-8da6-944c91827cac-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:50 crc kubenswrapper[5004]: I1201 08:41:50.053004 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-12e3-account-create-update-2kzjl"] Dec 01 08:41:50 crc kubenswrapper[5004]: I1201 08:41:50.067377 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-ae05-account-create-update-fx8ms"] Dec 01 08:41:50 crc kubenswrapper[5004]: I1201 08:41:50.079568 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-976b-account-create-update-8qblk"] Dec 01 08:41:50 crc kubenswrapper[5004]: W1201 08:41:50.100354 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f01fa36_7d6e_48b6_a764_8aa381b5cf7a.slice/crio-05ac3201c9bfbaae50ff8f98bdc363f2b351ef44f12bed5d618dcfb103c87b80 WatchSource:0}: Error finding container 05ac3201c9bfbaae50ff8f98bdc363f2b351ef44f12bed5d618dcfb103c87b80: Status 404 returned error can't find the container with id 05ac3201c9bfbaae50ff8f98bdc363f2b351ef44f12bed5d618dcfb103c87b80 Dec 01 08:41:50 crc kubenswrapper[5004]: I1201 08:41:50.104087 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 08:41:50 crc kubenswrapper[5004]: I1201 08:41:50.104285 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="fa6628b9-be2f-4594-8767-5442e1d2f5b9" containerName="glance-log" containerID="cri-o://8e0d24717f6b77a9ff93f781693f1a7b385497e8d77549962f407e2ec0a74009" gracePeriod=30 Dec 01 08:41:50 crc kubenswrapper[5004]: I1201 08:41:50.104506 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="fa6628b9-be2f-4594-8767-5442e1d2f5b9" containerName="glance-httpd" containerID="cri-o://873b1fd641aba5c655ee027bc8a60117804b809c92280ec25c80829f55b1ab65" gracePeriod=30 Dec 01 08:41:50 crc kubenswrapper[5004]: W1201 08:41:50.127741 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5135af8b_1b58_4816_ace1_424059c5267a.slice/crio-622c5b3b58609975928ff9a838bfc925cc2e3c3b40d1f4ac21ae7910510dadf0 WatchSource:0}: Error finding container 622c5b3b58609975928ff9a838bfc925cc2e3c3b40d1f4ac21ae7910510dadf0: Status 404 returned error can't find the container with id 622c5b3b58609975928ff9a838bfc925cc2e3c3b40d1f4ac21ae7910510dadf0 Dec 01 08:41:50 crc kubenswrapper[5004]: W1201 08:41:50.128470 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7924d39b_830f_4f91_ae39_6b04c31d3f61.slice/crio-c2026c5565741b6a599164a8219ccae4b981fcce2de6a6c3642fe2f74da2695c WatchSource:0}: Error finding container c2026c5565741b6a599164a8219ccae4b981fcce2de6a6c3642fe2f74da2695c: Status 404 returned error can't find the container with id c2026c5565741b6a599164a8219ccae4b981fcce2de6a6c3642fe2f74da2695c Dec 01 08:41:50 crc kubenswrapper[5004]: I1201 08:41:50.143715 5004 generic.go:334] "Generic (PLEG): container finished" podID="b50edddf-3daf-4bee-83da-8c44123a382f" containerID="b6b9ec3f129d85014f5e0cce074128a269bd4922e99b5e8fe4e00c1b8ca34694" exitCode=0 Dec 01 08:41:50 crc kubenswrapper[5004]: I1201 08:41:50.143776 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-xclx6" event={"ID":"b50edddf-3daf-4bee-83da-8c44123a382f","Type":"ContainerDied","Data":"b6b9ec3f129d85014f5e0cce074128a269bd4922e99b5e8fe4e00c1b8ca34694"} Dec 01 08:41:50 crc kubenswrapper[5004]: I1201 08:41:50.143800 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-xclx6" event={"ID":"b50edddf-3daf-4bee-83da-8c44123a382f","Type":"ContainerStarted","Data":"273c66ba75071a7405e0ac1cd87d86958720e15e8ead6253ced87c1d4795d34f"} Dec 01 08:41:50 crc kubenswrapper[5004]: I1201 08:41:50.145386 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-zqhkd" event={"ID":"e56ced1f-e623-4c1a-8da6-944c91827cac","Type":"ContainerDied","Data":"837d474be604f73d574bf8d5be6a68205ec3c8f7bea373704b39d6d2cfa37edc"} Dec 01 08:41:50 crc kubenswrapper[5004]: I1201 08:41:50.145419 5004 scope.go:117] "RemoveContainer" containerID="821287477bfaf6931ed0af5b0546c9fd2cfc10b9a29881cd227e2471910721bb" Dec 01 08:41:50 crc kubenswrapper[5004]: I1201 08:41:50.145494 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-zqhkd" Dec 01 08:41:50 crc kubenswrapper[5004]: I1201 08:41:50.158165 5004 generic.go:334] "Generic (PLEG): container finished" podID="eede402f-9c2a-4dcf-9de9-33df959b5cd8" containerID="0884f39c38ab64c5732becfbe9732e9cafa3905f930cb554f1f9eff46823ef94" exitCode=0 Dec 01 08:41:50 crc kubenswrapper[5004]: I1201 08:41:50.158249 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-94kdj" event={"ID":"eede402f-9c2a-4dcf-9de9-33df959b5cd8","Type":"ContainerDied","Data":"0884f39c38ab64c5732becfbe9732e9cafa3905f930cb554f1f9eff46823ef94"} Dec 01 08:41:50 crc kubenswrapper[5004]: I1201 08:41:50.158277 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-94kdj" event={"ID":"eede402f-9c2a-4dcf-9de9-33df959b5cd8","Type":"ContainerStarted","Data":"dde661c76b643309bd5304eb02da31c6ffef51a780ae1bfc98f935ce637ee0eb"} Dec 01 08:41:50 crc kubenswrapper[5004]: I1201 08:41:50.168790 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-12e3-account-create-update-2kzjl" event={"ID":"4f01fa36-7d6e-48b6-a764-8aa381b5cf7a","Type":"ContainerStarted","Data":"05ac3201c9bfbaae50ff8f98bdc363f2b351ef44f12bed5d618dcfb103c87b80"} Dec 01 08:41:50 crc kubenswrapper[5004]: I1201 08:41:50.186431 5004 scope.go:117] "RemoveContainer" containerID="11c240da9eb0d4278fdf2349444d2c6f8636a7477bbbb29db5b2bcdd9596e62f" Dec 01 08:41:50 crc kubenswrapper[5004]: E1201 08:41:50.186821 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-7f549dcc6f-fhw4x_openstack(0e866b18-ec1e-4d54-acb5-89b374c7a9d5)\"" pod="openstack/heat-api-7f549dcc6f-fhw4x" podUID="0e866b18-ec1e-4d54-acb5-89b374c7a9d5" Dec 01 08:41:50 crc kubenswrapper[5004]: I1201 08:41:50.200419 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-976b-account-create-update-8qblk" event={"ID":"5135af8b-1b58-4816-ace1-424059c5267a","Type":"ContainerStarted","Data":"622c5b3b58609975928ff9a838bfc925cc2e3c3b40d1f4ac21ae7910510dadf0"} Dec 01 08:41:50 crc kubenswrapper[5004]: I1201 08:41:50.231131 5004 generic.go:334] "Generic (PLEG): container finished" podID="244abe13-3114-4206-9505-f3b0fdda447e" containerID="a79f9e7e64a47dd07136cf47c21fcdd10ecfc38370970bd9af1f1ff18a1ab176" exitCode=1 Dec 01 08:41:50 crc kubenswrapper[5004]: I1201 08:41:50.231219 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-67bc95bb56-lkml4" event={"ID":"244abe13-3114-4206-9505-f3b0fdda447e","Type":"ContainerDied","Data":"a79f9e7e64a47dd07136cf47c21fcdd10ecfc38370970bd9af1f1ff18a1ab176"} Dec 01 08:41:50 crc kubenswrapper[5004]: I1201 08:41:50.232062 5004 scope.go:117] "RemoveContainer" containerID="a79f9e7e64a47dd07136cf47c21fcdd10ecfc38370970bd9af1f1ff18a1ab176" Dec 01 08:41:50 crc kubenswrapper[5004]: E1201 08:41:50.232333 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-67bc95bb56-lkml4_openstack(244abe13-3114-4206-9505-f3b0fdda447e)\"" pod="openstack/heat-cfnapi-67bc95bb56-lkml4" podUID="244abe13-3114-4206-9505-f3b0fdda447e" Dec 01 08:41:50 crc kubenswrapper[5004]: I1201 08:41:50.232846 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-7f549dcc6f-fhw4x" Dec 01 08:41:50 crc kubenswrapper[5004]: I1201 08:41:50.232883 5004 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-api-7f549dcc6f-fhw4x" Dec 01 08:41:50 crc kubenswrapper[5004]: I1201 08:41:50.237274 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-ae05-account-create-update-fx8ms" event={"ID":"7924d39b-830f-4f91-ae39-6b04c31d3f61","Type":"ContainerStarted","Data":"c2026c5565741b6a599164a8219ccae4b981fcce2de6a6c3642fe2f74da2695c"} Dec 01 08:41:50 crc kubenswrapper[5004]: I1201 08:41:50.240047 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-w9gjc" event={"ID":"e1b1e393-9cd1-4f89-8b3e-b02ee0d397bd","Type":"ContainerStarted","Data":"e60e90156b1e6887f3e8343b122b6c71bd9eb9d3a25ce4ddb55b7de2aa02ce1c"} Dec 01 08:41:50 crc kubenswrapper[5004]: I1201 08:41:50.240074 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-w9gjc" event={"ID":"e1b1e393-9cd1-4f89-8b3e-b02ee0d397bd","Type":"ContainerStarted","Data":"04c44d9befc693c6aa29a233905b5fb6d3a69b95b5690036a0780707c3bb058d"} Dec 01 08:41:50 crc kubenswrapper[5004]: I1201 08:41:50.252804 5004 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-67bc95bb56-lkml4" Dec 01 08:41:50 crc kubenswrapper[5004]: I1201 08:41:50.326283 5004 scope.go:117] "RemoveContainer" containerID="002293e044aefc19d7b90b6c52c2446535cf25dc6a789457717e49c13553c0ad" Dec 01 08:41:50 crc kubenswrapper[5004]: I1201 08:41:50.356649 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-zqhkd"] Dec 01 08:41:50 crc kubenswrapper[5004]: I1201 08:41:50.366678 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-zqhkd"] Dec 01 08:41:50 crc kubenswrapper[5004]: I1201 08:41:50.385697 5004 scope.go:117] "RemoveContainer" containerID="c055793885623df684ea4da9faf40557172cb5ae2ee67606f7b9dc7ce964e369" Dec 01 08:41:50 crc kubenswrapper[5004]: I1201 08:41:50.771620 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e56ced1f-e623-4c1a-8da6-944c91827cac" path="/var/lib/kubelet/pods/e56ced1f-e623-4c1a-8da6-944c91827cac/volumes" Dec 01 08:41:51 crc kubenswrapper[5004]: E1201 08:41:51.148170 5004 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f01fa36_7d6e_48b6_a764_8aa381b5cf7a.slice/crio-4fffe034b0d93a9159bf6c2b3f4a5ff39245208968004a8dd1c2f266ff170e85.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f01fa36_7d6e_48b6_a764_8aa381b5cf7a.slice/crio-conmon-4fffe034b0d93a9159bf6c2b3f4a5ff39245208968004a8dd1c2f266ff170e85.scope\": RecentStats: unable to find data in memory cache]" Dec 01 08:41:51 crc kubenswrapper[5004]: I1201 08:41:51.251285 5004 generic.go:334] "Generic (PLEG): container finished" podID="7924d39b-830f-4f91-ae39-6b04c31d3f61" containerID="aba2e67a068c132261944ce48d830b7e582b9569e72ad46e903c39e66dac9188" exitCode=0 Dec 01 08:41:51 crc kubenswrapper[5004]: I1201 08:41:51.251344 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-ae05-account-create-update-fx8ms" event={"ID":"7924d39b-830f-4f91-ae39-6b04c31d3f61","Type":"ContainerDied","Data":"aba2e67a068c132261944ce48d830b7e582b9569e72ad46e903c39e66dac9188"} Dec 01 08:41:51 crc kubenswrapper[5004]: I1201 08:41:51.255980 5004 generic.go:334] "Generic (PLEG): container finished" podID="fa6628b9-be2f-4594-8767-5442e1d2f5b9" containerID="8e0d24717f6b77a9ff93f781693f1a7b385497e8d77549962f407e2ec0a74009" exitCode=143 Dec 01 08:41:51 crc kubenswrapper[5004]: I1201 08:41:51.256119 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fa6628b9-be2f-4594-8767-5442e1d2f5b9","Type":"ContainerDied","Data":"8e0d24717f6b77a9ff93f781693f1a7b385497e8d77549962f407e2ec0a74009"} Dec 01 08:41:51 crc kubenswrapper[5004]: I1201 08:41:51.257584 5004 generic.go:334] "Generic (PLEG): container finished" podID="4f01fa36-7d6e-48b6-a764-8aa381b5cf7a" containerID="4fffe034b0d93a9159bf6c2b3f4a5ff39245208968004a8dd1c2f266ff170e85" exitCode=0 Dec 01 08:41:51 crc kubenswrapper[5004]: I1201 08:41:51.257682 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-12e3-account-create-update-2kzjl" event={"ID":"4f01fa36-7d6e-48b6-a764-8aa381b5cf7a","Type":"ContainerDied","Data":"4fffe034b0d93a9159bf6c2b3f4a5ff39245208968004a8dd1c2f266ff170e85"} Dec 01 08:41:51 crc kubenswrapper[5004]: I1201 08:41:51.259147 5004 generic.go:334] "Generic (PLEG): container finished" podID="e1b1e393-9cd1-4f89-8b3e-b02ee0d397bd" containerID="e60e90156b1e6887f3e8343b122b6c71bd9eb9d3a25ce4ddb55b7de2aa02ce1c" exitCode=0 Dec 01 08:41:51 crc kubenswrapper[5004]: I1201 08:41:51.259258 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-w9gjc" event={"ID":"e1b1e393-9cd1-4f89-8b3e-b02ee0d397bd","Type":"ContainerDied","Data":"e60e90156b1e6887f3e8343b122b6c71bd9eb9d3a25ce4ddb55b7de2aa02ce1c"} Dec 01 08:41:51 crc kubenswrapper[5004]: I1201 08:41:51.261975 5004 generic.go:334] "Generic (PLEG): container finished" podID="5135af8b-1b58-4816-ace1-424059c5267a" containerID="7d48562efbf85d7916e4e50b6ae5391b54d57c9b3e1a0991748ecbc5e1047ec7" exitCode=0 Dec 01 08:41:51 crc kubenswrapper[5004]: I1201 08:41:51.262027 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-976b-account-create-update-8qblk" event={"ID":"5135af8b-1b58-4816-ace1-424059c5267a","Type":"ContainerDied","Data":"7d48562efbf85d7916e4e50b6ae5391b54d57c9b3e1a0991748ecbc5e1047ec7"} Dec 01 08:41:51 crc kubenswrapper[5004]: I1201 08:41:51.266004 5004 scope.go:117] "RemoveContainer" containerID="a79f9e7e64a47dd07136cf47c21fcdd10ecfc38370970bd9af1f1ff18a1ab176" Dec 01 08:41:51 crc kubenswrapper[5004]: E1201 08:41:51.266305 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-67bc95bb56-lkml4_openstack(244abe13-3114-4206-9505-f3b0fdda447e)\"" pod="openstack/heat-cfnapi-67bc95bb56-lkml4" podUID="244abe13-3114-4206-9505-f3b0fdda447e" Dec 01 08:41:51 crc kubenswrapper[5004]: I1201 08:41:51.270777 5004 scope.go:117] "RemoveContainer" containerID="11c240da9eb0d4278fdf2349444d2c6f8636a7477bbbb29db5b2bcdd9596e62f" Dec 01 08:41:51 crc kubenswrapper[5004]: E1201 08:41:51.272948 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-7f549dcc6f-fhw4x_openstack(0e866b18-ec1e-4d54-acb5-89b374c7a9d5)\"" pod="openstack/heat-api-7f549dcc6f-fhw4x" podUID="0e866b18-ec1e-4d54-acb5-89b374c7a9d5" Dec 01 08:41:51 crc kubenswrapper[5004]: I1201 08:41:51.884805 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-94kdj" Dec 01 08:41:51 crc kubenswrapper[5004]: I1201 08:41:51.956683 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eede402f-9c2a-4dcf-9de9-33df959b5cd8-operator-scripts\") pod \"eede402f-9c2a-4dcf-9de9-33df959b5cd8\" (UID: \"eede402f-9c2a-4dcf-9de9-33df959b5cd8\") " Dec 01 08:41:51 crc kubenswrapper[5004]: I1201 08:41:51.956745 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z58h6\" (UniqueName: \"kubernetes.io/projected/eede402f-9c2a-4dcf-9de9-33df959b5cd8-kube-api-access-z58h6\") pod \"eede402f-9c2a-4dcf-9de9-33df959b5cd8\" (UID: \"eede402f-9c2a-4dcf-9de9-33df959b5cd8\") " Dec 01 08:41:51 crc kubenswrapper[5004]: I1201 08:41:51.960402 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eede402f-9c2a-4dcf-9de9-33df959b5cd8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "eede402f-9c2a-4dcf-9de9-33df959b5cd8" (UID: "eede402f-9c2a-4dcf-9de9-33df959b5cd8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:41:51 crc kubenswrapper[5004]: I1201 08:41:51.963948 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eede402f-9c2a-4dcf-9de9-33df959b5cd8-kube-api-access-z58h6" (OuterVolumeSpecName: "kube-api-access-z58h6") pod "eede402f-9c2a-4dcf-9de9-33df959b5cd8" (UID: "eede402f-9c2a-4dcf-9de9-33df959b5cd8"). InnerVolumeSpecName "kube-api-access-z58h6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:41:52 crc kubenswrapper[5004]: I1201 08:41:52.059217 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z58h6\" (UniqueName: \"kubernetes.io/projected/eede402f-9c2a-4dcf-9de9-33df959b5cd8-kube-api-access-z58h6\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:52 crc kubenswrapper[5004]: I1201 08:41:52.059523 5004 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eede402f-9c2a-4dcf-9de9-33df959b5cd8-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:52 crc kubenswrapper[5004]: I1201 08:41:52.069025 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-xclx6" Dec 01 08:41:52 crc kubenswrapper[5004]: I1201 08:41:52.078727 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-w9gjc" Dec 01 08:41:52 crc kubenswrapper[5004]: I1201 08:41:52.160498 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e1b1e393-9cd1-4f89-8b3e-b02ee0d397bd-operator-scripts\") pod \"e1b1e393-9cd1-4f89-8b3e-b02ee0d397bd\" (UID: \"e1b1e393-9cd1-4f89-8b3e-b02ee0d397bd\") " Dec 01 08:41:52 crc kubenswrapper[5004]: I1201 08:41:52.160549 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkrdx\" (UniqueName: \"kubernetes.io/projected/b50edddf-3daf-4bee-83da-8c44123a382f-kube-api-access-jkrdx\") pod \"b50edddf-3daf-4bee-83da-8c44123a382f\" (UID: \"b50edddf-3daf-4bee-83da-8c44123a382f\") " Dec 01 08:41:52 crc kubenswrapper[5004]: I1201 08:41:52.160898 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b50edddf-3daf-4bee-83da-8c44123a382f-operator-scripts\") pod \"b50edddf-3daf-4bee-83da-8c44123a382f\" (UID: \"b50edddf-3daf-4bee-83da-8c44123a382f\") " Dec 01 08:41:52 crc kubenswrapper[5004]: I1201 08:41:52.160992 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9gdg\" (UniqueName: \"kubernetes.io/projected/e1b1e393-9cd1-4f89-8b3e-b02ee0d397bd-kube-api-access-c9gdg\") pod \"e1b1e393-9cd1-4f89-8b3e-b02ee0d397bd\" (UID: \"e1b1e393-9cd1-4f89-8b3e-b02ee0d397bd\") " Dec 01 08:41:52 crc kubenswrapper[5004]: I1201 08:41:52.160992 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1b1e393-9cd1-4f89-8b3e-b02ee0d397bd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e1b1e393-9cd1-4f89-8b3e-b02ee0d397bd" (UID: "e1b1e393-9cd1-4f89-8b3e-b02ee0d397bd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:41:52 crc kubenswrapper[5004]: I1201 08:41:52.161326 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b50edddf-3daf-4bee-83da-8c44123a382f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b50edddf-3daf-4bee-83da-8c44123a382f" (UID: "b50edddf-3daf-4bee-83da-8c44123a382f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:41:52 crc kubenswrapper[5004]: I1201 08:41:52.161557 5004 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b50edddf-3daf-4bee-83da-8c44123a382f-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:52 crc kubenswrapper[5004]: I1201 08:41:52.161584 5004 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e1b1e393-9cd1-4f89-8b3e-b02ee0d397bd-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:52 crc kubenswrapper[5004]: I1201 08:41:52.165417 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1b1e393-9cd1-4f89-8b3e-b02ee0d397bd-kube-api-access-c9gdg" (OuterVolumeSpecName: "kube-api-access-c9gdg") pod "e1b1e393-9cd1-4f89-8b3e-b02ee0d397bd" (UID: "e1b1e393-9cd1-4f89-8b3e-b02ee0d397bd"). InnerVolumeSpecName "kube-api-access-c9gdg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:41:52 crc kubenswrapper[5004]: I1201 08:41:52.165484 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b50edddf-3daf-4bee-83da-8c44123a382f-kube-api-access-jkrdx" (OuterVolumeSpecName: "kube-api-access-jkrdx") pod "b50edddf-3daf-4bee-83da-8c44123a382f" (UID: "b50edddf-3daf-4bee-83da-8c44123a382f"). InnerVolumeSpecName "kube-api-access-jkrdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:41:52 crc kubenswrapper[5004]: I1201 08:41:52.270613 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 08:41:52 crc kubenswrapper[5004]: I1201 08:41:52.270910 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="64235e39-5760-4a2e-a164-7cb27ca906a3" containerName="glance-log" containerID="cri-o://a4de37a2a551c10a01fbe2aa77cf24579a8bbc8756ad332d58f177ae88a8f0c4" gracePeriod=30 Dec 01 08:41:52 crc kubenswrapper[5004]: I1201 08:41:52.271022 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="64235e39-5760-4a2e-a164-7cb27ca906a3" containerName="glance-httpd" containerID="cri-o://78ca9413ec0076671f4c8abfe49abffd1640b7fdb572e7e322984516b6e3783c" gracePeriod=30 Dec 01 08:41:52 crc kubenswrapper[5004]: I1201 08:41:52.277358 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9gdg\" (UniqueName: \"kubernetes.io/projected/e1b1e393-9cd1-4f89-8b3e-b02ee0d397bd-kube-api-access-c9gdg\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:52 crc kubenswrapper[5004]: I1201 08:41:52.277390 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkrdx\" (UniqueName: \"kubernetes.io/projected/b50edddf-3daf-4bee-83da-8c44123a382f-kube-api-access-jkrdx\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:52 crc kubenswrapper[5004]: I1201 08:41:52.300544 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-94kdj" event={"ID":"eede402f-9c2a-4dcf-9de9-33df959b5cd8","Type":"ContainerDied","Data":"dde661c76b643309bd5304eb02da31c6ffef51a780ae1bfc98f935ce637ee0eb"} Dec 01 08:41:52 crc kubenswrapper[5004]: I1201 08:41:52.300616 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dde661c76b643309bd5304eb02da31c6ffef51a780ae1bfc98f935ce637ee0eb" Dec 01 08:41:52 crc kubenswrapper[5004]: I1201 08:41:52.300579 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-94kdj" Dec 01 08:41:52 crc kubenswrapper[5004]: I1201 08:41:52.301829 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-w9gjc" event={"ID":"e1b1e393-9cd1-4f89-8b3e-b02ee0d397bd","Type":"ContainerDied","Data":"04c44d9befc693c6aa29a233905b5fb6d3a69b95b5690036a0780707c3bb058d"} Dec 01 08:41:52 crc kubenswrapper[5004]: I1201 08:41:52.301856 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04c44d9befc693c6aa29a233905b5fb6d3a69b95b5690036a0780707c3bb058d" Dec 01 08:41:52 crc kubenswrapper[5004]: I1201 08:41:52.301971 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-w9gjc" Dec 01 08:41:52 crc kubenswrapper[5004]: I1201 08:41:52.306998 5004 generic.go:334] "Generic (PLEG): container finished" podID="39fa01cf-5ba6-4a0a-8240-2b2eba0decf6" containerID="f85fe523c36185516b7a492fd599b5d479b15257da6a82e8a48292efd8203240" exitCode=0 Dec 01 08:41:52 crc kubenswrapper[5004]: I1201 08:41:52.307062 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"39fa01cf-5ba6-4a0a-8240-2b2eba0decf6","Type":"ContainerDied","Data":"f85fe523c36185516b7a492fd599b5d479b15257da6a82e8a48292efd8203240"} Dec 01 08:41:52 crc kubenswrapper[5004]: I1201 08:41:52.311788 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-xclx6" Dec 01 08:41:52 crc kubenswrapper[5004]: I1201 08:41:52.312058 5004 scope.go:117] "RemoveContainer" containerID="11c240da9eb0d4278fdf2349444d2c6f8636a7477bbbb29db5b2bcdd9596e62f" Dec 01 08:41:52 crc kubenswrapper[5004]: E1201 08:41:52.312488 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-7f549dcc6f-fhw4x_openstack(0e866b18-ec1e-4d54-acb5-89b374c7a9d5)\"" pod="openstack/heat-api-7f549dcc6f-fhw4x" podUID="0e866b18-ec1e-4d54-acb5-89b374c7a9d5" Dec 01 08:41:52 crc kubenswrapper[5004]: I1201 08:41:52.313404 5004 scope.go:117] "RemoveContainer" containerID="a79f9e7e64a47dd07136cf47c21fcdd10ecfc38370970bd9af1f1ff18a1ab176" Dec 01 08:41:52 crc kubenswrapper[5004]: E1201 08:41:52.313649 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-67bc95bb56-lkml4_openstack(244abe13-3114-4206-9505-f3b0fdda447e)\"" pod="openstack/heat-cfnapi-67bc95bb56-lkml4" podUID="244abe13-3114-4206-9505-f3b0fdda447e" Dec 01 08:41:52 crc kubenswrapper[5004]: I1201 08:41:52.314718 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-xclx6" event={"ID":"b50edddf-3daf-4bee-83da-8c44123a382f","Type":"ContainerDied","Data":"273c66ba75071a7405e0ac1cd87d86958720e15e8ead6253ced87c1d4795d34f"} Dec 01 08:41:52 crc kubenswrapper[5004]: I1201 08:41:52.314754 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="273c66ba75071a7405e0ac1cd87d86958720e15e8ead6253ced87c1d4795d34f" Dec 01 08:41:52 crc kubenswrapper[5004]: I1201 08:41:52.405207 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 08:41:52 crc kubenswrapper[5004]: I1201 08:41:52.480333 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/39fa01cf-5ba6-4a0a-8240-2b2eba0decf6-run-httpd\") pod \"39fa01cf-5ba6-4a0a-8240-2b2eba0decf6\" (UID: \"39fa01cf-5ba6-4a0a-8240-2b2eba0decf6\") " Dec 01 08:41:52 crc kubenswrapper[5004]: I1201 08:41:52.480454 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2cbpw\" (UniqueName: \"kubernetes.io/projected/39fa01cf-5ba6-4a0a-8240-2b2eba0decf6-kube-api-access-2cbpw\") pod \"39fa01cf-5ba6-4a0a-8240-2b2eba0decf6\" (UID: \"39fa01cf-5ba6-4a0a-8240-2b2eba0decf6\") " Dec 01 08:41:52 crc kubenswrapper[5004]: I1201 08:41:52.480508 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39fa01cf-5ba6-4a0a-8240-2b2eba0decf6-combined-ca-bundle\") pod \"39fa01cf-5ba6-4a0a-8240-2b2eba0decf6\" (UID: \"39fa01cf-5ba6-4a0a-8240-2b2eba0decf6\") " Dec 01 08:41:52 crc kubenswrapper[5004]: I1201 08:41:52.480607 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/39fa01cf-5ba6-4a0a-8240-2b2eba0decf6-log-httpd\") pod \"39fa01cf-5ba6-4a0a-8240-2b2eba0decf6\" (UID: \"39fa01cf-5ba6-4a0a-8240-2b2eba0decf6\") " Dec 01 08:41:52 crc kubenswrapper[5004]: I1201 08:41:52.480676 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/39fa01cf-5ba6-4a0a-8240-2b2eba0decf6-sg-core-conf-yaml\") pod \"39fa01cf-5ba6-4a0a-8240-2b2eba0decf6\" (UID: \"39fa01cf-5ba6-4a0a-8240-2b2eba0decf6\") " Dec 01 08:41:52 crc kubenswrapper[5004]: I1201 08:41:52.480699 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39fa01cf-5ba6-4a0a-8240-2b2eba0decf6-scripts\") pod \"39fa01cf-5ba6-4a0a-8240-2b2eba0decf6\" (UID: \"39fa01cf-5ba6-4a0a-8240-2b2eba0decf6\") " Dec 01 08:41:52 crc kubenswrapper[5004]: I1201 08:41:52.480721 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39fa01cf-5ba6-4a0a-8240-2b2eba0decf6-config-data\") pod \"39fa01cf-5ba6-4a0a-8240-2b2eba0decf6\" (UID: \"39fa01cf-5ba6-4a0a-8240-2b2eba0decf6\") " Dec 01 08:41:52 crc kubenswrapper[5004]: I1201 08:41:52.480807 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39fa01cf-5ba6-4a0a-8240-2b2eba0decf6-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "39fa01cf-5ba6-4a0a-8240-2b2eba0decf6" (UID: "39fa01cf-5ba6-4a0a-8240-2b2eba0decf6"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:41:52 crc kubenswrapper[5004]: I1201 08:41:52.481077 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39fa01cf-5ba6-4a0a-8240-2b2eba0decf6-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "39fa01cf-5ba6-4a0a-8240-2b2eba0decf6" (UID: "39fa01cf-5ba6-4a0a-8240-2b2eba0decf6"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:41:52 crc kubenswrapper[5004]: I1201 08:41:52.481261 5004 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/39fa01cf-5ba6-4a0a-8240-2b2eba0decf6-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:52 crc kubenswrapper[5004]: I1201 08:41:52.481279 5004 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/39fa01cf-5ba6-4a0a-8240-2b2eba0decf6-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:52 crc kubenswrapper[5004]: I1201 08:41:52.484872 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39fa01cf-5ba6-4a0a-8240-2b2eba0decf6-scripts" (OuterVolumeSpecName: "scripts") pod "39fa01cf-5ba6-4a0a-8240-2b2eba0decf6" (UID: "39fa01cf-5ba6-4a0a-8240-2b2eba0decf6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:41:52 crc kubenswrapper[5004]: I1201 08:41:52.485019 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39fa01cf-5ba6-4a0a-8240-2b2eba0decf6-kube-api-access-2cbpw" (OuterVolumeSpecName: "kube-api-access-2cbpw") pod "39fa01cf-5ba6-4a0a-8240-2b2eba0decf6" (UID: "39fa01cf-5ba6-4a0a-8240-2b2eba0decf6"). InnerVolumeSpecName "kube-api-access-2cbpw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:41:52 crc kubenswrapper[5004]: I1201 08:41:52.515795 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39fa01cf-5ba6-4a0a-8240-2b2eba0decf6-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "39fa01cf-5ba6-4a0a-8240-2b2eba0decf6" (UID: "39fa01cf-5ba6-4a0a-8240-2b2eba0decf6"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:41:52 crc kubenswrapper[5004]: I1201 08:41:52.588902 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2cbpw\" (UniqueName: \"kubernetes.io/projected/39fa01cf-5ba6-4a0a-8240-2b2eba0decf6-kube-api-access-2cbpw\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:52 crc kubenswrapper[5004]: I1201 08:41:52.588943 5004 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/39fa01cf-5ba6-4a0a-8240-2b2eba0decf6-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:52 crc kubenswrapper[5004]: I1201 08:41:52.588953 5004 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39fa01cf-5ba6-4a0a-8240-2b2eba0decf6-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:52 crc kubenswrapper[5004]: I1201 08:41:52.612613 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39fa01cf-5ba6-4a0a-8240-2b2eba0decf6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "39fa01cf-5ba6-4a0a-8240-2b2eba0decf6" (UID: "39fa01cf-5ba6-4a0a-8240-2b2eba0decf6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:41:52 crc kubenswrapper[5004]: I1201 08:41:52.621664 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39fa01cf-5ba6-4a0a-8240-2b2eba0decf6-config-data" (OuterVolumeSpecName: "config-data") pod "39fa01cf-5ba6-4a0a-8240-2b2eba0decf6" (UID: "39fa01cf-5ba6-4a0a-8240-2b2eba0decf6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:41:52 crc kubenswrapper[5004]: I1201 08:41:52.690748 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39fa01cf-5ba6-4a0a-8240-2b2eba0decf6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:52 crc kubenswrapper[5004]: I1201 08:41:52.690774 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39fa01cf-5ba6-4a0a-8240-2b2eba0decf6-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:52 crc kubenswrapper[5004]: I1201 08:41:52.746066 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ae05-account-create-update-fx8ms" Dec 01 08:41:52 crc kubenswrapper[5004]: I1201 08:41:52.794196 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrlgm\" (UniqueName: \"kubernetes.io/projected/7924d39b-830f-4f91-ae39-6b04c31d3f61-kube-api-access-qrlgm\") pod \"7924d39b-830f-4f91-ae39-6b04c31d3f61\" (UID: \"7924d39b-830f-4f91-ae39-6b04c31d3f61\") " Dec 01 08:41:52 crc kubenswrapper[5004]: I1201 08:41:52.794226 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7924d39b-830f-4f91-ae39-6b04c31d3f61-operator-scripts\") pod \"7924d39b-830f-4f91-ae39-6b04c31d3f61\" (UID: \"7924d39b-830f-4f91-ae39-6b04c31d3f61\") " Dec 01 08:41:52 crc kubenswrapper[5004]: I1201 08:41:52.797254 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7924d39b-830f-4f91-ae39-6b04c31d3f61-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7924d39b-830f-4f91-ae39-6b04c31d3f61" (UID: "7924d39b-830f-4f91-ae39-6b04c31d3f61"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:41:52 crc kubenswrapper[5004]: I1201 08:41:52.803639 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7924d39b-830f-4f91-ae39-6b04c31d3f61-kube-api-access-qrlgm" (OuterVolumeSpecName: "kube-api-access-qrlgm") pod "7924d39b-830f-4f91-ae39-6b04c31d3f61" (UID: "7924d39b-830f-4f91-ae39-6b04c31d3f61"). InnerVolumeSpecName "kube-api-access-qrlgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:41:52 crc kubenswrapper[5004]: I1201 08:41:52.898210 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrlgm\" (UniqueName: \"kubernetes.io/projected/7924d39b-830f-4f91-ae39-6b04c31d3f61-kube-api-access-qrlgm\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:52 crc kubenswrapper[5004]: I1201 08:41:52.898244 5004 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7924d39b-830f-4f91-ae39-6b04c31d3f61-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:52 crc kubenswrapper[5004]: I1201 08:41:52.975166 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-976b-account-create-update-8qblk" Dec 01 08:41:52 crc kubenswrapper[5004]: I1201 08:41:52.981356 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-12e3-account-create-update-2kzjl" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.000241 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5135af8b-1b58-4816-ace1-424059c5267a-operator-scripts\") pod \"5135af8b-1b58-4816-ace1-424059c5267a\" (UID: \"5135af8b-1b58-4816-ace1-424059c5267a\") " Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.000300 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f01fa36-7d6e-48b6-a764-8aa381b5cf7a-operator-scripts\") pod \"4f01fa36-7d6e-48b6-a764-8aa381b5cf7a\" (UID: \"4f01fa36-7d6e-48b6-a764-8aa381b5cf7a\") " Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.000409 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5dbs\" (UniqueName: \"kubernetes.io/projected/4f01fa36-7d6e-48b6-a764-8aa381b5cf7a-kube-api-access-c5dbs\") pod \"4f01fa36-7d6e-48b6-a764-8aa381b5cf7a\" (UID: \"4f01fa36-7d6e-48b6-a764-8aa381b5cf7a\") " Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.000488 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7c4c\" (UniqueName: \"kubernetes.io/projected/5135af8b-1b58-4816-ace1-424059c5267a-kube-api-access-c7c4c\") pod \"5135af8b-1b58-4816-ace1-424059c5267a\" (UID: \"5135af8b-1b58-4816-ace1-424059c5267a\") " Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.000756 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5135af8b-1b58-4816-ace1-424059c5267a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5135af8b-1b58-4816-ace1-424059c5267a" (UID: "5135af8b-1b58-4816-ace1-424059c5267a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.001039 5004 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5135af8b-1b58-4816-ace1-424059c5267a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.006024 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5135af8b-1b58-4816-ace1-424059c5267a-kube-api-access-c7c4c" (OuterVolumeSpecName: "kube-api-access-c7c4c") pod "5135af8b-1b58-4816-ace1-424059c5267a" (UID: "5135af8b-1b58-4816-ace1-424059c5267a"). InnerVolumeSpecName "kube-api-access-c7c4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.014722 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f01fa36-7d6e-48b6-a764-8aa381b5cf7a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4f01fa36-7d6e-48b6-a764-8aa381b5cf7a" (UID: "4f01fa36-7d6e-48b6-a764-8aa381b5cf7a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.025287 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f01fa36-7d6e-48b6-a764-8aa381b5cf7a-kube-api-access-c5dbs" (OuterVolumeSpecName: "kube-api-access-c5dbs") pod "4f01fa36-7d6e-48b6-a764-8aa381b5cf7a" (UID: "4f01fa36-7d6e-48b6-a764-8aa381b5cf7a"). InnerVolumeSpecName "kube-api-access-c5dbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.103875 5004 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f01fa36-7d6e-48b6-a764-8aa381b5cf7a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.104067 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5dbs\" (UniqueName: \"kubernetes.io/projected/4f01fa36-7d6e-48b6-a764-8aa381b5cf7a-kube-api-access-c5dbs\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.104092 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7c4c\" (UniqueName: \"kubernetes.io/projected/5135af8b-1b58-4816-ace1-424059c5267a-kube-api-access-c7c4c\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.322493 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-976b-account-create-update-8qblk" event={"ID":"5135af8b-1b58-4816-ace1-424059c5267a","Type":"ContainerDied","Data":"622c5b3b58609975928ff9a838bfc925cc2e3c3b40d1f4ac21ae7910510dadf0"} Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.322888 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="622c5b3b58609975928ff9a838bfc925cc2e3c3b40d1f4ac21ae7910510dadf0" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.322617 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-976b-account-create-update-8qblk" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.325886 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"39fa01cf-5ba6-4a0a-8240-2b2eba0decf6","Type":"ContainerDied","Data":"fea9b49602e197a763761422d6a8c36ad1af8c0dea50d4f727f197ea88646a2e"} Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.325953 5004 scope.go:117] "RemoveContainer" containerID="837002c0fb6c00f56e148db9ac7b49b19a3039824aa05dbd99d9de35d418e360" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.326134 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.328993 5004 generic.go:334] "Generic (PLEG): container finished" podID="64235e39-5760-4a2e-a164-7cb27ca906a3" containerID="a4de37a2a551c10a01fbe2aa77cf24579a8bbc8756ad332d58f177ae88a8f0c4" exitCode=143 Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.329030 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"64235e39-5760-4a2e-a164-7cb27ca906a3","Type":"ContainerDied","Data":"a4de37a2a551c10a01fbe2aa77cf24579a8bbc8756ad332d58f177ae88a8f0c4"} Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.332196 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ae05-account-create-update-fx8ms" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.334599 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-ae05-account-create-update-fx8ms" event={"ID":"7924d39b-830f-4f91-ae39-6b04c31d3f61","Type":"ContainerDied","Data":"c2026c5565741b6a599164a8219ccae4b981fcce2de6a6c3642fe2f74da2695c"} Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.334644 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2026c5565741b6a599164a8219ccae4b981fcce2de6a6c3642fe2f74da2695c" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.342840 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-12e3-account-create-update-2kzjl" event={"ID":"4f01fa36-7d6e-48b6-a764-8aa381b5cf7a","Type":"ContainerDied","Data":"05ac3201c9bfbaae50ff8f98bdc363f2b351ef44f12bed5d618dcfb103c87b80"} Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.342875 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05ac3201c9bfbaae50ff8f98bdc363f2b351ef44f12bed5d618dcfb103c87b80" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.342948 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-12e3-account-create-update-2kzjl" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.414804 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.430998 5004 scope.go:117] "RemoveContainer" containerID="87be489943345253cd96375601b91a7ebcde30a3d80175c025c2d064b4578e0a" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.442823 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.452670 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 08:41:53 crc kubenswrapper[5004]: E1201 08:41:53.453094 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f01fa36-7d6e-48b6-a764-8aa381b5cf7a" containerName="mariadb-account-create-update" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.453113 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f01fa36-7d6e-48b6-a764-8aa381b5cf7a" containerName="mariadb-account-create-update" Dec 01 08:41:53 crc kubenswrapper[5004]: E1201 08:41:53.453129 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1b1e393-9cd1-4f89-8b3e-b02ee0d397bd" containerName="mariadb-database-create" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.453135 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1b1e393-9cd1-4f89-8b3e-b02ee0d397bd" containerName="mariadb-database-create" Dec 01 08:41:53 crc kubenswrapper[5004]: E1201 08:41:53.453149 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39fa01cf-5ba6-4a0a-8240-2b2eba0decf6" containerName="proxy-httpd" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.453155 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="39fa01cf-5ba6-4a0a-8240-2b2eba0decf6" containerName="proxy-httpd" Dec 01 08:41:53 crc kubenswrapper[5004]: E1201 08:41:53.453178 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eede402f-9c2a-4dcf-9de9-33df959b5cd8" containerName="mariadb-database-create" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.453186 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="eede402f-9c2a-4dcf-9de9-33df959b5cd8" containerName="mariadb-database-create" Dec 01 08:41:53 crc kubenswrapper[5004]: E1201 08:41:53.453198 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39fa01cf-5ba6-4a0a-8240-2b2eba0decf6" containerName="sg-core" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.453204 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="39fa01cf-5ba6-4a0a-8240-2b2eba0decf6" containerName="sg-core" Dec 01 08:41:53 crc kubenswrapper[5004]: E1201 08:41:53.453216 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39fa01cf-5ba6-4a0a-8240-2b2eba0decf6" containerName="ceilometer-notification-agent" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.453221 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="39fa01cf-5ba6-4a0a-8240-2b2eba0decf6" containerName="ceilometer-notification-agent" Dec 01 08:41:53 crc kubenswrapper[5004]: E1201 08:41:53.453237 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e56ced1f-e623-4c1a-8da6-944c91827cac" containerName="init" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.453243 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="e56ced1f-e623-4c1a-8da6-944c91827cac" containerName="init" Dec 01 08:41:53 crc kubenswrapper[5004]: E1201 08:41:53.453253 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e56ced1f-e623-4c1a-8da6-944c91827cac" containerName="dnsmasq-dns" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.453259 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="e56ced1f-e623-4c1a-8da6-944c91827cac" containerName="dnsmasq-dns" Dec 01 08:41:53 crc kubenswrapper[5004]: E1201 08:41:53.453269 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39fa01cf-5ba6-4a0a-8240-2b2eba0decf6" containerName="ceilometer-central-agent" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.453276 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="39fa01cf-5ba6-4a0a-8240-2b2eba0decf6" containerName="ceilometer-central-agent" Dec 01 08:41:53 crc kubenswrapper[5004]: E1201 08:41:53.453288 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7924d39b-830f-4f91-ae39-6b04c31d3f61" containerName="mariadb-account-create-update" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.453293 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="7924d39b-830f-4f91-ae39-6b04c31d3f61" containerName="mariadb-account-create-update" Dec 01 08:41:53 crc kubenswrapper[5004]: E1201 08:41:53.453304 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b50edddf-3daf-4bee-83da-8c44123a382f" containerName="mariadb-database-create" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.453310 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="b50edddf-3daf-4bee-83da-8c44123a382f" containerName="mariadb-database-create" Dec 01 08:41:53 crc kubenswrapper[5004]: E1201 08:41:53.453322 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5135af8b-1b58-4816-ace1-424059c5267a" containerName="mariadb-account-create-update" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.453327 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="5135af8b-1b58-4816-ace1-424059c5267a" containerName="mariadb-account-create-update" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.453573 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="5135af8b-1b58-4816-ace1-424059c5267a" containerName="mariadb-account-create-update" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.453585 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f01fa36-7d6e-48b6-a764-8aa381b5cf7a" containerName="mariadb-account-create-update" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.453598 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="7924d39b-830f-4f91-ae39-6b04c31d3f61" containerName="mariadb-account-create-update" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.453608 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="39fa01cf-5ba6-4a0a-8240-2b2eba0decf6" containerName="ceilometer-notification-agent" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.453619 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1b1e393-9cd1-4f89-8b3e-b02ee0d397bd" containerName="mariadb-database-create" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.453629 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="b50edddf-3daf-4bee-83da-8c44123a382f" containerName="mariadb-database-create" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.453637 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="eede402f-9c2a-4dcf-9de9-33df959b5cd8" containerName="mariadb-database-create" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.453646 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="39fa01cf-5ba6-4a0a-8240-2b2eba0decf6" containerName="ceilometer-central-agent" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.453661 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="39fa01cf-5ba6-4a0a-8240-2b2eba0decf6" containerName="sg-core" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.453670 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="39fa01cf-5ba6-4a0a-8240-2b2eba0decf6" containerName="proxy-httpd" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.453682 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="e56ced1f-e623-4c1a-8da6-944c91827cac" containerName="dnsmasq-dns" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.456799 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.460074 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.460282 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.466959 5004 scope.go:117] "RemoveContainer" containerID="0d3638419e618bdc1512483089151d9c94e2ca2076f244ebbfb1eef43d956740" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.479732 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.507418 5004 scope.go:117] "RemoveContainer" containerID="f85fe523c36185516b7a492fd599b5d479b15257da6a82e8a48292efd8203240" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.514079 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4d72bbc-0042-4886-9c2a-4b6e6267be65-run-httpd\") pod \"ceilometer-0\" (UID: \"b4d72bbc-0042-4886-9c2a-4b6e6267be65\") " pod="openstack/ceilometer-0" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.514130 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsjrt\" (UniqueName: \"kubernetes.io/projected/b4d72bbc-0042-4886-9c2a-4b6e6267be65-kube-api-access-wsjrt\") pod \"ceilometer-0\" (UID: \"b4d72bbc-0042-4886-9c2a-4b6e6267be65\") " pod="openstack/ceilometer-0" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.514164 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4d72bbc-0042-4886-9c2a-4b6e6267be65-scripts\") pod \"ceilometer-0\" (UID: \"b4d72bbc-0042-4886-9c2a-4b6e6267be65\") " pod="openstack/ceilometer-0" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.514180 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4d72bbc-0042-4886-9c2a-4b6e6267be65-log-httpd\") pod \"ceilometer-0\" (UID: \"b4d72bbc-0042-4886-9c2a-4b6e6267be65\") " pod="openstack/ceilometer-0" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.514236 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4d72bbc-0042-4886-9c2a-4b6e6267be65-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b4d72bbc-0042-4886-9c2a-4b6e6267be65\") " pod="openstack/ceilometer-0" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.514296 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4d72bbc-0042-4886-9c2a-4b6e6267be65-config-data\") pod \"ceilometer-0\" (UID: \"b4d72bbc-0042-4886-9c2a-4b6e6267be65\") " pod="openstack/ceilometer-0" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.514349 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b4d72bbc-0042-4886-9c2a-4b6e6267be65-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b4d72bbc-0042-4886-9c2a-4b6e6267be65\") " pod="openstack/ceilometer-0" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.617504 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4d72bbc-0042-4886-9c2a-4b6e6267be65-run-httpd\") pod \"ceilometer-0\" (UID: \"b4d72bbc-0042-4886-9c2a-4b6e6267be65\") " pod="openstack/ceilometer-0" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.617588 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsjrt\" (UniqueName: \"kubernetes.io/projected/b4d72bbc-0042-4886-9c2a-4b6e6267be65-kube-api-access-wsjrt\") pod \"ceilometer-0\" (UID: \"b4d72bbc-0042-4886-9c2a-4b6e6267be65\") " pod="openstack/ceilometer-0" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.617629 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4d72bbc-0042-4886-9c2a-4b6e6267be65-log-httpd\") pod \"ceilometer-0\" (UID: \"b4d72bbc-0042-4886-9c2a-4b6e6267be65\") " pod="openstack/ceilometer-0" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.617644 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4d72bbc-0042-4886-9c2a-4b6e6267be65-scripts\") pod \"ceilometer-0\" (UID: \"b4d72bbc-0042-4886-9c2a-4b6e6267be65\") " pod="openstack/ceilometer-0" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.617668 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4d72bbc-0042-4886-9c2a-4b6e6267be65-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b4d72bbc-0042-4886-9c2a-4b6e6267be65\") " pod="openstack/ceilometer-0" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.617749 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4d72bbc-0042-4886-9c2a-4b6e6267be65-config-data\") pod \"ceilometer-0\" (UID: \"b4d72bbc-0042-4886-9c2a-4b6e6267be65\") " pod="openstack/ceilometer-0" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.618101 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4d72bbc-0042-4886-9c2a-4b6e6267be65-run-httpd\") pod \"ceilometer-0\" (UID: \"b4d72bbc-0042-4886-9c2a-4b6e6267be65\") " pod="openstack/ceilometer-0" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.618142 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4d72bbc-0042-4886-9c2a-4b6e6267be65-log-httpd\") pod \"ceilometer-0\" (UID: \"b4d72bbc-0042-4886-9c2a-4b6e6267be65\") " pod="openstack/ceilometer-0" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.619814 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b4d72bbc-0042-4886-9c2a-4b6e6267be65-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b4d72bbc-0042-4886-9c2a-4b6e6267be65\") " pod="openstack/ceilometer-0" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.623009 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4d72bbc-0042-4886-9c2a-4b6e6267be65-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b4d72bbc-0042-4886-9c2a-4b6e6267be65\") " pod="openstack/ceilometer-0" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.623613 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4d72bbc-0042-4886-9c2a-4b6e6267be65-config-data\") pod \"ceilometer-0\" (UID: \"b4d72bbc-0042-4886-9c2a-4b6e6267be65\") " pod="openstack/ceilometer-0" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.624112 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4d72bbc-0042-4886-9c2a-4b6e6267be65-scripts\") pod \"ceilometer-0\" (UID: \"b4d72bbc-0042-4886-9c2a-4b6e6267be65\") " pod="openstack/ceilometer-0" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.625287 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b4d72bbc-0042-4886-9c2a-4b6e6267be65-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b4d72bbc-0042-4886-9c2a-4b6e6267be65\") " pod="openstack/ceilometer-0" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.639505 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsjrt\" (UniqueName: \"kubernetes.io/projected/b4d72bbc-0042-4886-9c2a-4b6e6267be65-kube-api-access-wsjrt\") pod \"ceilometer-0\" (UID: \"b4d72bbc-0042-4886-9c2a-4b6e6267be65\") " pod="openstack/ceilometer-0" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.776018 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.784354 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.824735 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa6628b9-be2f-4594-8767-5442e1d2f5b9-logs\") pod \"fa6628b9-be2f-4594-8767-5442e1d2f5b9\" (UID: \"fa6628b9-be2f-4594-8767-5442e1d2f5b9\") " Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.824771 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fa6628b9-be2f-4594-8767-5442e1d2f5b9-httpd-run\") pod \"fa6628b9-be2f-4594-8767-5442e1d2f5b9\" (UID: \"fa6628b9-be2f-4594-8767-5442e1d2f5b9\") " Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.824845 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa6628b9-be2f-4594-8767-5442e1d2f5b9-scripts\") pod \"fa6628b9-be2f-4594-8767-5442e1d2f5b9\" (UID: \"fa6628b9-be2f-4594-8767-5442e1d2f5b9\") " Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.824993 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nv98t\" (UniqueName: \"kubernetes.io/projected/fa6628b9-be2f-4594-8767-5442e1d2f5b9-kube-api-access-nv98t\") pod \"fa6628b9-be2f-4594-8767-5442e1d2f5b9\" (UID: \"fa6628b9-be2f-4594-8767-5442e1d2f5b9\") " Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.825019 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa6628b9-be2f-4594-8767-5442e1d2f5b9-combined-ca-bundle\") pod \"fa6628b9-be2f-4594-8767-5442e1d2f5b9\" (UID: \"fa6628b9-be2f-4594-8767-5442e1d2f5b9\") " Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.825036 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"fa6628b9-be2f-4594-8767-5442e1d2f5b9\" (UID: \"fa6628b9-be2f-4594-8767-5442e1d2f5b9\") " Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.825057 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa6628b9-be2f-4594-8767-5442e1d2f5b9-config-data\") pod \"fa6628b9-be2f-4594-8767-5442e1d2f5b9\" (UID: \"fa6628b9-be2f-4594-8767-5442e1d2f5b9\") " Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.825129 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa6628b9-be2f-4594-8767-5442e1d2f5b9-public-tls-certs\") pod \"fa6628b9-be2f-4594-8767-5442e1d2f5b9\" (UID: \"fa6628b9-be2f-4594-8767-5442e1d2f5b9\") " Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.825787 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa6628b9-be2f-4594-8767-5442e1d2f5b9-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "fa6628b9-be2f-4594-8767-5442e1d2f5b9" (UID: "fa6628b9-be2f-4594-8767-5442e1d2f5b9"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.831647 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa6628b9-be2f-4594-8767-5442e1d2f5b9-scripts" (OuterVolumeSpecName: "scripts") pod "fa6628b9-be2f-4594-8767-5442e1d2f5b9" (UID: "fa6628b9-be2f-4594-8767-5442e1d2f5b9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.837520 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa6628b9-be2f-4594-8767-5442e1d2f5b9-logs" (OuterVolumeSpecName: "logs") pod "fa6628b9-be2f-4594-8767-5442e1d2f5b9" (UID: "fa6628b9-be2f-4594-8767-5442e1d2f5b9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.837699 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa6628b9-be2f-4594-8767-5442e1d2f5b9-kube-api-access-nv98t" (OuterVolumeSpecName: "kube-api-access-nv98t") pod "fa6628b9-be2f-4594-8767-5442e1d2f5b9" (UID: "fa6628b9-be2f-4594-8767-5442e1d2f5b9"). InnerVolumeSpecName "kube-api-access-nv98t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.839645 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "fa6628b9-be2f-4594-8767-5442e1d2f5b9" (UID: "fa6628b9-be2f-4594-8767-5442e1d2f5b9"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.877024 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa6628b9-be2f-4594-8767-5442e1d2f5b9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fa6628b9-be2f-4594-8767-5442e1d2f5b9" (UID: "fa6628b9-be2f-4594-8767-5442e1d2f5b9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.909553 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa6628b9-be2f-4594-8767-5442e1d2f5b9-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "fa6628b9-be2f-4594-8767-5442e1d2f5b9" (UID: "fa6628b9-be2f-4594-8767-5442e1d2f5b9"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.930089 5004 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa6628b9-be2f-4594-8767-5442e1d2f5b9-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.930119 5004 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa6628b9-be2f-4594-8767-5442e1d2f5b9-logs\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.930128 5004 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fa6628b9-be2f-4594-8767-5442e1d2f5b9-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.930136 5004 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa6628b9-be2f-4594-8767-5442e1d2f5b9-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.930147 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nv98t\" (UniqueName: \"kubernetes.io/projected/fa6628b9-be2f-4594-8767-5442e1d2f5b9-kube-api-access-nv98t\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.930159 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa6628b9-be2f-4594-8767-5442e1d2f5b9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.930181 5004 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.985722 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa6628b9-be2f-4594-8767-5442e1d2f5b9-config-data" (OuterVolumeSpecName: "config-data") pod "fa6628b9-be2f-4594-8767-5442e1d2f5b9" (UID: "fa6628b9-be2f-4594-8767-5442e1d2f5b9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:41:53 crc kubenswrapper[5004]: I1201 08:41:53.989675 5004 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Dec 01 08:41:54 crc kubenswrapper[5004]: I1201 08:41:54.035019 5004 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:54 crc kubenswrapper[5004]: I1201 08:41:54.035047 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa6628b9-be2f-4594-8767-5442e1d2f5b9-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:54 crc kubenswrapper[5004]: W1201 08:41:54.336220 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4d72bbc_0042_4886_9c2a_4b6e6267be65.slice/crio-8fbd33e5499b2520aa6162d0228fe9a702544d08bf07ec0ccc3a6b42db44f184 WatchSource:0}: Error finding container 8fbd33e5499b2520aa6162d0228fe9a702544d08bf07ec0ccc3a6b42db44f184: Status 404 returned error can't find the container with id 8fbd33e5499b2520aa6162d0228fe9a702544d08bf07ec0ccc3a6b42db44f184 Dec 01 08:41:54 crc kubenswrapper[5004]: I1201 08:41:54.340170 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 08:41:54 crc kubenswrapper[5004]: I1201 08:41:54.356458 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4d72bbc-0042-4886-9c2a-4b6e6267be65","Type":"ContainerStarted","Data":"8fbd33e5499b2520aa6162d0228fe9a702544d08bf07ec0ccc3a6b42db44f184"} Dec 01 08:41:54 crc kubenswrapper[5004]: I1201 08:41:54.358897 5004 generic.go:334] "Generic (PLEG): container finished" podID="fa6628b9-be2f-4594-8767-5442e1d2f5b9" containerID="873b1fd641aba5c655ee027bc8a60117804b809c92280ec25c80829f55b1ab65" exitCode=0 Dec 01 08:41:54 crc kubenswrapper[5004]: I1201 08:41:54.358936 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fa6628b9-be2f-4594-8767-5442e1d2f5b9","Type":"ContainerDied","Data":"873b1fd641aba5c655ee027bc8a60117804b809c92280ec25c80829f55b1ab65"} Dec 01 08:41:54 crc kubenswrapper[5004]: I1201 08:41:54.358956 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 08:41:54 crc kubenswrapper[5004]: I1201 08:41:54.358989 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fa6628b9-be2f-4594-8767-5442e1d2f5b9","Type":"ContainerDied","Data":"d5514074610abc7259416e8dda14f559675a9845afe697c489e3eb4e44819adc"} Dec 01 08:41:54 crc kubenswrapper[5004]: I1201 08:41:54.359012 5004 scope.go:117] "RemoveContainer" containerID="873b1fd641aba5c655ee027bc8a60117804b809c92280ec25c80829f55b1ab65" Dec 01 08:41:54 crc kubenswrapper[5004]: I1201 08:41:54.400993 5004 scope.go:117] "RemoveContainer" containerID="8e0d24717f6b77a9ff93f781693f1a7b385497e8d77549962f407e2ec0a74009" Dec 01 08:41:54 crc kubenswrapper[5004]: I1201 08:41:54.412172 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 08:41:54 crc kubenswrapper[5004]: I1201 08:41:54.429048 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 08:41:54 crc kubenswrapper[5004]: I1201 08:41:54.429459 5004 scope.go:117] "RemoveContainer" containerID="873b1fd641aba5c655ee027bc8a60117804b809c92280ec25c80829f55b1ab65" Dec 01 08:41:54 crc kubenswrapper[5004]: E1201 08:41:54.430431 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"873b1fd641aba5c655ee027bc8a60117804b809c92280ec25c80829f55b1ab65\": container with ID starting with 873b1fd641aba5c655ee027bc8a60117804b809c92280ec25c80829f55b1ab65 not found: ID does not exist" containerID="873b1fd641aba5c655ee027bc8a60117804b809c92280ec25c80829f55b1ab65" Dec 01 08:41:54 crc kubenswrapper[5004]: I1201 08:41:54.430460 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"873b1fd641aba5c655ee027bc8a60117804b809c92280ec25c80829f55b1ab65"} err="failed to get container status \"873b1fd641aba5c655ee027bc8a60117804b809c92280ec25c80829f55b1ab65\": rpc error: code = NotFound desc = could not find container \"873b1fd641aba5c655ee027bc8a60117804b809c92280ec25c80829f55b1ab65\": container with ID starting with 873b1fd641aba5c655ee027bc8a60117804b809c92280ec25c80829f55b1ab65 not found: ID does not exist" Dec 01 08:41:54 crc kubenswrapper[5004]: I1201 08:41:54.430481 5004 scope.go:117] "RemoveContainer" containerID="8e0d24717f6b77a9ff93f781693f1a7b385497e8d77549962f407e2ec0a74009" Dec 01 08:41:54 crc kubenswrapper[5004]: E1201 08:41:54.430966 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e0d24717f6b77a9ff93f781693f1a7b385497e8d77549962f407e2ec0a74009\": container with ID starting with 8e0d24717f6b77a9ff93f781693f1a7b385497e8d77549962f407e2ec0a74009 not found: ID does not exist" containerID="8e0d24717f6b77a9ff93f781693f1a7b385497e8d77549962f407e2ec0a74009" Dec 01 08:41:54 crc kubenswrapper[5004]: I1201 08:41:54.430990 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e0d24717f6b77a9ff93f781693f1a7b385497e8d77549962f407e2ec0a74009"} err="failed to get container status \"8e0d24717f6b77a9ff93f781693f1a7b385497e8d77549962f407e2ec0a74009\": rpc error: code = NotFound desc = could not find container \"8e0d24717f6b77a9ff93f781693f1a7b385497e8d77549962f407e2ec0a74009\": container with ID starting with 8e0d24717f6b77a9ff93f781693f1a7b385497e8d77549962f407e2ec0a74009 not found: ID does not exist" Dec 01 08:41:54 crc kubenswrapper[5004]: I1201 08:41:54.444424 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 08:41:54 crc kubenswrapper[5004]: E1201 08:41:54.444981 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa6628b9-be2f-4594-8767-5442e1d2f5b9" containerName="glance-httpd" Dec 01 08:41:54 crc kubenswrapper[5004]: I1201 08:41:54.444999 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa6628b9-be2f-4594-8767-5442e1d2f5b9" containerName="glance-httpd" Dec 01 08:41:54 crc kubenswrapper[5004]: E1201 08:41:54.445007 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa6628b9-be2f-4594-8767-5442e1d2f5b9" containerName="glance-log" Dec 01 08:41:54 crc kubenswrapper[5004]: I1201 08:41:54.445012 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa6628b9-be2f-4594-8767-5442e1d2f5b9" containerName="glance-log" Dec 01 08:41:54 crc kubenswrapper[5004]: I1201 08:41:54.445197 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa6628b9-be2f-4594-8767-5442e1d2f5b9" containerName="glance-httpd" Dec 01 08:41:54 crc kubenswrapper[5004]: I1201 08:41:54.445220 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa6628b9-be2f-4594-8767-5442e1d2f5b9" containerName="glance-log" Dec 01 08:41:54 crc kubenswrapper[5004]: I1201 08:41:54.446494 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 08:41:54 crc kubenswrapper[5004]: I1201 08:41:54.448689 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 01 08:41:54 crc kubenswrapper[5004]: I1201 08:41:54.448704 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 01 08:41:54 crc kubenswrapper[5004]: I1201 08:41:54.454027 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 08:41:54 crc kubenswrapper[5004]: I1201 08:41:54.543627 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbd9e9ac-7391-43f2-9f0c-a10d4dd9f81a-scripts\") pod \"glance-default-external-api-0\" (UID: \"bbd9e9ac-7391-43f2-9f0c-a10d4dd9f81a\") " pod="openstack/glance-default-external-api-0" Dec 01 08:41:54 crc kubenswrapper[5004]: I1201 08:41:54.543681 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbd9e9ac-7391-43f2-9f0c-a10d4dd9f81a-config-data\") pod \"glance-default-external-api-0\" (UID: \"bbd9e9ac-7391-43f2-9f0c-a10d4dd9f81a\") " pod="openstack/glance-default-external-api-0" Dec 01 08:41:54 crc kubenswrapper[5004]: I1201 08:41:54.543702 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbd9e9ac-7391-43f2-9f0c-a10d4dd9f81a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bbd9e9ac-7391-43f2-9f0c-a10d4dd9f81a\") " pod="openstack/glance-default-external-api-0" Dec 01 08:41:54 crc kubenswrapper[5004]: I1201 08:41:54.543905 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bbd9e9ac-7391-43f2-9f0c-a10d4dd9f81a-logs\") pod \"glance-default-external-api-0\" (UID: \"bbd9e9ac-7391-43f2-9f0c-a10d4dd9f81a\") " pod="openstack/glance-default-external-api-0" Dec 01 08:41:54 crc kubenswrapper[5004]: I1201 08:41:54.544186 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bbd9e9ac-7391-43f2-9f0c-a10d4dd9f81a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bbd9e9ac-7391-43f2-9f0c-a10d4dd9f81a\") " pod="openstack/glance-default-external-api-0" Dec 01 08:41:54 crc kubenswrapper[5004]: I1201 08:41:54.544236 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbd9e9ac-7391-43f2-9f0c-a10d4dd9f81a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"bbd9e9ac-7391-43f2-9f0c-a10d4dd9f81a\") " pod="openstack/glance-default-external-api-0" Dec 01 08:41:54 crc kubenswrapper[5004]: I1201 08:41:54.544404 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58nts\" (UniqueName: \"kubernetes.io/projected/bbd9e9ac-7391-43f2-9f0c-a10d4dd9f81a-kube-api-access-58nts\") pod \"glance-default-external-api-0\" (UID: \"bbd9e9ac-7391-43f2-9f0c-a10d4dd9f81a\") " pod="openstack/glance-default-external-api-0" Dec 01 08:41:54 crc kubenswrapper[5004]: I1201 08:41:54.544540 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"bbd9e9ac-7391-43f2-9f0c-a10d4dd9f81a\") " pod="openstack/glance-default-external-api-0" Dec 01 08:41:54 crc kubenswrapper[5004]: I1201 08:41:54.647203 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bbd9e9ac-7391-43f2-9f0c-a10d4dd9f81a-logs\") pod \"glance-default-external-api-0\" (UID: \"bbd9e9ac-7391-43f2-9f0c-a10d4dd9f81a\") " pod="openstack/glance-default-external-api-0" Dec 01 08:41:54 crc kubenswrapper[5004]: I1201 08:41:54.647327 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bbd9e9ac-7391-43f2-9f0c-a10d4dd9f81a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bbd9e9ac-7391-43f2-9f0c-a10d4dd9f81a\") " pod="openstack/glance-default-external-api-0" Dec 01 08:41:54 crc kubenswrapper[5004]: I1201 08:41:54.647359 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbd9e9ac-7391-43f2-9f0c-a10d4dd9f81a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"bbd9e9ac-7391-43f2-9f0c-a10d4dd9f81a\") " pod="openstack/glance-default-external-api-0" Dec 01 08:41:54 crc kubenswrapper[5004]: I1201 08:41:54.647424 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58nts\" (UniqueName: \"kubernetes.io/projected/bbd9e9ac-7391-43f2-9f0c-a10d4dd9f81a-kube-api-access-58nts\") pod \"glance-default-external-api-0\" (UID: \"bbd9e9ac-7391-43f2-9f0c-a10d4dd9f81a\") " pod="openstack/glance-default-external-api-0" Dec 01 08:41:54 crc kubenswrapper[5004]: I1201 08:41:54.647491 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"bbd9e9ac-7391-43f2-9f0c-a10d4dd9f81a\") " pod="openstack/glance-default-external-api-0" Dec 01 08:41:54 crc kubenswrapper[5004]: I1201 08:41:54.647548 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbd9e9ac-7391-43f2-9f0c-a10d4dd9f81a-scripts\") pod \"glance-default-external-api-0\" (UID: \"bbd9e9ac-7391-43f2-9f0c-a10d4dd9f81a\") " pod="openstack/glance-default-external-api-0" Dec 01 08:41:54 crc kubenswrapper[5004]: I1201 08:41:54.647595 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbd9e9ac-7391-43f2-9f0c-a10d4dd9f81a-config-data\") pod \"glance-default-external-api-0\" (UID: \"bbd9e9ac-7391-43f2-9f0c-a10d4dd9f81a\") " pod="openstack/glance-default-external-api-0" Dec 01 08:41:54 crc kubenswrapper[5004]: I1201 08:41:54.647629 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbd9e9ac-7391-43f2-9f0c-a10d4dd9f81a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bbd9e9ac-7391-43f2-9f0c-a10d4dd9f81a\") " pod="openstack/glance-default-external-api-0" Dec 01 08:41:54 crc kubenswrapper[5004]: I1201 08:41:54.647731 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bbd9e9ac-7391-43f2-9f0c-a10d4dd9f81a-logs\") pod \"glance-default-external-api-0\" (UID: \"bbd9e9ac-7391-43f2-9f0c-a10d4dd9f81a\") " pod="openstack/glance-default-external-api-0" Dec 01 08:41:54 crc kubenswrapper[5004]: I1201 08:41:54.647773 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bbd9e9ac-7391-43f2-9f0c-a10d4dd9f81a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bbd9e9ac-7391-43f2-9f0c-a10d4dd9f81a\") " pod="openstack/glance-default-external-api-0" Dec 01 08:41:54 crc kubenswrapper[5004]: I1201 08:41:54.647813 5004 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"bbd9e9ac-7391-43f2-9f0c-a10d4dd9f81a\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Dec 01 08:41:54 crc kubenswrapper[5004]: I1201 08:41:54.653870 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbd9e9ac-7391-43f2-9f0c-a10d4dd9f81a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"bbd9e9ac-7391-43f2-9f0c-a10d4dd9f81a\") " pod="openstack/glance-default-external-api-0" Dec 01 08:41:54 crc kubenswrapper[5004]: I1201 08:41:54.654050 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbd9e9ac-7391-43f2-9f0c-a10d4dd9f81a-scripts\") pod \"glance-default-external-api-0\" (UID: \"bbd9e9ac-7391-43f2-9f0c-a10d4dd9f81a\") " pod="openstack/glance-default-external-api-0" Dec 01 08:41:54 crc kubenswrapper[5004]: I1201 08:41:54.654223 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbd9e9ac-7391-43f2-9f0c-a10d4dd9f81a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bbd9e9ac-7391-43f2-9f0c-a10d4dd9f81a\") " pod="openstack/glance-default-external-api-0" Dec 01 08:41:54 crc kubenswrapper[5004]: I1201 08:41:54.655388 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbd9e9ac-7391-43f2-9f0c-a10d4dd9f81a-config-data\") pod \"glance-default-external-api-0\" (UID: \"bbd9e9ac-7391-43f2-9f0c-a10d4dd9f81a\") " pod="openstack/glance-default-external-api-0" Dec 01 08:41:54 crc kubenswrapper[5004]: I1201 08:41:54.669284 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58nts\" (UniqueName: \"kubernetes.io/projected/bbd9e9ac-7391-43f2-9f0c-a10d4dd9f81a-kube-api-access-58nts\") pod \"glance-default-external-api-0\" (UID: \"bbd9e9ac-7391-43f2-9f0c-a10d4dd9f81a\") " pod="openstack/glance-default-external-api-0" Dec 01 08:41:54 crc kubenswrapper[5004]: I1201 08:41:54.689993 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"bbd9e9ac-7391-43f2-9f0c-a10d4dd9f81a\") " pod="openstack/glance-default-external-api-0" Dec 01 08:41:54 crc kubenswrapper[5004]: I1201 08:41:54.779152 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39fa01cf-5ba6-4a0a-8240-2b2eba0decf6" path="/var/lib/kubelet/pods/39fa01cf-5ba6-4a0a-8240-2b2eba0decf6/volumes" Dec 01 08:41:54 crc kubenswrapper[5004]: I1201 08:41:54.780435 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa6628b9-be2f-4594-8767-5442e1d2f5b9" path="/var/lib/kubelet/pods/fa6628b9-be2f-4594-8767-5442e1d2f5b9/volumes" Dec 01 08:41:54 crc kubenswrapper[5004]: I1201 08:41:54.783134 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 08:41:55 crc kubenswrapper[5004]: W1201 08:41:55.329238 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbbd9e9ac_7391_43f2_9f0c_a10d4dd9f81a.slice/crio-6d109f5048c3bd7a33dd52e272484815a6945817f1f7fdc581a4084ff37ac058 WatchSource:0}: Error finding container 6d109f5048c3bd7a33dd52e272484815a6945817f1f7fdc581a4084ff37ac058: Status 404 returned error can't find the container with id 6d109f5048c3bd7a33dd52e272484815a6945817f1f7fdc581a4084ff37ac058 Dec 01 08:41:55 crc kubenswrapper[5004]: I1201 08:41:55.332082 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 08:41:55 crc kubenswrapper[5004]: I1201 08:41:55.368433 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bbd9e9ac-7391-43f2-9f0c-a10d4dd9f81a","Type":"ContainerStarted","Data":"6d109f5048c3bd7a33dd52e272484815a6945817f1f7fdc581a4084ff37ac058"} Dec 01 08:41:55 crc kubenswrapper[5004]: I1201 08:41:55.371199 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4d72bbc-0042-4886-9c2a-4b6e6267be65","Type":"ContainerStarted","Data":"080966e70d6fb040c67da27f631647c6eeb3d335f9e02faa0f46df4b3174c846"} Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.208016 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.289460 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/64235e39-5760-4a2e-a164-7cb27ca906a3-httpd-run\") pod \"64235e39-5760-4a2e-a164-7cb27ca906a3\" (UID: \"64235e39-5760-4a2e-a164-7cb27ca906a3\") " Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.289601 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64235e39-5760-4a2e-a164-7cb27ca906a3-logs\") pod \"64235e39-5760-4a2e-a164-7cb27ca906a3\" (UID: \"64235e39-5760-4a2e-a164-7cb27ca906a3\") " Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.289631 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64235e39-5760-4a2e-a164-7cb27ca906a3-scripts\") pod \"64235e39-5760-4a2e-a164-7cb27ca906a3\" (UID: \"64235e39-5760-4a2e-a164-7cb27ca906a3\") " Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.289700 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64235e39-5760-4a2e-a164-7cb27ca906a3-config-data\") pod \"64235e39-5760-4a2e-a164-7cb27ca906a3\" (UID: \"64235e39-5760-4a2e-a164-7cb27ca906a3\") " Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.289892 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64235e39-5760-4a2e-a164-7cb27ca906a3-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "64235e39-5760-4a2e-a164-7cb27ca906a3" (UID: "64235e39-5760-4a2e-a164-7cb27ca906a3"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.290107 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64235e39-5760-4a2e-a164-7cb27ca906a3-logs" (OuterVolumeSpecName: "logs") pod "64235e39-5760-4a2e-a164-7cb27ca906a3" (UID: "64235e39-5760-4a2e-a164-7cb27ca906a3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.290407 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twpdz\" (UniqueName: \"kubernetes.io/projected/64235e39-5760-4a2e-a164-7cb27ca906a3-kube-api-access-twpdz\") pod \"64235e39-5760-4a2e-a164-7cb27ca906a3\" (UID: \"64235e39-5760-4a2e-a164-7cb27ca906a3\") " Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.290536 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"64235e39-5760-4a2e-a164-7cb27ca906a3\" (UID: \"64235e39-5760-4a2e-a164-7cb27ca906a3\") " Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.290634 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/64235e39-5760-4a2e-a164-7cb27ca906a3-internal-tls-certs\") pod \"64235e39-5760-4a2e-a164-7cb27ca906a3\" (UID: \"64235e39-5760-4a2e-a164-7cb27ca906a3\") " Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.290672 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64235e39-5760-4a2e-a164-7cb27ca906a3-combined-ca-bundle\") pod \"64235e39-5760-4a2e-a164-7cb27ca906a3\" (UID: \"64235e39-5760-4a2e-a164-7cb27ca906a3\") " Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.291382 5004 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/64235e39-5760-4a2e-a164-7cb27ca906a3-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.291413 5004 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64235e39-5760-4a2e-a164-7cb27ca906a3-logs\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.294143 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "64235e39-5760-4a2e-a164-7cb27ca906a3" (UID: "64235e39-5760-4a2e-a164-7cb27ca906a3"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.294652 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64235e39-5760-4a2e-a164-7cb27ca906a3-scripts" (OuterVolumeSpecName: "scripts") pod "64235e39-5760-4a2e-a164-7cb27ca906a3" (UID: "64235e39-5760-4a2e-a164-7cb27ca906a3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.297351 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64235e39-5760-4a2e-a164-7cb27ca906a3-kube-api-access-twpdz" (OuterVolumeSpecName: "kube-api-access-twpdz") pod "64235e39-5760-4a2e-a164-7cb27ca906a3" (UID: "64235e39-5760-4a2e-a164-7cb27ca906a3"). InnerVolumeSpecName "kube-api-access-twpdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.332701 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64235e39-5760-4a2e-a164-7cb27ca906a3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "64235e39-5760-4a2e-a164-7cb27ca906a3" (UID: "64235e39-5760-4a2e-a164-7cb27ca906a3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.375122 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64235e39-5760-4a2e-a164-7cb27ca906a3-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "64235e39-5760-4a2e-a164-7cb27ca906a3" (UID: "64235e39-5760-4a2e-a164-7cb27ca906a3"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.376445 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64235e39-5760-4a2e-a164-7cb27ca906a3-config-data" (OuterVolumeSpecName: "config-data") pod "64235e39-5760-4a2e-a164-7cb27ca906a3" (UID: "64235e39-5760-4a2e-a164-7cb27ca906a3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.385415 5004 generic.go:334] "Generic (PLEG): container finished" podID="64235e39-5760-4a2e-a164-7cb27ca906a3" containerID="78ca9413ec0076671f4c8abfe49abffd1640b7fdb572e7e322984516b6e3783c" exitCode=0 Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.385511 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"64235e39-5760-4a2e-a164-7cb27ca906a3","Type":"ContainerDied","Data":"78ca9413ec0076671f4c8abfe49abffd1640b7fdb572e7e322984516b6e3783c"} Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.385539 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"64235e39-5760-4a2e-a164-7cb27ca906a3","Type":"ContainerDied","Data":"a8a4f0712474d3b24c57adbe9519f964cb29fc5e0b4e8cc281ffbe1f19bbe5da"} Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.385601 5004 scope.go:117] "RemoveContainer" containerID="78ca9413ec0076671f4c8abfe49abffd1640b7fdb572e7e322984516b6e3783c" Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.385745 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.390175 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4d72bbc-0042-4886-9c2a-4b6e6267be65","Type":"ContainerStarted","Data":"3ccbacce6de7615c7d17e7e92b934338112c49b50467e778319e9a0d13af61d6"} Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.392036 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bbd9e9ac-7391-43f2-9f0c-a10d4dd9f81a","Type":"ContainerStarted","Data":"3fc389ade7c8522a2fb126cec8c3d96d3951e5f61e6dab2a8da385e0ac68b1a7"} Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.393826 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twpdz\" (UniqueName: \"kubernetes.io/projected/64235e39-5760-4a2e-a164-7cb27ca906a3-kube-api-access-twpdz\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.393877 5004 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.393889 5004 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/64235e39-5760-4a2e-a164-7cb27ca906a3-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.405653 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64235e39-5760-4a2e-a164-7cb27ca906a3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.405679 5004 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64235e39-5760-4a2e-a164-7cb27ca906a3-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.405689 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64235e39-5760-4a2e-a164-7cb27ca906a3-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.415784 5004 scope.go:117] "RemoveContainer" containerID="a4de37a2a551c10a01fbe2aa77cf24579a8bbc8756ad332d58f177ae88a8f0c4" Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.443480 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.459593 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.473659 5004 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.484102 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 08:41:56 crc kubenswrapper[5004]: E1201 08:41:56.484529 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64235e39-5760-4a2e-a164-7cb27ca906a3" containerName="glance-httpd" Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.484540 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="64235e39-5760-4a2e-a164-7cb27ca906a3" containerName="glance-httpd" Dec 01 08:41:56 crc kubenswrapper[5004]: E1201 08:41:56.484862 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64235e39-5760-4a2e-a164-7cb27ca906a3" containerName="glance-log" Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.484874 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="64235e39-5760-4a2e-a164-7cb27ca906a3" containerName="glance-log" Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.485082 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="64235e39-5760-4a2e-a164-7cb27ca906a3" containerName="glance-log" Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.485097 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="64235e39-5760-4a2e-a164-7cb27ca906a3" containerName="glance-httpd" Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.486239 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.492440 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.495879 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.498340 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.511916 5004 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.544534 5004 scope.go:117] "RemoveContainer" containerID="78ca9413ec0076671f4c8abfe49abffd1640b7fdb572e7e322984516b6e3783c" Dec 01 08:41:56 crc kubenswrapper[5004]: E1201 08:41:56.549374 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78ca9413ec0076671f4c8abfe49abffd1640b7fdb572e7e322984516b6e3783c\": container with ID starting with 78ca9413ec0076671f4c8abfe49abffd1640b7fdb572e7e322984516b6e3783c not found: ID does not exist" containerID="78ca9413ec0076671f4c8abfe49abffd1640b7fdb572e7e322984516b6e3783c" Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.549413 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78ca9413ec0076671f4c8abfe49abffd1640b7fdb572e7e322984516b6e3783c"} err="failed to get container status \"78ca9413ec0076671f4c8abfe49abffd1640b7fdb572e7e322984516b6e3783c\": rpc error: code = NotFound desc = could not find container \"78ca9413ec0076671f4c8abfe49abffd1640b7fdb572e7e322984516b6e3783c\": container with ID starting with 78ca9413ec0076671f4c8abfe49abffd1640b7fdb572e7e322984516b6e3783c not found: ID does not exist" Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.549435 5004 scope.go:117] "RemoveContainer" containerID="a4de37a2a551c10a01fbe2aa77cf24579a8bbc8756ad332d58f177ae88a8f0c4" Dec 01 08:41:56 crc kubenswrapper[5004]: E1201 08:41:56.552077 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4de37a2a551c10a01fbe2aa77cf24579a8bbc8756ad332d58f177ae88a8f0c4\": container with ID starting with a4de37a2a551c10a01fbe2aa77cf24579a8bbc8756ad332d58f177ae88a8f0c4 not found: ID does not exist" containerID="a4de37a2a551c10a01fbe2aa77cf24579a8bbc8756ad332d58f177ae88a8f0c4" Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.552105 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4de37a2a551c10a01fbe2aa77cf24579a8bbc8756ad332d58f177ae88a8f0c4"} err="failed to get container status \"a4de37a2a551c10a01fbe2aa77cf24579a8bbc8756ad332d58f177ae88a8f0c4\": rpc error: code = NotFound desc = could not find container \"a4de37a2a551c10a01fbe2aa77cf24579a8bbc8756ad332d58f177ae88a8f0c4\": container with ID starting with a4de37a2a551c10a01fbe2aa77cf24579a8bbc8756ad332d58f177ae88a8f0c4 not found: ID does not exist" Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.614893 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"2394661d-62f2-4367-bd1e-662a248df799\") " pod="openstack/glance-default-internal-api-0" Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.614982 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2394661d-62f2-4367-bd1e-662a248df799-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2394661d-62f2-4367-bd1e-662a248df799\") " pod="openstack/glance-default-internal-api-0" Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.615009 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2394661d-62f2-4367-bd1e-662a248df799-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2394661d-62f2-4367-bd1e-662a248df799\") " pod="openstack/glance-default-internal-api-0" Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.615044 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2394661d-62f2-4367-bd1e-662a248df799-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2394661d-62f2-4367-bd1e-662a248df799\") " pod="openstack/glance-default-internal-api-0" Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.615090 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2394661d-62f2-4367-bd1e-662a248df799-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2394661d-62f2-4367-bd1e-662a248df799\") " pod="openstack/glance-default-internal-api-0" Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.615110 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csqj4\" (UniqueName: \"kubernetes.io/projected/2394661d-62f2-4367-bd1e-662a248df799-kube-api-access-csqj4\") pod \"glance-default-internal-api-0\" (UID: \"2394661d-62f2-4367-bd1e-662a248df799\") " pod="openstack/glance-default-internal-api-0" Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.615160 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2394661d-62f2-4367-bd1e-662a248df799-logs\") pod \"glance-default-internal-api-0\" (UID: \"2394661d-62f2-4367-bd1e-662a248df799\") " pod="openstack/glance-default-internal-api-0" Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.615179 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2394661d-62f2-4367-bd1e-662a248df799-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2394661d-62f2-4367-bd1e-662a248df799\") " pod="openstack/glance-default-internal-api-0" Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.716929 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"2394661d-62f2-4367-bd1e-662a248df799\") " pod="openstack/glance-default-internal-api-0" Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.717013 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2394661d-62f2-4367-bd1e-662a248df799-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2394661d-62f2-4367-bd1e-662a248df799\") " pod="openstack/glance-default-internal-api-0" Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.717049 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2394661d-62f2-4367-bd1e-662a248df799-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2394661d-62f2-4367-bd1e-662a248df799\") " pod="openstack/glance-default-internal-api-0" Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.717088 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2394661d-62f2-4367-bd1e-662a248df799-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2394661d-62f2-4367-bd1e-662a248df799\") " pod="openstack/glance-default-internal-api-0" Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.717147 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2394661d-62f2-4367-bd1e-662a248df799-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2394661d-62f2-4367-bd1e-662a248df799\") " pod="openstack/glance-default-internal-api-0" Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.717176 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csqj4\" (UniqueName: \"kubernetes.io/projected/2394661d-62f2-4367-bd1e-662a248df799-kube-api-access-csqj4\") pod \"glance-default-internal-api-0\" (UID: \"2394661d-62f2-4367-bd1e-662a248df799\") " pod="openstack/glance-default-internal-api-0" Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.717230 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2394661d-62f2-4367-bd1e-662a248df799-logs\") pod \"glance-default-internal-api-0\" (UID: \"2394661d-62f2-4367-bd1e-662a248df799\") " pod="openstack/glance-default-internal-api-0" Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.717258 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2394661d-62f2-4367-bd1e-662a248df799-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2394661d-62f2-4367-bd1e-662a248df799\") " pod="openstack/glance-default-internal-api-0" Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.717635 5004 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"2394661d-62f2-4367-bd1e-662a248df799\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.717734 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2394661d-62f2-4367-bd1e-662a248df799-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2394661d-62f2-4367-bd1e-662a248df799\") " pod="openstack/glance-default-internal-api-0" Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.717851 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2394661d-62f2-4367-bd1e-662a248df799-logs\") pod \"glance-default-internal-api-0\" (UID: \"2394661d-62f2-4367-bd1e-662a248df799\") " pod="openstack/glance-default-internal-api-0" Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.728680 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2394661d-62f2-4367-bd1e-662a248df799-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2394661d-62f2-4367-bd1e-662a248df799\") " pod="openstack/glance-default-internal-api-0" Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.729752 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2394661d-62f2-4367-bd1e-662a248df799-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2394661d-62f2-4367-bd1e-662a248df799\") " pod="openstack/glance-default-internal-api-0" Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.731112 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2394661d-62f2-4367-bd1e-662a248df799-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2394661d-62f2-4367-bd1e-662a248df799\") " pod="openstack/glance-default-internal-api-0" Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.731365 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2394661d-62f2-4367-bd1e-662a248df799-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2394661d-62f2-4367-bd1e-662a248df799\") " pod="openstack/glance-default-internal-api-0" Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.735983 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csqj4\" (UniqueName: \"kubernetes.io/projected/2394661d-62f2-4367-bd1e-662a248df799-kube-api-access-csqj4\") pod \"glance-default-internal-api-0\" (UID: \"2394661d-62f2-4367-bd1e-662a248df799\") " pod="openstack/glance-default-internal-api-0" Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.766935 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"2394661d-62f2-4367-bd1e-662a248df799\") " pod="openstack/glance-default-internal-api-0" Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.784127 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64235e39-5760-4a2e-a164-7cb27ca906a3" path="/var/lib/kubelet/pods/64235e39-5760-4a2e-a164-7cb27ca906a3/volumes" Dec 01 08:41:56 crc kubenswrapper[5004]: I1201 08:41:56.822400 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 08:41:57 crc kubenswrapper[5004]: I1201 08:41:57.403853 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4d72bbc-0042-4886-9c2a-4b6e6267be65","Type":"ContainerStarted","Data":"0c51692d57f61cf3d1cc91ae9776de254395d2bef599628de1dd0b0a3cc00229"} Dec 01 08:41:57 crc kubenswrapper[5004]: I1201 08:41:57.407039 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bbd9e9ac-7391-43f2-9f0c-a10d4dd9f81a","Type":"ContainerStarted","Data":"79e67f6f78e2a017ca94cc627fdd77f209b64fafbd64e198ccf8bb6500b0d755"} Dec 01 08:41:57 crc kubenswrapper[5004]: I1201 08:41:57.437776 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.4377568800000002 podStartE2EDuration="3.43775688s" podCreationTimestamp="2025-12-01 08:41:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:41:57.427350835 +0000 UTC m=+1494.992342827" watchObservedRunningTime="2025-12-01 08:41:57.43775688 +0000 UTC m=+1495.002748852" Dec 01 08:41:57 crc kubenswrapper[5004]: I1201 08:41:57.458935 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 08:41:57 crc kubenswrapper[5004]: W1201 08:41:57.459184 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2394661d_62f2_4367_bd1e_662a248df799.slice/crio-c95d85066f4b3b02ae16fbeeddea94a1413053759f71e08b031e723b88c654e1 WatchSource:0}: Error finding container c95d85066f4b3b02ae16fbeeddea94a1413053759f71e08b031e723b88c654e1: Status 404 returned error can't find the container with id c95d85066f4b3b02ae16fbeeddea94a1413053759f71e08b031e723b88c654e1 Dec 01 08:41:57 crc kubenswrapper[5004]: I1201 08:41:57.926820 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-5d95bb6c5-cljg2" Dec 01 08:41:58 crc kubenswrapper[5004]: I1201 08:41:58.436947 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2394661d-62f2-4367-bd1e-662a248df799","Type":"ContainerStarted","Data":"3efc8210c3fba197586cdf2ca4b1b174c13764cdd6e9d15098d79606b13dd031"} Dec 01 08:41:58 crc kubenswrapper[5004]: I1201 08:41:58.437435 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2394661d-62f2-4367-bd1e-662a248df799","Type":"ContainerStarted","Data":"c95d85066f4b3b02ae16fbeeddea94a1413053759f71e08b031e723b88c654e1"} Dec 01 08:41:58 crc kubenswrapper[5004]: I1201 08:41:58.681743 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-598mn"] Dec 01 08:41:58 crc kubenswrapper[5004]: I1201 08:41:58.683210 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-598mn" Dec 01 08:41:58 crc kubenswrapper[5004]: I1201 08:41:58.686224 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-vqbz7" Dec 01 08:41:58 crc kubenswrapper[5004]: I1201 08:41:58.686333 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 01 08:41:58 crc kubenswrapper[5004]: I1201 08:41:58.686494 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 01 08:41:58 crc kubenswrapper[5004]: I1201 08:41:58.692245 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-598mn"] Dec 01 08:41:58 crc kubenswrapper[5004]: I1201 08:41:58.770012 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/058c0853-8613-4681-b617-fb985abce304-scripts\") pod \"nova-cell0-conductor-db-sync-598mn\" (UID: \"058c0853-8613-4681-b617-fb985abce304\") " pod="openstack/nova-cell0-conductor-db-sync-598mn" Dec 01 08:41:58 crc kubenswrapper[5004]: I1201 08:41:58.770201 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2w2t5\" (UniqueName: \"kubernetes.io/projected/058c0853-8613-4681-b617-fb985abce304-kube-api-access-2w2t5\") pod \"nova-cell0-conductor-db-sync-598mn\" (UID: \"058c0853-8613-4681-b617-fb985abce304\") " pod="openstack/nova-cell0-conductor-db-sync-598mn" Dec 01 08:41:58 crc kubenswrapper[5004]: I1201 08:41:58.770237 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/058c0853-8613-4681-b617-fb985abce304-config-data\") pod \"nova-cell0-conductor-db-sync-598mn\" (UID: \"058c0853-8613-4681-b617-fb985abce304\") " pod="openstack/nova-cell0-conductor-db-sync-598mn" Dec 01 08:41:58 crc kubenswrapper[5004]: I1201 08:41:58.770257 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/058c0853-8613-4681-b617-fb985abce304-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-598mn\" (UID: \"058c0853-8613-4681-b617-fb985abce304\") " pod="openstack/nova-cell0-conductor-db-sync-598mn" Dec 01 08:41:58 crc kubenswrapper[5004]: I1201 08:41:58.872622 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/058c0853-8613-4681-b617-fb985abce304-scripts\") pod \"nova-cell0-conductor-db-sync-598mn\" (UID: \"058c0853-8613-4681-b617-fb985abce304\") " pod="openstack/nova-cell0-conductor-db-sync-598mn" Dec 01 08:41:58 crc kubenswrapper[5004]: I1201 08:41:58.872806 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2w2t5\" (UniqueName: \"kubernetes.io/projected/058c0853-8613-4681-b617-fb985abce304-kube-api-access-2w2t5\") pod \"nova-cell0-conductor-db-sync-598mn\" (UID: \"058c0853-8613-4681-b617-fb985abce304\") " pod="openstack/nova-cell0-conductor-db-sync-598mn" Dec 01 08:41:58 crc kubenswrapper[5004]: I1201 08:41:58.872836 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/058c0853-8613-4681-b617-fb985abce304-config-data\") pod \"nova-cell0-conductor-db-sync-598mn\" (UID: \"058c0853-8613-4681-b617-fb985abce304\") " pod="openstack/nova-cell0-conductor-db-sync-598mn" Dec 01 08:41:58 crc kubenswrapper[5004]: I1201 08:41:58.872857 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/058c0853-8613-4681-b617-fb985abce304-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-598mn\" (UID: \"058c0853-8613-4681-b617-fb985abce304\") " pod="openstack/nova-cell0-conductor-db-sync-598mn" Dec 01 08:41:58 crc kubenswrapper[5004]: I1201 08:41:58.878167 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/058c0853-8613-4681-b617-fb985abce304-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-598mn\" (UID: \"058c0853-8613-4681-b617-fb985abce304\") " pod="openstack/nova-cell0-conductor-db-sync-598mn" Dec 01 08:41:58 crc kubenswrapper[5004]: I1201 08:41:58.878617 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/058c0853-8613-4681-b617-fb985abce304-scripts\") pod \"nova-cell0-conductor-db-sync-598mn\" (UID: \"058c0853-8613-4681-b617-fb985abce304\") " pod="openstack/nova-cell0-conductor-db-sync-598mn" Dec 01 08:41:58 crc kubenswrapper[5004]: I1201 08:41:58.878664 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/058c0853-8613-4681-b617-fb985abce304-config-data\") pod \"nova-cell0-conductor-db-sync-598mn\" (UID: \"058c0853-8613-4681-b617-fb985abce304\") " pod="openstack/nova-cell0-conductor-db-sync-598mn" Dec 01 08:41:58 crc kubenswrapper[5004]: I1201 08:41:58.889145 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2w2t5\" (UniqueName: \"kubernetes.io/projected/058c0853-8613-4681-b617-fb985abce304-kube-api-access-2w2t5\") pod \"nova-cell0-conductor-db-sync-598mn\" (UID: \"058c0853-8613-4681-b617-fb985abce304\") " pod="openstack/nova-cell0-conductor-db-sync-598mn" Dec 01 08:41:58 crc kubenswrapper[5004]: I1201 08:41:58.911745 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-5b78cf6765-f2dsx" Dec 01 08:41:58 crc kubenswrapper[5004]: I1201 08:41:58.967271 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-67bc95bb56-lkml4"] Dec 01 08:41:59 crc kubenswrapper[5004]: I1201 08:41:59.014967 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-598mn" Dec 01 08:41:59 crc kubenswrapper[5004]: I1201 08:41:59.018475 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-77c6469896-8fmqx" Dec 01 08:41:59 crc kubenswrapper[5004]: I1201 08:41:59.080327 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-7f549dcc6f-fhw4x"] Dec 01 08:41:59 crc kubenswrapper[5004]: I1201 08:41:59.457619 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4d72bbc-0042-4886-9c2a-4b6e6267be65","Type":"ContainerStarted","Data":"e0a094639d8bdc73ef49a7aff999acb00e8a1093ed995e306a787688b6edb9f9"} Dec 01 08:41:59 crc kubenswrapper[5004]: I1201 08:41:59.457803 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 08:41:59 crc kubenswrapper[5004]: I1201 08:41:59.460768 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2394661d-62f2-4367-bd1e-662a248df799","Type":"ContainerStarted","Data":"355d2e8be218446e87ca9f2afc3386db92696bd388d7e93f7d6f34bbed57bd39"} Dec 01 08:41:59 crc kubenswrapper[5004]: I1201 08:41:59.464843 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-67bc95bb56-lkml4" event={"ID":"244abe13-3114-4206-9505-f3b0fdda447e","Type":"ContainerDied","Data":"a6aec3e56a4058bc861d9c00b25c2c2c075fd48b6cd6964d39c63e8c22c834c0"} Dec 01 08:41:59 crc kubenswrapper[5004]: I1201 08:41:59.464879 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a6aec3e56a4058bc861d9c00b25c2c2c075fd48b6cd6964d39c63e8c22c834c0" Dec 01 08:41:59 crc kubenswrapper[5004]: I1201 08:41:59.481806 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.22992127 podStartE2EDuration="6.481790768s" podCreationTimestamp="2025-12-01 08:41:53 +0000 UTC" firstStartedPulling="2025-12-01 08:41:54.338407054 +0000 UTC m=+1491.903399036" lastFinishedPulling="2025-12-01 08:41:58.590276562 +0000 UTC m=+1496.155268534" observedRunningTime="2025-12-01 08:41:59.479653856 +0000 UTC m=+1497.044645838" watchObservedRunningTime="2025-12-01 08:41:59.481790768 +0000 UTC m=+1497.046782740" Dec 01 08:41:59 crc kubenswrapper[5004]: I1201 08:41:59.524632 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.524614304 podStartE2EDuration="3.524614304s" podCreationTimestamp="2025-12-01 08:41:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:41:59.502283678 +0000 UTC m=+1497.067275660" watchObservedRunningTime="2025-12-01 08:41:59.524614304 +0000 UTC m=+1497.089606286" Dec 01 08:41:59 crc kubenswrapper[5004]: I1201 08:41:59.548670 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-67bc95bb56-lkml4" Dec 01 08:41:59 crc kubenswrapper[5004]: I1201 08:41:59.652454 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7f549dcc6f-fhw4x" Dec 01 08:41:59 crc kubenswrapper[5004]: I1201 08:41:59.696362 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/244abe13-3114-4206-9505-f3b0fdda447e-config-data-custom\") pod \"244abe13-3114-4206-9505-f3b0fdda447e\" (UID: \"244abe13-3114-4206-9505-f3b0fdda447e\") " Dec 01 08:41:59 crc kubenswrapper[5004]: I1201 08:41:59.696436 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/244abe13-3114-4206-9505-f3b0fdda447e-config-data\") pod \"244abe13-3114-4206-9505-f3b0fdda447e\" (UID: \"244abe13-3114-4206-9505-f3b0fdda447e\") " Dec 01 08:41:59 crc kubenswrapper[5004]: I1201 08:41:59.696510 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwt5z\" (UniqueName: \"kubernetes.io/projected/244abe13-3114-4206-9505-f3b0fdda447e-kube-api-access-fwt5z\") pod \"244abe13-3114-4206-9505-f3b0fdda447e\" (UID: \"244abe13-3114-4206-9505-f3b0fdda447e\") " Dec 01 08:41:59 crc kubenswrapper[5004]: I1201 08:41:59.696747 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/244abe13-3114-4206-9505-f3b0fdda447e-combined-ca-bundle\") pod \"244abe13-3114-4206-9505-f3b0fdda447e\" (UID: \"244abe13-3114-4206-9505-f3b0fdda447e\") " Dec 01 08:41:59 crc kubenswrapper[5004]: I1201 08:41:59.702108 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/244abe13-3114-4206-9505-f3b0fdda447e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "244abe13-3114-4206-9505-f3b0fdda447e" (UID: "244abe13-3114-4206-9505-f3b0fdda447e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:41:59 crc kubenswrapper[5004]: I1201 08:41:59.705802 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/244abe13-3114-4206-9505-f3b0fdda447e-kube-api-access-fwt5z" (OuterVolumeSpecName: "kube-api-access-fwt5z") pod "244abe13-3114-4206-9505-f3b0fdda447e" (UID: "244abe13-3114-4206-9505-f3b0fdda447e"). InnerVolumeSpecName "kube-api-access-fwt5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:41:59 crc kubenswrapper[5004]: I1201 08:41:59.765781 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/244abe13-3114-4206-9505-f3b0fdda447e-config-data" (OuterVolumeSpecName: "config-data") pod "244abe13-3114-4206-9505-f3b0fdda447e" (UID: "244abe13-3114-4206-9505-f3b0fdda447e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:41:59 crc kubenswrapper[5004]: I1201 08:41:59.765894 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/244abe13-3114-4206-9505-f3b0fdda447e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "244abe13-3114-4206-9505-f3b0fdda447e" (UID: "244abe13-3114-4206-9505-f3b0fdda447e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:41:59 crc kubenswrapper[5004]: I1201 08:41:59.782327 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-598mn"] Dec 01 08:41:59 crc kubenswrapper[5004]: I1201 08:41:59.798687 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e866b18-ec1e-4d54-acb5-89b374c7a9d5-combined-ca-bundle\") pod \"0e866b18-ec1e-4d54-acb5-89b374c7a9d5\" (UID: \"0e866b18-ec1e-4d54-acb5-89b374c7a9d5\") " Dec 01 08:41:59 crc kubenswrapper[5004]: I1201 08:41:59.798732 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xdsx\" (UniqueName: \"kubernetes.io/projected/0e866b18-ec1e-4d54-acb5-89b374c7a9d5-kube-api-access-5xdsx\") pod \"0e866b18-ec1e-4d54-acb5-89b374c7a9d5\" (UID: \"0e866b18-ec1e-4d54-acb5-89b374c7a9d5\") " Dec 01 08:41:59 crc kubenswrapper[5004]: I1201 08:41:59.798752 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e866b18-ec1e-4d54-acb5-89b374c7a9d5-config-data\") pod \"0e866b18-ec1e-4d54-acb5-89b374c7a9d5\" (UID: \"0e866b18-ec1e-4d54-acb5-89b374c7a9d5\") " Dec 01 08:41:59 crc kubenswrapper[5004]: I1201 08:41:59.798917 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0e866b18-ec1e-4d54-acb5-89b374c7a9d5-config-data-custom\") pod \"0e866b18-ec1e-4d54-acb5-89b374c7a9d5\" (UID: \"0e866b18-ec1e-4d54-acb5-89b374c7a9d5\") " Dec 01 08:41:59 crc kubenswrapper[5004]: I1201 08:41:59.799349 5004 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/244abe13-3114-4206-9505-f3b0fdda447e-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:59 crc kubenswrapper[5004]: I1201 08:41:59.799366 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/244abe13-3114-4206-9505-f3b0fdda447e-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:59 crc kubenswrapper[5004]: I1201 08:41:59.799376 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwt5z\" (UniqueName: \"kubernetes.io/projected/244abe13-3114-4206-9505-f3b0fdda447e-kube-api-access-fwt5z\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:59 crc kubenswrapper[5004]: I1201 08:41:59.799386 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/244abe13-3114-4206-9505-f3b0fdda447e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:59 crc kubenswrapper[5004]: I1201 08:41:59.801772 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e866b18-ec1e-4d54-acb5-89b374c7a9d5-kube-api-access-5xdsx" (OuterVolumeSpecName: "kube-api-access-5xdsx") pod "0e866b18-ec1e-4d54-acb5-89b374c7a9d5" (UID: "0e866b18-ec1e-4d54-acb5-89b374c7a9d5"). InnerVolumeSpecName "kube-api-access-5xdsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:41:59 crc kubenswrapper[5004]: I1201 08:41:59.805886 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e866b18-ec1e-4d54-acb5-89b374c7a9d5-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "0e866b18-ec1e-4d54-acb5-89b374c7a9d5" (UID: "0e866b18-ec1e-4d54-acb5-89b374c7a9d5"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:41:59 crc kubenswrapper[5004]: I1201 08:41:59.841240 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e866b18-ec1e-4d54-acb5-89b374c7a9d5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0e866b18-ec1e-4d54-acb5-89b374c7a9d5" (UID: "0e866b18-ec1e-4d54-acb5-89b374c7a9d5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:41:59 crc kubenswrapper[5004]: I1201 08:41:59.869283 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e866b18-ec1e-4d54-acb5-89b374c7a9d5-config-data" (OuterVolumeSpecName: "config-data") pod "0e866b18-ec1e-4d54-acb5-89b374c7a9d5" (UID: "0e866b18-ec1e-4d54-acb5-89b374c7a9d5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:41:59 crc kubenswrapper[5004]: I1201 08:41:59.901481 5004 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0e866b18-ec1e-4d54-acb5-89b374c7a9d5-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:59 crc kubenswrapper[5004]: I1201 08:41:59.901523 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e866b18-ec1e-4d54-acb5-89b374c7a9d5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:59 crc kubenswrapper[5004]: I1201 08:41:59.901535 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xdsx\" (UniqueName: \"kubernetes.io/projected/0e866b18-ec1e-4d54-acb5-89b374c7a9d5-kube-api-access-5xdsx\") on node \"crc\" DevicePath \"\"" Dec 01 08:41:59 crc kubenswrapper[5004]: I1201 08:41:59.901550 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e866b18-ec1e-4d54-acb5-89b374c7a9d5-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 08:42:00 crc kubenswrapper[5004]: I1201 08:42:00.475400 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-598mn" event={"ID":"058c0853-8613-4681-b617-fb985abce304","Type":"ContainerStarted","Data":"3f95466b6783581424d931dfa98d093704a23c2fb75f3a3df17f6f349eb29597"} Dec 01 08:42:00 crc kubenswrapper[5004]: I1201 08:42:00.477137 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7f549dcc6f-fhw4x" event={"ID":"0e866b18-ec1e-4d54-acb5-89b374c7a9d5","Type":"ContainerDied","Data":"24036210d6eb5aed7ffce704d89da4631a3c0290b3ff7524fcd9a1988f9d850e"} Dec 01 08:42:00 crc kubenswrapper[5004]: I1201 08:42:00.477196 5004 scope.go:117] "RemoveContainer" containerID="11c240da9eb0d4278fdf2349444d2c6f8636a7477bbbb29db5b2bcdd9596e62f" Dec 01 08:42:00 crc kubenswrapper[5004]: I1201 08:42:00.477307 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7f549dcc6f-fhw4x" Dec 01 08:42:00 crc kubenswrapper[5004]: I1201 08:42:00.479032 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-67bc95bb56-lkml4" Dec 01 08:42:00 crc kubenswrapper[5004]: I1201 08:42:00.578621 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-67bc95bb56-lkml4"] Dec 01 08:42:00 crc kubenswrapper[5004]: I1201 08:42:00.587025 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-67bc95bb56-lkml4"] Dec 01 08:42:00 crc kubenswrapper[5004]: I1201 08:42:00.598608 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-7f549dcc6f-fhw4x"] Dec 01 08:42:00 crc kubenswrapper[5004]: I1201 08:42:00.608464 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-7f549dcc6f-fhw4x"] Dec 01 08:42:00 crc kubenswrapper[5004]: I1201 08:42:00.771751 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e866b18-ec1e-4d54-acb5-89b374c7a9d5" path="/var/lib/kubelet/pods/0e866b18-ec1e-4d54-acb5-89b374c7a9d5/volumes" Dec 01 08:42:00 crc kubenswrapper[5004]: I1201 08:42:00.772371 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="244abe13-3114-4206-9505-f3b0fdda447e" path="/var/lib/kubelet/pods/244abe13-3114-4206-9505-f3b0fdda447e/volumes" Dec 01 08:42:04 crc kubenswrapper[5004]: I1201 08:42:04.784188 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 01 08:42:04 crc kubenswrapper[5004]: I1201 08:42:04.784628 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 01 08:42:04 crc kubenswrapper[5004]: I1201 08:42:04.832328 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 01 08:42:04 crc kubenswrapper[5004]: I1201 08:42:04.844314 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 01 08:42:05 crc kubenswrapper[5004]: I1201 08:42:05.265959 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-7cfb9cfbf9-fqxms" Dec 01 08:42:05 crc kubenswrapper[5004]: I1201 08:42:05.318341 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-5d95bb6c5-cljg2"] Dec 01 08:42:05 crc kubenswrapper[5004]: I1201 08:42:05.318619 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-5d95bb6c5-cljg2" podUID="3ecc657f-21cf-40d3-8f77-29a9f9c4f5f2" containerName="heat-engine" containerID="cri-o://347b335225f2c0e10e2c63c058f0135ca01ba3b404f9d2161dd6a7d8bc0f56e9" gracePeriod=60 Dec 01 08:42:05 crc kubenswrapper[5004]: I1201 08:42:05.582343 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 01 08:42:05 crc kubenswrapper[5004]: I1201 08:42:05.582390 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 01 08:42:06 crc kubenswrapper[5004]: I1201 08:42:06.823015 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 01 08:42:06 crc kubenswrapper[5004]: I1201 08:42:06.823309 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 01 08:42:06 crc kubenswrapper[5004]: I1201 08:42:06.859431 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 01 08:42:06 crc kubenswrapper[5004]: I1201 08:42:06.868502 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 01 08:42:07 crc kubenswrapper[5004]: I1201 08:42:07.606608 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 01 08:42:07 crc kubenswrapper[5004]: I1201 08:42:07.606645 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 01 08:42:07 crc kubenswrapper[5004]: E1201 08:42:07.890846 5004 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="347b335225f2c0e10e2c63c058f0135ca01ba3b404f9d2161dd6a7d8bc0f56e9" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 01 08:42:07 crc kubenswrapper[5004]: E1201 08:42:07.894189 5004 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="347b335225f2c0e10e2c63c058f0135ca01ba3b404f9d2161dd6a7d8bc0f56e9" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 01 08:42:07 crc kubenswrapper[5004]: E1201 08:42:07.896378 5004 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="347b335225f2c0e10e2c63c058f0135ca01ba3b404f9d2161dd6a7d8bc0f56e9" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 01 08:42:07 crc kubenswrapper[5004]: E1201 08:42:07.896454 5004 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-5d95bb6c5-cljg2" podUID="3ecc657f-21cf-40d3-8f77-29a9f9c4f5f2" containerName="heat-engine" Dec 01 08:42:07 crc kubenswrapper[5004]: I1201 08:42:07.969697 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 01 08:42:07 crc kubenswrapper[5004]: I1201 08:42:07.969754 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 01 08:42:09 crc kubenswrapper[5004]: I1201 08:42:09.600842 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 01 08:42:09 crc kubenswrapper[5004]: I1201 08:42:09.632829 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-598mn" event={"ID":"058c0853-8613-4681-b617-fb985abce304","Type":"ContainerStarted","Data":"4936719999b16beb7679b4dd16c55d0eafcd9b7145a203c8432b4cf062a9f454"} Dec 01 08:42:09 crc kubenswrapper[5004]: I1201 08:42:09.632856 5004 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 08:42:09 crc kubenswrapper[5004]: I1201 08:42:09.656633 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 01 08:42:09 crc kubenswrapper[5004]: I1201 08:42:09.660578 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-598mn" podStartSLOduration=2.6858914289999998 podStartE2EDuration="11.660535848s" podCreationTimestamp="2025-12-01 08:41:58 +0000 UTC" firstStartedPulling="2025-12-01 08:41:59.765774534 +0000 UTC m=+1497.330766516" lastFinishedPulling="2025-12-01 08:42:08.740418953 +0000 UTC m=+1506.305410935" observedRunningTime="2025-12-01 08:42:09.649706084 +0000 UTC m=+1507.214698066" watchObservedRunningTime="2025-12-01 08:42:09.660535848 +0000 UTC m=+1507.225527830" Dec 01 08:42:10 crc kubenswrapper[5004]: I1201 08:42:10.472356 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 08:42:10 crc kubenswrapper[5004]: I1201 08:42:10.472649 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b4d72bbc-0042-4886-9c2a-4b6e6267be65" containerName="ceilometer-central-agent" containerID="cri-o://080966e70d6fb040c67da27f631647c6eeb3d335f9e02faa0f46df4b3174c846" gracePeriod=30 Dec 01 08:42:10 crc kubenswrapper[5004]: I1201 08:42:10.473685 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b4d72bbc-0042-4886-9c2a-4b6e6267be65" containerName="proxy-httpd" containerID="cri-o://e0a094639d8bdc73ef49a7aff999acb00e8a1093ed995e306a787688b6edb9f9" gracePeriod=30 Dec 01 08:42:10 crc kubenswrapper[5004]: I1201 08:42:10.473737 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b4d72bbc-0042-4886-9c2a-4b6e6267be65" containerName="sg-core" containerID="cri-o://0c51692d57f61cf3d1cc91ae9776de254395d2bef599628de1dd0b0a3cc00229" gracePeriod=30 Dec 01 08:42:10 crc kubenswrapper[5004]: I1201 08:42:10.473766 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b4d72bbc-0042-4886-9c2a-4b6e6267be65" containerName="ceilometer-notification-agent" containerID="cri-o://3ccbacce6de7615c7d17e7e92b934338112c49b50467e778319e9a0d13af61d6" gracePeriod=30 Dec 01 08:42:10 crc kubenswrapper[5004]: I1201 08:42:10.587155 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="b4d72bbc-0042-4886-9c2a-4b6e6267be65" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.224:3000/\": read tcp 10.217.0.2:51870->10.217.0.224:3000: read: connection reset by peer" Dec 01 08:42:10 crc kubenswrapper[5004]: I1201 08:42:10.656122 5004 generic.go:334] "Generic (PLEG): container finished" podID="b4d72bbc-0042-4886-9c2a-4b6e6267be65" containerID="0c51692d57f61cf3d1cc91ae9776de254395d2bef599628de1dd0b0a3cc00229" exitCode=2 Dec 01 08:42:10 crc kubenswrapper[5004]: I1201 08:42:10.656220 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4d72bbc-0042-4886-9c2a-4b6e6267be65","Type":"ContainerDied","Data":"0c51692d57f61cf3d1cc91ae9776de254395d2bef599628de1dd0b0a3cc00229"} Dec 01 08:42:11 crc kubenswrapper[5004]: I1201 08:42:11.670037 5004 generic.go:334] "Generic (PLEG): container finished" podID="b4d72bbc-0042-4886-9c2a-4b6e6267be65" containerID="e0a094639d8bdc73ef49a7aff999acb00e8a1093ed995e306a787688b6edb9f9" exitCode=0 Dec 01 08:42:11 crc kubenswrapper[5004]: I1201 08:42:11.671107 5004 generic.go:334] "Generic (PLEG): container finished" podID="b4d72bbc-0042-4886-9c2a-4b6e6267be65" containerID="080966e70d6fb040c67da27f631647c6eeb3d335f9e02faa0f46df4b3174c846" exitCode=0 Dec 01 08:42:11 crc kubenswrapper[5004]: I1201 08:42:11.670115 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4d72bbc-0042-4886-9c2a-4b6e6267be65","Type":"ContainerDied","Data":"e0a094639d8bdc73ef49a7aff999acb00e8a1093ed995e306a787688b6edb9f9"} Dec 01 08:42:11 crc kubenswrapper[5004]: I1201 08:42:11.671252 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4d72bbc-0042-4886-9c2a-4b6e6267be65","Type":"ContainerDied","Data":"080966e70d6fb040c67da27f631647c6eeb3d335f9e02faa0f46df4b3174c846"} Dec 01 08:42:11 crc kubenswrapper[5004]: I1201 08:42:11.978746 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-1bcc-account-create-update-99f9v"] Dec 01 08:42:11 crc kubenswrapper[5004]: E1201 08:42:11.982826 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="244abe13-3114-4206-9505-f3b0fdda447e" containerName="heat-cfnapi" Dec 01 08:42:11 crc kubenswrapper[5004]: I1201 08:42:11.982851 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="244abe13-3114-4206-9505-f3b0fdda447e" containerName="heat-cfnapi" Dec 01 08:42:11 crc kubenswrapper[5004]: E1201 08:42:11.982889 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e866b18-ec1e-4d54-acb5-89b374c7a9d5" containerName="heat-api" Dec 01 08:42:11 crc kubenswrapper[5004]: I1201 08:42:11.982896 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e866b18-ec1e-4d54-acb5-89b374c7a9d5" containerName="heat-api" Dec 01 08:42:11 crc kubenswrapper[5004]: I1201 08:42:11.983353 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e866b18-ec1e-4d54-acb5-89b374c7a9d5" containerName="heat-api" Dec 01 08:42:11 crc kubenswrapper[5004]: I1201 08:42:11.983372 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="244abe13-3114-4206-9505-f3b0fdda447e" containerName="heat-cfnapi" Dec 01 08:42:11 crc kubenswrapper[5004]: I1201 08:42:11.983393 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e866b18-ec1e-4d54-acb5-89b374c7a9d5" containerName="heat-api" Dec 01 08:42:11 crc kubenswrapper[5004]: I1201 08:42:11.983408 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="244abe13-3114-4206-9505-f3b0fdda447e" containerName="heat-cfnapi" Dec 01 08:42:11 crc kubenswrapper[5004]: I1201 08:42:11.984496 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-1bcc-account-create-update-99f9v" Dec 01 08:42:11 crc kubenswrapper[5004]: I1201 08:42:11.993273 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Dec 01 08:42:11 crc kubenswrapper[5004]: I1201 08:42:11.995531 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-rpt6t"] Dec 01 08:42:11 crc kubenswrapper[5004]: E1201 08:42:11.996212 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="244abe13-3114-4206-9505-f3b0fdda447e" containerName="heat-cfnapi" Dec 01 08:42:11 crc kubenswrapper[5004]: I1201 08:42:11.996240 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="244abe13-3114-4206-9505-f3b0fdda447e" containerName="heat-cfnapi" Dec 01 08:42:11 crc kubenswrapper[5004]: E1201 08:42:11.996269 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e866b18-ec1e-4d54-acb5-89b374c7a9d5" containerName="heat-api" Dec 01 08:42:11 crc kubenswrapper[5004]: I1201 08:42:11.996277 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e866b18-ec1e-4d54-acb5-89b374c7a9d5" containerName="heat-api" Dec 01 08:42:12 crc kubenswrapper[5004]: I1201 08:42:12.019099 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-1bcc-account-create-update-99f9v"] Dec 01 08:42:12 crc kubenswrapper[5004]: I1201 08:42:12.019217 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-rpt6t" Dec 01 08:42:12 crc kubenswrapper[5004]: I1201 08:42:12.021221 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-rpt6t"] Dec 01 08:42:12 crc kubenswrapper[5004]: I1201 08:42:12.029806 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nplbl\" (UniqueName: \"kubernetes.io/projected/47a280b1-dda9-451b-8791-990446098df5-kube-api-access-nplbl\") pod \"aodh-1bcc-account-create-update-99f9v\" (UID: \"47a280b1-dda9-451b-8791-990446098df5\") " pod="openstack/aodh-1bcc-account-create-update-99f9v" Dec 01 08:42:12 crc kubenswrapper[5004]: I1201 08:42:12.029933 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47a280b1-dda9-451b-8791-990446098df5-operator-scripts\") pod \"aodh-1bcc-account-create-update-99f9v\" (UID: \"47a280b1-dda9-451b-8791-990446098df5\") " pod="openstack/aodh-1bcc-account-create-update-99f9v" Dec 01 08:42:12 crc kubenswrapper[5004]: I1201 08:42:12.131538 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8cbde161-56a4-49c2-9e44-e86fc7c4a82f-operator-scripts\") pod \"aodh-db-create-rpt6t\" (UID: \"8cbde161-56a4-49c2-9e44-e86fc7c4a82f\") " pod="openstack/aodh-db-create-rpt6t" Dec 01 08:42:12 crc kubenswrapper[5004]: I1201 08:42:12.131644 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8x56b\" (UniqueName: \"kubernetes.io/projected/8cbde161-56a4-49c2-9e44-e86fc7c4a82f-kube-api-access-8x56b\") pod \"aodh-db-create-rpt6t\" (UID: \"8cbde161-56a4-49c2-9e44-e86fc7c4a82f\") " pod="openstack/aodh-db-create-rpt6t" Dec 01 08:42:12 crc kubenswrapper[5004]: I1201 08:42:12.131675 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nplbl\" (UniqueName: \"kubernetes.io/projected/47a280b1-dda9-451b-8791-990446098df5-kube-api-access-nplbl\") pod \"aodh-1bcc-account-create-update-99f9v\" (UID: \"47a280b1-dda9-451b-8791-990446098df5\") " pod="openstack/aodh-1bcc-account-create-update-99f9v" Dec 01 08:42:12 crc kubenswrapper[5004]: I1201 08:42:12.131723 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47a280b1-dda9-451b-8791-990446098df5-operator-scripts\") pod \"aodh-1bcc-account-create-update-99f9v\" (UID: \"47a280b1-dda9-451b-8791-990446098df5\") " pod="openstack/aodh-1bcc-account-create-update-99f9v" Dec 01 08:42:12 crc kubenswrapper[5004]: I1201 08:42:12.132842 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47a280b1-dda9-451b-8791-990446098df5-operator-scripts\") pod \"aodh-1bcc-account-create-update-99f9v\" (UID: \"47a280b1-dda9-451b-8791-990446098df5\") " pod="openstack/aodh-1bcc-account-create-update-99f9v" Dec 01 08:42:12 crc kubenswrapper[5004]: I1201 08:42:12.153419 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nplbl\" (UniqueName: \"kubernetes.io/projected/47a280b1-dda9-451b-8791-990446098df5-kube-api-access-nplbl\") pod \"aodh-1bcc-account-create-update-99f9v\" (UID: \"47a280b1-dda9-451b-8791-990446098df5\") " pod="openstack/aodh-1bcc-account-create-update-99f9v" Dec 01 08:42:12 crc kubenswrapper[5004]: I1201 08:42:12.233202 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8cbde161-56a4-49c2-9e44-e86fc7c4a82f-operator-scripts\") pod \"aodh-db-create-rpt6t\" (UID: \"8cbde161-56a4-49c2-9e44-e86fc7c4a82f\") " pod="openstack/aodh-db-create-rpt6t" Dec 01 08:42:12 crc kubenswrapper[5004]: I1201 08:42:12.233294 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8x56b\" (UniqueName: \"kubernetes.io/projected/8cbde161-56a4-49c2-9e44-e86fc7c4a82f-kube-api-access-8x56b\") pod \"aodh-db-create-rpt6t\" (UID: \"8cbde161-56a4-49c2-9e44-e86fc7c4a82f\") " pod="openstack/aodh-db-create-rpt6t" Dec 01 08:42:12 crc kubenswrapper[5004]: I1201 08:42:12.234304 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8cbde161-56a4-49c2-9e44-e86fc7c4a82f-operator-scripts\") pod \"aodh-db-create-rpt6t\" (UID: \"8cbde161-56a4-49c2-9e44-e86fc7c4a82f\") " pod="openstack/aodh-db-create-rpt6t" Dec 01 08:42:12 crc kubenswrapper[5004]: I1201 08:42:12.250933 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8x56b\" (UniqueName: \"kubernetes.io/projected/8cbde161-56a4-49c2-9e44-e86fc7c4a82f-kube-api-access-8x56b\") pod \"aodh-db-create-rpt6t\" (UID: \"8cbde161-56a4-49c2-9e44-e86fc7c4a82f\") " pod="openstack/aodh-db-create-rpt6t" Dec 01 08:42:12 crc kubenswrapper[5004]: I1201 08:42:12.320340 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-1bcc-account-create-update-99f9v" Dec 01 08:42:12 crc kubenswrapper[5004]: I1201 08:42:12.358064 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-rpt6t" Dec 01 08:42:12 crc kubenswrapper[5004]: I1201 08:42:12.630114 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-5d95bb6c5-cljg2" Dec 01 08:42:12 crc kubenswrapper[5004]: I1201 08:42:12.708830 5004 generic.go:334] "Generic (PLEG): container finished" podID="3ecc657f-21cf-40d3-8f77-29a9f9c4f5f2" containerID="347b335225f2c0e10e2c63c058f0135ca01ba3b404f9d2161dd6a7d8bc0f56e9" exitCode=0 Dec 01 08:42:12 crc kubenswrapper[5004]: I1201 08:42:12.708870 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-5d95bb6c5-cljg2" event={"ID":"3ecc657f-21cf-40d3-8f77-29a9f9c4f5f2","Type":"ContainerDied","Data":"347b335225f2c0e10e2c63c058f0135ca01ba3b404f9d2161dd6a7d8bc0f56e9"} Dec 01 08:42:12 crc kubenswrapper[5004]: I1201 08:42:12.708895 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-5d95bb6c5-cljg2" event={"ID":"3ecc657f-21cf-40d3-8f77-29a9f9c4f5f2","Type":"ContainerDied","Data":"e7c1e8a89251856a63c98529f409523d5cc3fe18ece5baa926844fd4b60da735"} Dec 01 08:42:12 crc kubenswrapper[5004]: I1201 08:42:12.708913 5004 scope.go:117] "RemoveContainer" containerID="347b335225f2c0e10e2c63c058f0135ca01ba3b404f9d2161dd6a7d8bc0f56e9" Dec 01 08:42:12 crc kubenswrapper[5004]: I1201 08:42:12.709029 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-5d95bb6c5-cljg2" Dec 01 08:42:12 crc kubenswrapper[5004]: I1201 08:42:12.737598 5004 scope.go:117] "RemoveContainer" containerID="347b335225f2c0e10e2c63c058f0135ca01ba3b404f9d2161dd6a7d8bc0f56e9" Dec 01 08:42:12 crc kubenswrapper[5004]: E1201 08:42:12.738360 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"347b335225f2c0e10e2c63c058f0135ca01ba3b404f9d2161dd6a7d8bc0f56e9\": container with ID starting with 347b335225f2c0e10e2c63c058f0135ca01ba3b404f9d2161dd6a7d8bc0f56e9 not found: ID does not exist" containerID="347b335225f2c0e10e2c63c058f0135ca01ba3b404f9d2161dd6a7d8bc0f56e9" Dec 01 08:42:12 crc kubenswrapper[5004]: I1201 08:42:12.738401 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"347b335225f2c0e10e2c63c058f0135ca01ba3b404f9d2161dd6a7d8bc0f56e9"} err="failed to get container status \"347b335225f2c0e10e2c63c058f0135ca01ba3b404f9d2161dd6a7d8bc0f56e9\": rpc error: code = NotFound desc = could not find container \"347b335225f2c0e10e2c63c058f0135ca01ba3b404f9d2161dd6a7d8bc0f56e9\": container with ID starting with 347b335225f2c0e10e2c63c058f0135ca01ba3b404f9d2161dd6a7d8bc0f56e9 not found: ID does not exist" Dec 01 08:42:12 crc kubenswrapper[5004]: I1201 08:42:12.751517 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4jhq\" (UniqueName: \"kubernetes.io/projected/3ecc657f-21cf-40d3-8f77-29a9f9c4f5f2-kube-api-access-m4jhq\") pod \"3ecc657f-21cf-40d3-8f77-29a9f9c4f5f2\" (UID: \"3ecc657f-21cf-40d3-8f77-29a9f9c4f5f2\") " Dec 01 08:42:12 crc kubenswrapper[5004]: I1201 08:42:12.751830 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3ecc657f-21cf-40d3-8f77-29a9f9c4f5f2-config-data-custom\") pod \"3ecc657f-21cf-40d3-8f77-29a9f9c4f5f2\" (UID: \"3ecc657f-21cf-40d3-8f77-29a9f9c4f5f2\") " Dec 01 08:42:12 crc kubenswrapper[5004]: I1201 08:42:12.751976 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ecc657f-21cf-40d3-8f77-29a9f9c4f5f2-config-data\") pod \"3ecc657f-21cf-40d3-8f77-29a9f9c4f5f2\" (UID: \"3ecc657f-21cf-40d3-8f77-29a9f9c4f5f2\") " Dec 01 08:42:12 crc kubenswrapper[5004]: I1201 08:42:12.752012 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ecc657f-21cf-40d3-8f77-29a9f9c4f5f2-combined-ca-bundle\") pod \"3ecc657f-21cf-40d3-8f77-29a9f9c4f5f2\" (UID: \"3ecc657f-21cf-40d3-8f77-29a9f9c4f5f2\") " Dec 01 08:42:12 crc kubenswrapper[5004]: I1201 08:42:12.759419 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ecc657f-21cf-40d3-8f77-29a9f9c4f5f2-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3ecc657f-21cf-40d3-8f77-29a9f9c4f5f2" (UID: "3ecc657f-21cf-40d3-8f77-29a9f9c4f5f2"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:42:12 crc kubenswrapper[5004]: I1201 08:42:12.759987 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ecc657f-21cf-40d3-8f77-29a9f9c4f5f2-kube-api-access-m4jhq" (OuterVolumeSpecName: "kube-api-access-m4jhq") pod "3ecc657f-21cf-40d3-8f77-29a9f9c4f5f2" (UID: "3ecc657f-21cf-40d3-8f77-29a9f9c4f5f2"). InnerVolumeSpecName "kube-api-access-m4jhq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:42:12 crc kubenswrapper[5004]: I1201 08:42:12.854642 5004 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3ecc657f-21cf-40d3-8f77-29a9f9c4f5f2-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 01 08:42:12 crc kubenswrapper[5004]: I1201 08:42:12.854676 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4jhq\" (UniqueName: \"kubernetes.io/projected/3ecc657f-21cf-40d3-8f77-29a9f9c4f5f2-kube-api-access-m4jhq\") on node \"crc\" DevicePath \"\"" Dec 01 08:42:12 crc kubenswrapper[5004]: I1201 08:42:12.921691 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ecc657f-21cf-40d3-8f77-29a9f9c4f5f2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3ecc657f-21cf-40d3-8f77-29a9f9c4f5f2" (UID: "3ecc657f-21cf-40d3-8f77-29a9f9c4f5f2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:42:12 crc kubenswrapper[5004]: I1201 08:42:12.956902 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-1bcc-account-create-update-99f9v"] Dec 01 08:42:12 crc kubenswrapper[5004]: I1201 08:42:12.959201 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ecc657f-21cf-40d3-8f77-29a9f9c4f5f2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:42:12 crc kubenswrapper[5004]: I1201 08:42:12.970550 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ecc657f-21cf-40d3-8f77-29a9f9c4f5f2-config-data" (OuterVolumeSpecName: "config-data") pod "3ecc657f-21cf-40d3-8f77-29a9f9c4f5f2" (UID: "3ecc657f-21cf-40d3-8f77-29a9f9c4f5f2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:42:13 crc kubenswrapper[5004]: I1201 08:42:13.061161 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ecc657f-21cf-40d3-8f77-29a9f9c4f5f2-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 08:42:13 crc kubenswrapper[5004]: I1201 08:42:13.085458 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-5d95bb6c5-cljg2"] Dec 01 08:42:13 crc kubenswrapper[5004]: I1201 08:42:13.110275 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-5d95bb6c5-cljg2"] Dec 01 08:42:13 crc kubenswrapper[5004]: I1201 08:42:13.124900 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-rpt6t"] Dec 01 08:42:13 crc kubenswrapper[5004]: I1201 08:42:13.317829 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 08:42:13 crc kubenswrapper[5004]: I1201 08:42:13.386165 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b4d72bbc-0042-4886-9c2a-4b6e6267be65-sg-core-conf-yaml\") pod \"b4d72bbc-0042-4886-9c2a-4b6e6267be65\" (UID: \"b4d72bbc-0042-4886-9c2a-4b6e6267be65\") " Dec 01 08:42:13 crc kubenswrapper[5004]: I1201 08:42:13.386758 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4d72bbc-0042-4886-9c2a-4b6e6267be65-scripts\") pod \"b4d72bbc-0042-4886-9c2a-4b6e6267be65\" (UID: \"b4d72bbc-0042-4886-9c2a-4b6e6267be65\") " Dec 01 08:42:13 crc kubenswrapper[5004]: I1201 08:42:13.386898 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4d72bbc-0042-4886-9c2a-4b6e6267be65-combined-ca-bundle\") pod \"b4d72bbc-0042-4886-9c2a-4b6e6267be65\" (UID: \"b4d72bbc-0042-4886-9c2a-4b6e6267be65\") " Dec 01 08:42:13 crc kubenswrapper[5004]: I1201 08:42:13.386995 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4d72bbc-0042-4886-9c2a-4b6e6267be65-run-httpd\") pod \"b4d72bbc-0042-4886-9c2a-4b6e6267be65\" (UID: \"b4d72bbc-0042-4886-9c2a-4b6e6267be65\") " Dec 01 08:42:13 crc kubenswrapper[5004]: I1201 08:42:13.387127 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4d72bbc-0042-4886-9c2a-4b6e6267be65-log-httpd\") pod \"b4d72bbc-0042-4886-9c2a-4b6e6267be65\" (UID: \"b4d72bbc-0042-4886-9c2a-4b6e6267be65\") " Dec 01 08:42:13 crc kubenswrapper[5004]: I1201 08:42:13.387270 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wsjrt\" (UniqueName: \"kubernetes.io/projected/b4d72bbc-0042-4886-9c2a-4b6e6267be65-kube-api-access-wsjrt\") pod \"b4d72bbc-0042-4886-9c2a-4b6e6267be65\" (UID: \"b4d72bbc-0042-4886-9c2a-4b6e6267be65\") " Dec 01 08:42:13 crc kubenswrapper[5004]: I1201 08:42:13.387351 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4d72bbc-0042-4886-9c2a-4b6e6267be65-config-data\") pod \"b4d72bbc-0042-4886-9c2a-4b6e6267be65\" (UID: \"b4d72bbc-0042-4886-9c2a-4b6e6267be65\") " Dec 01 08:42:13 crc kubenswrapper[5004]: I1201 08:42:13.387978 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4d72bbc-0042-4886-9c2a-4b6e6267be65-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b4d72bbc-0042-4886-9c2a-4b6e6267be65" (UID: "b4d72bbc-0042-4886-9c2a-4b6e6267be65"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:42:13 crc kubenswrapper[5004]: I1201 08:42:13.388138 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4d72bbc-0042-4886-9c2a-4b6e6267be65-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b4d72bbc-0042-4886-9c2a-4b6e6267be65" (UID: "b4d72bbc-0042-4886-9c2a-4b6e6267be65"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:42:13 crc kubenswrapper[5004]: I1201 08:42:13.400682 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4d72bbc-0042-4886-9c2a-4b6e6267be65-kube-api-access-wsjrt" (OuterVolumeSpecName: "kube-api-access-wsjrt") pod "b4d72bbc-0042-4886-9c2a-4b6e6267be65" (UID: "b4d72bbc-0042-4886-9c2a-4b6e6267be65"). InnerVolumeSpecName "kube-api-access-wsjrt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:42:13 crc kubenswrapper[5004]: I1201 08:42:13.400706 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4d72bbc-0042-4886-9c2a-4b6e6267be65-scripts" (OuterVolumeSpecName: "scripts") pod "b4d72bbc-0042-4886-9c2a-4b6e6267be65" (UID: "b4d72bbc-0042-4886-9c2a-4b6e6267be65"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:42:13 crc kubenswrapper[5004]: I1201 08:42:13.452774 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4d72bbc-0042-4886-9c2a-4b6e6267be65-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b4d72bbc-0042-4886-9c2a-4b6e6267be65" (UID: "b4d72bbc-0042-4886-9c2a-4b6e6267be65"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:42:13 crc kubenswrapper[5004]: I1201 08:42:13.491337 5004 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4d72bbc-0042-4886-9c2a-4b6e6267be65-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 08:42:13 crc kubenswrapper[5004]: I1201 08:42:13.491375 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wsjrt\" (UniqueName: \"kubernetes.io/projected/b4d72bbc-0042-4886-9c2a-4b6e6267be65-kube-api-access-wsjrt\") on node \"crc\" DevicePath \"\"" Dec 01 08:42:13 crc kubenswrapper[5004]: I1201 08:42:13.491386 5004 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b4d72bbc-0042-4886-9c2a-4b6e6267be65-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 01 08:42:13 crc kubenswrapper[5004]: I1201 08:42:13.491396 5004 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4d72bbc-0042-4886-9c2a-4b6e6267be65-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 08:42:13 crc kubenswrapper[5004]: I1201 08:42:13.491404 5004 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4d72bbc-0042-4886-9c2a-4b6e6267be65-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 08:42:13 crc kubenswrapper[5004]: I1201 08:42:13.539124 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4d72bbc-0042-4886-9c2a-4b6e6267be65-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b4d72bbc-0042-4886-9c2a-4b6e6267be65" (UID: "b4d72bbc-0042-4886-9c2a-4b6e6267be65"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:42:13 crc kubenswrapper[5004]: I1201 08:42:13.547370 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4d72bbc-0042-4886-9c2a-4b6e6267be65-config-data" (OuterVolumeSpecName: "config-data") pod "b4d72bbc-0042-4886-9c2a-4b6e6267be65" (UID: "b4d72bbc-0042-4886-9c2a-4b6e6267be65"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:42:13 crc kubenswrapper[5004]: I1201 08:42:13.593837 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4d72bbc-0042-4886-9c2a-4b6e6267be65-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:42:13 crc kubenswrapper[5004]: I1201 08:42:13.593876 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4d72bbc-0042-4886-9c2a-4b6e6267be65-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 08:42:13 crc kubenswrapper[5004]: I1201 08:42:13.723887 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-1bcc-account-create-update-99f9v" event={"ID":"47a280b1-dda9-451b-8791-990446098df5","Type":"ContainerStarted","Data":"6974c52a424b6d086dc89af8391e8f1b387a3788995ab7b740b0374b7bf52360"} Dec 01 08:42:13 crc kubenswrapper[5004]: I1201 08:42:13.723929 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-1bcc-account-create-update-99f9v" event={"ID":"47a280b1-dda9-451b-8791-990446098df5","Type":"ContainerStarted","Data":"b2c3f4558a0991a9872f4cee19797d525511435afae2d17f3fa90cc568940423"} Dec 01 08:42:13 crc kubenswrapper[5004]: I1201 08:42:13.733947 5004 generic.go:334] "Generic (PLEG): container finished" podID="b4d72bbc-0042-4886-9c2a-4b6e6267be65" containerID="3ccbacce6de7615c7d17e7e92b934338112c49b50467e778319e9a0d13af61d6" exitCode=0 Dec 01 08:42:13 crc kubenswrapper[5004]: I1201 08:42:13.734022 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4d72bbc-0042-4886-9c2a-4b6e6267be65","Type":"ContainerDied","Data":"3ccbacce6de7615c7d17e7e92b934338112c49b50467e778319e9a0d13af61d6"} Dec 01 08:42:13 crc kubenswrapper[5004]: I1201 08:42:13.734054 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4d72bbc-0042-4886-9c2a-4b6e6267be65","Type":"ContainerDied","Data":"8fbd33e5499b2520aa6162d0228fe9a702544d08bf07ec0ccc3a6b42db44f184"} Dec 01 08:42:13 crc kubenswrapper[5004]: I1201 08:42:13.734071 5004 scope.go:117] "RemoveContainer" containerID="e0a094639d8bdc73ef49a7aff999acb00e8a1093ed995e306a787688b6edb9f9" Dec 01 08:42:13 crc kubenswrapper[5004]: I1201 08:42:13.734180 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 08:42:13 crc kubenswrapper[5004]: I1201 08:42:13.746897 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-rpt6t" event={"ID":"8cbde161-56a4-49c2-9e44-e86fc7c4a82f","Type":"ContainerStarted","Data":"e638f1b500c8ca992c0b749a358bda55f4b30342cbfb24291a3875f8aea1bdf8"} Dec 01 08:42:13 crc kubenswrapper[5004]: I1201 08:42:13.746929 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-rpt6t" event={"ID":"8cbde161-56a4-49c2-9e44-e86fc7c4a82f","Type":"ContainerStarted","Data":"d7a10a6b95410ada0db8463e9e3201dd06786cc183b724c09c6242af68d91f00"} Dec 01 08:42:13 crc kubenswrapper[5004]: I1201 08:42:13.751766 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-1bcc-account-create-update-99f9v" podStartSLOduration=2.751745781 podStartE2EDuration="2.751745781s" podCreationTimestamp="2025-12-01 08:42:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:42:13.738877917 +0000 UTC m=+1511.303869899" watchObservedRunningTime="2025-12-01 08:42:13.751745781 +0000 UTC m=+1511.316737763" Dec 01 08:42:13 crc kubenswrapper[5004]: I1201 08:42:13.765712 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-create-rpt6t" podStartSLOduration=2.765694922 podStartE2EDuration="2.765694922s" podCreationTimestamp="2025-12-01 08:42:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:42:13.758919756 +0000 UTC m=+1511.323911758" watchObservedRunningTime="2025-12-01 08:42:13.765694922 +0000 UTC m=+1511.330686904" Dec 01 08:42:13 crc kubenswrapper[5004]: I1201 08:42:13.809303 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 08:42:13 crc kubenswrapper[5004]: I1201 08:42:13.823710 5004 scope.go:117] "RemoveContainer" containerID="0c51692d57f61cf3d1cc91ae9776de254395d2bef599628de1dd0b0a3cc00229" Dec 01 08:42:13 crc kubenswrapper[5004]: I1201 08:42:13.824981 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 01 08:42:13 crc kubenswrapper[5004]: I1201 08:42:13.839680 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 08:42:13 crc kubenswrapper[5004]: E1201 08:42:13.840261 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4d72bbc-0042-4886-9c2a-4b6e6267be65" containerName="sg-core" Dec 01 08:42:13 crc kubenswrapper[5004]: I1201 08:42:13.840290 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4d72bbc-0042-4886-9c2a-4b6e6267be65" containerName="sg-core" Dec 01 08:42:13 crc kubenswrapper[5004]: E1201 08:42:13.840304 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4d72bbc-0042-4886-9c2a-4b6e6267be65" containerName="proxy-httpd" Dec 01 08:42:13 crc kubenswrapper[5004]: I1201 08:42:13.840312 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4d72bbc-0042-4886-9c2a-4b6e6267be65" containerName="proxy-httpd" Dec 01 08:42:13 crc kubenswrapper[5004]: E1201 08:42:13.840328 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ecc657f-21cf-40d3-8f77-29a9f9c4f5f2" containerName="heat-engine" Dec 01 08:42:13 crc kubenswrapper[5004]: I1201 08:42:13.840335 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ecc657f-21cf-40d3-8f77-29a9f9c4f5f2" containerName="heat-engine" Dec 01 08:42:13 crc kubenswrapper[5004]: E1201 08:42:13.840355 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4d72bbc-0042-4886-9c2a-4b6e6267be65" containerName="ceilometer-central-agent" Dec 01 08:42:13 crc kubenswrapper[5004]: I1201 08:42:13.840362 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4d72bbc-0042-4886-9c2a-4b6e6267be65" containerName="ceilometer-central-agent" Dec 01 08:42:13 crc kubenswrapper[5004]: E1201 08:42:13.840396 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4d72bbc-0042-4886-9c2a-4b6e6267be65" containerName="ceilometer-notification-agent" Dec 01 08:42:13 crc kubenswrapper[5004]: I1201 08:42:13.840406 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4d72bbc-0042-4886-9c2a-4b6e6267be65" containerName="ceilometer-notification-agent" Dec 01 08:42:13 crc kubenswrapper[5004]: I1201 08:42:13.840658 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4d72bbc-0042-4886-9c2a-4b6e6267be65" containerName="sg-core" Dec 01 08:42:13 crc kubenswrapper[5004]: I1201 08:42:13.840711 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4d72bbc-0042-4886-9c2a-4b6e6267be65" containerName="ceilometer-central-agent" Dec 01 08:42:13 crc kubenswrapper[5004]: I1201 08:42:13.840728 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ecc657f-21cf-40d3-8f77-29a9f9c4f5f2" containerName="heat-engine" Dec 01 08:42:13 crc kubenswrapper[5004]: I1201 08:42:13.840741 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4d72bbc-0042-4886-9c2a-4b6e6267be65" containerName="proxy-httpd" Dec 01 08:42:13 crc kubenswrapper[5004]: I1201 08:42:13.840755 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4d72bbc-0042-4886-9c2a-4b6e6267be65" containerName="ceilometer-notification-agent" Dec 01 08:42:13 crc kubenswrapper[5004]: I1201 08:42:13.842786 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 08:42:13 crc kubenswrapper[5004]: I1201 08:42:13.846621 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 08:42:13 crc kubenswrapper[5004]: I1201 08:42:13.893028 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 08:42:13 crc kubenswrapper[5004]: I1201 08:42:13.893167 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 08:42:13 crc kubenswrapper[5004]: I1201 08:42:13.942765 5004 scope.go:117] "RemoveContainer" containerID="3ccbacce6de7615c7d17e7e92b934338112c49b50467e778319e9a0d13af61d6" Dec 01 08:42:14 crc kubenswrapper[5004]: I1201 08:42:14.000645 5004 scope.go:117] "RemoveContainer" containerID="080966e70d6fb040c67da27f631647c6eeb3d335f9e02faa0f46df4b3174c846" Dec 01 08:42:14 crc kubenswrapper[5004]: I1201 08:42:14.017828 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a28b451e-9081-498f-9ba4-4aac7a872d5a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a28b451e-9081-498f-9ba4-4aac7a872d5a\") " pod="openstack/ceilometer-0" Dec 01 08:42:14 crc kubenswrapper[5004]: I1201 08:42:14.018089 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a28b451e-9081-498f-9ba4-4aac7a872d5a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a28b451e-9081-498f-9ba4-4aac7a872d5a\") " pod="openstack/ceilometer-0" Dec 01 08:42:14 crc kubenswrapper[5004]: I1201 08:42:14.018235 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a28b451e-9081-498f-9ba4-4aac7a872d5a-config-data\") pod \"ceilometer-0\" (UID: \"a28b451e-9081-498f-9ba4-4aac7a872d5a\") " pod="openstack/ceilometer-0" Dec 01 08:42:14 crc kubenswrapper[5004]: I1201 08:42:14.018306 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a28b451e-9081-498f-9ba4-4aac7a872d5a-log-httpd\") pod \"ceilometer-0\" (UID: \"a28b451e-9081-498f-9ba4-4aac7a872d5a\") " pod="openstack/ceilometer-0" Dec 01 08:42:14 crc kubenswrapper[5004]: I1201 08:42:14.018407 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a28b451e-9081-498f-9ba4-4aac7a872d5a-scripts\") pod \"ceilometer-0\" (UID: \"a28b451e-9081-498f-9ba4-4aac7a872d5a\") " pod="openstack/ceilometer-0" Dec 01 08:42:14 crc kubenswrapper[5004]: I1201 08:42:14.018479 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a28b451e-9081-498f-9ba4-4aac7a872d5a-run-httpd\") pod \"ceilometer-0\" (UID: \"a28b451e-9081-498f-9ba4-4aac7a872d5a\") " pod="openstack/ceilometer-0" Dec 01 08:42:14 crc kubenswrapper[5004]: I1201 08:42:14.018582 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbj5c\" (UniqueName: \"kubernetes.io/projected/a28b451e-9081-498f-9ba4-4aac7a872d5a-kube-api-access-vbj5c\") pod \"ceilometer-0\" (UID: \"a28b451e-9081-498f-9ba4-4aac7a872d5a\") " pod="openstack/ceilometer-0" Dec 01 08:42:14 crc kubenswrapper[5004]: I1201 08:42:14.027273 5004 scope.go:117] "RemoveContainer" containerID="e0a094639d8bdc73ef49a7aff999acb00e8a1093ed995e306a787688b6edb9f9" Dec 01 08:42:14 crc kubenswrapper[5004]: E1201 08:42:14.033748 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0a094639d8bdc73ef49a7aff999acb00e8a1093ed995e306a787688b6edb9f9\": container with ID starting with e0a094639d8bdc73ef49a7aff999acb00e8a1093ed995e306a787688b6edb9f9 not found: ID does not exist" containerID="e0a094639d8bdc73ef49a7aff999acb00e8a1093ed995e306a787688b6edb9f9" Dec 01 08:42:14 crc kubenswrapper[5004]: I1201 08:42:14.033788 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0a094639d8bdc73ef49a7aff999acb00e8a1093ed995e306a787688b6edb9f9"} err="failed to get container status \"e0a094639d8bdc73ef49a7aff999acb00e8a1093ed995e306a787688b6edb9f9\": rpc error: code = NotFound desc = could not find container \"e0a094639d8bdc73ef49a7aff999acb00e8a1093ed995e306a787688b6edb9f9\": container with ID starting with e0a094639d8bdc73ef49a7aff999acb00e8a1093ed995e306a787688b6edb9f9 not found: ID does not exist" Dec 01 08:42:14 crc kubenswrapper[5004]: I1201 08:42:14.033813 5004 scope.go:117] "RemoveContainer" containerID="0c51692d57f61cf3d1cc91ae9776de254395d2bef599628de1dd0b0a3cc00229" Dec 01 08:42:14 crc kubenswrapper[5004]: E1201 08:42:14.035495 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c51692d57f61cf3d1cc91ae9776de254395d2bef599628de1dd0b0a3cc00229\": container with ID starting with 0c51692d57f61cf3d1cc91ae9776de254395d2bef599628de1dd0b0a3cc00229 not found: ID does not exist" containerID="0c51692d57f61cf3d1cc91ae9776de254395d2bef599628de1dd0b0a3cc00229" Dec 01 08:42:14 crc kubenswrapper[5004]: I1201 08:42:14.035516 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c51692d57f61cf3d1cc91ae9776de254395d2bef599628de1dd0b0a3cc00229"} err="failed to get container status \"0c51692d57f61cf3d1cc91ae9776de254395d2bef599628de1dd0b0a3cc00229\": rpc error: code = NotFound desc = could not find container \"0c51692d57f61cf3d1cc91ae9776de254395d2bef599628de1dd0b0a3cc00229\": container with ID starting with 0c51692d57f61cf3d1cc91ae9776de254395d2bef599628de1dd0b0a3cc00229 not found: ID does not exist" Dec 01 08:42:14 crc kubenswrapper[5004]: I1201 08:42:14.035530 5004 scope.go:117] "RemoveContainer" containerID="3ccbacce6de7615c7d17e7e92b934338112c49b50467e778319e9a0d13af61d6" Dec 01 08:42:14 crc kubenswrapper[5004]: E1201 08:42:14.035835 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ccbacce6de7615c7d17e7e92b934338112c49b50467e778319e9a0d13af61d6\": container with ID starting with 3ccbacce6de7615c7d17e7e92b934338112c49b50467e778319e9a0d13af61d6 not found: ID does not exist" containerID="3ccbacce6de7615c7d17e7e92b934338112c49b50467e778319e9a0d13af61d6" Dec 01 08:42:14 crc kubenswrapper[5004]: I1201 08:42:14.035854 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ccbacce6de7615c7d17e7e92b934338112c49b50467e778319e9a0d13af61d6"} err="failed to get container status \"3ccbacce6de7615c7d17e7e92b934338112c49b50467e778319e9a0d13af61d6\": rpc error: code = NotFound desc = could not find container \"3ccbacce6de7615c7d17e7e92b934338112c49b50467e778319e9a0d13af61d6\": container with ID starting with 3ccbacce6de7615c7d17e7e92b934338112c49b50467e778319e9a0d13af61d6 not found: ID does not exist" Dec 01 08:42:14 crc kubenswrapper[5004]: I1201 08:42:14.035866 5004 scope.go:117] "RemoveContainer" containerID="080966e70d6fb040c67da27f631647c6eeb3d335f9e02faa0f46df4b3174c846" Dec 01 08:42:14 crc kubenswrapper[5004]: E1201 08:42:14.036028 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"080966e70d6fb040c67da27f631647c6eeb3d335f9e02faa0f46df4b3174c846\": container with ID starting with 080966e70d6fb040c67da27f631647c6eeb3d335f9e02faa0f46df4b3174c846 not found: ID does not exist" containerID="080966e70d6fb040c67da27f631647c6eeb3d335f9e02faa0f46df4b3174c846" Dec 01 08:42:14 crc kubenswrapper[5004]: I1201 08:42:14.036048 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"080966e70d6fb040c67da27f631647c6eeb3d335f9e02faa0f46df4b3174c846"} err="failed to get container status \"080966e70d6fb040c67da27f631647c6eeb3d335f9e02faa0f46df4b3174c846\": rpc error: code = NotFound desc = could not find container \"080966e70d6fb040c67da27f631647c6eeb3d335f9e02faa0f46df4b3174c846\": container with ID starting with 080966e70d6fb040c67da27f631647c6eeb3d335f9e02faa0f46df4b3174c846 not found: ID does not exist" Dec 01 08:42:14 crc kubenswrapper[5004]: I1201 08:42:14.120330 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a28b451e-9081-498f-9ba4-4aac7a872d5a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a28b451e-9081-498f-9ba4-4aac7a872d5a\") " pod="openstack/ceilometer-0" Dec 01 08:42:14 crc kubenswrapper[5004]: I1201 08:42:14.120423 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a28b451e-9081-498f-9ba4-4aac7a872d5a-config-data\") pod \"ceilometer-0\" (UID: \"a28b451e-9081-498f-9ba4-4aac7a872d5a\") " pod="openstack/ceilometer-0" Dec 01 08:42:14 crc kubenswrapper[5004]: I1201 08:42:14.120447 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a28b451e-9081-498f-9ba4-4aac7a872d5a-log-httpd\") pod \"ceilometer-0\" (UID: \"a28b451e-9081-498f-9ba4-4aac7a872d5a\") " pod="openstack/ceilometer-0" Dec 01 08:42:14 crc kubenswrapper[5004]: I1201 08:42:14.120506 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a28b451e-9081-498f-9ba4-4aac7a872d5a-scripts\") pod \"ceilometer-0\" (UID: \"a28b451e-9081-498f-9ba4-4aac7a872d5a\") " pod="openstack/ceilometer-0" Dec 01 08:42:14 crc kubenswrapper[5004]: I1201 08:42:14.120526 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a28b451e-9081-498f-9ba4-4aac7a872d5a-run-httpd\") pod \"ceilometer-0\" (UID: \"a28b451e-9081-498f-9ba4-4aac7a872d5a\") " pod="openstack/ceilometer-0" Dec 01 08:42:14 crc kubenswrapper[5004]: I1201 08:42:14.120545 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbj5c\" (UniqueName: \"kubernetes.io/projected/a28b451e-9081-498f-9ba4-4aac7a872d5a-kube-api-access-vbj5c\") pod \"ceilometer-0\" (UID: \"a28b451e-9081-498f-9ba4-4aac7a872d5a\") " pod="openstack/ceilometer-0" Dec 01 08:42:14 crc kubenswrapper[5004]: I1201 08:42:14.120607 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a28b451e-9081-498f-9ba4-4aac7a872d5a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a28b451e-9081-498f-9ba4-4aac7a872d5a\") " pod="openstack/ceilometer-0" Dec 01 08:42:14 crc kubenswrapper[5004]: I1201 08:42:14.121257 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a28b451e-9081-498f-9ba4-4aac7a872d5a-log-httpd\") pod \"ceilometer-0\" (UID: \"a28b451e-9081-498f-9ba4-4aac7a872d5a\") " pod="openstack/ceilometer-0" Dec 01 08:42:14 crc kubenswrapper[5004]: I1201 08:42:14.121440 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a28b451e-9081-498f-9ba4-4aac7a872d5a-run-httpd\") pod \"ceilometer-0\" (UID: \"a28b451e-9081-498f-9ba4-4aac7a872d5a\") " pod="openstack/ceilometer-0" Dec 01 08:42:14 crc kubenswrapper[5004]: I1201 08:42:14.123902 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a28b451e-9081-498f-9ba4-4aac7a872d5a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a28b451e-9081-498f-9ba4-4aac7a872d5a\") " pod="openstack/ceilometer-0" Dec 01 08:42:14 crc kubenswrapper[5004]: I1201 08:42:14.124118 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a28b451e-9081-498f-9ba4-4aac7a872d5a-scripts\") pod \"ceilometer-0\" (UID: \"a28b451e-9081-498f-9ba4-4aac7a872d5a\") " pod="openstack/ceilometer-0" Dec 01 08:42:14 crc kubenswrapper[5004]: I1201 08:42:14.125235 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a28b451e-9081-498f-9ba4-4aac7a872d5a-config-data\") pod \"ceilometer-0\" (UID: \"a28b451e-9081-498f-9ba4-4aac7a872d5a\") " pod="openstack/ceilometer-0" Dec 01 08:42:14 crc kubenswrapper[5004]: I1201 08:42:14.130087 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a28b451e-9081-498f-9ba4-4aac7a872d5a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a28b451e-9081-498f-9ba4-4aac7a872d5a\") " pod="openstack/ceilometer-0" Dec 01 08:42:14 crc kubenswrapper[5004]: I1201 08:42:14.135648 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbj5c\" (UniqueName: \"kubernetes.io/projected/a28b451e-9081-498f-9ba4-4aac7a872d5a-kube-api-access-vbj5c\") pod \"ceilometer-0\" (UID: \"a28b451e-9081-498f-9ba4-4aac7a872d5a\") " pod="openstack/ceilometer-0" Dec 01 08:42:14 crc kubenswrapper[5004]: I1201 08:42:14.222615 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 08:42:14 crc kubenswrapper[5004]: I1201 08:42:14.709515 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 08:42:14 crc kubenswrapper[5004]: W1201 08:42:14.720241 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda28b451e_9081_498f_9ba4_4aac7a872d5a.slice/crio-df7f64eb8968b6ab87edbfb444f30ed68b746a02e89a0f4b32be21de46659f39 WatchSource:0}: Error finding container df7f64eb8968b6ab87edbfb444f30ed68b746a02e89a0f4b32be21de46659f39: Status 404 returned error can't find the container with id df7f64eb8968b6ab87edbfb444f30ed68b746a02e89a0f4b32be21de46659f39 Dec 01 08:42:14 crc kubenswrapper[5004]: I1201 08:42:14.767768 5004 generic.go:334] "Generic (PLEG): container finished" podID="47a280b1-dda9-451b-8791-990446098df5" containerID="6974c52a424b6d086dc89af8391e8f1b387a3788995ab7b740b0374b7bf52360" exitCode=0 Dec 01 08:42:14 crc kubenswrapper[5004]: I1201 08:42:14.774721 5004 generic.go:334] "Generic (PLEG): container finished" podID="8cbde161-56a4-49c2-9e44-e86fc7c4a82f" containerID="e638f1b500c8ca992c0b749a358bda55f4b30342cbfb24291a3875f8aea1bdf8" exitCode=0 Dec 01 08:42:14 crc kubenswrapper[5004]: I1201 08:42:14.793793 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ecc657f-21cf-40d3-8f77-29a9f9c4f5f2" path="/var/lib/kubelet/pods/3ecc657f-21cf-40d3-8f77-29a9f9c4f5f2/volumes" Dec 01 08:42:14 crc kubenswrapper[5004]: I1201 08:42:14.794718 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4d72bbc-0042-4886-9c2a-4b6e6267be65" path="/var/lib/kubelet/pods/b4d72bbc-0042-4886-9c2a-4b6e6267be65/volumes" Dec 01 08:42:14 crc kubenswrapper[5004]: I1201 08:42:14.796596 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-1bcc-account-create-update-99f9v" event={"ID":"47a280b1-dda9-451b-8791-990446098df5","Type":"ContainerDied","Data":"6974c52a424b6d086dc89af8391e8f1b387a3788995ab7b740b0374b7bf52360"} Dec 01 08:42:14 crc kubenswrapper[5004]: I1201 08:42:14.796640 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a28b451e-9081-498f-9ba4-4aac7a872d5a","Type":"ContainerStarted","Data":"df7f64eb8968b6ab87edbfb444f30ed68b746a02e89a0f4b32be21de46659f39"} Dec 01 08:42:14 crc kubenswrapper[5004]: I1201 08:42:14.796690 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-rpt6t" event={"ID":"8cbde161-56a4-49c2-9e44-e86fc7c4a82f","Type":"ContainerDied","Data":"e638f1b500c8ca992c0b749a358bda55f4b30342cbfb24291a3875f8aea1bdf8"} Dec 01 08:42:15 crc kubenswrapper[5004]: I1201 08:42:15.785826 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a28b451e-9081-498f-9ba4-4aac7a872d5a","Type":"ContainerStarted","Data":"eb32c682c6f68d69be6b3d8361afd845a1e6777e3c611266e61d0f19c5bae580"} Dec 01 08:42:16 crc kubenswrapper[5004]: I1201 08:42:16.260597 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-rpt6t" Dec 01 08:42:16 crc kubenswrapper[5004]: I1201 08:42:16.269484 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-1bcc-account-create-update-99f9v" Dec 01 08:42:16 crc kubenswrapper[5004]: I1201 08:42:16.372378 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nplbl\" (UniqueName: \"kubernetes.io/projected/47a280b1-dda9-451b-8791-990446098df5-kube-api-access-nplbl\") pod \"47a280b1-dda9-451b-8791-990446098df5\" (UID: \"47a280b1-dda9-451b-8791-990446098df5\") " Dec 01 08:42:16 crc kubenswrapper[5004]: I1201 08:42:16.372509 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8x56b\" (UniqueName: \"kubernetes.io/projected/8cbde161-56a4-49c2-9e44-e86fc7c4a82f-kube-api-access-8x56b\") pod \"8cbde161-56a4-49c2-9e44-e86fc7c4a82f\" (UID: \"8cbde161-56a4-49c2-9e44-e86fc7c4a82f\") " Dec 01 08:42:16 crc kubenswrapper[5004]: I1201 08:42:16.372651 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47a280b1-dda9-451b-8791-990446098df5-operator-scripts\") pod \"47a280b1-dda9-451b-8791-990446098df5\" (UID: \"47a280b1-dda9-451b-8791-990446098df5\") " Dec 01 08:42:16 crc kubenswrapper[5004]: I1201 08:42:16.372693 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8cbde161-56a4-49c2-9e44-e86fc7c4a82f-operator-scripts\") pod \"8cbde161-56a4-49c2-9e44-e86fc7c4a82f\" (UID: \"8cbde161-56a4-49c2-9e44-e86fc7c4a82f\") " Dec 01 08:42:16 crc kubenswrapper[5004]: I1201 08:42:16.373669 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47a280b1-dda9-451b-8791-990446098df5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "47a280b1-dda9-451b-8791-990446098df5" (UID: "47a280b1-dda9-451b-8791-990446098df5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:42:16 crc kubenswrapper[5004]: I1201 08:42:16.373760 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cbde161-56a4-49c2-9e44-e86fc7c4a82f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8cbde161-56a4-49c2-9e44-e86fc7c4a82f" (UID: "8cbde161-56a4-49c2-9e44-e86fc7c4a82f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:42:16 crc kubenswrapper[5004]: I1201 08:42:16.379414 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47a280b1-dda9-451b-8791-990446098df5-kube-api-access-nplbl" (OuterVolumeSpecName: "kube-api-access-nplbl") pod "47a280b1-dda9-451b-8791-990446098df5" (UID: "47a280b1-dda9-451b-8791-990446098df5"). InnerVolumeSpecName "kube-api-access-nplbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:42:16 crc kubenswrapper[5004]: I1201 08:42:16.379957 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cbde161-56a4-49c2-9e44-e86fc7c4a82f-kube-api-access-8x56b" (OuterVolumeSpecName: "kube-api-access-8x56b") pod "8cbde161-56a4-49c2-9e44-e86fc7c4a82f" (UID: "8cbde161-56a4-49c2-9e44-e86fc7c4a82f"). InnerVolumeSpecName "kube-api-access-8x56b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:42:16 crc kubenswrapper[5004]: I1201 08:42:16.475776 5004 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47a280b1-dda9-451b-8791-990446098df5-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 08:42:16 crc kubenswrapper[5004]: I1201 08:42:16.476129 5004 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8cbde161-56a4-49c2-9e44-e86fc7c4a82f-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 08:42:16 crc kubenswrapper[5004]: I1201 08:42:16.476146 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nplbl\" (UniqueName: \"kubernetes.io/projected/47a280b1-dda9-451b-8791-990446098df5-kube-api-access-nplbl\") on node \"crc\" DevicePath \"\"" Dec 01 08:42:16 crc kubenswrapper[5004]: I1201 08:42:16.476159 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8x56b\" (UniqueName: \"kubernetes.io/projected/8cbde161-56a4-49c2-9e44-e86fc7c4a82f-kube-api-access-8x56b\") on node \"crc\" DevicePath \"\"" Dec 01 08:42:16 crc kubenswrapper[5004]: I1201 08:42:16.826804 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-1bcc-account-create-update-99f9v" event={"ID":"47a280b1-dda9-451b-8791-990446098df5","Type":"ContainerDied","Data":"b2c3f4558a0991a9872f4cee19797d525511435afae2d17f3fa90cc568940423"} Dec 01 08:42:16 crc kubenswrapper[5004]: I1201 08:42:16.826848 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2c3f4558a0991a9872f4cee19797d525511435afae2d17f3fa90cc568940423" Dec 01 08:42:16 crc kubenswrapper[5004]: I1201 08:42:16.826890 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-1bcc-account-create-update-99f9v" Dec 01 08:42:16 crc kubenswrapper[5004]: I1201 08:42:16.831275 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a28b451e-9081-498f-9ba4-4aac7a872d5a","Type":"ContainerStarted","Data":"964994c321d2a343de9c030ec175e1032f3eba9d0727c7f7411ddf24be7ffb3d"} Dec 01 08:42:16 crc kubenswrapper[5004]: I1201 08:42:16.833771 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-rpt6t" event={"ID":"8cbde161-56a4-49c2-9e44-e86fc7c4a82f","Type":"ContainerDied","Data":"d7a10a6b95410ada0db8463e9e3201dd06786cc183b724c09c6242af68d91f00"} Dec 01 08:42:16 crc kubenswrapper[5004]: I1201 08:42:16.833828 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7a10a6b95410ada0db8463e9e3201dd06786cc183b724c09c6242af68d91f00" Dec 01 08:42:16 crc kubenswrapper[5004]: I1201 08:42:16.833974 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-rpt6t" Dec 01 08:42:17 crc kubenswrapper[5004]: I1201 08:42:17.847194 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a28b451e-9081-498f-9ba4-4aac7a872d5a","Type":"ContainerStarted","Data":"8c8bd315dc30d6ec8f0fc0860aea5f624ba72b7e94565e563a4a64a38a8369de"} Dec 01 08:42:18 crc kubenswrapper[5004]: I1201 08:42:18.867951 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a28b451e-9081-498f-9ba4-4aac7a872d5a","Type":"ContainerStarted","Data":"b0375f9fe61f3bdd15e1bf95b851c577fd28e5f351525752302efabd86b25234"} Dec 01 08:42:18 crc kubenswrapper[5004]: I1201 08:42:18.868637 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 08:42:18 crc kubenswrapper[5004]: I1201 08:42:18.894394 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.373998827 podStartE2EDuration="5.894374057s" podCreationTimestamp="2025-12-01 08:42:13 +0000 UTC" firstStartedPulling="2025-12-01 08:42:14.723110678 +0000 UTC m=+1512.288102660" lastFinishedPulling="2025-12-01 08:42:18.243485878 +0000 UTC m=+1515.808477890" observedRunningTime="2025-12-01 08:42:18.891799314 +0000 UTC m=+1516.456791316" watchObservedRunningTime="2025-12-01 08:42:18.894374057 +0000 UTC m=+1516.459366049" Dec 01 08:42:19 crc kubenswrapper[5004]: I1201 08:42:19.881368 5004 generic.go:334] "Generic (PLEG): container finished" podID="058c0853-8613-4681-b617-fb985abce304" containerID="4936719999b16beb7679b4dd16c55d0eafcd9b7145a203c8432b4cf062a9f454" exitCode=0 Dec 01 08:42:19 crc kubenswrapper[5004]: I1201 08:42:19.881467 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-598mn" event={"ID":"058c0853-8613-4681-b617-fb985abce304","Type":"ContainerDied","Data":"4936719999b16beb7679b4dd16c55d0eafcd9b7145a203c8432b4cf062a9f454"} Dec 01 08:42:21 crc kubenswrapper[5004]: I1201 08:42:21.359261 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-598mn" Dec 01 08:42:21 crc kubenswrapper[5004]: I1201 08:42:21.504979 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/058c0853-8613-4681-b617-fb985abce304-combined-ca-bundle\") pod \"058c0853-8613-4681-b617-fb985abce304\" (UID: \"058c0853-8613-4681-b617-fb985abce304\") " Dec 01 08:42:21 crc kubenswrapper[5004]: I1201 08:42:21.505587 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/058c0853-8613-4681-b617-fb985abce304-scripts\") pod \"058c0853-8613-4681-b617-fb985abce304\" (UID: \"058c0853-8613-4681-b617-fb985abce304\") " Dec 01 08:42:21 crc kubenswrapper[5004]: I1201 08:42:21.505685 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/058c0853-8613-4681-b617-fb985abce304-config-data\") pod \"058c0853-8613-4681-b617-fb985abce304\" (UID: \"058c0853-8613-4681-b617-fb985abce304\") " Dec 01 08:42:21 crc kubenswrapper[5004]: I1201 08:42:21.505763 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w2t5\" (UniqueName: \"kubernetes.io/projected/058c0853-8613-4681-b617-fb985abce304-kube-api-access-2w2t5\") pod \"058c0853-8613-4681-b617-fb985abce304\" (UID: \"058c0853-8613-4681-b617-fb985abce304\") " Dec 01 08:42:21 crc kubenswrapper[5004]: I1201 08:42:21.510615 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/058c0853-8613-4681-b617-fb985abce304-scripts" (OuterVolumeSpecName: "scripts") pod "058c0853-8613-4681-b617-fb985abce304" (UID: "058c0853-8613-4681-b617-fb985abce304"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:42:21 crc kubenswrapper[5004]: I1201 08:42:21.513522 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/058c0853-8613-4681-b617-fb985abce304-kube-api-access-2w2t5" (OuterVolumeSpecName: "kube-api-access-2w2t5") pod "058c0853-8613-4681-b617-fb985abce304" (UID: "058c0853-8613-4681-b617-fb985abce304"). InnerVolumeSpecName "kube-api-access-2w2t5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:42:21 crc kubenswrapper[5004]: I1201 08:42:21.536791 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/058c0853-8613-4681-b617-fb985abce304-config-data" (OuterVolumeSpecName: "config-data") pod "058c0853-8613-4681-b617-fb985abce304" (UID: "058c0853-8613-4681-b617-fb985abce304"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:42:21 crc kubenswrapper[5004]: I1201 08:42:21.547982 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/058c0853-8613-4681-b617-fb985abce304-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "058c0853-8613-4681-b617-fb985abce304" (UID: "058c0853-8613-4681-b617-fb985abce304"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:42:21 crc kubenswrapper[5004]: I1201 08:42:21.607837 5004 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/058c0853-8613-4681-b617-fb985abce304-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 08:42:21 crc kubenswrapper[5004]: I1201 08:42:21.607863 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/058c0853-8613-4681-b617-fb985abce304-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 08:42:21 crc kubenswrapper[5004]: I1201 08:42:21.607876 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w2t5\" (UniqueName: \"kubernetes.io/projected/058c0853-8613-4681-b617-fb985abce304-kube-api-access-2w2t5\") on node \"crc\" DevicePath \"\"" Dec 01 08:42:21 crc kubenswrapper[5004]: I1201 08:42:21.607886 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/058c0853-8613-4681-b617-fb985abce304-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:42:21 crc kubenswrapper[5004]: I1201 08:42:21.907483 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-598mn" event={"ID":"058c0853-8613-4681-b617-fb985abce304","Type":"ContainerDied","Data":"3f95466b6783581424d931dfa98d093704a23c2fb75f3a3df17f6f349eb29597"} Dec 01 08:42:21 crc kubenswrapper[5004]: I1201 08:42:21.907540 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f95466b6783581424d931dfa98d093704a23c2fb75f3a3df17f6f349eb29597" Dec 01 08:42:21 crc kubenswrapper[5004]: I1201 08:42:21.907613 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-598mn" Dec 01 08:42:22 crc kubenswrapper[5004]: I1201 08:42:22.070114 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 01 08:42:22 crc kubenswrapper[5004]: E1201 08:42:22.070880 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="058c0853-8613-4681-b617-fb985abce304" containerName="nova-cell0-conductor-db-sync" Dec 01 08:42:22 crc kubenswrapper[5004]: I1201 08:42:22.070911 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="058c0853-8613-4681-b617-fb985abce304" containerName="nova-cell0-conductor-db-sync" Dec 01 08:42:22 crc kubenswrapper[5004]: E1201 08:42:22.071000 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47a280b1-dda9-451b-8791-990446098df5" containerName="mariadb-account-create-update" Dec 01 08:42:22 crc kubenswrapper[5004]: I1201 08:42:22.071013 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="47a280b1-dda9-451b-8791-990446098df5" containerName="mariadb-account-create-update" Dec 01 08:42:22 crc kubenswrapper[5004]: E1201 08:42:22.071044 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cbde161-56a4-49c2-9e44-e86fc7c4a82f" containerName="mariadb-database-create" Dec 01 08:42:22 crc kubenswrapper[5004]: I1201 08:42:22.071060 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cbde161-56a4-49c2-9e44-e86fc7c4a82f" containerName="mariadb-database-create" Dec 01 08:42:22 crc kubenswrapper[5004]: I1201 08:42:22.071471 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cbde161-56a4-49c2-9e44-e86fc7c4a82f" containerName="mariadb-database-create" Dec 01 08:42:22 crc kubenswrapper[5004]: I1201 08:42:22.071520 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="058c0853-8613-4681-b617-fb985abce304" containerName="nova-cell0-conductor-db-sync" Dec 01 08:42:22 crc kubenswrapper[5004]: I1201 08:42:22.071555 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="47a280b1-dda9-451b-8791-990446098df5" containerName="mariadb-account-create-update" Dec 01 08:42:22 crc kubenswrapper[5004]: I1201 08:42:22.072860 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 01 08:42:22 crc kubenswrapper[5004]: I1201 08:42:22.076023 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-vqbz7" Dec 01 08:42:22 crc kubenswrapper[5004]: I1201 08:42:22.084148 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 01 08:42:22 crc kubenswrapper[5004]: I1201 08:42:22.084269 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 01 08:42:22 crc kubenswrapper[5004]: I1201 08:42:22.220092 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87fd3c52-cd06-4af8-9399-7ed081bc8799-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"87fd3c52-cd06-4af8-9399-7ed081bc8799\") " pod="openstack/nova-cell0-conductor-0" Dec 01 08:42:22 crc kubenswrapper[5004]: I1201 08:42:22.220297 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7pts\" (UniqueName: \"kubernetes.io/projected/87fd3c52-cd06-4af8-9399-7ed081bc8799-kube-api-access-t7pts\") pod \"nova-cell0-conductor-0\" (UID: \"87fd3c52-cd06-4af8-9399-7ed081bc8799\") " pod="openstack/nova-cell0-conductor-0" Dec 01 08:42:22 crc kubenswrapper[5004]: I1201 08:42:22.220332 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87fd3c52-cd06-4af8-9399-7ed081bc8799-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"87fd3c52-cd06-4af8-9399-7ed081bc8799\") " pod="openstack/nova-cell0-conductor-0" Dec 01 08:42:22 crc kubenswrapper[5004]: I1201 08:42:22.321882 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87fd3c52-cd06-4af8-9399-7ed081bc8799-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"87fd3c52-cd06-4af8-9399-7ed081bc8799\") " pod="openstack/nova-cell0-conductor-0" Dec 01 08:42:22 crc kubenswrapper[5004]: I1201 08:42:22.322589 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7pts\" (UniqueName: \"kubernetes.io/projected/87fd3c52-cd06-4af8-9399-7ed081bc8799-kube-api-access-t7pts\") pod \"nova-cell0-conductor-0\" (UID: \"87fd3c52-cd06-4af8-9399-7ed081bc8799\") " pod="openstack/nova-cell0-conductor-0" Dec 01 08:42:22 crc kubenswrapper[5004]: I1201 08:42:22.322725 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87fd3c52-cd06-4af8-9399-7ed081bc8799-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"87fd3c52-cd06-4af8-9399-7ed081bc8799\") " pod="openstack/nova-cell0-conductor-0" Dec 01 08:42:22 crc kubenswrapper[5004]: I1201 08:42:22.328606 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87fd3c52-cd06-4af8-9399-7ed081bc8799-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"87fd3c52-cd06-4af8-9399-7ed081bc8799\") " pod="openstack/nova-cell0-conductor-0" Dec 01 08:42:22 crc kubenswrapper[5004]: I1201 08:42:22.328639 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87fd3c52-cd06-4af8-9399-7ed081bc8799-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"87fd3c52-cd06-4af8-9399-7ed081bc8799\") " pod="openstack/nova-cell0-conductor-0" Dec 01 08:42:22 crc kubenswrapper[5004]: I1201 08:42:22.340286 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7pts\" (UniqueName: \"kubernetes.io/projected/87fd3c52-cd06-4af8-9399-7ed081bc8799-kube-api-access-t7pts\") pod \"nova-cell0-conductor-0\" (UID: \"87fd3c52-cd06-4af8-9399-7ed081bc8799\") " pod="openstack/nova-cell0-conductor-0" Dec 01 08:42:22 crc kubenswrapper[5004]: I1201 08:42:22.415381 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 01 08:42:22 crc kubenswrapper[5004]: I1201 08:42:22.499426 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-hxtzx"] Dec 01 08:42:22 crc kubenswrapper[5004]: I1201 08:42:22.500775 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-hxtzx" Dec 01 08:42:22 crc kubenswrapper[5004]: I1201 08:42:22.509257 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-hxtzx"] Dec 01 08:42:22 crc kubenswrapper[5004]: I1201 08:42:22.514366 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Dec 01 08:42:22 crc kubenswrapper[5004]: I1201 08:42:22.514532 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-hrc7d" Dec 01 08:42:22 crc kubenswrapper[5004]: I1201 08:42:22.514691 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Dec 01 08:42:22 crc kubenswrapper[5004]: I1201 08:42:22.514790 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 01 08:42:22 crc kubenswrapper[5004]: I1201 08:42:22.631063 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34e30621-736a-4bfd-8b6d-fbbb4350e4ad-combined-ca-bundle\") pod \"aodh-db-sync-hxtzx\" (UID: \"34e30621-736a-4bfd-8b6d-fbbb4350e4ad\") " pod="openstack/aodh-db-sync-hxtzx" Dec 01 08:42:22 crc kubenswrapper[5004]: I1201 08:42:22.631117 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34e30621-736a-4bfd-8b6d-fbbb4350e4ad-config-data\") pod \"aodh-db-sync-hxtzx\" (UID: \"34e30621-736a-4bfd-8b6d-fbbb4350e4ad\") " pod="openstack/aodh-db-sync-hxtzx" Dec 01 08:42:22 crc kubenswrapper[5004]: I1201 08:42:22.631268 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34e30621-736a-4bfd-8b6d-fbbb4350e4ad-scripts\") pod \"aodh-db-sync-hxtzx\" (UID: \"34e30621-736a-4bfd-8b6d-fbbb4350e4ad\") " pod="openstack/aodh-db-sync-hxtzx" Dec 01 08:42:22 crc kubenswrapper[5004]: I1201 08:42:22.631306 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5r7ft\" (UniqueName: \"kubernetes.io/projected/34e30621-736a-4bfd-8b6d-fbbb4350e4ad-kube-api-access-5r7ft\") pod \"aodh-db-sync-hxtzx\" (UID: \"34e30621-736a-4bfd-8b6d-fbbb4350e4ad\") " pod="openstack/aodh-db-sync-hxtzx" Dec 01 08:42:22 crc kubenswrapper[5004]: I1201 08:42:22.734072 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34e30621-736a-4bfd-8b6d-fbbb4350e4ad-scripts\") pod \"aodh-db-sync-hxtzx\" (UID: \"34e30621-736a-4bfd-8b6d-fbbb4350e4ad\") " pod="openstack/aodh-db-sync-hxtzx" Dec 01 08:42:22 crc kubenswrapper[5004]: I1201 08:42:22.734142 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5r7ft\" (UniqueName: \"kubernetes.io/projected/34e30621-736a-4bfd-8b6d-fbbb4350e4ad-kube-api-access-5r7ft\") pod \"aodh-db-sync-hxtzx\" (UID: \"34e30621-736a-4bfd-8b6d-fbbb4350e4ad\") " pod="openstack/aodh-db-sync-hxtzx" Dec 01 08:42:22 crc kubenswrapper[5004]: I1201 08:42:22.734300 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34e30621-736a-4bfd-8b6d-fbbb4350e4ad-combined-ca-bundle\") pod \"aodh-db-sync-hxtzx\" (UID: \"34e30621-736a-4bfd-8b6d-fbbb4350e4ad\") " pod="openstack/aodh-db-sync-hxtzx" Dec 01 08:42:22 crc kubenswrapper[5004]: I1201 08:42:22.734329 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34e30621-736a-4bfd-8b6d-fbbb4350e4ad-config-data\") pod \"aodh-db-sync-hxtzx\" (UID: \"34e30621-736a-4bfd-8b6d-fbbb4350e4ad\") " pod="openstack/aodh-db-sync-hxtzx" Dec 01 08:42:22 crc kubenswrapper[5004]: I1201 08:42:22.739736 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34e30621-736a-4bfd-8b6d-fbbb4350e4ad-scripts\") pod \"aodh-db-sync-hxtzx\" (UID: \"34e30621-736a-4bfd-8b6d-fbbb4350e4ad\") " pod="openstack/aodh-db-sync-hxtzx" Dec 01 08:42:22 crc kubenswrapper[5004]: I1201 08:42:22.740393 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34e30621-736a-4bfd-8b6d-fbbb4350e4ad-combined-ca-bundle\") pod \"aodh-db-sync-hxtzx\" (UID: \"34e30621-736a-4bfd-8b6d-fbbb4350e4ad\") " pod="openstack/aodh-db-sync-hxtzx" Dec 01 08:42:22 crc kubenswrapper[5004]: I1201 08:42:22.744176 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34e30621-736a-4bfd-8b6d-fbbb4350e4ad-config-data\") pod \"aodh-db-sync-hxtzx\" (UID: \"34e30621-736a-4bfd-8b6d-fbbb4350e4ad\") " pod="openstack/aodh-db-sync-hxtzx" Dec 01 08:42:22 crc kubenswrapper[5004]: I1201 08:42:22.749996 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5r7ft\" (UniqueName: \"kubernetes.io/projected/34e30621-736a-4bfd-8b6d-fbbb4350e4ad-kube-api-access-5r7ft\") pod \"aodh-db-sync-hxtzx\" (UID: \"34e30621-736a-4bfd-8b6d-fbbb4350e4ad\") " pod="openstack/aodh-db-sync-hxtzx" Dec 01 08:42:22 crc kubenswrapper[5004]: I1201 08:42:22.882621 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-hxtzx" Dec 01 08:42:22 crc kubenswrapper[5004]: I1201 08:42:22.945000 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 01 08:42:23 crc kubenswrapper[5004]: I1201 08:42:23.393354 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-hxtzx"] Dec 01 08:42:23 crc kubenswrapper[5004]: W1201 08:42:23.395710 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34e30621_736a_4bfd_8b6d_fbbb4350e4ad.slice/crio-7ac00c7bee974d624e7ef2c1e7deb7837d8872405f7f0a5c7ed6b960d1d8d871 WatchSource:0}: Error finding container 7ac00c7bee974d624e7ef2c1e7deb7837d8872405f7f0a5c7ed6b960d1d8d871: Status 404 returned error can't find the container with id 7ac00c7bee974d624e7ef2c1e7deb7837d8872405f7f0a5c7ed6b960d1d8d871 Dec 01 08:42:23 crc kubenswrapper[5004]: I1201 08:42:23.939718 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-hxtzx" event={"ID":"34e30621-736a-4bfd-8b6d-fbbb4350e4ad","Type":"ContainerStarted","Data":"7ac00c7bee974d624e7ef2c1e7deb7837d8872405f7f0a5c7ed6b960d1d8d871"} Dec 01 08:42:23 crc kubenswrapper[5004]: I1201 08:42:23.942356 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"87fd3c52-cd06-4af8-9399-7ed081bc8799","Type":"ContainerStarted","Data":"92291e782d18bd10f79705a4dc5665560e1535196a8d2da88b08317f1259a751"} Dec 01 08:42:23 crc kubenswrapper[5004]: I1201 08:42:23.942380 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"87fd3c52-cd06-4af8-9399-7ed081bc8799","Type":"ContainerStarted","Data":"ed2ea6c0dcf1c1826ed847a3f76a416e26c71cc596062fedc99af828932a48a4"} Dec 01 08:42:23 crc kubenswrapper[5004]: I1201 08:42:23.942544 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 01 08:42:23 crc kubenswrapper[5004]: I1201 08:42:23.963728 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=1.963707153 podStartE2EDuration="1.963707153s" podCreationTimestamp="2025-12-01 08:42:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:42:23.955544544 +0000 UTC m=+1521.520536526" watchObservedRunningTime="2025-12-01 08:42:23.963707153 +0000 UTC m=+1521.528699145" Dec 01 08:42:30 crc kubenswrapper[5004]: I1201 08:42:30.033781 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-hxtzx" event={"ID":"34e30621-736a-4bfd-8b6d-fbbb4350e4ad","Type":"ContainerStarted","Data":"a410c8306ba4e6c1633fed1ba85921a2fdf9711c53f980b87e3693d9b9adb70a"} Dec 01 08:42:32 crc kubenswrapper[5004]: I1201 08:42:32.062733 5004 generic.go:334] "Generic (PLEG): container finished" podID="34e30621-736a-4bfd-8b6d-fbbb4350e4ad" containerID="a410c8306ba4e6c1633fed1ba85921a2fdf9711c53f980b87e3693d9b9adb70a" exitCode=0 Dec 01 08:42:32 crc kubenswrapper[5004]: I1201 08:42:32.062838 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-hxtzx" event={"ID":"34e30621-736a-4bfd-8b6d-fbbb4350e4ad","Type":"ContainerDied","Data":"a410c8306ba4e6c1633fed1ba85921a2fdf9711c53f980b87e3693d9b9adb70a"} Dec 01 08:42:32 crc kubenswrapper[5004]: I1201 08:42:32.467411 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.023094 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-6rnpl"] Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.025893 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-6rnpl" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.034076 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.034339 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.061521 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-6rnpl"] Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.102093 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/902842db-d6a2-4ae1-8d5e-25b637f4db2c-config-data\") pod \"nova-cell0-cell-mapping-6rnpl\" (UID: \"902842db-d6a2-4ae1-8d5e-25b637f4db2c\") " pod="openstack/nova-cell0-cell-mapping-6rnpl" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.102199 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/902842db-d6a2-4ae1-8d5e-25b637f4db2c-scripts\") pod \"nova-cell0-cell-mapping-6rnpl\" (UID: \"902842db-d6a2-4ae1-8d5e-25b637f4db2c\") " pod="openstack/nova-cell0-cell-mapping-6rnpl" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.102292 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/902842db-d6a2-4ae1-8d5e-25b637f4db2c-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-6rnpl\" (UID: \"902842db-d6a2-4ae1-8d5e-25b637f4db2c\") " pod="openstack/nova-cell0-cell-mapping-6rnpl" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.102389 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f68x4\" (UniqueName: \"kubernetes.io/projected/902842db-d6a2-4ae1-8d5e-25b637f4db2c-kube-api-access-f68x4\") pod \"nova-cell0-cell-mapping-6rnpl\" (UID: \"902842db-d6a2-4ae1-8d5e-25b637f4db2c\") " pod="openstack/nova-cell0-cell-mapping-6rnpl" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.211862 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f68x4\" (UniqueName: \"kubernetes.io/projected/902842db-d6a2-4ae1-8d5e-25b637f4db2c-kube-api-access-f68x4\") pod \"nova-cell0-cell-mapping-6rnpl\" (UID: \"902842db-d6a2-4ae1-8d5e-25b637f4db2c\") " pod="openstack/nova-cell0-cell-mapping-6rnpl" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.212291 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/902842db-d6a2-4ae1-8d5e-25b637f4db2c-config-data\") pod \"nova-cell0-cell-mapping-6rnpl\" (UID: \"902842db-d6a2-4ae1-8d5e-25b637f4db2c\") " pod="openstack/nova-cell0-cell-mapping-6rnpl" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.212354 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/902842db-d6a2-4ae1-8d5e-25b637f4db2c-scripts\") pod \"nova-cell0-cell-mapping-6rnpl\" (UID: \"902842db-d6a2-4ae1-8d5e-25b637f4db2c\") " pod="openstack/nova-cell0-cell-mapping-6rnpl" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.212467 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/902842db-d6a2-4ae1-8d5e-25b637f4db2c-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-6rnpl\" (UID: \"902842db-d6a2-4ae1-8d5e-25b637f4db2c\") " pod="openstack/nova-cell0-cell-mapping-6rnpl" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.236701 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/902842db-d6a2-4ae1-8d5e-25b637f4db2c-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-6rnpl\" (UID: \"902842db-d6a2-4ae1-8d5e-25b637f4db2c\") " pod="openstack/nova-cell0-cell-mapping-6rnpl" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.236767 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/902842db-d6a2-4ae1-8d5e-25b637f4db2c-config-data\") pod \"nova-cell0-cell-mapping-6rnpl\" (UID: \"902842db-d6a2-4ae1-8d5e-25b637f4db2c\") " pod="openstack/nova-cell0-cell-mapping-6rnpl" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.244274 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/902842db-d6a2-4ae1-8d5e-25b637f4db2c-scripts\") pod \"nova-cell0-cell-mapping-6rnpl\" (UID: \"902842db-d6a2-4ae1-8d5e-25b637f4db2c\") " pod="openstack/nova-cell0-cell-mapping-6rnpl" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.270247 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.280166 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f68x4\" (UniqueName: \"kubernetes.io/projected/902842db-d6a2-4ae1-8d5e-25b637f4db2c-kube-api-access-f68x4\") pod \"nova-cell0-cell-mapping-6rnpl\" (UID: \"902842db-d6a2-4ae1-8d5e-25b637f4db2c\") " pod="openstack/nova-cell0-cell-mapping-6rnpl" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.294293 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.306959 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.339148 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.416468 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-6rnpl" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.421687 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cf491dc-7bad-4373-ba55-e59710d8c05f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0cf491dc-7bad-4373-ba55-e59710d8c05f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.421753 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vf64w\" (UniqueName: \"kubernetes.io/projected/0cf491dc-7bad-4373-ba55-e59710d8c05f-kube-api-access-vf64w\") pod \"nova-cell1-novncproxy-0\" (UID: \"0cf491dc-7bad-4373-ba55-e59710d8c05f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.421894 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cf491dc-7bad-4373-ba55-e59710d8c05f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0cf491dc-7bad-4373-ba55-e59710d8c05f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.435729 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.437293 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.442826 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.453021 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.471633 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.473806 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.486681 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.523436 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/948b1e6b-150b-44f9-8dc9-d6731e08f6e8-config-data\") pod \"nova-scheduler-0\" (UID: \"948b1e6b-150b-44f9-8dc9-d6731e08f6e8\") " pod="openstack/nova-scheduler-0" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.523491 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvs5w\" (UniqueName: \"kubernetes.io/projected/9ae06efb-c5ea-45b8-9c2a-c7c43dcca1c7-kube-api-access-cvs5w\") pod \"nova-api-0\" (UID: \"9ae06efb-c5ea-45b8-9c2a-c7c43dcca1c7\") " pod="openstack/nova-api-0" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.523519 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ae06efb-c5ea-45b8-9c2a-c7c43dcca1c7-config-data\") pod \"nova-api-0\" (UID: \"9ae06efb-c5ea-45b8-9c2a-c7c43dcca1c7\") " pod="openstack/nova-api-0" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.531346 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.542727 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cf491dc-7bad-4373-ba55-e59710d8c05f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0cf491dc-7bad-4373-ba55-e59710d8c05f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.542834 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vf64w\" (UniqueName: \"kubernetes.io/projected/0cf491dc-7bad-4373-ba55-e59710d8c05f-kube-api-access-vf64w\") pod \"nova-cell1-novncproxy-0\" (UID: \"0cf491dc-7bad-4373-ba55-e59710d8c05f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.542900 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ae06efb-c5ea-45b8-9c2a-c7c43dcca1c7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9ae06efb-c5ea-45b8-9c2a-c7c43dcca1c7\") " pod="openstack/nova-api-0" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.542972 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltzj8\" (UniqueName: \"kubernetes.io/projected/948b1e6b-150b-44f9-8dc9-d6731e08f6e8-kube-api-access-ltzj8\") pod \"nova-scheduler-0\" (UID: \"948b1e6b-150b-44f9-8dc9-d6731e08f6e8\") " pod="openstack/nova-scheduler-0" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.543102 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/948b1e6b-150b-44f9-8dc9-d6731e08f6e8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"948b1e6b-150b-44f9-8dc9-d6731e08f6e8\") " pod="openstack/nova-scheduler-0" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.543147 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ae06efb-c5ea-45b8-9c2a-c7c43dcca1c7-logs\") pod \"nova-api-0\" (UID: \"9ae06efb-c5ea-45b8-9c2a-c7c43dcca1c7\") " pod="openstack/nova-api-0" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.543258 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cf491dc-7bad-4373-ba55-e59710d8c05f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0cf491dc-7bad-4373-ba55-e59710d8c05f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.569769 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cf491dc-7bad-4373-ba55-e59710d8c05f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0cf491dc-7bad-4373-ba55-e59710d8c05f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.578319 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cf491dc-7bad-4373-ba55-e59710d8c05f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0cf491dc-7bad-4373-ba55-e59710d8c05f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.582418 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vf64w\" (UniqueName: \"kubernetes.io/projected/0cf491dc-7bad-4373-ba55-e59710d8c05f-kube-api-access-vf64w\") pod \"nova-cell1-novncproxy-0\" (UID: \"0cf491dc-7bad-4373-ba55-e59710d8c05f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.587387 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.592183 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.601316 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.635845 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.645915 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dec4937a-6e5e-4612-b0d6-44b774704a2d-logs\") pod \"nova-metadata-0\" (UID: \"dec4937a-6e5e-4612-b0d6-44b774704a2d\") " pod="openstack/nova-metadata-0" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.645978 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/948b1e6b-150b-44f9-8dc9-d6731e08f6e8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"948b1e6b-150b-44f9-8dc9-d6731e08f6e8\") " pod="openstack/nova-scheduler-0" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.646004 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dec4937a-6e5e-4612-b0d6-44b774704a2d-config-data\") pod \"nova-metadata-0\" (UID: \"dec4937a-6e5e-4612-b0d6-44b774704a2d\") " pod="openstack/nova-metadata-0" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.646035 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dec4937a-6e5e-4612-b0d6-44b774704a2d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"dec4937a-6e5e-4612-b0d6-44b774704a2d\") " pod="openstack/nova-metadata-0" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.646071 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ae06efb-c5ea-45b8-9c2a-c7c43dcca1c7-logs\") pod \"nova-api-0\" (UID: \"9ae06efb-c5ea-45b8-9c2a-c7c43dcca1c7\") " pod="openstack/nova-api-0" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.646117 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngf8v\" (UniqueName: \"kubernetes.io/projected/dec4937a-6e5e-4612-b0d6-44b774704a2d-kube-api-access-ngf8v\") pod \"nova-metadata-0\" (UID: \"dec4937a-6e5e-4612-b0d6-44b774704a2d\") " pod="openstack/nova-metadata-0" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.646206 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/948b1e6b-150b-44f9-8dc9-d6731e08f6e8-config-data\") pod \"nova-scheduler-0\" (UID: \"948b1e6b-150b-44f9-8dc9-d6731e08f6e8\") " pod="openstack/nova-scheduler-0" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.646237 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvs5w\" (UniqueName: \"kubernetes.io/projected/9ae06efb-c5ea-45b8-9c2a-c7c43dcca1c7-kube-api-access-cvs5w\") pod \"nova-api-0\" (UID: \"9ae06efb-c5ea-45b8-9c2a-c7c43dcca1c7\") " pod="openstack/nova-api-0" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.646270 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ae06efb-c5ea-45b8-9c2a-c7c43dcca1c7-config-data\") pod \"nova-api-0\" (UID: \"9ae06efb-c5ea-45b8-9c2a-c7c43dcca1c7\") " pod="openstack/nova-api-0" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.646357 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ae06efb-c5ea-45b8-9c2a-c7c43dcca1c7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9ae06efb-c5ea-45b8-9c2a-c7c43dcca1c7\") " pod="openstack/nova-api-0" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.646401 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltzj8\" (UniqueName: \"kubernetes.io/projected/948b1e6b-150b-44f9-8dc9-d6731e08f6e8-kube-api-access-ltzj8\") pod \"nova-scheduler-0\" (UID: \"948b1e6b-150b-44f9-8dc9-d6731e08f6e8\") " pod="openstack/nova-scheduler-0" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.663319 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/948b1e6b-150b-44f9-8dc9-d6731e08f6e8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"948b1e6b-150b-44f9-8dc9-d6731e08f6e8\") " pod="openstack/nova-scheduler-0" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.665713 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ae06efb-c5ea-45b8-9c2a-c7c43dcca1c7-logs\") pod \"nova-api-0\" (UID: \"9ae06efb-c5ea-45b8-9c2a-c7c43dcca1c7\") " pod="openstack/nova-api-0" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.670175 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ae06efb-c5ea-45b8-9c2a-c7c43dcca1c7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9ae06efb-c5ea-45b8-9c2a-c7c43dcca1c7\") " pod="openstack/nova-api-0" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.670894 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ae06efb-c5ea-45b8-9c2a-c7c43dcca1c7-config-data\") pod \"nova-api-0\" (UID: \"9ae06efb-c5ea-45b8-9c2a-c7c43dcca1c7\") " pod="openstack/nova-api-0" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.682271 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/948b1e6b-150b-44f9-8dc9-d6731e08f6e8-config-data\") pod \"nova-scheduler-0\" (UID: \"948b1e6b-150b-44f9-8dc9-d6731e08f6e8\") " pod="openstack/nova-scheduler-0" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.709103 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.722593 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvs5w\" (UniqueName: \"kubernetes.io/projected/9ae06efb-c5ea-45b8-9c2a-c7c43dcca1c7-kube-api-access-cvs5w\") pod \"nova-api-0\" (UID: \"9ae06efb-c5ea-45b8-9c2a-c7c43dcca1c7\") " pod="openstack/nova-api-0" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.723182 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltzj8\" (UniqueName: \"kubernetes.io/projected/948b1e6b-150b-44f9-8dc9-d6731e08f6e8-kube-api-access-ltzj8\") pod \"nova-scheduler-0\" (UID: \"948b1e6b-150b-44f9-8dc9-d6731e08f6e8\") " pod="openstack/nova-scheduler-0" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.735354 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.751610 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dec4937a-6e5e-4612-b0d6-44b774704a2d-logs\") pod \"nova-metadata-0\" (UID: \"dec4937a-6e5e-4612-b0d6-44b774704a2d\") " pod="openstack/nova-metadata-0" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.751674 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dec4937a-6e5e-4612-b0d6-44b774704a2d-config-data\") pod \"nova-metadata-0\" (UID: \"dec4937a-6e5e-4612-b0d6-44b774704a2d\") " pod="openstack/nova-metadata-0" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.751697 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dec4937a-6e5e-4612-b0d6-44b774704a2d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"dec4937a-6e5e-4612-b0d6-44b774704a2d\") " pod="openstack/nova-metadata-0" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.751739 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngf8v\" (UniqueName: \"kubernetes.io/projected/dec4937a-6e5e-4612-b0d6-44b774704a2d-kube-api-access-ngf8v\") pod \"nova-metadata-0\" (UID: \"dec4937a-6e5e-4612-b0d6-44b774704a2d\") " pod="openstack/nova-metadata-0" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.752801 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dec4937a-6e5e-4612-b0d6-44b774704a2d-logs\") pod \"nova-metadata-0\" (UID: \"dec4937a-6e5e-4612-b0d6-44b774704a2d\") " pod="openstack/nova-metadata-0" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.756229 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dec4937a-6e5e-4612-b0d6-44b774704a2d-config-data\") pod \"nova-metadata-0\" (UID: \"dec4937a-6e5e-4612-b0d6-44b774704a2d\") " pod="openstack/nova-metadata-0" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.774275 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dec4937a-6e5e-4612-b0d6-44b774704a2d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"dec4937a-6e5e-4612-b0d6-44b774704a2d\") " pod="openstack/nova-metadata-0" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.792539 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.802983 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngf8v\" (UniqueName: \"kubernetes.io/projected/dec4937a-6e5e-4612-b0d6-44b774704a2d-kube-api-access-ngf8v\") pod \"nova-metadata-0\" (UID: \"dec4937a-6e5e-4612-b0d6-44b774704a2d\") " pod="openstack/nova-metadata-0" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.814361 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7877d89589-z7vkq"] Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.821833 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7877d89589-z7vkq" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.836096 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7877d89589-z7vkq"] Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.964401 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c39c876-65b4-4e2d-83c8-e239417edbf5-ovsdbserver-sb\") pod \"dnsmasq-dns-7877d89589-z7vkq\" (UID: \"2c39c876-65b4-4e2d-83c8-e239417edbf5\") " pod="openstack/dnsmasq-dns-7877d89589-z7vkq" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.964755 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c39c876-65b4-4e2d-83c8-e239417edbf5-ovsdbserver-nb\") pod \"dnsmasq-dns-7877d89589-z7vkq\" (UID: \"2c39c876-65b4-4e2d-83c8-e239417edbf5\") " pod="openstack/dnsmasq-dns-7877d89589-z7vkq" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.964832 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c39c876-65b4-4e2d-83c8-e239417edbf5-dns-svc\") pod \"dnsmasq-dns-7877d89589-z7vkq\" (UID: \"2c39c876-65b4-4e2d-83c8-e239417edbf5\") " pod="openstack/dnsmasq-dns-7877d89589-z7vkq" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.964853 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c39c876-65b4-4e2d-83c8-e239417edbf5-config\") pod \"dnsmasq-dns-7877d89589-z7vkq\" (UID: \"2c39c876-65b4-4e2d-83c8-e239417edbf5\") " pod="openstack/dnsmasq-dns-7877d89589-z7vkq" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.964875 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2c39c876-65b4-4e2d-83c8-e239417edbf5-dns-swift-storage-0\") pod \"dnsmasq-dns-7877d89589-z7vkq\" (UID: \"2c39c876-65b4-4e2d-83c8-e239417edbf5\") " pod="openstack/dnsmasq-dns-7877d89589-z7vkq" Dec 01 08:42:33 crc kubenswrapper[5004]: I1201 08:42:33.964961 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mj7cq\" (UniqueName: \"kubernetes.io/projected/2c39c876-65b4-4e2d-83c8-e239417edbf5-kube-api-access-mj7cq\") pod \"dnsmasq-dns-7877d89589-z7vkq\" (UID: \"2c39c876-65b4-4e2d-83c8-e239417edbf5\") " pod="openstack/dnsmasq-dns-7877d89589-z7vkq" Dec 01 08:42:34 crc kubenswrapper[5004]: I1201 08:42:34.064810 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 08:42:34 crc kubenswrapper[5004]: I1201 08:42:34.066388 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c39c876-65b4-4e2d-83c8-e239417edbf5-ovsdbserver-sb\") pod \"dnsmasq-dns-7877d89589-z7vkq\" (UID: \"2c39c876-65b4-4e2d-83c8-e239417edbf5\") " pod="openstack/dnsmasq-dns-7877d89589-z7vkq" Dec 01 08:42:34 crc kubenswrapper[5004]: I1201 08:42:34.066425 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c39c876-65b4-4e2d-83c8-e239417edbf5-ovsdbserver-nb\") pod \"dnsmasq-dns-7877d89589-z7vkq\" (UID: \"2c39c876-65b4-4e2d-83c8-e239417edbf5\") " pod="openstack/dnsmasq-dns-7877d89589-z7vkq" Dec 01 08:42:34 crc kubenswrapper[5004]: I1201 08:42:34.066481 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c39c876-65b4-4e2d-83c8-e239417edbf5-dns-svc\") pod \"dnsmasq-dns-7877d89589-z7vkq\" (UID: \"2c39c876-65b4-4e2d-83c8-e239417edbf5\") " pod="openstack/dnsmasq-dns-7877d89589-z7vkq" Dec 01 08:42:34 crc kubenswrapper[5004]: I1201 08:42:34.066500 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c39c876-65b4-4e2d-83c8-e239417edbf5-config\") pod \"dnsmasq-dns-7877d89589-z7vkq\" (UID: \"2c39c876-65b4-4e2d-83c8-e239417edbf5\") " pod="openstack/dnsmasq-dns-7877d89589-z7vkq" Dec 01 08:42:34 crc kubenswrapper[5004]: I1201 08:42:34.066516 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2c39c876-65b4-4e2d-83c8-e239417edbf5-dns-swift-storage-0\") pod \"dnsmasq-dns-7877d89589-z7vkq\" (UID: \"2c39c876-65b4-4e2d-83c8-e239417edbf5\") " pod="openstack/dnsmasq-dns-7877d89589-z7vkq" Dec 01 08:42:34 crc kubenswrapper[5004]: I1201 08:42:34.066604 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mj7cq\" (UniqueName: \"kubernetes.io/projected/2c39c876-65b4-4e2d-83c8-e239417edbf5-kube-api-access-mj7cq\") pod \"dnsmasq-dns-7877d89589-z7vkq\" (UID: \"2c39c876-65b4-4e2d-83c8-e239417edbf5\") " pod="openstack/dnsmasq-dns-7877d89589-z7vkq" Dec 01 08:42:34 crc kubenswrapper[5004]: I1201 08:42:34.069904 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c39c876-65b4-4e2d-83c8-e239417edbf5-dns-svc\") pod \"dnsmasq-dns-7877d89589-z7vkq\" (UID: \"2c39c876-65b4-4e2d-83c8-e239417edbf5\") " pod="openstack/dnsmasq-dns-7877d89589-z7vkq" Dec 01 08:42:34 crc kubenswrapper[5004]: I1201 08:42:34.070458 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c39c876-65b4-4e2d-83c8-e239417edbf5-ovsdbserver-sb\") pod \"dnsmasq-dns-7877d89589-z7vkq\" (UID: \"2c39c876-65b4-4e2d-83c8-e239417edbf5\") " pod="openstack/dnsmasq-dns-7877d89589-z7vkq" Dec 01 08:42:34 crc kubenswrapper[5004]: I1201 08:42:34.070483 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c39c876-65b4-4e2d-83c8-e239417edbf5-config\") pod \"dnsmasq-dns-7877d89589-z7vkq\" (UID: \"2c39c876-65b4-4e2d-83c8-e239417edbf5\") " pod="openstack/dnsmasq-dns-7877d89589-z7vkq" Dec 01 08:42:34 crc kubenswrapper[5004]: I1201 08:42:34.070591 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c39c876-65b4-4e2d-83c8-e239417edbf5-ovsdbserver-nb\") pod \"dnsmasq-dns-7877d89589-z7vkq\" (UID: \"2c39c876-65b4-4e2d-83c8-e239417edbf5\") " pod="openstack/dnsmasq-dns-7877d89589-z7vkq" Dec 01 08:42:34 crc kubenswrapper[5004]: I1201 08:42:34.071741 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2c39c876-65b4-4e2d-83c8-e239417edbf5-dns-swift-storage-0\") pod \"dnsmasq-dns-7877d89589-z7vkq\" (UID: \"2c39c876-65b4-4e2d-83c8-e239417edbf5\") " pod="openstack/dnsmasq-dns-7877d89589-z7vkq" Dec 01 08:42:34 crc kubenswrapper[5004]: I1201 08:42:34.084885 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mj7cq\" (UniqueName: \"kubernetes.io/projected/2c39c876-65b4-4e2d-83c8-e239417edbf5-kube-api-access-mj7cq\") pod \"dnsmasq-dns-7877d89589-z7vkq\" (UID: \"2c39c876-65b4-4e2d-83c8-e239417edbf5\") " pod="openstack/dnsmasq-dns-7877d89589-z7vkq" Dec 01 08:42:34 crc kubenswrapper[5004]: I1201 08:42:34.110698 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-hxtzx" Dec 01 08:42:34 crc kubenswrapper[5004]: I1201 08:42:34.162927 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-hxtzx" event={"ID":"34e30621-736a-4bfd-8b6d-fbbb4350e4ad","Type":"ContainerDied","Data":"7ac00c7bee974d624e7ef2c1e7deb7837d8872405f7f0a5c7ed6b960d1d8d871"} Dec 01 08:42:34 crc kubenswrapper[5004]: I1201 08:42:34.162981 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ac00c7bee974d624e7ef2c1e7deb7837d8872405f7f0a5c7ed6b960d1d8d871" Dec 01 08:42:34 crc kubenswrapper[5004]: I1201 08:42:34.163045 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-hxtzx" Dec 01 08:42:34 crc kubenswrapper[5004]: I1201 08:42:34.168323 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34e30621-736a-4bfd-8b6d-fbbb4350e4ad-combined-ca-bundle\") pod \"34e30621-736a-4bfd-8b6d-fbbb4350e4ad\" (UID: \"34e30621-736a-4bfd-8b6d-fbbb4350e4ad\") " Dec 01 08:42:34 crc kubenswrapper[5004]: I1201 08:42:34.168383 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34e30621-736a-4bfd-8b6d-fbbb4350e4ad-scripts\") pod \"34e30621-736a-4bfd-8b6d-fbbb4350e4ad\" (UID: \"34e30621-736a-4bfd-8b6d-fbbb4350e4ad\") " Dec 01 08:42:34 crc kubenswrapper[5004]: I1201 08:42:34.168621 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5r7ft\" (UniqueName: \"kubernetes.io/projected/34e30621-736a-4bfd-8b6d-fbbb4350e4ad-kube-api-access-5r7ft\") pod \"34e30621-736a-4bfd-8b6d-fbbb4350e4ad\" (UID: \"34e30621-736a-4bfd-8b6d-fbbb4350e4ad\") " Dec 01 08:42:34 crc kubenswrapper[5004]: I1201 08:42:34.168692 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34e30621-736a-4bfd-8b6d-fbbb4350e4ad-config-data\") pod \"34e30621-736a-4bfd-8b6d-fbbb4350e4ad\" (UID: \"34e30621-736a-4bfd-8b6d-fbbb4350e4ad\") " Dec 01 08:42:34 crc kubenswrapper[5004]: I1201 08:42:34.183943 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7877d89589-z7vkq" Dec 01 08:42:34 crc kubenswrapper[5004]: I1201 08:42:34.189626 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34e30621-736a-4bfd-8b6d-fbbb4350e4ad-scripts" (OuterVolumeSpecName: "scripts") pod "34e30621-736a-4bfd-8b6d-fbbb4350e4ad" (UID: "34e30621-736a-4bfd-8b6d-fbbb4350e4ad"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:42:34 crc kubenswrapper[5004]: I1201 08:42:34.191851 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34e30621-736a-4bfd-8b6d-fbbb4350e4ad-kube-api-access-5r7ft" (OuterVolumeSpecName: "kube-api-access-5r7ft") pod "34e30621-736a-4bfd-8b6d-fbbb4350e4ad" (UID: "34e30621-736a-4bfd-8b6d-fbbb4350e4ad"). InnerVolumeSpecName "kube-api-access-5r7ft". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:42:34 crc kubenswrapper[5004]: I1201 08:42:34.230804 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34e30621-736a-4bfd-8b6d-fbbb4350e4ad-config-data" (OuterVolumeSpecName: "config-data") pod "34e30621-736a-4bfd-8b6d-fbbb4350e4ad" (UID: "34e30621-736a-4bfd-8b6d-fbbb4350e4ad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:42:34 crc kubenswrapper[5004]: I1201 08:42:34.267325 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34e30621-736a-4bfd-8b6d-fbbb4350e4ad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "34e30621-736a-4bfd-8b6d-fbbb4350e4ad" (UID: "34e30621-736a-4bfd-8b6d-fbbb4350e4ad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:42:34 crc kubenswrapper[5004]: I1201 08:42:34.278380 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5r7ft\" (UniqueName: \"kubernetes.io/projected/34e30621-736a-4bfd-8b6d-fbbb4350e4ad-kube-api-access-5r7ft\") on node \"crc\" DevicePath \"\"" Dec 01 08:42:34 crc kubenswrapper[5004]: I1201 08:42:34.278438 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34e30621-736a-4bfd-8b6d-fbbb4350e4ad-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 08:42:34 crc kubenswrapper[5004]: I1201 08:42:34.278450 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34e30621-736a-4bfd-8b6d-fbbb4350e4ad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:42:34 crc kubenswrapper[5004]: I1201 08:42:34.278462 5004 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34e30621-736a-4bfd-8b6d-fbbb4350e4ad-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 08:42:34 crc kubenswrapper[5004]: I1201 08:42:34.581253 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-6rnpl"] Dec 01 08:42:34 crc kubenswrapper[5004]: I1201 08:42:34.725005 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-5gbbc"] Dec 01 08:42:34 crc kubenswrapper[5004]: E1201 08:42:34.725940 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34e30621-736a-4bfd-8b6d-fbbb4350e4ad" containerName="aodh-db-sync" Dec 01 08:42:34 crc kubenswrapper[5004]: I1201 08:42:34.726017 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="34e30621-736a-4bfd-8b6d-fbbb4350e4ad" containerName="aodh-db-sync" Dec 01 08:42:34 crc kubenswrapper[5004]: I1201 08:42:34.726332 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="34e30621-736a-4bfd-8b6d-fbbb4350e4ad" containerName="aodh-db-sync" Dec 01 08:42:34 crc kubenswrapper[5004]: I1201 08:42:34.727151 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-5gbbc" Dec 01 08:42:34 crc kubenswrapper[5004]: I1201 08:42:34.730333 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 01 08:42:34 crc kubenswrapper[5004]: I1201 08:42:34.730547 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 01 08:42:34 crc kubenswrapper[5004]: I1201 08:42:34.753978 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-5gbbc"] Dec 01 08:42:34 crc kubenswrapper[5004]: I1201 08:42:34.800764 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6f2z4\" (UniqueName: \"kubernetes.io/projected/a31898e1-9571-43cb-b754-5544a7898213-kube-api-access-6f2z4\") pod \"nova-cell1-conductor-db-sync-5gbbc\" (UID: \"a31898e1-9571-43cb-b754-5544a7898213\") " pod="openstack/nova-cell1-conductor-db-sync-5gbbc" Dec 01 08:42:34 crc kubenswrapper[5004]: I1201 08:42:34.801166 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a31898e1-9571-43cb-b754-5544a7898213-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-5gbbc\" (UID: \"a31898e1-9571-43cb-b754-5544a7898213\") " pod="openstack/nova-cell1-conductor-db-sync-5gbbc" Dec 01 08:42:34 crc kubenswrapper[5004]: I1201 08:42:34.801350 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a31898e1-9571-43cb-b754-5544a7898213-config-data\") pod \"nova-cell1-conductor-db-sync-5gbbc\" (UID: \"a31898e1-9571-43cb-b754-5544a7898213\") " pod="openstack/nova-cell1-conductor-db-sync-5gbbc" Dec 01 08:42:34 crc kubenswrapper[5004]: I1201 08:42:34.801422 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a31898e1-9571-43cb-b754-5544a7898213-scripts\") pod \"nova-cell1-conductor-db-sync-5gbbc\" (UID: \"a31898e1-9571-43cb-b754-5544a7898213\") " pod="openstack/nova-cell1-conductor-db-sync-5gbbc" Dec 01 08:42:34 crc kubenswrapper[5004]: I1201 08:42:34.811020 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 08:42:34 crc kubenswrapper[5004]: I1201 08:42:34.824114 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 08:42:34 crc kubenswrapper[5004]: I1201 08:42:34.838075 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 08:42:34 crc kubenswrapper[5004]: I1201 08:42:34.906168 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6f2z4\" (UniqueName: \"kubernetes.io/projected/a31898e1-9571-43cb-b754-5544a7898213-kube-api-access-6f2z4\") pod \"nova-cell1-conductor-db-sync-5gbbc\" (UID: \"a31898e1-9571-43cb-b754-5544a7898213\") " pod="openstack/nova-cell1-conductor-db-sync-5gbbc" Dec 01 08:42:34 crc kubenswrapper[5004]: I1201 08:42:34.906788 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a31898e1-9571-43cb-b754-5544a7898213-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-5gbbc\" (UID: \"a31898e1-9571-43cb-b754-5544a7898213\") " pod="openstack/nova-cell1-conductor-db-sync-5gbbc" Dec 01 08:42:34 crc kubenswrapper[5004]: I1201 08:42:34.906889 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a31898e1-9571-43cb-b754-5544a7898213-config-data\") pod \"nova-cell1-conductor-db-sync-5gbbc\" (UID: \"a31898e1-9571-43cb-b754-5544a7898213\") " pod="openstack/nova-cell1-conductor-db-sync-5gbbc" Dec 01 08:42:34 crc kubenswrapper[5004]: I1201 08:42:34.906929 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a31898e1-9571-43cb-b754-5544a7898213-scripts\") pod \"nova-cell1-conductor-db-sync-5gbbc\" (UID: \"a31898e1-9571-43cb-b754-5544a7898213\") " pod="openstack/nova-cell1-conductor-db-sync-5gbbc" Dec 01 08:42:34 crc kubenswrapper[5004]: I1201 08:42:34.920573 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a31898e1-9571-43cb-b754-5544a7898213-scripts\") pod \"nova-cell1-conductor-db-sync-5gbbc\" (UID: \"a31898e1-9571-43cb-b754-5544a7898213\") " pod="openstack/nova-cell1-conductor-db-sync-5gbbc" Dec 01 08:42:34 crc kubenswrapper[5004]: I1201 08:42:34.921037 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a31898e1-9571-43cb-b754-5544a7898213-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-5gbbc\" (UID: \"a31898e1-9571-43cb-b754-5544a7898213\") " pod="openstack/nova-cell1-conductor-db-sync-5gbbc" Dec 01 08:42:34 crc kubenswrapper[5004]: I1201 08:42:34.922826 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a31898e1-9571-43cb-b754-5544a7898213-config-data\") pod \"nova-cell1-conductor-db-sync-5gbbc\" (UID: \"a31898e1-9571-43cb-b754-5544a7898213\") " pod="openstack/nova-cell1-conductor-db-sync-5gbbc" Dec 01 08:42:34 crc kubenswrapper[5004]: I1201 08:42:34.948224 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6f2z4\" (UniqueName: \"kubernetes.io/projected/a31898e1-9571-43cb-b754-5544a7898213-kube-api-access-6f2z4\") pod \"nova-cell1-conductor-db-sync-5gbbc\" (UID: \"a31898e1-9571-43cb-b754-5544a7898213\") " pod="openstack/nova-cell1-conductor-db-sync-5gbbc" Dec 01 08:42:35 crc kubenswrapper[5004]: I1201 08:42:35.100220 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 08:42:35 crc kubenswrapper[5004]: I1201 08:42:35.115709 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-5gbbc" Dec 01 08:42:35 crc kubenswrapper[5004]: I1201 08:42:35.142503 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7877d89589-z7vkq"] Dec 01 08:42:35 crc kubenswrapper[5004]: W1201 08:42:35.195703 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c39c876_65b4_4e2d_83c8_e239417edbf5.slice/crio-7341e488eac2c60bd0e4ba72eec610b452ffedda199ffebe3a798b1d7bcd66a0 WatchSource:0}: Error finding container 7341e488eac2c60bd0e4ba72eec610b452ffedda199ffebe3a798b1d7bcd66a0: Status 404 returned error can't find the container with id 7341e488eac2c60bd0e4ba72eec610b452ffedda199ffebe3a798b1d7bcd66a0 Dec 01 08:42:35 crc kubenswrapper[5004]: I1201 08:42:35.220628 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-6rnpl" event={"ID":"902842db-d6a2-4ae1-8d5e-25b637f4db2c","Type":"ContainerStarted","Data":"0c678a7b2a88c2a73b210510389daca02a40d24ae6cd27fe05e33d2154b3e83d"} Dec 01 08:42:35 crc kubenswrapper[5004]: I1201 08:42:35.244713 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0cf491dc-7bad-4373-ba55-e59710d8c05f","Type":"ContainerStarted","Data":"1298ee3ffcaf2a14b357b355d13f12e4d9977f18621a00feaa89bcfd08e885f4"} Dec 01 08:42:35 crc kubenswrapper[5004]: I1201 08:42:35.295084 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9ae06efb-c5ea-45b8-9c2a-c7c43dcca1c7","Type":"ContainerStarted","Data":"969483b3e1bec5f7973065db919612f9e59be2aaa24abc5b667a973cd03a3366"} Dec 01 08:42:35 crc kubenswrapper[5004]: I1201 08:42:35.318969 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dec4937a-6e5e-4612-b0d6-44b774704a2d","Type":"ContainerStarted","Data":"aced9ee4e0cfd4ef04c77ffd92edf37ea6f98f28bb877810014df86c2c900409"} Dec 01 08:42:35 crc kubenswrapper[5004]: I1201 08:42:35.325719 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"948b1e6b-150b-44f9-8dc9-d6731e08f6e8","Type":"ContainerStarted","Data":"ec843f516d26844c01c54fcc628d52a88f13f3343bc6af33cd66dc08e85e2202"} Dec 01 08:42:35 crc kubenswrapper[5004]: I1201 08:42:35.500282 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-svs2q"] Dec 01 08:42:35 crc kubenswrapper[5004]: I1201 08:42:35.509455 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-svs2q" Dec 01 08:42:35 crc kubenswrapper[5004]: I1201 08:42:35.538029 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-svs2q"] Dec 01 08:42:35 crc kubenswrapper[5004]: I1201 08:42:35.661346 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac4a2842-3b0b-45d4-b400-ce7be95df717-utilities\") pod \"community-operators-svs2q\" (UID: \"ac4a2842-3b0b-45d4-b400-ce7be95df717\") " pod="openshift-marketplace/community-operators-svs2q" Dec 01 08:42:35 crc kubenswrapper[5004]: I1201 08:42:35.661756 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac4a2842-3b0b-45d4-b400-ce7be95df717-catalog-content\") pod \"community-operators-svs2q\" (UID: \"ac4a2842-3b0b-45d4-b400-ce7be95df717\") " pod="openshift-marketplace/community-operators-svs2q" Dec 01 08:42:35 crc kubenswrapper[5004]: I1201 08:42:35.661884 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7cwn\" (UniqueName: \"kubernetes.io/projected/ac4a2842-3b0b-45d4-b400-ce7be95df717-kube-api-access-t7cwn\") pod \"community-operators-svs2q\" (UID: \"ac4a2842-3b0b-45d4-b400-ce7be95df717\") " pod="openshift-marketplace/community-operators-svs2q" Dec 01 08:42:35 crc kubenswrapper[5004]: I1201 08:42:35.765829 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7cwn\" (UniqueName: \"kubernetes.io/projected/ac4a2842-3b0b-45d4-b400-ce7be95df717-kube-api-access-t7cwn\") pod \"community-operators-svs2q\" (UID: \"ac4a2842-3b0b-45d4-b400-ce7be95df717\") " pod="openshift-marketplace/community-operators-svs2q" Dec 01 08:42:35 crc kubenswrapper[5004]: I1201 08:42:35.765963 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac4a2842-3b0b-45d4-b400-ce7be95df717-utilities\") pod \"community-operators-svs2q\" (UID: \"ac4a2842-3b0b-45d4-b400-ce7be95df717\") " pod="openshift-marketplace/community-operators-svs2q" Dec 01 08:42:35 crc kubenswrapper[5004]: I1201 08:42:35.765989 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac4a2842-3b0b-45d4-b400-ce7be95df717-catalog-content\") pod \"community-operators-svs2q\" (UID: \"ac4a2842-3b0b-45d4-b400-ce7be95df717\") " pod="openshift-marketplace/community-operators-svs2q" Dec 01 08:42:35 crc kubenswrapper[5004]: I1201 08:42:35.766485 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac4a2842-3b0b-45d4-b400-ce7be95df717-catalog-content\") pod \"community-operators-svs2q\" (UID: \"ac4a2842-3b0b-45d4-b400-ce7be95df717\") " pod="openshift-marketplace/community-operators-svs2q" Dec 01 08:42:35 crc kubenswrapper[5004]: I1201 08:42:35.766488 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac4a2842-3b0b-45d4-b400-ce7be95df717-utilities\") pod \"community-operators-svs2q\" (UID: \"ac4a2842-3b0b-45d4-b400-ce7be95df717\") " pod="openshift-marketplace/community-operators-svs2q" Dec 01 08:42:35 crc kubenswrapper[5004]: I1201 08:42:35.785325 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7cwn\" (UniqueName: \"kubernetes.io/projected/ac4a2842-3b0b-45d4-b400-ce7be95df717-kube-api-access-t7cwn\") pod \"community-operators-svs2q\" (UID: \"ac4a2842-3b0b-45d4-b400-ce7be95df717\") " pod="openshift-marketplace/community-operators-svs2q" Dec 01 08:42:35 crc kubenswrapper[5004]: I1201 08:42:35.835627 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-svs2q" Dec 01 08:42:35 crc kubenswrapper[5004]: I1201 08:42:35.857477 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-5gbbc"] Dec 01 08:42:35 crc kubenswrapper[5004]: W1201 08:42:35.875990 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda31898e1_9571_43cb_b754_5544a7898213.slice/crio-8da75b95a188065c5f6aafe48f7040747aa7c972a53ca9b23822d6a4b7fbec6c WatchSource:0}: Error finding container 8da75b95a188065c5f6aafe48f7040747aa7c972a53ca9b23822d6a4b7fbec6c: Status 404 returned error can't find the container with id 8da75b95a188065c5f6aafe48f7040747aa7c972a53ca9b23822d6a4b7fbec6c Dec 01 08:42:36 crc kubenswrapper[5004]: I1201 08:42:36.350064 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-6rnpl" event={"ID":"902842db-d6a2-4ae1-8d5e-25b637f4db2c","Type":"ContainerStarted","Data":"75604ee40e80496e4599aae3f1fb7cc18dd761c466f5be26677f5ac186554dbd"} Dec 01 08:42:36 crc kubenswrapper[5004]: I1201 08:42:36.365168 5004 generic.go:334] "Generic (PLEG): container finished" podID="2c39c876-65b4-4e2d-83c8-e239417edbf5" containerID="49a3135120f1f483799c32df427157579566586b76f8d4c632bcacea2f004914" exitCode=0 Dec 01 08:42:36 crc kubenswrapper[5004]: I1201 08:42:36.365264 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7877d89589-z7vkq" event={"ID":"2c39c876-65b4-4e2d-83c8-e239417edbf5","Type":"ContainerDied","Data":"49a3135120f1f483799c32df427157579566586b76f8d4c632bcacea2f004914"} Dec 01 08:42:36 crc kubenswrapper[5004]: I1201 08:42:36.365311 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7877d89589-z7vkq" event={"ID":"2c39c876-65b4-4e2d-83c8-e239417edbf5","Type":"ContainerStarted","Data":"7341e488eac2c60bd0e4ba72eec610b452ffedda199ffebe3a798b1d7bcd66a0"} Dec 01 08:42:36 crc kubenswrapper[5004]: I1201 08:42:36.373461 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-5gbbc" event={"ID":"a31898e1-9571-43cb-b754-5544a7898213","Type":"ContainerStarted","Data":"d396d1b5dd78f4ad9d79e792b9b18ece3e9a8e63b83164dbc75f56468b206893"} Dec 01 08:42:36 crc kubenswrapper[5004]: I1201 08:42:36.373502 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-5gbbc" event={"ID":"a31898e1-9571-43cb-b754-5544a7898213","Type":"ContainerStarted","Data":"8da75b95a188065c5f6aafe48f7040747aa7c972a53ca9b23822d6a4b7fbec6c"} Dec 01 08:42:36 crc kubenswrapper[5004]: I1201 08:42:36.386816 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-6rnpl" podStartSLOduration=4.386783575 podStartE2EDuration="4.386783575s" podCreationTimestamp="2025-12-01 08:42:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:42:36.368001215 +0000 UTC m=+1533.932993197" watchObservedRunningTime="2025-12-01 08:42:36.386783575 +0000 UTC m=+1533.951775557" Dec 01 08:42:36 crc kubenswrapper[5004]: I1201 08:42:36.470414 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-5gbbc" podStartSLOduration=2.470391226 podStartE2EDuration="2.470391226s" podCreationTimestamp="2025-12-01 08:42:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:42:36.446247797 +0000 UTC m=+1534.011239779" watchObservedRunningTime="2025-12-01 08:42:36.470391226 +0000 UTC m=+1534.035383208" Dec 01 08:42:36 crc kubenswrapper[5004]: I1201 08:42:36.510000 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-svs2q"] Dec 01 08:42:36 crc kubenswrapper[5004]: W1201 08:42:36.525057 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac4a2842_3b0b_45d4_b400_ce7be95df717.slice/crio-35122448ca9a01e43a66d90e63d44cc667a119f216f461d17966c746878e0b07 WatchSource:0}: Error finding container 35122448ca9a01e43a66d90e63d44cc667a119f216f461d17966c746878e0b07: Status 404 returned error can't find the container with id 35122448ca9a01e43a66d90e63d44cc667a119f216f461d17966c746878e0b07 Dec 01 08:42:37 crc kubenswrapper[5004]: I1201 08:42:37.394662 5004 generic.go:334] "Generic (PLEG): container finished" podID="ac4a2842-3b0b-45d4-b400-ce7be95df717" containerID="65aaef056ee3bf8e63cda250c1d72ea11ef78d60ef1bdde7cca1fa885a4b8af5" exitCode=0 Dec 01 08:42:37 crc kubenswrapper[5004]: I1201 08:42:37.395019 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-svs2q" event={"ID":"ac4a2842-3b0b-45d4-b400-ce7be95df717","Type":"ContainerDied","Data":"65aaef056ee3bf8e63cda250c1d72ea11ef78d60ef1bdde7cca1fa885a4b8af5"} Dec 01 08:42:37 crc kubenswrapper[5004]: I1201 08:42:37.395050 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-svs2q" event={"ID":"ac4a2842-3b0b-45d4-b400-ce7be95df717","Type":"ContainerStarted","Data":"35122448ca9a01e43a66d90e63d44cc667a119f216f461d17966c746878e0b07"} Dec 01 08:42:37 crc kubenswrapper[5004]: I1201 08:42:37.398927 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7877d89589-z7vkq" event={"ID":"2c39c876-65b4-4e2d-83c8-e239417edbf5","Type":"ContainerStarted","Data":"93ae33f03605ec6547e3243a238e3683cce0f366c93b879a96c37a4e5da0618f"} Dec 01 08:42:37 crc kubenswrapper[5004]: I1201 08:42:37.399480 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7877d89589-z7vkq" Dec 01 08:42:37 crc kubenswrapper[5004]: I1201 08:42:37.438432 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7877d89589-z7vkq" podStartSLOduration=4.438400652 podStartE2EDuration="4.438400652s" podCreationTimestamp="2025-12-01 08:42:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:42:37.428713395 +0000 UTC m=+1534.993705377" watchObservedRunningTime="2025-12-01 08:42:37.438400652 +0000 UTC m=+1535.003392634" Dec 01 08:42:37 crc kubenswrapper[5004]: I1201 08:42:37.729349 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Dec 01 08:42:37 crc kubenswrapper[5004]: I1201 08:42:37.733767 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 01 08:42:37 crc kubenswrapper[5004]: I1201 08:42:37.735915 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-hrc7d" Dec 01 08:42:37 crc kubenswrapper[5004]: I1201 08:42:37.736237 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Dec 01 08:42:37 crc kubenswrapper[5004]: I1201 08:42:37.736585 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Dec 01 08:42:37 crc kubenswrapper[5004]: I1201 08:42:37.742504 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 01 08:42:37 crc kubenswrapper[5004]: I1201 08:42:37.837110 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6792a811-4156-4b0a-b6b4-6ff3280229d2-config-data\") pod \"aodh-0\" (UID: \"6792a811-4156-4b0a-b6b4-6ff3280229d2\") " pod="openstack/aodh-0" Dec 01 08:42:37 crc kubenswrapper[5004]: I1201 08:42:37.837179 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6792a811-4156-4b0a-b6b4-6ff3280229d2-scripts\") pod \"aodh-0\" (UID: \"6792a811-4156-4b0a-b6b4-6ff3280229d2\") " pod="openstack/aodh-0" Dec 01 08:42:37 crc kubenswrapper[5004]: I1201 08:42:37.837208 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnsxn\" (UniqueName: \"kubernetes.io/projected/6792a811-4156-4b0a-b6b4-6ff3280229d2-kube-api-access-vnsxn\") pod \"aodh-0\" (UID: \"6792a811-4156-4b0a-b6b4-6ff3280229d2\") " pod="openstack/aodh-0" Dec 01 08:42:37 crc kubenswrapper[5004]: I1201 08:42:37.837336 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6792a811-4156-4b0a-b6b4-6ff3280229d2-combined-ca-bundle\") pod \"aodh-0\" (UID: \"6792a811-4156-4b0a-b6b4-6ff3280229d2\") " pod="openstack/aodh-0" Dec 01 08:42:37 crc kubenswrapper[5004]: I1201 08:42:37.942494 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6792a811-4156-4b0a-b6b4-6ff3280229d2-combined-ca-bundle\") pod \"aodh-0\" (UID: \"6792a811-4156-4b0a-b6b4-6ff3280229d2\") " pod="openstack/aodh-0" Dec 01 08:42:37 crc kubenswrapper[5004]: I1201 08:42:37.942618 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6792a811-4156-4b0a-b6b4-6ff3280229d2-config-data\") pod \"aodh-0\" (UID: \"6792a811-4156-4b0a-b6b4-6ff3280229d2\") " pod="openstack/aodh-0" Dec 01 08:42:37 crc kubenswrapper[5004]: I1201 08:42:37.942660 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6792a811-4156-4b0a-b6b4-6ff3280229d2-scripts\") pod \"aodh-0\" (UID: \"6792a811-4156-4b0a-b6b4-6ff3280229d2\") " pod="openstack/aodh-0" Dec 01 08:42:37 crc kubenswrapper[5004]: I1201 08:42:37.942684 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnsxn\" (UniqueName: \"kubernetes.io/projected/6792a811-4156-4b0a-b6b4-6ff3280229d2-kube-api-access-vnsxn\") pod \"aodh-0\" (UID: \"6792a811-4156-4b0a-b6b4-6ff3280229d2\") " pod="openstack/aodh-0" Dec 01 08:42:37 crc kubenswrapper[5004]: I1201 08:42:37.961266 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6792a811-4156-4b0a-b6b4-6ff3280229d2-combined-ca-bundle\") pod \"aodh-0\" (UID: \"6792a811-4156-4b0a-b6b4-6ff3280229d2\") " pod="openstack/aodh-0" Dec 01 08:42:37 crc kubenswrapper[5004]: I1201 08:42:37.963900 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6792a811-4156-4b0a-b6b4-6ff3280229d2-config-data\") pod \"aodh-0\" (UID: \"6792a811-4156-4b0a-b6b4-6ff3280229d2\") " pod="openstack/aodh-0" Dec 01 08:42:37 crc kubenswrapper[5004]: I1201 08:42:37.966994 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnsxn\" (UniqueName: \"kubernetes.io/projected/6792a811-4156-4b0a-b6b4-6ff3280229d2-kube-api-access-vnsxn\") pod \"aodh-0\" (UID: \"6792a811-4156-4b0a-b6b4-6ff3280229d2\") " pod="openstack/aodh-0" Dec 01 08:42:37 crc kubenswrapper[5004]: I1201 08:42:37.986009 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6792a811-4156-4b0a-b6b4-6ff3280229d2-scripts\") pod \"aodh-0\" (UID: \"6792a811-4156-4b0a-b6b4-6ff3280229d2\") " pod="openstack/aodh-0" Dec 01 08:42:38 crc kubenswrapper[5004]: I1201 08:42:38.013201 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 08:42:38 crc kubenswrapper[5004]: I1201 08:42:38.037352 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 08:42:38 crc kubenswrapper[5004]: I1201 08:42:38.071160 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 01 08:42:40 crc kubenswrapper[5004]: I1201 08:42:40.473325 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 01 08:42:40 crc kubenswrapper[5004]: I1201 08:42:40.756497 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 08:42:40 crc kubenswrapper[5004]: I1201 08:42:40.757581 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a28b451e-9081-498f-9ba4-4aac7a872d5a" containerName="ceilometer-central-agent" containerID="cri-o://eb32c682c6f68d69be6b3d8361afd845a1e6777e3c611266e61d0f19c5bae580" gracePeriod=30 Dec 01 08:42:40 crc kubenswrapper[5004]: I1201 08:42:40.758461 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a28b451e-9081-498f-9ba4-4aac7a872d5a" containerName="sg-core" containerID="cri-o://8c8bd315dc30d6ec8f0fc0860aea5f624ba72b7e94565e563a4a64a38a8369de" gracePeriod=30 Dec 01 08:42:40 crc kubenswrapper[5004]: I1201 08:42:40.758691 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a28b451e-9081-498f-9ba4-4aac7a872d5a" containerName="proxy-httpd" containerID="cri-o://b0375f9fe61f3bdd15e1bf95b851c577fd28e5f351525752302efabd86b25234" gracePeriod=30 Dec 01 08:42:40 crc kubenswrapper[5004]: I1201 08:42:40.758814 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a28b451e-9081-498f-9ba4-4aac7a872d5a" containerName="ceilometer-notification-agent" containerID="cri-o://964994c321d2a343de9c030ec175e1032f3eba9d0727c7f7411ddf24be7ffb3d" gracePeriod=30 Dec 01 08:42:40 crc kubenswrapper[5004]: I1201 08:42:40.780968 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="a28b451e-9081-498f-9ba4-4aac7a872d5a" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.230:3000/\": EOF" Dec 01 08:42:41 crc kubenswrapper[5004]: I1201 08:42:41.491409 5004 generic.go:334] "Generic (PLEG): container finished" podID="a28b451e-9081-498f-9ba4-4aac7a872d5a" containerID="b0375f9fe61f3bdd15e1bf95b851c577fd28e5f351525752302efabd86b25234" exitCode=0 Dec 01 08:42:41 crc kubenswrapper[5004]: I1201 08:42:41.514139 5004 generic.go:334] "Generic (PLEG): container finished" podID="a28b451e-9081-498f-9ba4-4aac7a872d5a" containerID="8c8bd315dc30d6ec8f0fc0860aea5f624ba72b7e94565e563a4a64a38a8369de" exitCode=2 Dec 01 08:42:41 crc kubenswrapper[5004]: I1201 08:42:41.514171 5004 generic.go:334] "Generic (PLEG): container finished" podID="a28b451e-9081-498f-9ba4-4aac7a872d5a" containerID="eb32c682c6f68d69be6b3d8361afd845a1e6777e3c611266e61d0f19c5bae580" exitCode=0 Dec 01 08:42:41 crc kubenswrapper[5004]: I1201 08:42:41.501685 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a28b451e-9081-498f-9ba4-4aac7a872d5a","Type":"ContainerDied","Data":"b0375f9fe61f3bdd15e1bf95b851c577fd28e5f351525752302efabd86b25234"} Dec 01 08:42:41 crc kubenswrapper[5004]: I1201 08:42:41.514323 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a28b451e-9081-498f-9ba4-4aac7a872d5a","Type":"ContainerDied","Data":"8c8bd315dc30d6ec8f0fc0860aea5f624ba72b7e94565e563a4a64a38a8369de"} Dec 01 08:42:41 crc kubenswrapper[5004]: I1201 08:42:41.514342 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a28b451e-9081-498f-9ba4-4aac7a872d5a","Type":"ContainerDied","Data":"eb32c682c6f68d69be6b3d8361afd845a1e6777e3c611266e61d0f19c5bae580"} Dec 01 08:42:41 crc kubenswrapper[5004]: I1201 08:42:41.535253 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-svs2q" event={"ID":"ac4a2842-3b0b-45d4-b400-ce7be95df717","Type":"ContainerStarted","Data":"2303c4f6039dff84a524587888c0bc3a95b9b34da511d23004f020aecbbe9345"} Dec 01 08:42:41 crc kubenswrapper[5004]: I1201 08:42:41.544709 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"948b1e6b-150b-44f9-8dc9-d6731e08f6e8","Type":"ContainerStarted","Data":"6334c3e913314b751db05ebfd38b595352ae3df0deb67fd3e8513e67bc61f3f3"} Dec 01 08:42:41 crc kubenswrapper[5004]: I1201 08:42:41.558255 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0cf491dc-7bad-4373-ba55-e59710d8c05f","Type":"ContainerStarted","Data":"88d6d3368227731978b43a0b27e8daa6ef83cb94f13725e7e296d80e42311c4b"} Dec 01 08:42:41 crc kubenswrapper[5004]: I1201 08:42:41.558404 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="0cf491dc-7bad-4373-ba55-e59710d8c05f" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://88d6d3368227731978b43a0b27e8daa6ef83cb94f13725e7e296d80e42311c4b" gracePeriod=30 Dec 01 08:42:41 crc kubenswrapper[5004]: I1201 08:42:41.581164 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.587005055 podStartE2EDuration="8.581145613s" podCreationTimestamp="2025-12-01 08:42:33 +0000 UTC" firstStartedPulling="2025-12-01 08:42:34.833537574 +0000 UTC m=+1532.398529556" lastFinishedPulling="2025-12-01 08:42:39.827678132 +0000 UTC m=+1537.392670114" observedRunningTime="2025-12-01 08:42:41.572312877 +0000 UTC m=+1539.137304859" watchObservedRunningTime="2025-12-01 08:42:41.581145613 +0000 UTC m=+1539.146137595" Dec 01 08:42:41 crc kubenswrapper[5004]: I1201 08:42:41.586806 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9ae06efb-c5ea-45b8-9c2a-c7c43dcca1c7","Type":"ContainerStarted","Data":"31e1c0b0831e8b10a469f9a8414e345f4fc81d6a710a1a517964ea830fd4ef68"} Dec 01 08:42:41 crc kubenswrapper[5004]: I1201 08:42:41.586849 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9ae06efb-c5ea-45b8-9c2a-c7c43dcca1c7","Type":"ContainerStarted","Data":"5ac15de2db274c854f4ae19c9240c2c49ff5d606a1545de00284aab70223f4cf"} Dec 01 08:42:41 crc kubenswrapper[5004]: I1201 08:42:41.609329 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.615457011 podStartE2EDuration="8.609305312s" podCreationTimestamp="2025-12-01 08:42:33 +0000 UTC" firstStartedPulling="2025-12-01 08:42:34.83378633 +0000 UTC m=+1532.398778312" lastFinishedPulling="2025-12-01 08:42:39.827634631 +0000 UTC m=+1537.392626613" observedRunningTime="2025-12-01 08:42:41.601845229 +0000 UTC m=+1539.166837211" watchObservedRunningTime="2025-12-01 08:42:41.609305312 +0000 UTC m=+1539.174297294" Dec 01 08:42:41 crc kubenswrapper[5004]: I1201 08:42:41.622641 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dec4937a-6e5e-4612-b0d6-44b774704a2d","Type":"ContainerStarted","Data":"376212e4f743b3174fc9269d3341e5107dbf00c310c82542eee81ab91a23c27a"} Dec 01 08:42:41 crc kubenswrapper[5004]: I1201 08:42:41.622690 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dec4937a-6e5e-4612-b0d6-44b774704a2d","Type":"ContainerStarted","Data":"4c5eedbaec57618042d5afc59279bbeeb6d6b437a3cd6a36910207a0839ea822"} Dec 01 08:42:41 crc kubenswrapper[5004]: I1201 08:42:41.622822 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="dec4937a-6e5e-4612-b0d6-44b774704a2d" containerName="nova-metadata-log" containerID="cri-o://4c5eedbaec57618042d5afc59279bbeeb6d6b437a3cd6a36910207a0839ea822" gracePeriod=30 Dec 01 08:42:41 crc kubenswrapper[5004]: I1201 08:42:41.623286 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="dec4937a-6e5e-4612-b0d6-44b774704a2d" containerName="nova-metadata-metadata" containerID="cri-o://376212e4f743b3174fc9269d3341e5107dbf00c310c82542eee81ab91a23c27a" gracePeriod=30 Dec 01 08:42:41 crc kubenswrapper[5004]: I1201 08:42:41.631032 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6792a811-4156-4b0a-b6b4-6ff3280229d2","Type":"ContainerStarted","Data":"96da3a799c73646c16fc155df5e8fd5c622c1f9a4d244cbfbef65a50cdeb0f90"} Dec 01 08:42:41 crc kubenswrapper[5004]: I1201 08:42:41.640450 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.633570723 podStartE2EDuration="8.640429582s" podCreationTimestamp="2025-12-01 08:42:33 +0000 UTC" firstStartedPulling="2025-12-01 08:42:34.81863093 +0000 UTC m=+1532.383622912" lastFinishedPulling="2025-12-01 08:42:39.825489789 +0000 UTC m=+1537.390481771" observedRunningTime="2025-12-01 08:42:41.620296959 +0000 UTC m=+1539.185288941" watchObservedRunningTime="2025-12-01 08:42:41.640429582 +0000 UTC m=+1539.205421564" Dec 01 08:42:41 crc kubenswrapper[5004]: I1201 08:42:41.651625 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=4.004503063 podStartE2EDuration="8.651605795s" podCreationTimestamp="2025-12-01 08:42:33 +0000 UTC" firstStartedPulling="2025-12-01 08:42:35.181429102 +0000 UTC m=+1532.746421084" lastFinishedPulling="2025-12-01 08:42:39.828531834 +0000 UTC m=+1537.393523816" observedRunningTime="2025-12-01 08:42:41.643132418 +0000 UTC m=+1539.208124410" watchObservedRunningTime="2025-12-01 08:42:41.651605795 +0000 UTC m=+1539.216597767" Dec 01 08:42:42 crc kubenswrapper[5004]: I1201 08:42:42.026760 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Dec 01 08:42:42 crc kubenswrapper[5004]: I1201 08:42:42.650470 5004 generic.go:334] "Generic (PLEG): container finished" podID="dec4937a-6e5e-4612-b0d6-44b774704a2d" containerID="376212e4f743b3174fc9269d3341e5107dbf00c310c82542eee81ab91a23c27a" exitCode=0 Dec 01 08:42:42 crc kubenswrapper[5004]: I1201 08:42:42.650502 5004 generic.go:334] "Generic (PLEG): container finished" podID="dec4937a-6e5e-4612-b0d6-44b774704a2d" containerID="4c5eedbaec57618042d5afc59279bbeeb6d6b437a3cd6a36910207a0839ea822" exitCode=143 Dec 01 08:42:42 crc kubenswrapper[5004]: I1201 08:42:42.650590 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dec4937a-6e5e-4612-b0d6-44b774704a2d","Type":"ContainerDied","Data":"376212e4f743b3174fc9269d3341e5107dbf00c310c82542eee81ab91a23c27a"} Dec 01 08:42:42 crc kubenswrapper[5004]: I1201 08:42:42.650647 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dec4937a-6e5e-4612-b0d6-44b774704a2d","Type":"ContainerDied","Data":"4c5eedbaec57618042d5afc59279bbeeb6d6b437a3cd6a36910207a0839ea822"} Dec 01 08:42:42 crc kubenswrapper[5004]: I1201 08:42:42.651836 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6792a811-4156-4b0a-b6b4-6ff3280229d2","Type":"ContainerStarted","Data":"9ab23b6a648f9483fa5184cfcbb95b1a2b97c9bcd584bcdead439cfe601c29e3"} Dec 01 08:42:42 crc kubenswrapper[5004]: I1201 08:42:42.655253 5004 generic.go:334] "Generic (PLEG): container finished" podID="a28b451e-9081-498f-9ba4-4aac7a872d5a" containerID="964994c321d2a343de9c030ec175e1032f3eba9d0727c7f7411ddf24be7ffb3d" exitCode=0 Dec 01 08:42:42 crc kubenswrapper[5004]: I1201 08:42:42.655304 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a28b451e-9081-498f-9ba4-4aac7a872d5a","Type":"ContainerDied","Data":"964994c321d2a343de9c030ec175e1032f3eba9d0727c7f7411ddf24be7ffb3d"} Dec 01 08:42:42 crc kubenswrapper[5004]: I1201 08:42:42.669618 5004 generic.go:334] "Generic (PLEG): container finished" podID="ac4a2842-3b0b-45d4-b400-ce7be95df717" containerID="2303c4f6039dff84a524587888c0bc3a95b9b34da511d23004f020aecbbe9345" exitCode=0 Dec 01 08:42:42 crc kubenswrapper[5004]: I1201 08:42:42.669783 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-svs2q" event={"ID":"ac4a2842-3b0b-45d4-b400-ce7be95df717","Type":"ContainerDied","Data":"2303c4f6039dff84a524587888c0bc3a95b9b34da511d23004f020aecbbe9345"} Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.084486 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.089463 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.217501 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a28b451e-9081-498f-9ba4-4aac7a872d5a-combined-ca-bundle\") pod \"a28b451e-9081-498f-9ba4-4aac7a872d5a\" (UID: \"a28b451e-9081-498f-9ba4-4aac7a872d5a\") " Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.217587 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dec4937a-6e5e-4612-b0d6-44b774704a2d-combined-ca-bundle\") pod \"dec4937a-6e5e-4612-b0d6-44b774704a2d\" (UID: \"dec4937a-6e5e-4612-b0d6-44b774704a2d\") " Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.217644 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a28b451e-9081-498f-9ba4-4aac7a872d5a-log-httpd\") pod \"a28b451e-9081-498f-9ba4-4aac7a872d5a\" (UID: \"a28b451e-9081-498f-9ba4-4aac7a872d5a\") " Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.217698 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a28b451e-9081-498f-9ba4-4aac7a872d5a-scripts\") pod \"a28b451e-9081-498f-9ba4-4aac7a872d5a\" (UID: \"a28b451e-9081-498f-9ba4-4aac7a872d5a\") " Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.217734 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngf8v\" (UniqueName: \"kubernetes.io/projected/dec4937a-6e5e-4612-b0d6-44b774704a2d-kube-api-access-ngf8v\") pod \"dec4937a-6e5e-4612-b0d6-44b774704a2d\" (UID: \"dec4937a-6e5e-4612-b0d6-44b774704a2d\") " Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.217770 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbj5c\" (UniqueName: \"kubernetes.io/projected/a28b451e-9081-498f-9ba4-4aac7a872d5a-kube-api-access-vbj5c\") pod \"a28b451e-9081-498f-9ba4-4aac7a872d5a\" (UID: \"a28b451e-9081-498f-9ba4-4aac7a872d5a\") " Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.217835 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a28b451e-9081-498f-9ba4-4aac7a872d5a-sg-core-conf-yaml\") pod \"a28b451e-9081-498f-9ba4-4aac7a872d5a\" (UID: \"a28b451e-9081-498f-9ba4-4aac7a872d5a\") " Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.217902 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dec4937a-6e5e-4612-b0d6-44b774704a2d-config-data\") pod \"dec4937a-6e5e-4612-b0d6-44b774704a2d\" (UID: \"dec4937a-6e5e-4612-b0d6-44b774704a2d\") " Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.217926 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a28b451e-9081-498f-9ba4-4aac7a872d5a-config-data\") pod \"a28b451e-9081-498f-9ba4-4aac7a872d5a\" (UID: \"a28b451e-9081-498f-9ba4-4aac7a872d5a\") " Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.217977 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a28b451e-9081-498f-9ba4-4aac7a872d5a-run-httpd\") pod \"a28b451e-9081-498f-9ba4-4aac7a872d5a\" (UID: \"a28b451e-9081-498f-9ba4-4aac7a872d5a\") " Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.218016 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dec4937a-6e5e-4612-b0d6-44b774704a2d-logs\") pod \"dec4937a-6e5e-4612-b0d6-44b774704a2d\" (UID: \"dec4937a-6e5e-4612-b0d6-44b774704a2d\") " Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.218596 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a28b451e-9081-498f-9ba4-4aac7a872d5a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a28b451e-9081-498f-9ba4-4aac7a872d5a" (UID: "a28b451e-9081-498f-9ba4-4aac7a872d5a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.218680 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a28b451e-9081-498f-9ba4-4aac7a872d5a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a28b451e-9081-498f-9ba4-4aac7a872d5a" (UID: "a28b451e-9081-498f-9ba4-4aac7a872d5a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.218763 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dec4937a-6e5e-4612-b0d6-44b774704a2d-logs" (OuterVolumeSpecName: "logs") pod "dec4937a-6e5e-4612-b0d6-44b774704a2d" (UID: "dec4937a-6e5e-4612-b0d6-44b774704a2d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.218884 5004 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a28b451e-9081-498f-9ba4-4aac7a872d5a-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.218901 5004 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dec4937a-6e5e-4612-b0d6-44b774704a2d-logs\") on node \"crc\" DevicePath \"\"" Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.218909 5004 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a28b451e-9081-498f-9ba4-4aac7a872d5a-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.236210 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a28b451e-9081-498f-9ba4-4aac7a872d5a-kube-api-access-vbj5c" (OuterVolumeSpecName: "kube-api-access-vbj5c") pod "a28b451e-9081-498f-9ba4-4aac7a872d5a" (UID: "a28b451e-9081-498f-9ba4-4aac7a872d5a"). InnerVolumeSpecName "kube-api-access-vbj5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.243120 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a28b451e-9081-498f-9ba4-4aac7a872d5a-scripts" (OuterVolumeSpecName: "scripts") pod "a28b451e-9081-498f-9ba4-4aac7a872d5a" (UID: "a28b451e-9081-498f-9ba4-4aac7a872d5a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.244940 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dec4937a-6e5e-4612-b0d6-44b774704a2d-kube-api-access-ngf8v" (OuterVolumeSpecName: "kube-api-access-ngf8v") pod "dec4937a-6e5e-4612-b0d6-44b774704a2d" (UID: "dec4937a-6e5e-4612-b0d6-44b774704a2d"). InnerVolumeSpecName "kube-api-access-ngf8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.320810 5004 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a28b451e-9081-498f-9ba4-4aac7a872d5a-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.320839 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngf8v\" (UniqueName: \"kubernetes.io/projected/dec4937a-6e5e-4612-b0d6-44b774704a2d-kube-api-access-ngf8v\") on node \"crc\" DevicePath \"\"" Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.320850 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbj5c\" (UniqueName: \"kubernetes.io/projected/a28b451e-9081-498f-9ba4-4aac7a872d5a-kube-api-access-vbj5c\") on node \"crc\" DevicePath \"\"" Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.339525 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a28b451e-9081-498f-9ba4-4aac7a872d5a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a28b451e-9081-498f-9ba4-4aac7a872d5a" (UID: "a28b451e-9081-498f-9ba4-4aac7a872d5a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.376733 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dec4937a-6e5e-4612-b0d6-44b774704a2d-config-data" (OuterVolumeSpecName: "config-data") pod "dec4937a-6e5e-4612-b0d6-44b774704a2d" (UID: "dec4937a-6e5e-4612-b0d6-44b774704a2d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.386897 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dec4937a-6e5e-4612-b0d6-44b774704a2d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dec4937a-6e5e-4612-b0d6-44b774704a2d" (UID: "dec4937a-6e5e-4612-b0d6-44b774704a2d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.414999 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a28b451e-9081-498f-9ba4-4aac7a872d5a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a28b451e-9081-498f-9ba4-4aac7a872d5a" (UID: "a28b451e-9081-498f-9ba4-4aac7a872d5a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.423469 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dec4937a-6e5e-4612-b0d6-44b774704a2d-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.423498 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a28b451e-9081-498f-9ba4-4aac7a872d5a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.423508 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dec4937a-6e5e-4612-b0d6-44b774704a2d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.423519 5004 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a28b451e-9081-498f-9ba4-4aac7a872d5a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.492488 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a28b451e-9081-498f-9ba4-4aac7a872d5a-config-data" (OuterVolumeSpecName: "config-data") pod "a28b451e-9081-498f-9ba4-4aac7a872d5a" (UID: "a28b451e-9081-498f-9ba4-4aac7a872d5a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.525497 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a28b451e-9081-498f-9ba4-4aac7a872d5a-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.682636 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dec4937a-6e5e-4612-b0d6-44b774704a2d","Type":"ContainerDied","Data":"aced9ee4e0cfd4ef04c77ffd92edf37ea6f98f28bb877810014df86c2c900409"} Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.682694 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.682701 5004 scope.go:117] "RemoveContainer" containerID="376212e4f743b3174fc9269d3341e5107dbf00c310c82542eee81ab91a23c27a" Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.687653 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a28b451e-9081-498f-9ba4-4aac7a872d5a","Type":"ContainerDied","Data":"df7f64eb8968b6ab87edbfb444f30ed68b746a02e89a0f4b32be21de46659f39"} Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.687668 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.691492 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-svs2q" event={"ID":"ac4a2842-3b0b-45d4-b400-ce7be95df717","Type":"ContainerStarted","Data":"528c2d1956dfcbfef6946d3280cde4c12c86ac089d5eb27229f063d41fcb827e"} Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.709142 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-svs2q" podStartSLOduration=3.929839032 podStartE2EDuration="8.709127922s" podCreationTimestamp="2025-12-01 08:42:35 +0000 UTC" firstStartedPulling="2025-12-01 08:42:38.46224745 +0000 UTC m=+1536.027239432" lastFinishedPulling="2025-12-01 08:42:43.24153634 +0000 UTC m=+1540.806528322" observedRunningTime="2025-12-01 08:42:43.705018871 +0000 UTC m=+1541.270010853" watchObservedRunningTime="2025-12-01 08:42:43.709127922 +0000 UTC m=+1541.274119904" Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.709649 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.729671 5004 scope.go:117] "RemoveContainer" containerID="4c5eedbaec57618042d5afc59279bbeeb6d6b437a3cd6a36910207a0839ea822" Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.738651 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.738686 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.744177 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.764395 5004 scope.go:117] "RemoveContainer" containerID="b0375f9fe61f3bdd15e1bf95b851c577fd28e5f351525752302efabd86b25234" Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.769531 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.787335 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.822845 5004 scope.go:117] "RemoveContainer" containerID="8c8bd315dc30d6ec8f0fc0860aea5f624ba72b7e94565e563a4a64a38a8369de" Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.824636 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.824713 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.824731 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.872353 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 01 08:42:43 crc kubenswrapper[5004]: E1201 08:42:43.875029 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dec4937a-6e5e-4612-b0d6-44b774704a2d" containerName="nova-metadata-metadata" Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.875048 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="dec4937a-6e5e-4612-b0d6-44b774704a2d" containerName="nova-metadata-metadata" Dec 01 08:42:43 crc kubenswrapper[5004]: E1201 08:42:43.875072 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a28b451e-9081-498f-9ba4-4aac7a872d5a" containerName="ceilometer-notification-agent" Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.875082 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="a28b451e-9081-498f-9ba4-4aac7a872d5a" containerName="ceilometer-notification-agent" Dec 01 08:42:43 crc kubenswrapper[5004]: E1201 08:42:43.875107 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a28b451e-9081-498f-9ba4-4aac7a872d5a" containerName="ceilometer-central-agent" Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.875115 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="a28b451e-9081-498f-9ba4-4aac7a872d5a" containerName="ceilometer-central-agent" Dec 01 08:42:43 crc kubenswrapper[5004]: E1201 08:42:43.875141 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a28b451e-9081-498f-9ba4-4aac7a872d5a" containerName="proxy-httpd" Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.875149 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="a28b451e-9081-498f-9ba4-4aac7a872d5a" containerName="proxy-httpd" Dec 01 08:42:43 crc kubenswrapper[5004]: E1201 08:42:43.875181 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a28b451e-9081-498f-9ba4-4aac7a872d5a" containerName="sg-core" Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.875187 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="a28b451e-9081-498f-9ba4-4aac7a872d5a" containerName="sg-core" Dec 01 08:42:43 crc kubenswrapper[5004]: E1201 08:42:43.875206 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dec4937a-6e5e-4612-b0d6-44b774704a2d" containerName="nova-metadata-log" Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.875212 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="dec4937a-6e5e-4612-b0d6-44b774704a2d" containerName="nova-metadata-log" Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.875872 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="dec4937a-6e5e-4612-b0d6-44b774704a2d" containerName="nova-metadata-metadata" Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.876172 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="a28b451e-9081-498f-9ba4-4aac7a872d5a" containerName="proxy-httpd" Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.876209 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="dec4937a-6e5e-4612-b0d6-44b774704a2d" containerName="nova-metadata-log" Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.876234 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="a28b451e-9081-498f-9ba4-4aac7a872d5a" containerName="ceilometer-central-agent" Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.876245 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="a28b451e-9081-498f-9ba4-4aac7a872d5a" containerName="ceilometer-notification-agent" Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.876255 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="a28b451e-9081-498f-9ba4-4aac7a872d5a" containerName="sg-core" Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.877991 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.881071 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.881706 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.887824 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.904521 5004 scope.go:117] "RemoveContainer" containerID="964994c321d2a343de9c030ec175e1032f3eba9d0727c7f7411ddf24be7ffb3d" Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.913777 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.916992 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.920013 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.920201 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.955840 5004 scope.go:117] "RemoveContainer" containerID="eb32c682c6f68d69be6b3d8361afd845a1e6777e3c611266e61d0f19c5bae580" Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.955984 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 08:42:43 crc kubenswrapper[5004]: I1201 08:42:43.986266 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 08:42:44 crc kubenswrapper[5004]: I1201 08:42:44.038262 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0eebd6d-9db9-4e42-b0bf-e98a99f77f96-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c0eebd6d-9db9-4e42-b0bf-e98a99f77f96\") " pod="openstack/nova-metadata-0" Dec 01 08:42:44 crc kubenswrapper[5004]: I1201 08:42:44.038331 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f55f02c-5371-4258-b5cf-9565b29798b9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7f55f02c-5371-4258-b5cf-9565b29798b9\") " pod="openstack/ceilometer-0" Dec 01 08:42:44 crc kubenswrapper[5004]: I1201 08:42:44.038374 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7f55f02c-5371-4258-b5cf-9565b29798b9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7f55f02c-5371-4258-b5cf-9565b29798b9\") " pod="openstack/ceilometer-0" Dec 01 08:42:44 crc kubenswrapper[5004]: I1201 08:42:44.038415 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f55f02c-5371-4258-b5cf-9565b29798b9-scripts\") pod \"ceilometer-0\" (UID: \"7f55f02c-5371-4258-b5cf-9565b29798b9\") " pod="openstack/ceilometer-0" Dec 01 08:42:44 crc kubenswrapper[5004]: I1201 08:42:44.038597 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0eebd6d-9db9-4e42-b0bf-e98a99f77f96-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c0eebd6d-9db9-4e42-b0bf-e98a99f77f96\") " pod="openstack/nova-metadata-0" Dec 01 08:42:44 crc kubenswrapper[5004]: I1201 08:42:44.038626 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thh9b\" (UniqueName: \"kubernetes.io/projected/7f55f02c-5371-4258-b5cf-9565b29798b9-kube-api-access-thh9b\") pod \"ceilometer-0\" (UID: \"7f55f02c-5371-4258-b5cf-9565b29798b9\") " pod="openstack/ceilometer-0" Dec 01 08:42:44 crc kubenswrapper[5004]: I1201 08:42:44.038661 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0eebd6d-9db9-4e42-b0bf-e98a99f77f96-logs\") pod \"nova-metadata-0\" (UID: \"c0eebd6d-9db9-4e42-b0bf-e98a99f77f96\") " pod="openstack/nova-metadata-0" Dec 01 08:42:44 crc kubenswrapper[5004]: I1201 08:42:44.038685 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfnz2\" (UniqueName: \"kubernetes.io/projected/c0eebd6d-9db9-4e42-b0bf-e98a99f77f96-kube-api-access-tfnz2\") pod \"nova-metadata-0\" (UID: \"c0eebd6d-9db9-4e42-b0bf-e98a99f77f96\") " pod="openstack/nova-metadata-0" Dec 01 08:42:44 crc kubenswrapper[5004]: I1201 08:42:44.038854 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f55f02c-5371-4258-b5cf-9565b29798b9-run-httpd\") pod \"ceilometer-0\" (UID: \"7f55f02c-5371-4258-b5cf-9565b29798b9\") " pod="openstack/ceilometer-0" Dec 01 08:42:44 crc kubenswrapper[5004]: I1201 08:42:44.038878 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f55f02c-5371-4258-b5cf-9565b29798b9-log-httpd\") pod \"ceilometer-0\" (UID: \"7f55f02c-5371-4258-b5cf-9565b29798b9\") " pod="openstack/ceilometer-0" Dec 01 08:42:44 crc kubenswrapper[5004]: I1201 08:42:44.038945 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f55f02c-5371-4258-b5cf-9565b29798b9-config-data\") pod \"ceilometer-0\" (UID: \"7f55f02c-5371-4258-b5cf-9565b29798b9\") " pod="openstack/ceilometer-0" Dec 01 08:42:44 crc kubenswrapper[5004]: I1201 08:42:44.039120 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0eebd6d-9db9-4e42-b0bf-e98a99f77f96-config-data\") pod \"nova-metadata-0\" (UID: \"c0eebd6d-9db9-4e42-b0bf-e98a99f77f96\") " pod="openstack/nova-metadata-0" Dec 01 08:42:44 crc kubenswrapper[5004]: I1201 08:42:44.141478 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfnz2\" (UniqueName: \"kubernetes.io/projected/c0eebd6d-9db9-4e42-b0bf-e98a99f77f96-kube-api-access-tfnz2\") pod \"nova-metadata-0\" (UID: \"c0eebd6d-9db9-4e42-b0bf-e98a99f77f96\") " pod="openstack/nova-metadata-0" Dec 01 08:42:44 crc kubenswrapper[5004]: I1201 08:42:44.141577 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f55f02c-5371-4258-b5cf-9565b29798b9-run-httpd\") pod \"ceilometer-0\" (UID: \"7f55f02c-5371-4258-b5cf-9565b29798b9\") " pod="openstack/ceilometer-0" Dec 01 08:42:44 crc kubenswrapper[5004]: I1201 08:42:44.141599 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f55f02c-5371-4258-b5cf-9565b29798b9-log-httpd\") pod \"ceilometer-0\" (UID: \"7f55f02c-5371-4258-b5cf-9565b29798b9\") " pod="openstack/ceilometer-0" Dec 01 08:42:44 crc kubenswrapper[5004]: I1201 08:42:44.141632 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f55f02c-5371-4258-b5cf-9565b29798b9-config-data\") pod \"ceilometer-0\" (UID: \"7f55f02c-5371-4258-b5cf-9565b29798b9\") " pod="openstack/ceilometer-0" Dec 01 08:42:44 crc kubenswrapper[5004]: I1201 08:42:44.141659 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0eebd6d-9db9-4e42-b0bf-e98a99f77f96-config-data\") pod \"nova-metadata-0\" (UID: \"c0eebd6d-9db9-4e42-b0bf-e98a99f77f96\") " pod="openstack/nova-metadata-0" Dec 01 08:42:44 crc kubenswrapper[5004]: I1201 08:42:44.141697 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0eebd6d-9db9-4e42-b0bf-e98a99f77f96-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c0eebd6d-9db9-4e42-b0bf-e98a99f77f96\") " pod="openstack/nova-metadata-0" Dec 01 08:42:44 crc kubenswrapper[5004]: I1201 08:42:44.141737 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f55f02c-5371-4258-b5cf-9565b29798b9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7f55f02c-5371-4258-b5cf-9565b29798b9\") " pod="openstack/ceilometer-0" Dec 01 08:42:44 crc kubenswrapper[5004]: I1201 08:42:44.141759 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7f55f02c-5371-4258-b5cf-9565b29798b9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7f55f02c-5371-4258-b5cf-9565b29798b9\") " pod="openstack/ceilometer-0" Dec 01 08:42:44 crc kubenswrapper[5004]: I1201 08:42:44.141795 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f55f02c-5371-4258-b5cf-9565b29798b9-scripts\") pod \"ceilometer-0\" (UID: \"7f55f02c-5371-4258-b5cf-9565b29798b9\") " pod="openstack/ceilometer-0" Dec 01 08:42:44 crc kubenswrapper[5004]: I1201 08:42:44.141859 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0eebd6d-9db9-4e42-b0bf-e98a99f77f96-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c0eebd6d-9db9-4e42-b0bf-e98a99f77f96\") " pod="openstack/nova-metadata-0" Dec 01 08:42:44 crc kubenswrapper[5004]: I1201 08:42:44.141878 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thh9b\" (UniqueName: \"kubernetes.io/projected/7f55f02c-5371-4258-b5cf-9565b29798b9-kube-api-access-thh9b\") pod \"ceilometer-0\" (UID: \"7f55f02c-5371-4258-b5cf-9565b29798b9\") " pod="openstack/ceilometer-0" Dec 01 08:42:44 crc kubenswrapper[5004]: I1201 08:42:44.141909 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0eebd6d-9db9-4e42-b0bf-e98a99f77f96-logs\") pod \"nova-metadata-0\" (UID: \"c0eebd6d-9db9-4e42-b0bf-e98a99f77f96\") " pod="openstack/nova-metadata-0" Dec 01 08:42:44 crc kubenswrapper[5004]: I1201 08:42:44.142292 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0eebd6d-9db9-4e42-b0bf-e98a99f77f96-logs\") pod \"nova-metadata-0\" (UID: \"c0eebd6d-9db9-4e42-b0bf-e98a99f77f96\") " pod="openstack/nova-metadata-0" Dec 01 08:42:44 crc kubenswrapper[5004]: I1201 08:42:44.143435 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f55f02c-5371-4258-b5cf-9565b29798b9-log-httpd\") pod \"ceilometer-0\" (UID: \"7f55f02c-5371-4258-b5cf-9565b29798b9\") " pod="openstack/ceilometer-0" Dec 01 08:42:44 crc kubenswrapper[5004]: I1201 08:42:44.144019 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f55f02c-5371-4258-b5cf-9565b29798b9-run-httpd\") pod \"ceilometer-0\" (UID: \"7f55f02c-5371-4258-b5cf-9565b29798b9\") " pod="openstack/ceilometer-0" Dec 01 08:42:44 crc kubenswrapper[5004]: I1201 08:42:44.151398 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0eebd6d-9db9-4e42-b0bf-e98a99f77f96-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c0eebd6d-9db9-4e42-b0bf-e98a99f77f96\") " pod="openstack/nova-metadata-0" Dec 01 08:42:44 crc kubenswrapper[5004]: I1201 08:42:44.152000 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f55f02c-5371-4258-b5cf-9565b29798b9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7f55f02c-5371-4258-b5cf-9565b29798b9\") " pod="openstack/ceilometer-0" Dec 01 08:42:44 crc kubenswrapper[5004]: I1201 08:42:44.152038 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f55f02c-5371-4258-b5cf-9565b29798b9-scripts\") pod \"ceilometer-0\" (UID: \"7f55f02c-5371-4258-b5cf-9565b29798b9\") " pod="openstack/ceilometer-0" Dec 01 08:42:44 crc kubenswrapper[5004]: I1201 08:42:44.153029 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f55f02c-5371-4258-b5cf-9565b29798b9-config-data\") pod \"ceilometer-0\" (UID: \"7f55f02c-5371-4258-b5cf-9565b29798b9\") " pod="openstack/ceilometer-0" Dec 01 08:42:44 crc kubenswrapper[5004]: I1201 08:42:44.153874 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7f55f02c-5371-4258-b5cf-9565b29798b9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7f55f02c-5371-4258-b5cf-9565b29798b9\") " pod="openstack/ceilometer-0" Dec 01 08:42:44 crc kubenswrapper[5004]: I1201 08:42:44.154636 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0eebd6d-9db9-4e42-b0bf-e98a99f77f96-config-data\") pod \"nova-metadata-0\" (UID: \"c0eebd6d-9db9-4e42-b0bf-e98a99f77f96\") " pod="openstack/nova-metadata-0" Dec 01 08:42:44 crc kubenswrapper[5004]: I1201 08:42:44.159837 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0eebd6d-9db9-4e42-b0bf-e98a99f77f96-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c0eebd6d-9db9-4e42-b0bf-e98a99f77f96\") " pod="openstack/nova-metadata-0" Dec 01 08:42:44 crc kubenswrapper[5004]: I1201 08:42:44.168482 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thh9b\" (UniqueName: \"kubernetes.io/projected/7f55f02c-5371-4258-b5cf-9565b29798b9-kube-api-access-thh9b\") pod \"ceilometer-0\" (UID: \"7f55f02c-5371-4258-b5cf-9565b29798b9\") " pod="openstack/ceilometer-0" Dec 01 08:42:44 crc kubenswrapper[5004]: I1201 08:42:44.171190 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfnz2\" (UniqueName: \"kubernetes.io/projected/c0eebd6d-9db9-4e42-b0bf-e98a99f77f96-kube-api-access-tfnz2\") pod \"nova-metadata-0\" (UID: \"c0eebd6d-9db9-4e42-b0bf-e98a99f77f96\") " pod="openstack/nova-metadata-0" Dec 01 08:42:44 crc kubenswrapper[5004]: I1201 08:42:44.185742 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7877d89589-z7vkq" Dec 01 08:42:44 crc kubenswrapper[5004]: I1201 08:42:44.205450 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 08:42:44 crc kubenswrapper[5004]: I1201 08:42:44.272449 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d978555f9-fpvfz"] Dec 01 08:42:44 crc kubenswrapper[5004]: I1201 08:42:44.272755 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7d978555f9-fpvfz" podUID="6d31297b-7a09-4806-b488-1c6e7ea17ab0" containerName="dnsmasq-dns" containerID="cri-o://0b81a733c8f514f09eb04912b4c63f52c6900f05c44b4ef3f29bffb86206f57b" gracePeriod=10 Dec 01 08:42:44 crc kubenswrapper[5004]: I1201 08:42:44.372125 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 08:42:44 crc kubenswrapper[5004]: I1201 08:42:44.718974 5004 generic.go:334] "Generic (PLEG): container finished" podID="6d31297b-7a09-4806-b488-1c6e7ea17ab0" containerID="0b81a733c8f514f09eb04912b4c63f52c6900f05c44b4ef3f29bffb86206f57b" exitCode=0 Dec 01 08:42:44 crc kubenswrapper[5004]: I1201 08:42:44.719146 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d978555f9-fpvfz" event={"ID":"6d31297b-7a09-4806-b488-1c6e7ea17ab0","Type":"ContainerDied","Data":"0b81a733c8f514f09eb04912b4c63f52c6900f05c44b4ef3f29bffb86206f57b"} Dec 01 08:42:44 crc kubenswrapper[5004]: I1201 08:42:44.788407 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a28b451e-9081-498f-9ba4-4aac7a872d5a" path="/var/lib/kubelet/pods/a28b451e-9081-498f-9ba4-4aac7a872d5a/volumes" Dec 01 08:42:44 crc kubenswrapper[5004]: I1201 08:42:44.789632 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dec4937a-6e5e-4612-b0d6-44b774704a2d" path="/var/lib/kubelet/pods/dec4937a-6e5e-4612-b0d6-44b774704a2d/volumes" Dec 01 08:42:44 crc kubenswrapper[5004]: I1201 08:42:44.794937 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 01 08:42:44 crc kubenswrapper[5004]: I1201 08:42:44.823017 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9ae06efb-c5ea-45b8-9c2a-c7c43dcca1c7" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.236:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 08:42:44 crc kubenswrapper[5004]: I1201 08:42:44.823263 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9ae06efb-c5ea-45b8-9c2a-c7c43dcca1c7" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.236:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 08:42:44 crc kubenswrapper[5004]: I1201 08:42:44.875719 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 08:42:45 crc kubenswrapper[5004]: I1201 08:42:45.099304 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 08:42:45 crc kubenswrapper[5004]: W1201 08:42:45.740098 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc0eebd6d_9db9_4e42_b0bf_e98a99f77f96.slice/crio-54dae97069387d32db9291f645b42a728001f74aa2222da2bdf1d19220279e88 WatchSource:0}: Error finding container 54dae97069387d32db9291f645b42a728001f74aa2222da2bdf1d19220279e88: Status 404 returned error can't find the container with id 54dae97069387d32db9291f645b42a728001f74aa2222da2bdf1d19220279e88 Dec 01 08:42:45 crc kubenswrapper[5004]: I1201 08:42:45.836141 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-svs2q" Dec 01 08:42:45 crc kubenswrapper[5004]: I1201 08:42:45.837361 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-svs2q" Dec 01 08:42:46 crc kubenswrapper[5004]: I1201 08:42:46.196285 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d978555f9-fpvfz" Dec 01 08:42:46 crc kubenswrapper[5004]: I1201 08:42:46.302957 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mj89\" (UniqueName: \"kubernetes.io/projected/6d31297b-7a09-4806-b488-1c6e7ea17ab0-kube-api-access-6mj89\") pod \"6d31297b-7a09-4806-b488-1c6e7ea17ab0\" (UID: \"6d31297b-7a09-4806-b488-1c6e7ea17ab0\") " Dec 01 08:42:46 crc kubenswrapper[5004]: I1201 08:42:46.303023 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6d31297b-7a09-4806-b488-1c6e7ea17ab0-dns-swift-storage-0\") pod \"6d31297b-7a09-4806-b488-1c6e7ea17ab0\" (UID: \"6d31297b-7a09-4806-b488-1c6e7ea17ab0\") " Dec 01 08:42:46 crc kubenswrapper[5004]: I1201 08:42:46.303074 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d31297b-7a09-4806-b488-1c6e7ea17ab0-config\") pod \"6d31297b-7a09-4806-b488-1c6e7ea17ab0\" (UID: \"6d31297b-7a09-4806-b488-1c6e7ea17ab0\") " Dec 01 08:42:46 crc kubenswrapper[5004]: I1201 08:42:46.303265 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6d31297b-7a09-4806-b488-1c6e7ea17ab0-ovsdbserver-sb\") pod \"6d31297b-7a09-4806-b488-1c6e7ea17ab0\" (UID: \"6d31297b-7a09-4806-b488-1c6e7ea17ab0\") " Dec 01 08:42:46 crc kubenswrapper[5004]: I1201 08:42:46.303290 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6d31297b-7a09-4806-b488-1c6e7ea17ab0-ovsdbserver-nb\") pod \"6d31297b-7a09-4806-b488-1c6e7ea17ab0\" (UID: \"6d31297b-7a09-4806-b488-1c6e7ea17ab0\") " Dec 01 08:42:46 crc kubenswrapper[5004]: I1201 08:42:46.303406 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d31297b-7a09-4806-b488-1c6e7ea17ab0-dns-svc\") pod \"6d31297b-7a09-4806-b488-1c6e7ea17ab0\" (UID: \"6d31297b-7a09-4806-b488-1c6e7ea17ab0\") " Dec 01 08:42:46 crc kubenswrapper[5004]: I1201 08:42:46.313654 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d31297b-7a09-4806-b488-1c6e7ea17ab0-kube-api-access-6mj89" (OuterVolumeSpecName: "kube-api-access-6mj89") pod "6d31297b-7a09-4806-b488-1c6e7ea17ab0" (UID: "6d31297b-7a09-4806-b488-1c6e7ea17ab0"). InnerVolumeSpecName "kube-api-access-6mj89". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:42:46 crc kubenswrapper[5004]: I1201 08:42:46.408426 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mj89\" (UniqueName: \"kubernetes.io/projected/6d31297b-7a09-4806-b488-1c6e7ea17ab0-kube-api-access-6mj89\") on node \"crc\" DevicePath \"\"" Dec 01 08:42:46 crc kubenswrapper[5004]: I1201 08:42:46.462206 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 08:42:46 crc kubenswrapper[5004]: I1201 08:42:46.477252 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d31297b-7a09-4806-b488-1c6e7ea17ab0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6d31297b-7a09-4806-b488-1c6e7ea17ab0" (UID: "6d31297b-7a09-4806-b488-1c6e7ea17ab0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:42:46 crc kubenswrapper[5004]: I1201 08:42:46.494454 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d31297b-7a09-4806-b488-1c6e7ea17ab0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6d31297b-7a09-4806-b488-1c6e7ea17ab0" (UID: "6d31297b-7a09-4806-b488-1c6e7ea17ab0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:42:46 crc kubenswrapper[5004]: I1201 08:42:46.513522 5004 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6d31297b-7a09-4806-b488-1c6e7ea17ab0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 01 08:42:46 crc kubenswrapper[5004]: I1201 08:42:46.513551 5004 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6d31297b-7a09-4806-b488-1c6e7ea17ab0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 08:42:46 crc kubenswrapper[5004]: I1201 08:42:46.538138 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d31297b-7a09-4806-b488-1c6e7ea17ab0-config" (OuterVolumeSpecName: "config") pod "6d31297b-7a09-4806-b488-1c6e7ea17ab0" (UID: "6d31297b-7a09-4806-b488-1c6e7ea17ab0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:42:46 crc kubenswrapper[5004]: I1201 08:42:46.552528 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d31297b-7a09-4806-b488-1c6e7ea17ab0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6d31297b-7a09-4806-b488-1c6e7ea17ab0" (UID: "6d31297b-7a09-4806-b488-1c6e7ea17ab0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:42:46 crc kubenswrapper[5004]: I1201 08:42:46.612673 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d31297b-7a09-4806-b488-1c6e7ea17ab0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6d31297b-7a09-4806-b488-1c6e7ea17ab0" (UID: "6d31297b-7a09-4806-b488-1c6e7ea17ab0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:42:46 crc kubenswrapper[5004]: I1201 08:42:46.615388 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d31297b-7a09-4806-b488-1c6e7ea17ab0-config\") on node \"crc\" DevicePath \"\"" Dec 01 08:42:46 crc kubenswrapper[5004]: I1201 08:42:46.615418 5004 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6d31297b-7a09-4806-b488-1c6e7ea17ab0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 08:42:46 crc kubenswrapper[5004]: I1201 08:42:46.615430 5004 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d31297b-7a09-4806-b488-1c6e7ea17ab0-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 08:42:46 crc kubenswrapper[5004]: I1201 08:42:46.776729 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c0eebd6d-9db9-4e42-b0bf-e98a99f77f96","Type":"ContainerStarted","Data":"9a25b2f9d039312daa158f7a5afbd0cf906f0d6766aa7e29e90f28390bd8ea5a"} Dec 01 08:42:46 crc kubenswrapper[5004]: I1201 08:42:46.776765 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c0eebd6d-9db9-4e42-b0bf-e98a99f77f96","Type":"ContainerStarted","Data":"537214a11e8a15750d36fc700bc7fa066aa2267e43f394864938d035fc4c63d2"} Dec 01 08:42:46 crc kubenswrapper[5004]: I1201 08:42:46.776775 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c0eebd6d-9db9-4e42-b0bf-e98a99f77f96","Type":"ContainerStarted","Data":"54dae97069387d32db9291f645b42a728001f74aa2222da2bdf1d19220279e88"} Dec 01 08:42:46 crc kubenswrapper[5004]: I1201 08:42:46.785290 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d978555f9-fpvfz" event={"ID":"6d31297b-7a09-4806-b488-1c6e7ea17ab0","Type":"ContainerDied","Data":"fb3a125a6d540161eefbcb6bffdfbe144352dd3650ec23d544e6eecf4d45b36e"} Dec 01 08:42:46 crc kubenswrapper[5004]: I1201 08:42:46.785431 5004 scope.go:117] "RemoveContainer" containerID="0b81a733c8f514f09eb04912b4c63f52c6900f05c44b4ef3f29bffb86206f57b" Dec 01 08:42:46 crc kubenswrapper[5004]: I1201 08:42:46.785707 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d978555f9-fpvfz" Dec 01 08:42:46 crc kubenswrapper[5004]: I1201 08:42:46.788063 5004 generic.go:334] "Generic (PLEG): container finished" podID="902842db-d6a2-4ae1-8d5e-25b637f4db2c" containerID="75604ee40e80496e4599aae3f1fb7cc18dd761c466f5be26677f5ac186554dbd" exitCode=0 Dec 01 08:42:46 crc kubenswrapper[5004]: I1201 08:42:46.788119 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-6rnpl" event={"ID":"902842db-d6a2-4ae1-8d5e-25b637f4db2c","Type":"ContainerDied","Data":"75604ee40e80496e4599aae3f1fb7cc18dd761c466f5be26677f5ac186554dbd"} Dec 01 08:42:46 crc kubenswrapper[5004]: I1201 08:42:46.795392 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7f55f02c-5371-4258-b5cf-9565b29798b9","Type":"ContainerStarted","Data":"b4cecaeb7fc761e8c8b511ee1a5f16d8dcafcbce8eb0d59b501027a28c49dab7"} Dec 01 08:42:46 crc kubenswrapper[5004]: I1201 08:42:46.797617 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6792a811-4156-4b0a-b6b4-6ff3280229d2","Type":"ContainerStarted","Data":"2f13ef7709655a3d8d5548c58b0d41f1559b92eed686d7880bfceca816c21aad"} Dec 01 08:42:46 crc kubenswrapper[5004]: I1201 08:42:46.810975 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.810955929 podStartE2EDuration="3.810955929s" podCreationTimestamp="2025-12-01 08:42:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:42:46.796780812 +0000 UTC m=+1544.361772784" watchObservedRunningTime="2025-12-01 08:42:46.810955929 +0000 UTC m=+1544.375947911" Dec 01 08:42:46 crc kubenswrapper[5004]: I1201 08:42:46.898292 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-svs2q" podUID="ac4a2842-3b0b-45d4-b400-ce7be95df717" containerName="registry-server" probeResult="failure" output=< Dec 01 08:42:46 crc kubenswrapper[5004]: timeout: failed to connect service ":50051" within 1s Dec 01 08:42:46 crc kubenswrapper[5004]: > Dec 01 08:42:46 crc kubenswrapper[5004]: I1201 08:42:46.903022 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d978555f9-fpvfz"] Dec 01 08:42:46 crc kubenswrapper[5004]: I1201 08:42:46.903107 5004 scope.go:117] "RemoveContainer" containerID="4e026f98251c490003dc4784baabd4babacba50787c6e7159e604e1bea8ba9a2" Dec 01 08:42:46 crc kubenswrapper[5004]: I1201 08:42:46.919194 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7d978555f9-fpvfz"] Dec 01 08:42:47 crc kubenswrapper[5004]: E1201 08:42:47.089636 5004 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod902842db_d6a2_4ae1_8d5e_25b637f4db2c.slice/crio-75604ee40e80496e4599aae3f1fb7cc18dd761c466f5be26677f5ac186554dbd.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod902842db_d6a2_4ae1_8d5e_25b637f4db2c.slice/crio-conmon-75604ee40e80496e4599aae3f1fb7cc18dd761c466f5be26677f5ac186554dbd.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d31297b_7a09_4806_b488_1c6e7ea17ab0.slice\": RecentStats: unable to find data in memory cache]" Dec 01 08:42:47 crc kubenswrapper[5004]: I1201 08:42:47.817497 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7f55f02c-5371-4258-b5cf-9565b29798b9","Type":"ContainerStarted","Data":"7b2e343e254fa87c3ee0984676d24fda0f1fa1803053782002fc004659c8c7bc"} Dec 01 08:42:48 crc kubenswrapper[5004]: I1201 08:42:48.788491 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-6rnpl" Dec 01 08:42:48 crc kubenswrapper[5004]: I1201 08:42:48.863814 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d31297b-7a09-4806-b488-1c6e7ea17ab0" path="/var/lib/kubelet/pods/6d31297b-7a09-4806-b488-1c6e7ea17ab0/volumes" Dec 01 08:42:48 crc kubenswrapper[5004]: I1201 08:42:48.864219 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-6rnpl" Dec 01 08:42:48 crc kubenswrapper[5004]: I1201 08:42:48.869019 5004 generic.go:334] "Generic (PLEG): container finished" podID="a31898e1-9571-43cb-b754-5544a7898213" containerID="d396d1b5dd78f4ad9d79e792b9b18ece3e9a8e63b83164dbc75f56468b206893" exitCode=0 Dec 01 08:42:48 crc kubenswrapper[5004]: I1201 08:42:48.869641 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-6rnpl" event={"ID":"902842db-d6a2-4ae1-8d5e-25b637f4db2c","Type":"ContainerDied","Data":"0c678a7b2a88c2a73b210510389daca02a40d24ae6cd27fe05e33d2154b3e83d"} Dec 01 08:42:48 crc kubenswrapper[5004]: I1201 08:42:48.869696 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c678a7b2a88c2a73b210510389daca02a40d24ae6cd27fe05e33d2154b3e83d" Dec 01 08:42:48 crc kubenswrapper[5004]: I1201 08:42:48.869708 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-5gbbc" event={"ID":"a31898e1-9571-43cb-b754-5544a7898213","Type":"ContainerDied","Data":"d396d1b5dd78f4ad9d79e792b9b18ece3e9a8e63b83164dbc75f56468b206893"} Dec 01 08:42:48 crc kubenswrapper[5004]: I1201 08:42:48.899000 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/902842db-d6a2-4ae1-8d5e-25b637f4db2c-config-data\") pod \"902842db-d6a2-4ae1-8d5e-25b637f4db2c\" (UID: \"902842db-d6a2-4ae1-8d5e-25b637f4db2c\") " Dec 01 08:42:48 crc kubenswrapper[5004]: I1201 08:42:48.899131 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/902842db-d6a2-4ae1-8d5e-25b637f4db2c-combined-ca-bundle\") pod \"902842db-d6a2-4ae1-8d5e-25b637f4db2c\" (UID: \"902842db-d6a2-4ae1-8d5e-25b637f4db2c\") " Dec 01 08:42:48 crc kubenswrapper[5004]: I1201 08:42:48.899181 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f68x4\" (UniqueName: \"kubernetes.io/projected/902842db-d6a2-4ae1-8d5e-25b637f4db2c-kube-api-access-f68x4\") pod \"902842db-d6a2-4ae1-8d5e-25b637f4db2c\" (UID: \"902842db-d6a2-4ae1-8d5e-25b637f4db2c\") " Dec 01 08:42:48 crc kubenswrapper[5004]: I1201 08:42:48.899226 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/902842db-d6a2-4ae1-8d5e-25b637f4db2c-scripts\") pod \"902842db-d6a2-4ae1-8d5e-25b637f4db2c\" (UID: \"902842db-d6a2-4ae1-8d5e-25b637f4db2c\") " Dec 01 08:42:48 crc kubenswrapper[5004]: I1201 08:42:48.906259 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/902842db-d6a2-4ae1-8d5e-25b637f4db2c-scripts" (OuterVolumeSpecName: "scripts") pod "902842db-d6a2-4ae1-8d5e-25b637f4db2c" (UID: "902842db-d6a2-4ae1-8d5e-25b637f4db2c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:42:48 crc kubenswrapper[5004]: I1201 08:42:48.906322 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/902842db-d6a2-4ae1-8d5e-25b637f4db2c-kube-api-access-f68x4" (OuterVolumeSpecName: "kube-api-access-f68x4") pod "902842db-d6a2-4ae1-8d5e-25b637f4db2c" (UID: "902842db-d6a2-4ae1-8d5e-25b637f4db2c"). InnerVolumeSpecName "kube-api-access-f68x4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:42:48 crc kubenswrapper[5004]: I1201 08:42:48.939828 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/902842db-d6a2-4ae1-8d5e-25b637f4db2c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "902842db-d6a2-4ae1-8d5e-25b637f4db2c" (UID: "902842db-d6a2-4ae1-8d5e-25b637f4db2c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:42:48 crc kubenswrapper[5004]: I1201 08:42:48.940490 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/902842db-d6a2-4ae1-8d5e-25b637f4db2c-config-data" (OuterVolumeSpecName: "config-data") pod "902842db-d6a2-4ae1-8d5e-25b637f4db2c" (UID: "902842db-d6a2-4ae1-8d5e-25b637f4db2c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:42:48 crc kubenswrapper[5004]: I1201 08:42:48.997952 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 01 08:42:48 crc kubenswrapper[5004]: I1201 08:42:48.998510 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="9ae06efb-c5ea-45b8-9c2a-c7c43dcca1c7" containerName="nova-api-log" containerID="cri-o://5ac15de2db274c854f4ae19c9240c2c49ff5d606a1545de00284aab70223f4cf" gracePeriod=30 Dec 01 08:42:48 crc kubenswrapper[5004]: I1201 08:42:48.998591 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="9ae06efb-c5ea-45b8-9c2a-c7c43dcca1c7" containerName="nova-api-api" containerID="cri-o://31e1c0b0831e8b10a469f9a8414e345f4fc81d6a710a1a517964ea830fd4ef68" gracePeriod=30 Dec 01 08:42:49 crc kubenswrapper[5004]: I1201 08:42:49.001830 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/902842db-d6a2-4ae1-8d5e-25b637f4db2c-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 08:42:49 crc kubenswrapper[5004]: I1201 08:42:49.001865 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/902842db-d6a2-4ae1-8d5e-25b637f4db2c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:42:49 crc kubenswrapper[5004]: I1201 08:42:49.001882 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f68x4\" (UniqueName: \"kubernetes.io/projected/902842db-d6a2-4ae1-8d5e-25b637f4db2c-kube-api-access-f68x4\") on node \"crc\" DevicePath \"\"" Dec 01 08:42:49 crc kubenswrapper[5004]: I1201 08:42:49.001894 5004 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/902842db-d6a2-4ae1-8d5e-25b637f4db2c-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 08:42:49 crc kubenswrapper[5004]: I1201 08:42:49.021213 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 08:42:49 crc kubenswrapper[5004]: I1201 08:42:49.021445 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="948b1e6b-150b-44f9-8dc9-d6731e08f6e8" containerName="nova-scheduler-scheduler" containerID="cri-o://6334c3e913314b751db05ebfd38b595352ae3df0deb67fd3e8513e67bc61f3f3" gracePeriod=30 Dec 01 08:42:49 crc kubenswrapper[5004]: I1201 08:42:49.051360 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 08:42:49 crc kubenswrapper[5004]: I1201 08:42:49.051584 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c0eebd6d-9db9-4e42-b0bf-e98a99f77f96" containerName="nova-metadata-log" containerID="cri-o://537214a11e8a15750d36fc700bc7fa066aa2267e43f394864938d035fc4c63d2" gracePeriod=30 Dec 01 08:42:49 crc kubenswrapper[5004]: I1201 08:42:49.052033 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c0eebd6d-9db9-4e42-b0bf-e98a99f77f96" containerName="nova-metadata-metadata" containerID="cri-o://9a25b2f9d039312daa158f7a5afbd0cf906f0d6766aa7e29e90f28390bd8ea5a" gracePeriod=30 Dec 01 08:42:49 crc kubenswrapper[5004]: I1201 08:42:49.205744 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 01 08:42:49 crc kubenswrapper[5004]: I1201 08:42:49.205797 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 01 08:42:49 crc kubenswrapper[5004]: I1201 08:42:49.883802 5004 generic.go:334] "Generic (PLEG): container finished" podID="9ae06efb-c5ea-45b8-9c2a-c7c43dcca1c7" containerID="5ac15de2db274c854f4ae19c9240c2c49ff5d606a1545de00284aab70223f4cf" exitCode=143 Dec 01 08:42:49 crc kubenswrapper[5004]: I1201 08:42:49.884029 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9ae06efb-c5ea-45b8-9c2a-c7c43dcca1c7","Type":"ContainerDied","Data":"5ac15de2db274c854f4ae19c9240c2c49ff5d606a1545de00284aab70223f4cf"} Dec 01 08:42:49 crc kubenswrapper[5004]: I1201 08:42:49.904969 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7f55f02c-5371-4258-b5cf-9565b29798b9","Type":"ContainerStarted","Data":"42f9b588d70d3af67504f05f69f5fe67ff5311a5aff39254ca127ec5ef25cb05"} Dec 01 08:42:49 crc kubenswrapper[5004]: I1201 08:42:49.910445 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6792a811-4156-4b0a-b6b4-6ff3280229d2","Type":"ContainerStarted","Data":"3997889ffc5431fc2825ff310e60d0300319a23df2695880a877b33ad1ff73a7"} Dec 01 08:42:49 crc kubenswrapper[5004]: I1201 08:42:49.914153 5004 generic.go:334] "Generic (PLEG): container finished" podID="c0eebd6d-9db9-4e42-b0bf-e98a99f77f96" containerID="9a25b2f9d039312daa158f7a5afbd0cf906f0d6766aa7e29e90f28390bd8ea5a" exitCode=0 Dec 01 08:42:49 crc kubenswrapper[5004]: I1201 08:42:49.914191 5004 generic.go:334] "Generic (PLEG): container finished" podID="c0eebd6d-9db9-4e42-b0bf-e98a99f77f96" containerID="537214a11e8a15750d36fc700bc7fa066aa2267e43f394864938d035fc4c63d2" exitCode=143 Dec 01 08:42:49 crc kubenswrapper[5004]: I1201 08:42:49.914350 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c0eebd6d-9db9-4e42-b0bf-e98a99f77f96","Type":"ContainerDied","Data":"9a25b2f9d039312daa158f7a5afbd0cf906f0d6766aa7e29e90f28390bd8ea5a"} Dec 01 08:42:49 crc kubenswrapper[5004]: I1201 08:42:49.914375 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c0eebd6d-9db9-4e42-b0bf-e98a99f77f96","Type":"ContainerDied","Data":"537214a11e8a15750d36fc700bc7fa066aa2267e43f394864938d035fc4c63d2"} Dec 01 08:42:50 crc kubenswrapper[5004]: I1201 08:42:50.236542 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 08:42:50 crc kubenswrapper[5004]: I1201 08:42:50.331615 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfnz2\" (UniqueName: \"kubernetes.io/projected/c0eebd6d-9db9-4e42-b0bf-e98a99f77f96-kube-api-access-tfnz2\") pod \"c0eebd6d-9db9-4e42-b0bf-e98a99f77f96\" (UID: \"c0eebd6d-9db9-4e42-b0bf-e98a99f77f96\") " Dec 01 08:42:50 crc kubenswrapper[5004]: I1201 08:42:50.331810 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0eebd6d-9db9-4e42-b0bf-e98a99f77f96-combined-ca-bundle\") pod \"c0eebd6d-9db9-4e42-b0bf-e98a99f77f96\" (UID: \"c0eebd6d-9db9-4e42-b0bf-e98a99f77f96\") " Dec 01 08:42:50 crc kubenswrapper[5004]: I1201 08:42:50.331864 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0eebd6d-9db9-4e42-b0bf-e98a99f77f96-nova-metadata-tls-certs\") pod \"c0eebd6d-9db9-4e42-b0bf-e98a99f77f96\" (UID: \"c0eebd6d-9db9-4e42-b0bf-e98a99f77f96\") " Dec 01 08:42:50 crc kubenswrapper[5004]: I1201 08:42:50.331997 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0eebd6d-9db9-4e42-b0bf-e98a99f77f96-logs\") pod \"c0eebd6d-9db9-4e42-b0bf-e98a99f77f96\" (UID: \"c0eebd6d-9db9-4e42-b0bf-e98a99f77f96\") " Dec 01 08:42:50 crc kubenswrapper[5004]: I1201 08:42:50.332013 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0eebd6d-9db9-4e42-b0bf-e98a99f77f96-config-data\") pod \"c0eebd6d-9db9-4e42-b0bf-e98a99f77f96\" (UID: \"c0eebd6d-9db9-4e42-b0bf-e98a99f77f96\") " Dec 01 08:42:50 crc kubenswrapper[5004]: I1201 08:42:50.338114 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0eebd6d-9db9-4e42-b0bf-e98a99f77f96-kube-api-access-tfnz2" (OuterVolumeSpecName: "kube-api-access-tfnz2") pod "c0eebd6d-9db9-4e42-b0bf-e98a99f77f96" (UID: "c0eebd6d-9db9-4e42-b0bf-e98a99f77f96"). InnerVolumeSpecName "kube-api-access-tfnz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:42:50 crc kubenswrapper[5004]: I1201 08:42:50.353384 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0eebd6d-9db9-4e42-b0bf-e98a99f77f96-logs" (OuterVolumeSpecName: "logs") pod "c0eebd6d-9db9-4e42-b0bf-e98a99f77f96" (UID: "c0eebd6d-9db9-4e42-b0bf-e98a99f77f96"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:42:50 crc kubenswrapper[5004]: I1201 08:42:50.434302 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfnz2\" (UniqueName: \"kubernetes.io/projected/c0eebd6d-9db9-4e42-b0bf-e98a99f77f96-kube-api-access-tfnz2\") on node \"crc\" DevicePath \"\"" Dec 01 08:42:50 crc kubenswrapper[5004]: I1201 08:42:50.434334 5004 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0eebd6d-9db9-4e42-b0bf-e98a99f77f96-logs\") on node \"crc\" DevicePath \"\"" Dec 01 08:42:50 crc kubenswrapper[5004]: I1201 08:42:50.441880 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0eebd6d-9db9-4e42-b0bf-e98a99f77f96-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c0eebd6d-9db9-4e42-b0bf-e98a99f77f96" (UID: "c0eebd6d-9db9-4e42-b0bf-e98a99f77f96"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:42:50 crc kubenswrapper[5004]: I1201 08:42:50.467523 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0eebd6d-9db9-4e42-b0bf-e98a99f77f96-config-data" (OuterVolumeSpecName: "config-data") pod "c0eebd6d-9db9-4e42-b0bf-e98a99f77f96" (UID: "c0eebd6d-9db9-4e42-b0bf-e98a99f77f96"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:42:50 crc kubenswrapper[5004]: I1201 08:42:50.484199 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0eebd6d-9db9-4e42-b0bf-e98a99f77f96-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "c0eebd6d-9db9-4e42-b0bf-e98a99f77f96" (UID: "c0eebd6d-9db9-4e42-b0bf-e98a99f77f96"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:42:50 crc kubenswrapper[5004]: I1201 08:42:50.536096 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0eebd6d-9db9-4e42-b0bf-e98a99f77f96-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 08:42:50 crc kubenswrapper[5004]: I1201 08:42:50.536126 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0eebd6d-9db9-4e42-b0bf-e98a99f77f96-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:42:50 crc kubenswrapper[5004]: I1201 08:42:50.536140 5004 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0eebd6d-9db9-4e42-b0bf-e98a99f77f96-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 08:42:50 crc kubenswrapper[5004]: I1201 08:42:50.556966 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-5gbbc" Dec 01 08:42:50 crc kubenswrapper[5004]: I1201 08:42:50.638793 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6f2z4\" (UniqueName: \"kubernetes.io/projected/a31898e1-9571-43cb-b754-5544a7898213-kube-api-access-6f2z4\") pod \"a31898e1-9571-43cb-b754-5544a7898213\" (UID: \"a31898e1-9571-43cb-b754-5544a7898213\") " Dec 01 08:42:50 crc kubenswrapper[5004]: I1201 08:42:50.638855 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a31898e1-9571-43cb-b754-5544a7898213-combined-ca-bundle\") pod \"a31898e1-9571-43cb-b754-5544a7898213\" (UID: \"a31898e1-9571-43cb-b754-5544a7898213\") " Dec 01 08:42:50 crc kubenswrapper[5004]: I1201 08:42:50.638964 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a31898e1-9571-43cb-b754-5544a7898213-scripts\") pod \"a31898e1-9571-43cb-b754-5544a7898213\" (UID: \"a31898e1-9571-43cb-b754-5544a7898213\") " Dec 01 08:42:50 crc kubenswrapper[5004]: I1201 08:42:50.639186 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a31898e1-9571-43cb-b754-5544a7898213-config-data\") pod \"a31898e1-9571-43cb-b754-5544a7898213\" (UID: \"a31898e1-9571-43cb-b754-5544a7898213\") " Dec 01 08:42:50 crc kubenswrapper[5004]: I1201 08:42:50.647988 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31898e1-9571-43cb-b754-5544a7898213-kube-api-access-6f2z4" (OuterVolumeSpecName: "kube-api-access-6f2z4") pod "a31898e1-9571-43cb-b754-5544a7898213" (UID: "a31898e1-9571-43cb-b754-5544a7898213"). InnerVolumeSpecName "kube-api-access-6f2z4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:42:50 crc kubenswrapper[5004]: I1201 08:42:50.651760 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31898e1-9571-43cb-b754-5544a7898213-scripts" (OuterVolumeSpecName: "scripts") pod "a31898e1-9571-43cb-b754-5544a7898213" (UID: "a31898e1-9571-43cb-b754-5544a7898213"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:42:50 crc kubenswrapper[5004]: I1201 08:42:50.682746 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31898e1-9571-43cb-b754-5544a7898213-config-data" (OuterVolumeSpecName: "config-data") pod "a31898e1-9571-43cb-b754-5544a7898213" (UID: "a31898e1-9571-43cb-b754-5544a7898213"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:42:50 crc kubenswrapper[5004]: I1201 08:42:50.695066 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31898e1-9571-43cb-b754-5544a7898213-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a31898e1-9571-43cb-b754-5544a7898213" (UID: "a31898e1-9571-43cb-b754-5544a7898213"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:42:50 crc kubenswrapper[5004]: I1201 08:42:50.742654 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6f2z4\" (UniqueName: \"kubernetes.io/projected/a31898e1-9571-43cb-b754-5544a7898213-kube-api-access-6f2z4\") on node \"crc\" DevicePath \"\"" Dec 01 08:42:50 crc kubenswrapper[5004]: I1201 08:42:50.742688 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a31898e1-9571-43cb-b754-5544a7898213-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:42:50 crc kubenswrapper[5004]: I1201 08:42:50.742698 5004 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a31898e1-9571-43cb-b754-5544a7898213-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 08:42:50 crc kubenswrapper[5004]: I1201 08:42:50.742708 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a31898e1-9571-43cb-b754-5544a7898213-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 08:42:50 crc kubenswrapper[5004]: I1201 08:42:50.957047 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c0eebd6d-9db9-4e42-b0bf-e98a99f77f96","Type":"ContainerDied","Data":"54dae97069387d32db9291f645b42a728001f74aa2222da2bdf1d19220279e88"} Dec 01 08:42:50 crc kubenswrapper[5004]: I1201 08:42:50.961199 5004 scope.go:117] "RemoveContainer" containerID="9a25b2f9d039312daa158f7a5afbd0cf906f0d6766aa7e29e90f28390bd8ea5a" Dec 01 08:42:50 crc kubenswrapper[5004]: I1201 08:42:50.957062 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 08:42:50 crc kubenswrapper[5004]: I1201 08:42:50.967964 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 08:42:50 crc kubenswrapper[5004]: I1201 08:42:50.968074 5004 generic.go:334] "Generic (PLEG): container finished" podID="948b1e6b-150b-44f9-8dc9-d6731e08f6e8" containerID="6334c3e913314b751db05ebfd38b595352ae3df0deb67fd3e8513e67bc61f3f3" exitCode=0 Dec 01 08:42:50 crc kubenswrapper[5004]: I1201 08:42:50.968176 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"948b1e6b-150b-44f9-8dc9-d6731e08f6e8","Type":"ContainerDied","Data":"6334c3e913314b751db05ebfd38b595352ae3df0deb67fd3e8513e67bc61f3f3"} Dec 01 08:42:50 crc kubenswrapper[5004]: I1201 08:42:50.993993 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 01 08:42:50 crc kubenswrapper[5004]: E1201 08:42:50.994757 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0eebd6d-9db9-4e42-b0bf-e98a99f77f96" containerName="nova-metadata-metadata" Dec 01 08:42:50 crc kubenswrapper[5004]: I1201 08:42:50.994775 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0eebd6d-9db9-4e42-b0bf-e98a99f77f96" containerName="nova-metadata-metadata" Dec 01 08:42:50 crc kubenswrapper[5004]: E1201 08:42:50.994801 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="902842db-d6a2-4ae1-8d5e-25b637f4db2c" containerName="nova-manage" Dec 01 08:42:50 crc kubenswrapper[5004]: I1201 08:42:50.994808 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="902842db-d6a2-4ae1-8d5e-25b637f4db2c" containerName="nova-manage" Dec 01 08:42:50 crc kubenswrapper[5004]: E1201 08:42:50.994814 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a31898e1-9571-43cb-b754-5544a7898213" containerName="nova-cell1-conductor-db-sync" Dec 01 08:42:50 crc kubenswrapper[5004]: I1201 08:42:50.994821 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="a31898e1-9571-43cb-b754-5544a7898213" containerName="nova-cell1-conductor-db-sync" Dec 01 08:42:50 crc kubenswrapper[5004]: E1201 08:42:50.994833 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0eebd6d-9db9-4e42-b0bf-e98a99f77f96" containerName="nova-metadata-log" Dec 01 08:42:50 crc kubenswrapper[5004]: I1201 08:42:50.994839 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0eebd6d-9db9-4e42-b0bf-e98a99f77f96" containerName="nova-metadata-log" Dec 01 08:42:50 crc kubenswrapper[5004]: E1201 08:42:50.994865 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d31297b-7a09-4806-b488-1c6e7ea17ab0" containerName="init" Dec 01 08:42:50 crc kubenswrapper[5004]: I1201 08:42:50.994870 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d31297b-7a09-4806-b488-1c6e7ea17ab0" containerName="init" Dec 01 08:42:50 crc kubenswrapper[5004]: E1201 08:42:50.994881 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d31297b-7a09-4806-b488-1c6e7ea17ab0" containerName="dnsmasq-dns" Dec 01 08:42:51 crc kubenswrapper[5004]: I1201 08:42:50.994888 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d31297b-7a09-4806-b488-1c6e7ea17ab0" containerName="dnsmasq-dns" Dec 01 08:42:51 crc kubenswrapper[5004]: E1201 08:42:50.994914 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="948b1e6b-150b-44f9-8dc9-d6731e08f6e8" containerName="nova-scheduler-scheduler" Dec 01 08:42:51 crc kubenswrapper[5004]: I1201 08:42:50.994920 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="948b1e6b-150b-44f9-8dc9-d6731e08f6e8" containerName="nova-scheduler-scheduler" Dec 01 08:42:51 crc kubenswrapper[5004]: I1201 08:42:50.995261 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="948b1e6b-150b-44f9-8dc9-d6731e08f6e8" containerName="nova-scheduler-scheduler" Dec 01 08:42:51 crc kubenswrapper[5004]: I1201 08:42:50.995284 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="902842db-d6a2-4ae1-8d5e-25b637f4db2c" containerName="nova-manage" Dec 01 08:42:51 crc kubenswrapper[5004]: I1201 08:42:50.995295 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0eebd6d-9db9-4e42-b0bf-e98a99f77f96" containerName="nova-metadata-log" Dec 01 08:42:51 crc kubenswrapper[5004]: I1201 08:42:50.995308 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0eebd6d-9db9-4e42-b0bf-e98a99f77f96" containerName="nova-metadata-metadata" Dec 01 08:42:51 crc kubenswrapper[5004]: I1201 08:42:50.995320 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="a31898e1-9571-43cb-b754-5544a7898213" containerName="nova-cell1-conductor-db-sync" Dec 01 08:42:51 crc kubenswrapper[5004]: I1201 08:42:50.995330 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d31297b-7a09-4806-b488-1c6e7ea17ab0" containerName="dnsmasq-dns" Dec 01 08:42:51 crc kubenswrapper[5004]: I1201 08:42:51.001183 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 01 08:42:51 crc kubenswrapper[5004]: I1201 08:42:51.017871 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 01 08:42:51 crc kubenswrapper[5004]: I1201 08:42:51.021017 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7f55f02c-5371-4258-b5cf-9565b29798b9","Type":"ContainerStarted","Data":"2e1530cdf061e5fb1b8209f839d146c3c7c4f1bfe7b77d7de57decf909fe873d"} Dec 01 08:42:51 crc kubenswrapper[5004]: I1201 08:42:51.032120 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 08:42:51 crc kubenswrapper[5004]: I1201 08:42:51.040267 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-5gbbc" event={"ID":"a31898e1-9571-43cb-b754-5544a7898213","Type":"ContainerDied","Data":"8da75b95a188065c5f6aafe48f7040747aa7c972a53ca9b23822d6a4b7fbec6c"} Dec 01 08:42:51 crc kubenswrapper[5004]: I1201 08:42:51.040305 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8da75b95a188065c5f6aafe48f7040747aa7c972a53ca9b23822d6a4b7fbec6c" Dec 01 08:42:51 crc kubenswrapper[5004]: I1201 08:42:51.040364 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-5gbbc" Dec 01 08:42:51 crc kubenswrapper[5004]: I1201 08:42:51.051913 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 08:42:51 crc kubenswrapper[5004]: I1201 08:42:51.052902 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/948b1e6b-150b-44f9-8dc9-d6731e08f6e8-combined-ca-bundle\") pod \"948b1e6b-150b-44f9-8dc9-d6731e08f6e8\" (UID: \"948b1e6b-150b-44f9-8dc9-d6731e08f6e8\") " Dec 01 08:42:51 crc kubenswrapper[5004]: I1201 08:42:51.053088 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltzj8\" (UniqueName: \"kubernetes.io/projected/948b1e6b-150b-44f9-8dc9-d6731e08f6e8-kube-api-access-ltzj8\") pod \"948b1e6b-150b-44f9-8dc9-d6731e08f6e8\" (UID: \"948b1e6b-150b-44f9-8dc9-d6731e08f6e8\") " Dec 01 08:42:51 crc kubenswrapper[5004]: I1201 08:42:51.053138 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/948b1e6b-150b-44f9-8dc9-d6731e08f6e8-config-data\") pod \"948b1e6b-150b-44f9-8dc9-d6731e08f6e8\" (UID: \"948b1e6b-150b-44f9-8dc9-d6731e08f6e8\") " Dec 01 08:42:51 crc kubenswrapper[5004]: I1201 08:42:51.061671 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/948b1e6b-150b-44f9-8dc9-d6731e08f6e8-kube-api-access-ltzj8" (OuterVolumeSpecName: "kube-api-access-ltzj8") pod "948b1e6b-150b-44f9-8dc9-d6731e08f6e8" (UID: "948b1e6b-150b-44f9-8dc9-d6731e08f6e8"). InnerVolumeSpecName "kube-api-access-ltzj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:42:51 crc kubenswrapper[5004]: I1201 08:42:51.061734 5004 scope.go:117] "RemoveContainer" containerID="537214a11e8a15750d36fc700bc7fa066aa2267e43f394864938d035fc4c63d2" Dec 01 08:42:51 crc kubenswrapper[5004]: I1201 08:42:51.095643 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 01 08:42:51 crc kubenswrapper[5004]: I1201 08:42:51.097409 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 08:42:51 crc kubenswrapper[5004]: I1201 08:42:51.102521 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 01 08:42:51 crc kubenswrapper[5004]: I1201 08:42:51.102747 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 01 08:42:51 crc kubenswrapper[5004]: I1201 08:42:51.103610 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/948b1e6b-150b-44f9-8dc9-d6731e08f6e8-config-data" (OuterVolumeSpecName: "config-data") pod "948b1e6b-150b-44f9-8dc9-d6731e08f6e8" (UID: "948b1e6b-150b-44f9-8dc9-d6731e08f6e8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:42:51 crc kubenswrapper[5004]: I1201 08:42:51.165761 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 08:42:51 crc kubenswrapper[5004]: I1201 08:42:51.168827 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6stfb\" (UniqueName: \"kubernetes.io/projected/6ac2f2ad-2604-4972-896b-07e23c74ac8d-kube-api-access-6stfb\") pod \"nova-cell1-conductor-0\" (UID: \"6ac2f2ad-2604-4972-896b-07e23c74ac8d\") " pod="openstack/nova-cell1-conductor-0" Dec 01 08:42:51 crc kubenswrapper[5004]: I1201 08:42:51.169237 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ac2f2ad-2604-4972-896b-07e23c74ac8d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"6ac2f2ad-2604-4972-896b-07e23c74ac8d\") " pod="openstack/nova-cell1-conductor-0" Dec 01 08:42:51 crc kubenswrapper[5004]: I1201 08:42:51.169375 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ac2f2ad-2604-4972-896b-07e23c74ac8d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"6ac2f2ad-2604-4972-896b-07e23c74ac8d\") " pod="openstack/nova-cell1-conductor-0" Dec 01 08:42:51 crc kubenswrapper[5004]: I1201 08:42:51.170006 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltzj8\" (UniqueName: \"kubernetes.io/projected/948b1e6b-150b-44f9-8dc9-d6731e08f6e8-kube-api-access-ltzj8\") on node \"crc\" DevicePath \"\"" Dec 01 08:42:51 crc kubenswrapper[5004]: I1201 08:42:51.170051 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/948b1e6b-150b-44f9-8dc9-d6731e08f6e8-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 08:42:51 crc kubenswrapper[5004]: I1201 08:42:51.201699 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/948b1e6b-150b-44f9-8dc9-d6731e08f6e8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "948b1e6b-150b-44f9-8dc9-d6731e08f6e8" (UID: "948b1e6b-150b-44f9-8dc9-d6731e08f6e8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:42:51 crc kubenswrapper[5004]: I1201 08:42:51.271919 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/855d076a-eacb-40d9-9135-8a1a3cf64f59-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"855d076a-eacb-40d9-9135-8a1a3cf64f59\") " pod="openstack/nova-metadata-0" Dec 01 08:42:51 crc kubenswrapper[5004]: I1201 08:42:51.271982 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6stfb\" (UniqueName: \"kubernetes.io/projected/6ac2f2ad-2604-4972-896b-07e23c74ac8d-kube-api-access-6stfb\") pod \"nova-cell1-conductor-0\" (UID: \"6ac2f2ad-2604-4972-896b-07e23c74ac8d\") " pod="openstack/nova-cell1-conductor-0" Dec 01 08:42:51 crc kubenswrapper[5004]: I1201 08:42:51.272042 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/855d076a-eacb-40d9-9135-8a1a3cf64f59-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"855d076a-eacb-40d9-9135-8a1a3cf64f59\") " pod="openstack/nova-metadata-0" Dec 01 08:42:51 crc kubenswrapper[5004]: I1201 08:42:51.272076 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/855d076a-eacb-40d9-9135-8a1a3cf64f59-config-data\") pod \"nova-metadata-0\" (UID: \"855d076a-eacb-40d9-9135-8a1a3cf64f59\") " pod="openstack/nova-metadata-0" Dec 01 08:42:51 crc kubenswrapper[5004]: I1201 08:42:51.272142 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/855d076a-eacb-40d9-9135-8a1a3cf64f59-logs\") pod \"nova-metadata-0\" (UID: \"855d076a-eacb-40d9-9135-8a1a3cf64f59\") " pod="openstack/nova-metadata-0" Dec 01 08:42:51 crc kubenswrapper[5004]: I1201 08:42:51.272200 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ac2f2ad-2604-4972-896b-07e23c74ac8d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"6ac2f2ad-2604-4972-896b-07e23c74ac8d\") " pod="openstack/nova-cell1-conductor-0" Dec 01 08:42:51 crc kubenswrapper[5004]: I1201 08:42:51.272249 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ac2f2ad-2604-4972-896b-07e23c74ac8d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"6ac2f2ad-2604-4972-896b-07e23c74ac8d\") " pod="openstack/nova-cell1-conductor-0" Dec 01 08:42:51 crc kubenswrapper[5004]: I1201 08:42:51.272295 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qft5\" (UniqueName: \"kubernetes.io/projected/855d076a-eacb-40d9-9135-8a1a3cf64f59-kube-api-access-9qft5\") pod \"nova-metadata-0\" (UID: \"855d076a-eacb-40d9-9135-8a1a3cf64f59\") " pod="openstack/nova-metadata-0" Dec 01 08:42:51 crc kubenswrapper[5004]: I1201 08:42:51.272362 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/948b1e6b-150b-44f9-8dc9-d6731e08f6e8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:42:51 crc kubenswrapper[5004]: I1201 08:42:51.276762 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ac2f2ad-2604-4972-896b-07e23c74ac8d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"6ac2f2ad-2604-4972-896b-07e23c74ac8d\") " pod="openstack/nova-cell1-conductor-0" Dec 01 08:42:51 crc kubenswrapper[5004]: I1201 08:42:51.277800 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ac2f2ad-2604-4972-896b-07e23c74ac8d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"6ac2f2ad-2604-4972-896b-07e23c74ac8d\") " pod="openstack/nova-cell1-conductor-0" Dec 01 08:42:51 crc kubenswrapper[5004]: I1201 08:42:51.291298 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6stfb\" (UniqueName: \"kubernetes.io/projected/6ac2f2ad-2604-4972-896b-07e23c74ac8d-kube-api-access-6stfb\") pod \"nova-cell1-conductor-0\" (UID: \"6ac2f2ad-2604-4972-896b-07e23c74ac8d\") " pod="openstack/nova-cell1-conductor-0" Dec 01 08:42:51 crc kubenswrapper[5004]: I1201 08:42:51.296498 5004 scope.go:117] "RemoveContainer" containerID="6334c3e913314b751db05ebfd38b595352ae3df0deb67fd3e8513e67bc61f3f3" Dec 01 08:42:51 crc kubenswrapper[5004]: I1201 08:42:51.343229 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 01 08:42:51 crc kubenswrapper[5004]: I1201 08:42:51.374225 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/855d076a-eacb-40d9-9135-8a1a3cf64f59-logs\") pod \"nova-metadata-0\" (UID: \"855d076a-eacb-40d9-9135-8a1a3cf64f59\") " pod="openstack/nova-metadata-0" Dec 01 08:42:51 crc kubenswrapper[5004]: I1201 08:42:51.374358 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qft5\" (UniqueName: \"kubernetes.io/projected/855d076a-eacb-40d9-9135-8a1a3cf64f59-kube-api-access-9qft5\") pod \"nova-metadata-0\" (UID: \"855d076a-eacb-40d9-9135-8a1a3cf64f59\") " pod="openstack/nova-metadata-0" Dec 01 08:42:51 crc kubenswrapper[5004]: I1201 08:42:51.374437 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/855d076a-eacb-40d9-9135-8a1a3cf64f59-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"855d076a-eacb-40d9-9135-8a1a3cf64f59\") " pod="openstack/nova-metadata-0" Dec 01 08:42:51 crc kubenswrapper[5004]: I1201 08:42:51.374496 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/855d076a-eacb-40d9-9135-8a1a3cf64f59-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"855d076a-eacb-40d9-9135-8a1a3cf64f59\") " pod="openstack/nova-metadata-0" Dec 01 08:42:51 crc kubenswrapper[5004]: I1201 08:42:51.374522 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/855d076a-eacb-40d9-9135-8a1a3cf64f59-config-data\") pod \"nova-metadata-0\" (UID: \"855d076a-eacb-40d9-9135-8a1a3cf64f59\") " pod="openstack/nova-metadata-0" Dec 01 08:42:51 crc kubenswrapper[5004]: I1201 08:42:51.375383 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/855d076a-eacb-40d9-9135-8a1a3cf64f59-logs\") pod \"nova-metadata-0\" (UID: \"855d076a-eacb-40d9-9135-8a1a3cf64f59\") " pod="openstack/nova-metadata-0" Dec 01 08:42:51 crc kubenswrapper[5004]: I1201 08:42:51.379180 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/855d076a-eacb-40d9-9135-8a1a3cf64f59-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"855d076a-eacb-40d9-9135-8a1a3cf64f59\") " pod="openstack/nova-metadata-0" Dec 01 08:42:51 crc kubenswrapper[5004]: I1201 08:42:51.379189 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/855d076a-eacb-40d9-9135-8a1a3cf64f59-config-data\") pod \"nova-metadata-0\" (UID: \"855d076a-eacb-40d9-9135-8a1a3cf64f59\") " pod="openstack/nova-metadata-0" Dec 01 08:42:51 crc kubenswrapper[5004]: I1201 08:42:51.381339 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/855d076a-eacb-40d9-9135-8a1a3cf64f59-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"855d076a-eacb-40d9-9135-8a1a3cf64f59\") " pod="openstack/nova-metadata-0" Dec 01 08:42:51 crc kubenswrapper[5004]: I1201 08:42:51.396575 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qft5\" (UniqueName: \"kubernetes.io/projected/855d076a-eacb-40d9-9135-8a1a3cf64f59-kube-api-access-9qft5\") pod \"nova-metadata-0\" (UID: \"855d076a-eacb-40d9-9135-8a1a3cf64f59\") " pod="openstack/nova-metadata-0" Dec 01 08:42:51 crc kubenswrapper[5004]: I1201 08:42:51.652770 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 08:42:52 crc kubenswrapper[5004]: I1201 08:42:52.072130 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6792a811-4156-4b0a-b6b4-6ff3280229d2","Type":"ContainerStarted","Data":"e789ac7dbb636b11509724961c9b1b5f2335aa9d96fb8d46c0eba9d201fadfd9"} Dec 01 08:42:52 crc kubenswrapper[5004]: I1201 08:42:52.072220 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="6792a811-4156-4b0a-b6b4-6ff3280229d2" containerName="aodh-api" containerID="cri-o://9ab23b6a648f9483fa5184cfcbb95b1a2b97c9bcd584bcdead439cfe601c29e3" gracePeriod=30 Dec 01 08:42:52 crc kubenswrapper[5004]: I1201 08:42:52.072237 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="6792a811-4156-4b0a-b6b4-6ff3280229d2" containerName="aodh-listener" containerID="cri-o://e789ac7dbb636b11509724961c9b1b5f2335aa9d96fb8d46c0eba9d201fadfd9" gracePeriod=30 Dec 01 08:42:52 crc kubenswrapper[5004]: I1201 08:42:52.072392 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="6792a811-4156-4b0a-b6b4-6ff3280229d2" containerName="aodh-evaluator" containerID="cri-o://2f13ef7709655a3d8d5548c58b0d41f1559b92eed686d7880bfceca816c21aad" gracePeriod=30 Dec 01 08:42:52 crc kubenswrapper[5004]: I1201 08:42:52.072433 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="6792a811-4156-4b0a-b6b4-6ff3280229d2" containerName="aodh-notifier" containerID="cri-o://3997889ffc5431fc2825ff310e60d0300319a23df2695880a877b33ad1ff73a7" gracePeriod=30 Dec 01 08:42:52 crc kubenswrapper[5004]: I1201 08:42:52.078079 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"948b1e6b-150b-44f9-8dc9-d6731e08f6e8","Type":"ContainerDied","Data":"ec843f516d26844c01c54fcc628d52a88f13f3343bc6af33cd66dc08e85e2202"} Dec 01 08:42:52 crc kubenswrapper[5004]: I1201 08:42:52.078297 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 08:42:52 crc kubenswrapper[5004]: I1201 08:42:52.112009 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=3.794458628 podStartE2EDuration="15.111990794s" podCreationTimestamp="2025-12-01 08:42:37 +0000 UTC" firstStartedPulling="2025-12-01 08:42:40.493339872 +0000 UTC m=+1538.058331854" lastFinishedPulling="2025-12-01 08:42:51.810872018 +0000 UTC m=+1549.375864020" observedRunningTime="2025-12-01 08:42:52.101724942 +0000 UTC m=+1549.666716964" watchObservedRunningTime="2025-12-01 08:42:52.111990794 +0000 UTC m=+1549.676982776" Dec 01 08:42:52 crc kubenswrapper[5004]: I1201 08:42:52.191641 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 08:42:52 crc kubenswrapper[5004]: I1201 08:42:52.217771 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 08:42:52 crc kubenswrapper[5004]: I1201 08:42:52.250626 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 08:42:52 crc kubenswrapper[5004]: I1201 08:42:52.252130 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 08:42:52 crc kubenswrapper[5004]: I1201 08:42:52.264669 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 01 08:42:52 crc kubenswrapper[5004]: I1201 08:42:52.270744 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 08:42:52 crc kubenswrapper[5004]: I1201 08:42:52.366399 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 01 08:42:52 crc kubenswrapper[5004]: I1201 08:42:52.404936 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrh9n\" (UniqueName: \"kubernetes.io/projected/d16020b5-7cf1-492f-b129-5054bcbd8427-kube-api-access-wrh9n\") pod \"nova-scheduler-0\" (UID: \"d16020b5-7cf1-492f-b129-5054bcbd8427\") " pod="openstack/nova-scheduler-0" Dec 01 08:42:52 crc kubenswrapper[5004]: I1201 08:42:52.404994 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d16020b5-7cf1-492f-b129-5054bcbd8427-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d16020b5-7cf1-492f-b129-5054bcbd8427\") " pod="openstack/nova-scheduler-0" Dec 01 08:42:52 crc kubenswrapper[5004]: I1201 08:42:52.405036 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d16020b5-7cf1-492f-b129-5054bcbd8427-config-data\") pod \"nova-scheduler-0\" (UID: \"d16020b5-7cf1-492f-b129-5054bcbd8427\") " pod="openstack/nova-scheduler-0" Dec 01 08:42:52 crc kubenswrapper[5004]: I1201 08:42:52.507029 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrh9n\" (UniqueName: \"kubernetes.io/projected/d16020b5-7cf1-492f-b129-5054bcbd8427-kube-api-access-wrh9n\") pod \"nova-scheduler-0\" (UID: \"d16020b5-7cf1-492f-b129-5054bcbd8427\") " pod="openstack/nova-scheduler-0" Dec 01 08:42:52 crc kubenswrapper[5004]: I1201 08:42:52.507289 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d16020b5-7cf1-492f-b129-5054bcbd8427-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d16020b5-7cf1-492f-b129-5054bcbd8427\") " pod="openstack/nova-scheduler-0" Dec 01 08:42:52 crc kubenswrapper[5004]: I1201 08:42:52.507328 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d16020b5-7cf1-492f-b129-5054bcbd8427-config-data\") pod \"nova-scheduler-0\" (UID: \"d16020b5-7cf1-492f-b129-5054bcbd8427\") " pod="openstack/nova-scheduler-0" Dec 01 08:42:52 crc kubenswrapper[5004]: I1201 08:42:52.518999 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 08:42:52 crc kubenswrapper[5004]: I1201 08:42:52.519206 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d16020b5-7cf1-492f-b129-5054bcbd8427-config-data\") pod \"nova-scheduler-0\" (UID: \"d16020b5-7cf1-492f-b129-5054bcbd8427\") " pod="openstack/nova-scheduler-0" Dec 01 08:42:52 crc kubenswrapper[5004]: I1201 08:42:52.520337 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d16020b5-7cf1-492f-b129-5054bcbd8427-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d16020b5-7cf1-492f-b129-5054bcbd8427\") " pod="openstack/nova-scheduler-0" Dec 01 08:42:52 crc kubenswrapper[5004]: I1201 08:42:52.547183 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrh9n\" (UniqueName: \"kubernetes.io/projected/d16020b5-7cf1-492f-b129-5054bcbd8427-kube-api-access-wrh9n\") pod \"nova-scheduler-0\" (UID: \"d16020b5-7cf1-492f-b129-5054bcbd8427\") " pod="openstack/nova-scheduler-0" Dec 01 08:42:52 crc kubenswrapper[5004]: W1201 08:42:52.585718 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod855d076a_eacb_40d9_9135_8a1a3cf64f59.slice/crio-d8d8a3dcc2c19acc24ec50d9ce59f972502927fcb112181c54be8a276f306af2 WatchSource:0}: Error finding container d8d8a3dcc2c19acc24ec50d9ce59f972502927fcb112181c54be8a276f306af2: Status 404 returned error can't find the container with id d8d8a3dcc2c19acc24ec50d9ce59f972502927fcb112181c54be8a276f306af2 Dec 01 08:42:52 crc kubenswrapper[5004]: I1201 08:42:52.653836 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 08:42:52 crc kubenswrapper[5004]: I1201 08:42:52.785167 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="948b1e6b-150b-44f9-8dc9-d6731e08f6e8" path="/var/lib/kubelet/pods/948b1e6b-150b-44f9-8dc9-d6731e08f6e8/volumes" Dec 01 08:42:52 crc kubenswrapper[5004]: I1201 08:42:52.786147 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0eebd6d-9db9-4e42-b0bf-e98a99f77f96" path="/var/lib/kubelet/pods/c0eebd6d-9db9-4e42-b0bf-e98a99f77f96/volumes" Dec 01 08:42:52 crc kubenswrapper[5004]: I1201 08:42:52.956627 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 08:42:53 crc kubenswrapper[5004]: I1201 08:42:53.021462 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ae06efb-c5ea-45b8-9c2a-c7c43dcca1c7-config-data\") pod \"9ae06efb-c5ea-45b8-9c2a-c7c43dcca1c7\" (UID: \"9ae06efb-c5ea-45b8-9c2a-c7c43dcca1c7\") " Dec 01 08:42:53 crc kubenswrapper[5004]: I1201 08:42:53.021735 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ae06efb-c5ea-45b8-9c2a-c7c43dcca1c7-logs\") pod \"9ae06efb-c5ea-45b8-9c2a-c7c43dcca1c7\" (UID: \"9ae06efb-c5ea-45b8-9c2a-c7c43dcca1c7\") " Dec 01 08:42:53 crc kubenswrapper[5004]: I1201 08:42:53.021905 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvs5w\" (UniqueName: \"kubernetes.io/projected/9ae06efb-c5ea-45b8-9c2a-c7c43dcca1c7-kube-api-access-cvs5w\") pod \"9ae06efb-c5ea-45b8-9c2a-c7c43dcca1c7\" (UID: \"9ae06efb-c5ea-45b8-9c2a-c7c43dcca1c7\") " Dec 01 08:42:53 crc kubenswrapper[5004]: I1201 08:42:53.022139 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ae06efb-c5ea-45b8-9c2a-c7c43dcca1c7-combined-ca-bundle\") pod \"9ae06efb-c5ea-45b8-9c2a-c7c43dcca1c7\" (UID: \"9ae06efb-c5ea-45b8-9c2a-c7c43dcca1c7\") " Dec 01 08:42:53 crc kubenswrapper[5004]: I1201 08:42:53.022175 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ae06efb-c5ea-45b8-9c2a-c7c43dcca1c7-logs" (OuterVolumeSpecName: "logs") pod "9ae06efb-c5ea-45b8-9c2a-c7c43dcca1c7" (UID: "9ae06efb-c5ea-45b8-9c2a-c7c43dcca1c7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:42:53 crc kubenswrapper[5004]: I1201 08:42:53.022784 5004 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ae06efb-c5ea-45b8-9c2a-c7c43dcca1c7-logs\") on node \"crc\" DevicePath \"\"" Dec 01 08:42:53 crc kubenswrapper[5004]: I1201 08:42:53.049791 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ae06efb-c5ea-45b8-9c2a-c7c43dcca1c7-kube-api-access-cvs5w" (OuterVolumeSpecName: "kube-api-access-cvs5w") pod "9ae06efb-c5ea-45b8-9c2a-c7c43dcca1c7" (UID: "9ae06efb-c5ea-45b8-9c2a-c7c43dcca1c7"). InnerVolumeSpecName "kube-api-access-cvs5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:42:53 crc kubenswrapper[5004]: I1201 08:42:53.088671 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ae06efb-c5ea-45b8-9c2a-c7c43dcca1c7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9ae06efb-c5ea-45b8-9c2a-c7c43dcca1c7" (UID: "9ae06efb-c5ea-45b8-9c2a-c7c43dcca1c7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:42:53 crc kubenswrapper[5004]: I1201 08:42:53.130575 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"855d076a-eacb-40d9-9135-8a1a3cf64f59","Type":"ContainerStarted","Data":"639a91cddffd8c36fa3b41966ba1065c7bd09ff518d776caad8ffe332c1198df"} Dec 01 08:42:53 crc kubenswrapper[5004]: I1201 08:42:53.130774 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"855d076a-eacb-40d9-9135-8a1a3cf64f59","Type":"ContainerStarted","Data":"d8d8a3dcc2c19acc24ec50d9ce59f972502927fcb112181c54be8a276f306af2"} Dec 01 08:42:53 crc kubenswrapper[5004]: I1201 08:42:53.135307 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvs5w\" (UniqueName: \"kubernetes.io/projected/9ae06efb-c5ea-45b8-9c2a-c7c43dcca1c7-kube-api-access-cvs5w\") on node \"crc\" DevicePath \"\"" Dec 01 08:42:53 crc kubenswrapper[5004]: I1201 08:42:53.135333 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ae06efb-c5ea-45b8-9c2a-c7c43dcca1c7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:42:53 crc kubenswrapper[5004]: I1201 08:42:53.136831 5004 generic.go:334] "Generic (PLEG): container finished" podID="6792a811-4156-4b0a-b6b4-6ff3280229d2" containerID="3997889ffc5431fc2825ff310e60d0300319a23df2695880a877b33ad1ff73a7" exitCode=0 Dec 01 08:42:53 crc kubenswrapper[5004]: I1201 08:42:53.136958 5004 generic.go:334] "Generic (PLEG): container finished" podID="6792a811-4156-4b0a-b6b4-6ff3280229d2" containerID="2f13ef7709655a3d8d5548c58b0d41f1559b92eed686d7880bfceca816c21aad" exitCode=0 Dec 01 08:42:53 crc kubenswrapper[5004]: I1201 08:42:53.137011 5004 generic.go:334] "Generic (PLEG): container finished" podID="6792a811-4156-4b0a-b6b4-6ff3280229d2" containerID="9ab23b6a648f9483fa5184cfcbb95b1a2b97c9bcd584bcdead439cfe601c29e3" exitCode=0 Dec 01 08:42:53 crc kubenswrapper[5004]: I1201 08:42:53.136935 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6792a811-4156-4b0a-b6b4-6ff3280229d2","Type":"ContainerDied","Data":"3997889ffc5431fc2825ff310e60d0300319a23df2695880a877b33ad1ff73a7"} Dec 01 08:42:53 crc kubenswrapper[5004]: I1201 08:42:53.137165 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6792a811-4156-4b0a-b6b4-6ff3280229d2","Type":"ContainerDied","Data":"2f13ef7709655a3d8d5548c58b0d41f1559b92eed686d7880bfceca816c21aad"} Dec 01 08:42:53 crc kubenswrapper[5004]: I1201 08:42:53.137221 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6792a811-4156-4b0a-b6b4-6ff3280229d2","Type":"ContainerDied","Data":"9ab23b6a648f9483fa5184cfcbb95b1a2b97c9bcd584bcdead439cfe601c29e3"} Dec 01 08:42:53 crc kubenswrapper[5004]: I1201 08:42:53.140199 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"6ac2f2ad-2604-4972-896b-07e23c74ac8d","Type":"ContainerStarted","Data":"c81a40a7344f33addf5351a129f0e16e4aa9b1067defd89268c09747673b16e7"} Dec 01 08:42:53 crc kubenswrapper[5004]: I1201 08:42:53.140698 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 01 08:42:53 crc kubenswrapper[5004]: I1201 08:42:53.140782 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"6ac2f2ad-2604-4972-896b-07e23c74ac8d","Type":"ContainerStarted","Data":"edb4ba76983b157ddf17713aea7132f9e834270e8b6b7741d02f827a37c8d8de"} Dec 01 08:42:53 crc kubenswrapper[5004]: I1201 08:42:53.150785 5004 generic.go:334] "Generic (PLEG): container finished" podID="9ae06efb-c5ea-45b8-9c2a-c7c43dcca1c7" containerID="31e1c0b0831e8b10a469f9a8414e345f4fc81d6a710a1a517964ea830fd4ef68" exitCode=0 Dec 01 08:42:53 crc kubenswrapper[5004]: I1201 08:42:53.150850 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9ae06efb-c5ea-45b8-9c2a-c7c43dcca1c7","Type":"ContainerDied","Data":"31e1c0b0831e8b10a469f9a8414e345f4fc81d6a710a1a517964ea830fd4ef68"} Dec 01 08:42:53 crc kubenswrapper[5004]: I1201 08:42:53.150874 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9ae06efb-c5ea-45b8-9c2a-c7c43dcca1c7","Type":"ContainerDied","Data":"969483b3e1bec5f7973065db919612f9e59be2aaa24abc5b667a973cd03a3366"} Dec 01 08:42:53 crc kubenswrapper[5004]: I1201 08:42:53.150891 5004 scope.go:117] "RemoveContainer" containerID="31e1c0b0831e8b10a469f9a8414e345f4fc81d6a710a1a517964ea830fd4ef68" Dec 01 08:42:53 crc kubenswrapper[5004]: I1201 08:42:53.151426 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 08:42:53 crc kubenswrapper[5004]: I1201 08:42:53.154697 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ae06efb-c5ea-45b8-9c2a-c7c43dcca1c7-config-data" (OuterVolumeSpecName: "config-data") pod "9ae06efb-c5ea-45b8-9c2a-c7c43dcca1c7" (UID: "9ae06efb-c5ea-45b8-9c2a-c7c43dcca1c7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:42:53 crc kubenswrapper[5004]: I1201 08:42:53.169737 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=3.16971506 podStartE2EDuration="3.16971506s" podCreationTimestamp="2025-12-01 08:42:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:42:53.158871054 +0000 UTC m=+1550.723863036" watchObservedRunningTime="2025-12-01 08:42:53.16971506 +0000 UTC m=+1550.734707042" Dec 01 08:42:53 crc kubenswrapper[5004]: I1201 08:42:53.172821 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7f55f02c-5371-4258-b5cf-9565b29798b9","Type":"ContainerStarted","Data":"408d4b7c490f8fa6f226e1183de84ca77e47a016ca539196c842aa42a405b28c"} Dec 01 08:42:53 crc kubenswrapper[5004]: I1201 08:42:53.172978 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7f55f02c-5371-4258-b5cf-9565b29798b9" containerName="ceilometer-central-agent" containerID="cri-o://7b2e343e254fa87c3ee0984676d24fda0f1fa1803053782002fc004659c8c7bc" gracePeriod=30 Dec 01 08:42:53 crc kubenswrapper[5004]: I1201 08:42:53.173012 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 08:42:53 crc kubenswrapper[5004]: I1201 08:42:53.173045 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7f55f02c-5371-4258-b5cf-9565b29798b9" containerName="proxy-httpd" containerID="cri-o://408d4b7c490f8fa6f226e1183de84ca77e47a016ca539196c842aa42a405b28c" gracePeriod=30 Dec 01 08:42:53 crc kubenswrapper[5004]: I1201 08:42:53.173076 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7f55f02c-5371-4258-b5cf-9565b29798b9" containerName="ceilometer-notification-agent" containerID="cri-o://42f9b588d70d3af67504f05f69f5fe67ff5311a5aff39254ca127ec5ef25cb05" gracePeriod=30 Dec 01 08:42:53 crc kubenswrapper[5004]: I1201 08:42:53.173063 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7f55f02c-5371-4258-b5cf-9565b29798b9" containerName="sg-core" containerID="cri-o://2e1530cdf061e5fb1b8209f839d146c3c7c4f1bfe7b77d7de57decf909fe873d" gracePeriod=30 Dec 01 08:42:53 crc kubenswrapper[5004]: I1201 08:42:53.210721 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=4.902868219 podStartE2EDuration="10.21069935s" podCreationTimestamp="2025-12-01 08:42:43 +0000 UTC" firstStartedPulling="2025-12-01 08:42:46.502402921 +0000 UTC m=+1544.067394903" lastFinishedPulling="2025-12-01 08:42:51.810234042 +0000 UTC m=+1549.375226034" observedRunningTime="2025-12-01 08:42:53.198904893 +0000 UTC m=+1550.763896885" watchObservedRunningTime="2025-12-01 08:42:53.21069935 +0000 UTC m=+1550.775691332" Dec 01 08:42:53 crc kubenswrapper[5004]: I1201 08:42:53.237700 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ae06efb-c5ea-45b8-9c2a-c7c43dcca1c7-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 08:42:53 crc kubenswrapper[5004]: I1201 08:42:53.276029 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 08:42:53 crc kubenswrapper[5004]: I1201 08:42:53.280134 5004 scope.go:117] "RemoveContainer" containerID="5ac15de2db274c854f4ae19c9240c2c49ff5d606a1545de00284aab70223f4cf" Dec 01 08:42:53 crc kubenswrapper[5004]: I1201 08:42:53.434967 5004 scope.go:117] "RemoveContainer" containerID="31e1c0b0831e8b10a469f9a8414e345f4fc81d6a710a1a517964ea830fd4ef68" Dec 01 08:42:53 crc kubenswrapper[5004]: E1201 08:42:53.435529 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31e1c0b0831e8b10a469f9a8414e345f4fc81d6a710a1a517964ea830fd4ef68\": container with ID starting with 31e1c0b0831e8b10a469f9a8414e345f4fc81d6a710a1a517964ea830fd4ef68 not found: ID does not exist" containerID="31e1c0b0831e8b10a469f9a8414e345f4fc81d6a710a1a517964ea830fd4ef68" Dec 01 08:42:53 crc kubenswrapper[5004]: I1201 08:42:53.435572 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31e1c0b0831e8b10a469f9a8414e345f4fc81d6a710a1a517964ea830fd4ef68"} err="failed to get container status \"31e1c0b0831e8b10a469f9a8414e345f4fc81d6a710a1a517964ea830fd4ef68\": rpc error: code = NotFound desc = could not find container \"31e1c0b0831e8b10a469f9a8414e345f4fc81d6a710a1a517964ea830fd4ef68\": container with ID starting with 31e1c0b0831e8b10a469f9a8414e345f4fc81d6a710a1a517964ea830fd4ef68 not found: ID does not exist" Dec 01 08:42:53 crc kubenswrapper[5004]: I1201 08:42:53.435595 5004 scope.go:117] "RemoveContainer" containerID="5ac15de2db274c854f4ae19c9240c2c49ff5d606a1545de00284aab70223f4cf" Dec 01 08:42:53 crc kubenswrapper[5004]: E1201 08:42:53.436830 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ac15de2db274c854f4ae19c9240c2c49ff5d606a1545de00284aab70223f4cf\": container with ID starting with 5ac15de2db274c854f4ae19c9240c2c49ff5d606a1545de00284aab70223f4cf not found: ID does not exist" containerID="5ac15de2db274c854f4ae19c9240c2c49ff5d606a1545de00284aab70223f4cf" Dec 01 08:42:53 crc kubenswrapper[5004]: I1201 08:42:53.436883 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ac15de2db274c854f4ae19c9240c2c49ff5d606a1545de00284aab70223f4cf"} err="failed to get container status \"5ac15de2db274c854f4ae19c9240c2c49ff5d606a1545de00284aab70223f4cf\": rpc error: code = NotFound desc = could not find container \"5ac15de2db274c854f4ae19c9240c2c49ff5d606a1545de00284aab70223f4cf\": container with ID starting with 5ac15de2db274c854f4ae19c9240c2c49ff5d606a1545de00284aab70223f4cf not found: ID does not exist" Dec 01 08:42:53 crc kubenswrapper[5004]: I1201 08:42:53.510688 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 01 08:42:53 crc kubenswrapper[5004]: I1201 08:42:53.529294 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 01 08:42:53 crc kubenswrapper[5004]: I1201 08:42:53.534420 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 01 08:42:53 crc kubenswrapper[5004]: E1201 08:42:53.535132 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ae06efb-c5ea-45b8-9c2a-c7c43dcca1c7" containerName="nova-api-api" Dec 01 08:42:53 crc kubenswrapper[5004]: I1201 08:42:53.535224 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ae06efb-c5ea-45b8-9c2a-c7c43dcca1c7" containerName="nova-api-api" Dec 01 08:42:53 crc kubenswrapper[5004]: E1201 08:42:53.535315 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ae06efb-c5ea-45b8-9c2a-c7c43dcca1c7" containerName="nova-api-log" Dec 01 08:42:53 crc kubenswrapper[5004]: I1201 08:42:53.535364 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ae06efb-c5ea-45b8-9c2a-c7c43dcca1c7" containerName="nova-api-log" Dec 01 08:42:53 crc kubenswrapper[5004]: I1201 08:42:53.535696 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ae06efb-c5ea-45b8-9c2a-c7c43dcca1c7" containerName="nova-api-api" Dec 01 08:42:53 crc kubenswrapper[5004]: I1201 08:42:53.535796 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ae06efb-c5ea-45b8-9c2a-c7c43dcca1c7" containerName="nova-api-log" Dec 01 08:42:53 crc kubenswrapper[5004]: I1201 08:42:53.537235 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 08:42:53 crc kubenswrapper[5004]: I1201 08:42:53.541955 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 01 08:42:53 crc kubenswrapper[5004]: I1201 08:42:53.543519 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 08:42:53 crc kubenswrapper[5004]: I1201 08:42:53.545358 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7673aa49-60cb-427a-b089-42db27e176ee-logs\") pod \"nova-api-0\" (UID: \"7673aa49-60cb-427a-b089-42db27e176ee\") " pod="openstack/nova-api-0" Dec 01 08:42:53 crc kubenswrapper[5004]: I1201 08:42:53.545421 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7673aa49-60cb-427a-b089-42db27e176ee-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7673aa49-60cb-427a-b089-42db27e176ee\") " pod="openstack/nova-api-0" Dec 01 08:42:53 crc kubenswrapper[5004]: I1201 08:42:53.545446 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7673aa49-60cb-427a-b089-42db27e176ee-config-data\") pod \"nova-api-0\" (UID: \"7673aa49-60cb-427a-b089-42db27e176ee\") " pod="openstack/nova-api-0" Dec 01 08:42:53 crc kubenswrapper[5004]: I1201 08:42:53.545539 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5r24w\" (UniqueName: \"kubernetes.io/projected/7673aa49-60cb-427a-b089-42db27e176ee-kube-api-access-5r24w\") pod \"nova-api-0\" (UID: \"7673aa49-60cb-427a-b089-42db27e176ee\") " pod="openstack/nova-api-0" Dec 01 08:42:53 crc kubenswrapper[5004]: I1201 08:42:53.663028 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5r24w\" (UniqueName: \"kubernetes.io/projected/7673aa49-60cb-427a-b089-42db27e176ee-kube-api-access-5r24w\") pod \"nova-api-0\" (UID: \"7673aa49-60cb-427a-b089-42db27e176ee\") " pod="openstack/nova-api-0" Dec 01 08:42:53 crc kubenswrapper[5004]: I1201 08:42:53.663219 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7673aa49-60cb-427a-b089-42db27e176ee-logs\") pod \"nova-api-0\" (UID: \"7673aa49-60cb-427a-b089-42db27e176ee\") " pod="openstack/nova-api-0" Dec 01 08:42:53 crc kubenswrapper[5004]: I1201 08:42:53.663292 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7673aa49-60cb-427a-b089-42db27e176ee-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7673aa49-60cb-427a-b089-42db27e176ee\") " pod="openstack/nova-api-0" Dec 01 08:42:53 crc kubenswrapper[5004]: I1201 08:42:53.663330 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7673aa49-60cb-427a-b089-42db27e176ee-config-data\") pod \"nova-api-0\" (UID: \"7673aa49-60cb-427a-b089-42db27e176ee\") " pod="openstack/nova-api-0" Dec 01 08:42:53 crc kubenswrapper[5004]: I1201 08:42:53.665085 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7673aa49-60cb-427a-b089-42db27e176ee-logs\") pod \"nova-api-0\" (UID: \"7673aa49-60cb-427a-b089-42db27e176ee\") " pod="openstack/nova-api-0" Dec 01 08:42:53 crc kubenswrapper[5004]: I1201 08:42:53.668500 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7673aa49-60cb-427a-b089-42db27e176ee-config-data\") pod \"nova-api-0\" (UID: \"7673aa49-60cb-427a-b089-42db27e176ee\") " pod="openstack/nova-api-0" Dec 01 08:42:53 crc kubenswrapper[5004]: I1201 08:42:53.672260 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7673aa49-60cb-427a-b089-42db27e176ee-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7673aa49-60cb-427a-b089-42db27e176ee\") " pod="openstack/nova-api-0" Dec 01 08:42:53 crc kubenswrapper[5004]: I1201 08:42:53.685208 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5r24w\" (UniqueName: \"kubernetes.io/projected/7673aa49-60cb-427a-b089-42db27e176ee-kube-api-access-5r24w\") pod \"nova-api-0\" (UID: \"7673aa49-60cb-427a-b089-42db27e176ee\") " pod="openstack/nova-api-0" Dec 01 08:42:53 crc kubenswrapper[5004]: I1201 08:42:53.873418 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 08:42:54 crc kubenswrapper[5004]: I1201 08:42:54.185258 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"855d076a-eacb-40d9-9135-8a1a3cf64f59","Type":"ContainerStarted","Data":"d8a63febb8a8556f14ad5f947b65573a3ffe986bedf429596c00b97ca667ffca"} Dec 01 08:42:54 crc kubenswrapper[5004]: I1201 08:42:54.186894 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d16020b5-7cf1-492f-b129-5054bcbd8427","Type":"ContainerStarted","Data":"6827a7e54d43df41de9a37d9daa31f6e85299558d37f5eff5f15ec59ef888bc7"} Dec 01 08:42:54 crc kubenswrapper[5004]: I1201 08:42:54.186917 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d16020b5-7cf1-492f-b129-5054bcbd8427","Type":"ContainerStarted","Data":"fdf1098de1e0e3ed52fdefe45d17e365f2230bad4f48833ffaf950a19b667815"} Dec 01 08:42:54 crc kubenswrapper[5004]: I1201 08:42:54.191922 5004 generic.go:334] "Generic (PLEG): container finished" podID="7f55f02c-5371-4258-b5cf-9565b29798b9" containerID="408d4b7c490f8fa6f226e1183de84ca77e47a016ca539196c842aa42a405b28c" exitCode=0 Dec 01 08:42:54 crc kubenswrapper[5004]: I1201 08:42:54.191948 5004 generic.go:334] "Generic (PLEG): container finished" podID="7f55f02c-5371-4258-b5cf-9565b29798b9" containerID="2e1530cdf061e5fb1b8209f839d146c3c7c4f1bfe7b77d7de57decf909fe873d" exitCode=2 Dec 01 08:42:54 crc kubenswrapper[5004]: I1201 08:42:54.191959 5004 generic.go:334] "Generic (PLEG): container finished" podID="7f55f02c-5371-4258-b5cf-9565b29798b9" containerID="42f9b588d70d3af67504f05f69f5fe67ff5311a5aff39254ca127ec5ef25cb05" exitCode=0 Dec 01 08:42:54 crc kubenswrapper[5004]: I1201 08:42:54.192156 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7f55f02c-5371-4258-b5cf-9565b29798b9","Type":"ContainerDied","Data":"408d4b7c490f8fa6f226e1183de84ca77e47a016ca539196c842aa42a405b28c"} Dec 01 08:42:54 crc kubenswrapper[5004]: I1201 08:42:54.192209 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7f55f02c-5371-4258-b5cf-9565b29798b9","Type":"ContainerDied","Data":"2e1530cdf061e5fb1b8209f839d146c3c7c4f1bfe7b77d7de57decf909fe873d"} Dec 01 08:42:54 crc kubenswrapper[5004]: I1201 08:42:54.192223 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7f55f02c-5371-4258-b5cf-9565b29798b9","Type":"ContainerDied","Data":"42f9b588d70d3af67504f05f69f5fe67ff5311a5aff39254ca127ec5ef25cb05"} Dec 01 08:42:54 crc kubenswrapper[5004]: I1201 08:42:54.237405 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=4.237382459 podStartE2EDuration="4.237382459s" podCreationTimestamp="2025-12-01 08:42:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:42:54.203869461 +0000 UTC m=+1551.768861493" watchObservedRunningTime="2025-12-01 08:42:54.237382459 +0000 UTC m=+1551.802374461" Dec 01 08:42:54 crc kubenswrapper[5004]: I1201 08:42:54.245194 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.245177489 podStartE2EDuration="2.245177489s" podCreationTimestamp="2025-12-01 08:42:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:42:54.225225522 +0000 UTC m=+1551.790217504" watchObservedRunningTime="2025-12-01 08:42:54.245177489 +0000 UTC m=+1551.810169471" Dec 01 08:42:54 crc kubenswrapper[5004]: I1201 08:42:54.351017 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 08:42:54 crc kubenswrapper[5004]: I1201 08:42:54.773485 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ae06efb-c5ea-45b8-9c2a-c7c43dcca1c7" path="/var/lib/kubelet/pods/9ae06efb-c5ea-45b8-9c2a-c7c43dcca1c7/volumes" Dec 01 08:42:55 crc kubenswrapper[5004]: I1201 08:42:55.204461 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7673aa49-60cb-427a-b089-42db27e176ee","Type":"ContainerStarted","Data":"97088024db1f02e2cfefe40f95b46e731ecfdf607491dd4db34ba16d64e22e64"} Dec 01 08:42:55 crc kubenswrapper[5004]: I1201 08:42:55.206028 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7673aa49-60cb-427a-b089-42db27e176ee","Type":"ContainerStarted","Data":"60706ff456d9fc6048e6bf6d99347395c9dc671747d43be8126c7a9cb0e22c40"} Dec 01 08:42:55 crc kubenswrapper[5004]: I1201 08:42:55.206065 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7673aa49-60cb-427a-b089-42db27e176ee","Type":"ContainerStarted","Data":"71891789afc64ef3e2a0482dad7386a08cf6dd2ba50b75f23ecbdf12e89f7114"} Dec 01 08:42:55 crc kubenswrapper[5004]: I1201 08:42:55.896995 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-svs2q" Dec 01 08:42:55 crc kubenswrapper[5004]: I1201 08:42:55.932434 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.932392242 podStartE2EDuration="2.932392242s" podCreationTimestamp="2025-12-01 08:42:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:42:55.225701511 +0000 UTC m=+1552.790693503" watchObservedRunningTime="2025-12-01 08:42:55.932392242 +0000 UTC m=+1553.497384264" Dec 01 08:42:55 crc kubenswrapper[5004]: I1201 08:42:55.965329 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-svs2q" Dec 01 08:42:56 crc kubenswrapper[5004]: I1201 08:42:56.159933 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-svs2q"] Dec 01 08:42:56 crc kubenswrapper[5004]: I1201 08:42:56.652338 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 01 08:42:56 crc kubenswrapper[5004]: I1201 08:42:56.653304 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 01 08:42:56 crc kubenswrapper[5004]: I1201 08:42:56.924490 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.040995 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7f55f02c-5371-4258-b5cf-9565b29798b9-sg-core-conf-yaml\") pod \"7f55f02c-5371-4258-b5cf-9565b29798b9\" (UID: \"7f55f02c-5371-4258-b5cf-9565b29798b9\") " Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.041098 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thh9b\" (UniqueName: \"kubernetes.io/projected/7f55f02c-5371-4258-b5cf-9565b29798b9-kube-api-access-thh9b\") pod \"7f55f02c-5371-4258-b5cf-9565b29798b9\" (UID: \"7f55f02c-5371-4258-b5cf-9565b29798b9\") " Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.041176 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f55f02c-5371-4258-b5cf-9565b29798b9-scripts\") pod \"7f55f02c-5371-4258-b5cf-9565b29798b9\" (UID: \"7f55f02c-5371-4258-b5cf-9565b29798b9\") " Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.041283 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f55f02c-5371-4258-b5cf-9565b29798b9-combined-ca-bundle\") pod \"7f55f02c-5371-4258-b5cf-9565b29798b9\" (UID: \"7f55f02c-5371-4258-b5cf-9565b29798b9\") " Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.041427 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f55f02c-5371-4258-b5cf-9565b29798b9-log-httpd\") pod \"7f55f02c-5371-4258-b5cf-9565b29798b9\" (UID: \"7f55f02c-5371-4258-b5cf-9565b29798b9\") " Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.041512 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f55f02c-5371-4258-b5cf-9565b29798b9-config-data\") pod \"7f55f02c-5371-4258-b5cf-9565b29798b9\" (UID: \"7f55f02c-5371-4258-b5cf-9565b29798b9\") " Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.041554 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f55f02c-5371-4258-b5cf-9565b29798b9-run-httpd\") pod \"7f55f02c-5371-4258-b5cf-9565b29798b9\" (UID: \"7f55f02c-5371-4258-b5cf-9565b29798b9\") " Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.042077 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f55f02c-5371-4258-b5cf-9565b29798b9-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7f55f02c-5371-4258-b5cf-9565b29798b9" (UID: "7f55f02c-5371-4258-b5cf-9565b29798b9"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.042337 5004 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f55f02c-5371-4258-b5cf-9565b29798b9-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.042522 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f55f02c-5371-4258-b5cf-9565b29798b9-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7f55f02c-5371-4258-b5cf-9565b29798b9" (UID: "7f55f02c-5371-4258-b5cf-9565b29798b9"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.047731 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f55f02c-5371-4258-b5cf-9565b29798b9-scripts" (OuterVolumeSpecName: "scripts") pod "7f55f02c-5371-4258-b5cf-9565b29798b9" (UID: "7f55f02c-5371-4258-b5cf-9565b29798b9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.049032 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f55f02c-5371-4258-b5cf-9565b29798b9-kube-api-access-thh9b" (OuterVolumeSpecName: "kube-api-access-thh9b") pod "7f55f02c-5371-4258-b5cf-9565b29798b9" (UID: "7f55f02c-5371-4258-b5cf-9565b29798b9"). InnerVolumeSpecName "kube-api-access-thh9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.081259 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f55f02c-5371-4258-b5cf-9565b29798b9-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7f55f02c-5371-4258-b5cf-9565b29798b9" (UID: "7f55f02c-5371-4258-b5cf-9565b29798b9"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.143644 5004 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7f55f02c-5371-4258-b5cf-9565b29798b9-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.143671 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thh9b\" (UniqueName: \"kubernetes.io/projected/7f55f02c-5371-4258-b5cf-9565b29798b9-kube-api-access-thh9b\") on node \"crc\" DevicePath \"\"" Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.143681 5004 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f55f02c-5371-4258-b5cf-9565b29798b9-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.143692 5004 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f55f02c-5371-4258-b5cf-9565b29798b9-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.150406 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f55f02c-5371-4258-b5cf-9565b29798b9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7f55f02c-5371-4258-b5cf-9565b29798b9" (UID: "7f55f02c-5371-4258-b5cf-9565b29798b9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.223740 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f55f02c-5371-4258-b5cf-9565b29798b9-config-data" (OuterVolumeSpecName: "config-data") pod "7f55f02c-5371-4258-b5cf-9565b29798b9" (UID: "7f55f02c-5371-4258-b5cf-9565b29798b9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.239065 5004 generic.go:334] "Generic (PLEG): container finished" podID="7f55f02c-5371-4258-b5cf-9565b29798b9" containerID="7b2e343e254fa87c3ee0984676d24fda0f1fa1803053782002fc004659c8c7bc" exitCode=0 Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.240314 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.240773 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7f55f02c-5371-4258-b5cf-9565b29798b9","Type":"ContainerDied","Data":"7b2e343e254fa87c3ee0984676d24fda0f1fa1803053782002fc004659c8c7bc"} Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.240851 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-svs2q" podUID="ac4a2842-3b0b-45d4-b400-ce7be95df717" containerName="registry-server" containerID="cri-o://528c2d1956dfcbfef6946d3280cde4c12c86ac089d5eb27229f063d41fcb827e" gracePeriod=2 Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.245738 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f55f02c-5371-4258-b5cf-9565b29798b9-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.245763 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f55f02c-5371-4258-b5cf-9565b29798b9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.248316 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7f55f02c-5371-4258-b5cf-9565b29798b9","Type":"ContainerDied","Data":"b4cecaeb7fc761e8c8b511ee1a5f16d8dcafcbce8eb0d59b501027a28c49dab7"} Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.248388 5004 scope.go:117] "RemoveContainer" containerID="408d4b7c490f8fa6f226e1183de84ca77e47a016ca539196c842aa42a405b28c" Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.281966 5004 scope.go:117] "RemoveContainer" containerID="2e1530cdf061e5fb1b8209f839d146c3c7c4f1bfe7b77d7de57decf909fe873d" Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.297200 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.317796 5004 scope.go:117] "RemoveContainer" containerID="42f9b588d70d3af67504f05f69f5fe67ff5311a5aff39254ca127ec5ef25cb05" Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.327420 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.338368 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 08:42:57 crc kubenswrapper[5004]: E1201 08:42:57.338919 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f55f02c-5371-4258-b5cf-9565b29798b9" containerName="proxy-httpd" Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.338938 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f55f02c-5371-4258-b5cf-9565b29798b9" containerName="proxy-httpd" Dec 01 08:42:57 crc kubenswrapper[5004]: E1201 08:42:57.338961 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f55f02c-5371-4258-b5cf-9565b29798b9" containerName="ceilometer-notification-agent" Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.338969 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f55f02c-5371-4258-b5cf-9565b29798b9" containerName="ceilometer-notification-agent" Dec 01 08:42:57 crc kubenswrapper[5004]: E1201 08:42:57.338991 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f55f02c-5371-4258-b5cf-9565b29798b9" containerName="ceilometer-central-agent" Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.338997 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f55f02c-5371-4258-b5cf-9565b29798b9" containerName="ceilometer-central-agent" Dec 01 08:42:57 crc kubenswrapper[5004]: E1201 08:42:57.339024 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f55f02c-5371-4258-b5cf-9565b29798b9" containerName="sg-core" Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.339031 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f55f02c-5371-4258-b5cf-9565b29798b9" containerName="sg-core" Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.339252 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f55f02c-5371-4258-b5cf-9565b29798b9" containerName="ceilometer-central-agent" Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.339270 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f55f02c-5371-4258-b5cf-9565b29798b9" containerName="ceilometer-notification-agent" Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.339292 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f55f02c-5371-4258-b5cf-9565b29798b9" containerName="sg-core" Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.339307 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f55f02c-5371-4258-b5cf-9565b29798b9" containerName="proxy-httpd" Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.341440 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.343504 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.343701 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.353375 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.449754 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4772\" (UniqueName: \"kubernetes.io/projected/cf129260-3776-4aef-908f-17defb18949d-kube-api-access-x4772\") pod \"ceilometer-0\" (UID: \"cf129260-3776-4aef-908f-17defb18949d\") " pod="openstack/ceilometer-0" Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.449794 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf129260-3776-4aef-908f-17defb18949d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cf129260-3776-4aef-908f-17defb18949d\") " pod="openstack/ceilometer-0" Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.449816 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf129260-3776-4aef-908f-17defb18949d-config-data\") pod \"ceilometer-0\" (UID: \"cf129260-3776-4aef-908f-17defb18949d\") " pod="openstack/ceilometer-0" Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.449850 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cf129260-3776-4aef-908f-17defb18949d-log-httpd\") pod \"ceilometer-0\" (UID: \"cf129260-3776-4aef-908f-17defb18949d\") " pod="openstack/ceilometer-0" Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.449875 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf129260-3776-4aef-908f-17defb18949d-scripts\") pod \"ceilometer-0\" (UID: \"cf129260-3776-4aef-908f-17defb18949d\") " pod="openstack/ceilometer-0" Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.449930 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cf129260-3776-4aef-908f-17defb18949d-run-httpd\") pod \"ceilometer-0\" (UID: \"cf129260-3776-4aef-908f-17defb18949d\") " pod="openstack/ceilometer-0" Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.450033 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cf129260-3776-4aef-908f-17defb18949d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cf129260-3776-4aef-908f-17defb18949d\") " pod="openstack/ceilometer-0" Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.496827 5004 scope.go:117] "RemoveContainer" containerID="7b2e343e254fa87c3ee0984676d24fda0f1fa1803053782002fc004659c8c7bc" Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.518335 5004 scope.go:117] "RemoveContainer" containerID="408d4b7c490f8fa6f226e1183de84ca77e47a016ca539196c842aa42a405b28c" Dec 01 08:42:57 crc kubenswrapper[5004]: E1201 08:42:57.518755 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"408d4b7c490f8fa6f226e1183de84ca77e47a016ca539196c842aa42a405b28c\": container with ID starting with 408d4b7c490f8fa6f226e1183de84ca77e47a016ca539196c842aa42a405b28c not found: ID does not exist" containerID="408d4b7c490f8fa6f226e1183de84ca77e47a016ca539196c842aa42a405b28c" Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.518781 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"408d4b7c490f8fa6f226e1183de84ca77e47a016ca539196c842aa42a405b28c"} err="failed to get container status \"408d4b7c490f8fa6f226e1183de84ca77e47a016ca539196c842aa42a405b28c\": rpc error: code = NotFound desc = could not find container \"408d4b7c490f8fa6f226e1183de84ca77e47a016ca539196c842aa42a405b28c\": container with ID starting with 408d4b7c490f8fa6f226e1183de84ca77e47a016ca539196c842aa42a405b28c not found: ID does not exist" Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.518801 5004 scope.go:117] "RemoveContainer" containerID="2e1530cdf061e5fb1b8209f839d146c3c7c4f1bfe7b77d7de57decf909fe873d" Dec 01 08:42:57 crc kubenswrapper[5004]: E1201 08:42:57.519049 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e1530cdf061e5fb1b8209f839d146c3c7c4f1bfe7b77d7de57decf909fe873d\": container with ID starting with 2e1530cdf061e5fb1b8209f839d146c3c7c4f1bfe7b77d7de57decf909fe873d not found: ID does not exist" containerID="2e1530cdf061e5fb1b8209f839d146c3c7c4f1bfe7b77d7de57decf909fe873d" Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.519069 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e1530cdf061e5fb1b8209f839d146c3c7c4f1bfe7b77d7de57decf909fe873d"} err="failed to get container status \"2e1530cdf061e5fb1b8209f839d146c3c7c4f1bfe7b77d7de57decf909fe873d\": rpc error: code = NotFound desc = could not find container \"2e1530cdf061e5fb1b8209f839d146c3c7c4f1bfe7b77d7de57decf909fe873d\": container with ID starting with 2e1530cdf061e5fb1b8209f839d146c3c7c4f1bfe7b77d7de57decf909fe873d not found: ID does not exist" Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.519083 5004 scope.go:117] "RemoveContainer" containerID="42f9b588d70d3af67504f05f69f5fe67ff5311a5aff39254ca127ec5ef25cb05" Dec 01 08:42:57 crc kubenswrapper[5004]: E1201 08:42:57.519331 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42f9b588d70d3af67504f05f69f5fe67ff5311a5aff39254ca127ec5ef25cb05\": container with ID starting with 42f9b588d70d3af67504f05f69f5fe67ff5311a5aff39254ca127ec5ef25cb05 not found: ID does not exist" containerID="42f9b588d70d3af67504f05f69f5fe67ff5311a5aff39254ca127ec5ef25cb05" Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.519382 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42f9b588d70d3af67504f05f69f5fe67ff5311a5aff39254ca127ec5ef25cb05"} err="failed to get container status \"42f9b588d70d3af67504f05f69f5fe67ff5311a5aff39254ca127ec5ef25cb05\": rpc error: code = NotFound desc = could not find container \"42f9b588d70d3af67504f05f69f5fe67ff5311a5aff39254ca127ec5ef25cb05\": container with ID starting with 42f9b588d70d3af67504f05f69f5fe67ff5311a5aff39254ca127ec5ef25cb05 not found: ID does not exist" Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.519410 5004 scope.go:117] "RemoveContainer" containerID="7b2e343e254fa87c3ee0984676d24fda0f1fa1803053782002fc004659c8c7bc" Dec 01 08:42:57 crc kubenswrapper[5004]: E1201 08:42:57.519791 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b2e343e254fa87c3ee0984676d24fda0f1fa1803053782002fc004659c8c7bc\": container with ID starting with 7b2e343e254fa87c3ee0984676d24fda0f1fa1803053782002fc004659c8c7bc not found: ID does not exist" containerID="7b2e343e254fa87c3ee0984676d24fda0f1fa1803053782002fc004659c8c7bc" Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.519812 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b2e343e254fa87c3ee0984676d24fda0f1fa1803053782002fc004659c8c7bc"} err="failed to get container status \"7b2e343e254fa87c3ee0984676d24fda0f1fa1803053782002fc004659c8c7bc\": rpc error: code = NotFound desc = could not find container \"7b2e343e254fa87c3ee0984676d24fda0f1fa1803053782002fc004659c8c7bc\": container with ID starting with 7b2e343e254fa87c3ee0984676d24fda0f1fa1803053782002fc004659c8c7bc not found: ID does not exist" Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.552115 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cf129260-3776-4aef-908f-17defb18949d-run-httpd\") pod \"ceilometer-0\" (UID: \"cf129260-3776-4aef-908f-17defb18949d\") " pod="openstack/ceilometer-0" Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.552216 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cf129260-3776-4aef-908f-17defb18949d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cf129260-3776-4aef-908f-17defb18949d\") " pod="openstack/ceilometer-0" Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.552288 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4772\" (UniqueName: \"kubernetes.io/projected/cf129260-3776-4aef-908f-17defb18949d-kube-api-access-x4772\") pod \"ceilometer-0\" (UID: \"cf129260-3776-4aef-908f-17defb18949d\") " pod="openstack/ceilometer-0" Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.552309 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf129260-3776-4aef-908f-17defb18949d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cf129260-3776-4aef-908f-17defb18949d\") " pod="openstack/ceilometer-0" Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.552323 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf129260-3776-4aef-908f-17defb18949d-config-data\") pod \"ceilometer-0\" (UID: \"cf129260-3776-4aef-908f-17defb18949d\") " pod="openstack/ceilometer-0" Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.552353 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cf129260-3776-4aef-908f-17defb18949d-log-httpd\") pod \"ceilometer-0\" (UID: \"cf129260-3776-4aef-908f-17defb18949d\") " pod="openstack/ceilometer-0" Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.552380 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf129260-3776-4aef-908f-17defb18949d-scripts\") pod \"ceilometer-0\" (UID: \"cf129260-3776-4aef-908f-17defb18949d\") " pod="openstack/ceilometer-0" Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.553470 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cf129260-3776-4aef-908f-17defb18949d-run-httpd\") pod \"ceilometer-0\" (UID: \"cf129260-3776-4aef-908f-17defb18949d\") " pod="openstack/ceilometer-0" Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.553470 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cf129260-3776-4aef-908f-17defb18949d-log-httpd\") pod \"ceilometer-0\" (UID: \"cf129260-3776-4aef-908f-17defb18949d\") " pod="openstack/ceilometer-0" Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.558454 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf129260-3776-4aef-908f-17defb18949d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cf129260-3776-4aef-908f-17defb18949d\") " pod="openstack/ceilometer-0" Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.558986 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf129260-3776-4aef-908f-17defb18949d-scripts\") pod \"ceilometer-0\" (UID: \"cf129260-3776-4aef-908f-17defb18949d\") " pod="openstack/ceilometer-0" Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.569111 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf129260-3776-4aef-908f-17defb18949d-config-data\") pod \"ceilometer-0\" (UID: \"cf129260-3776-4aef-908f-17defb18949d\") " pod="openstack/ceilometer-0" Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.574763 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cf129260-3776-4aef-908f-17defb18949d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cf129260-3776-4aef-908f-17defb18949d\") " pod="openstack/ceilometer-0" Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.575692 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4772\" (UniqueName: \"kubernetes.io/projected/cf129260-3776-4aef-908f-17defb18949d-kube-api-access-x4772\") pod \"ceilometer-0\" (UID: \"cf129260-3776-4aef-908f-17defb18949d\") " pod="openstack/ceilometer-0" Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.655633 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.796785 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.848134 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-svs2q" Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.960475 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7cwn\" (UniqueName: \"kubernetes.io/projected/ac4a2842-3b0b-45d4-b400-ce7be95df717-kube-api-access-t7cwn\") pod \"ac4a2842-3b0b-45d4-b400-ce7be95df717\" (UID: \"ac4a2842-3b0b-45d4-b400-ce7be95df717\") " Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.960647 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac4a2842-3b0b-45d4-b400-ce7be95df717-catalog-content\") pod \"ac4a2842-3b0b-45d4-b400-ce7be95df717\" (UID: \"ac4a2842-3b0b-45d4-b400-ce7be95df717\") " Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.960702 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac4a2842-3b0b-45d4-b400-ce7be95df717-utilities\") pod \"ac4a2842-3b0b-45d4-b400-ce7be95df717\" (UID: \"ac4a2842-3b0b-45d4-b400-ce7be95df717\") " Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.961800 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac4a2842-3b0b-45d4-b400-ce7be95df717-utilities" (OuterVolumeSpecName: "utilities") pod "ac4a2842-3b0b-45d4-b400-ce7be95df717" (UID: "ac4a2842-3b0b-45d4-b400-ce7be95df717"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:42:57 crc kubenswrapper[5004]: I1201 08:42:57.969759 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac4a2842-3b0b-45d4-b400-ce7be95df717-kube-api-access-t7cwn" (OuterVolumeSpecName: "kube-api-access-t7cwn") pod "ac4a2842-3b0b-45d4-b400-ce7be95df717" (UID: "ac4a2842-3b0b-45d4-b400-ce7be95df717"). InnerVolumeSpecName "kube-api-access-t7cwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:42:58 crc kubenswrapper[5004]: I1201 08:42:58.063111 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac4a2842-3b0b-45d4-b400-ce7be95df717-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 08:42:58 crc kubenswrapper[5004]: I1201 08:42:58.063143 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7cwn\" (UniqueName: \"kubernetes.io/projected/ac4a2842-3b0b-45d4-b400-ce7be95df717-kube-api-access-t7cwn\") on node \"crc\" DevicePath \"\"" Dec 01 08:42:58 crc kubenswrapper[5004]: I1201 08:42:58.065749 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac4a2842-3b0b-45d4-b400-ce7be95df717-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ac4a2842-3b0b-45d4-b400-ce7be95df717" (UID: "ac4a2842-3b0b-45d4-b400-ce7be95df717"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:42:58 crc kubenswrapper[5004]: I1201 08:42:58.164852 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac4a2842-3b0b-45d4-b400-ce7be95df717-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 08:42:58 crc kubenswrapper[5004]: I1201 08:42:58.258936 5004 generic.go:334] "Generic (PLEG): container finished" podID="ac4a2842-3b0b-45d4-b400-ce7be95df717" containerID="528c2d1956dfcbfef6946d3280cde4c12c86ac089d5eb27229f063d41fcb827e" exitCode=0 Dec 01 08:42:58 crc kubenswrapper[5004]: I1201 08:42:58.259036 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-svs2q" Dec 01 08:42:58 crc kubenswrapper[5004]: I1201 08:42:58.259057 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-svs2q" event={"ID":"ac4a2842-3b0b-45d4-b400-ce7be95df717","Type":"ContainerDied","Data":"528c2d1956dfcbfef6946d3280cde4c12c86ac089d5eb27229f063d41fcb827e"} Dec 01 08:42:58 crc kubenswrapper[5004]: I1201 08:42:58.260382 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-svs2q" event={"ID":"ac4a2842-3b0b-45d4-b400-ce7be95df717","Type":"ContainerDied","Data":"35122448ca9a01e43a66d90e63d44cc667a119f216f461d17966c746878e0b07"} Dec 01 08:42:58 crc kubenswrapper[5004]: I1201 08:42:58.260424 5004 scope.go:117] "RemoveContainer" containerID="528c2d1956dfcbfef6946d3280cde4c12c86ac089d5eb27229f063d41fcb827e" Dec 01 08:42:58 crc kubenswrapper[5004]: I1201 08:42:58.287909 5004 scope.go:117] "RemoveContainer" containerID="2303c4f6039dff84a524587888c0bc3a95b9b34da511d23004f020aecbbe9345" Dec 01 08:42:58 crc kubenswrapper[5004]: I1201 08:42:58.305289 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-svs2q"] Dec 01 08:42:58 crc kubenswrapper[5004]: I1201 08:42:58.316126 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-svs2q"] Dec 01 08:42:58 crc kubenswrapper[5004]: I1201 08:42:58.352324 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 08:42:58 crc kubenswrapper[5004]: I1201 08:42:58.437199 5004 scope.go:117] "RemoveContainer" containerID="65aaef056ee3bf8e63cda250c1d72ea11ef78d60ef1bdde7cca1fa885a4b8af5" Dec 01 08:42:58 crc kubenswrapper[5004]: I1201 08:42:58.497607 5004 scope.go:117] "RemoveContainer" containerID="528c2d1956dfcbfef6946d3280cde4c12c86ac089d5eb27229f063d41fcb827e" Dec 01 08:42:58 crc kubenswrapper[5004]: E1201 08:42:58.498057 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"528c2d1956dfcbfef6946d3280cde4c12c86ac089d5eb27229f063d41fcb827e\": container with ID starting with 528c2d1956dfcbfef6946d3280cde4c12c86ac089d5eb27229f063d41fcb827e not found: ID does not exist" containerID="528c2d1956dfcbfef6946d3280cde4c12c86ac089d5eb27229f063d41fcb827e" Dec 01 08:42:58 crc kubenswrapper[5004]: I1201 08:42:58.498129 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"528c2d1956dfcbfef6946d3280cde4c12c86ac089d5eb27229f063d41fcb827e"} err="failed to get container status \"528c2d1956dfcbfef6946d3280cde4c12c86ac089d5eb27229f063d41fcb827e\": rpc error: code = NotFound desc = could not find container \"528c2d1956dfcbfef6946d3280cde4c12c86ac089d5eb27229f063d41fcb827e\": container with ID starting with 528c2d1956dfcbfef6946d3280cde4c12c86ac089d5eb27229f063d41fcb827e not found: ID does not exist" Dec 01 08:42:58 crc kubenswrapper[5004]: I1201 08:42:58.498157 5004 scope.go:117] "RemoveContainer" containerID="2303c4f6039dff84a524587888c0bc3a95b9b34da511d23004f020aecbbe9345" Dec 01 08:42:58 crc kubenswrapper[5004]: E1201 08:42:58.498751 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2303c4f6039dff84a524587888c0bc3a95b9b34da511d23004f020aecbbe9345\": container with ID starting with 2303c4f6039dff84a524587888c0bc3a95b9b34da511d23004f020aecbbe9345 not found: ID does not exist" containerID="2303c4f6039dff84a524587888c0bc3a95b9b34da511d23004f020aecbbe9345" Dec 01 08:42:58 crc kubenswrapper[5004]: I1201 08:42:58.499138 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2303c4f6039dff84a524587888c0bc3a95b9b34da511d23004f020aecbbe9345"} err="failed to get container status \"2303c4f6039dff84a524587888c0bc3a95b9b34da511d23004f020aecbbe9345\": rpc error: code = NotFound desc = could not find container \"2303c4f6039dff84a524587888c0bc3a95b9b34da511d23004f020aecbbe9345\": container with ID starting with 2303c4f6039dff84a524587888c0bc3a95b9b34da511d23004f020aecbbe9345 not found: ID does not exist" Dec 01 08:42:58 crc kubenswrapper[5004]: I1201 08:42:58.499196 5004 scope.go:117] "RemoveContainer" containerID="65aaef056ee3bf8e63cda250c1d72ea11ef78d60ef1bdde7cca1fa885a4b8af5" Dec 01 08:42:58 crc kubenswrapper[5004]: E1201 08:42:58.499612 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65aaef056ee3bf8e63cda250c1d72ea11ef78d60ef1bdde7cca1fa885a4b8af5\": container with ID starting with 65aaef056ee3bf8e63cda250c1d72ea11ef78d60ef1bdde7cca1fa885a4b8af5 not found: ID does not exist" containerID="65aaef056ee3bf8e63cda250c1d72ea11ef78d60ef1bdde7cca1fa885a4b8af5" Dec 01 08:42:58 crc kubenswrapper[5004]: I1201 08:42:58.499694 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65aaef056ee3bf8e63cda250c1d72ea11ef78d60ef1bdde7cca1fa885a4b8af5"} err="failed to get container status \"65aaef056ee3bf8e63cda250c1d72ea11ef78d60ef1bdde7cca1fa885a4b8af5\": rpc error: code = NotFound desc = could not find container \"65aaef056ee3bf8e63cda250c1d72ea11ef78d60ef1bdde7cca1fa885a4b8af5\": container with ID starting with 65aaef056ee3bf8e63cda250c1d72ea11ef78d60ef1bdde7cca1fa885a4b8af5 not found: ID does not exist" Dec 01 08:42:58 crc kubenswrapper[5004]: I1201 08:42:58.781517 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f55f02c-5371-4258-b5cf-9565b29798b9" path="/var/lib/kubelet/pods/7f55f02c-5371-4258-b5cf-9565b29798b9/volumes" Dec 01 08:42:58 crc kubenswrapper[5004]: I1201 08:42:58.782448 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac4a2842-3b0b-45d4-b400-ce7be95df717" path="/var/lib/kubelet/pods/ac4a2842-3b0b-45d4-b400-ce7be95df717/volumes" Dec 01 08:42:59 crc kubenswrapper[5004]: I1201 08:42:59.273886 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cf129260-3776-4aef-908f-17defb18949d","Type":"ContainerStarted","Data":"fc776daf0808f78130eee654c714b6c4ee668c18424442260d9af29df0f5d80a"} Dec 01 08:42:59 crc kubenswrapper[5004]: I1201 08:42:59.274196 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cf129260-3776-4aef-908f-17defb18949d","Type":"ContainerStarted","Data":"345c303513ce1d47ab0c4c8c7bcb58dcf32cb398e38b23efc795fcd1bb8a9d4f"} Dec 01 08:43:00 crc kubenswrapper[5004]: I1201 08:43:00.286510 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cf129260-3776-4aef-908f-17defb18949d","Type":"ContainerStarted","Data":"9d21786cfdb3689b03b08cae03431c6830bde6f9ee45b65f26bd50894c4acca3"} Dec 01 08:43:01 crc kubenswrapper[5004]: I1201 08:43:01.299504 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cf129260-3776-4aef-908f-17defb18949d","Type":"ContainerStarted","Data":"83caf7713f77ee17e875f3fe279b1d1afa85abc52ab689445cd0bfad842e8d0c"} Dec 01 08:43:01 crc kubenswrapper[5004]: I1201 08:43:01.378472 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 01 08:43:01 crc kubenswrapper[5004]: I1201 08:43:01.652707 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 01 08:43:01 crc kubenswrapper[5004]: I1201 08:43:01.653022 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 01 08:43:02 crc kubenswrapper[5004]: I1201 08:43:02.312844 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cf129260-3776-4aef-908f-17defb18949d","Type":"ContainerStarted","Data":"22ffa35445670733c02423becdd59f1ce90f1b9c2d0992fd12cb54d9c79e693a"} Dec 01 08:43:02 crc kubenswrapper[5004]: I1201 08:43:02.313259 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 08:43:02 crc kubenswrapper[5004]: I1201 08:43:02.343202 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.733518413 podStartE2EDuration="5.343177114s" podCreationTimestamp="2025-12-01 08:42:57 +0000 UTC" firstStartedPulling="2025-12-01 08:42:58.353215944 +0000 UTC m=+1555.918207916" lastFinishedPulling="2025-12-01 08:43:01.962874635 +0000 UTC m=+1559.527866617" observedRunningTime="2025-12-01 08:43:02.333245312 +0000 UTC m=+1559.898237304" watchObservedRunningTime="2025-12-01 08:43:02.343177114 +0000 UTC m=+1559.908169096" Dec 01 08:43:02 crc kubenswrapper[5004]: I1201 08:43:02.654658 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 01 08:43:02 crc kubenswrapper[5004]: I1201 08:43:02.675736 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="855d076a-eacb-40d9-9135-8a1a3cf64f59" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.245:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 01 08:43:02 crc kubenswrapper[5004]: I1201 08:43:02.675774 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="855d076a-eacb-40d9-9135-8a1a3cf64f59" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.245:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 01 08:43:02 crc kubenswrapper[5004]: I1201 08:43:02.714950 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 01 08:43:03 crc kubenswrapper[5004]: I1201 08:43:03.361218 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 01 08:43:03 crc kubenswrapper[5004]: I1201 08:43:03.873736 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 08:43:03 crc kubenswrapper[5004]: I1201 08:43:03.873968 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 08:43:04 crc kubenswrapper[5004]: I1201 08:43:04.916068 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="7673aa49-60cb-427a-b089-42db27e176ee" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.247:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 08:43:04 crc kubenswrapper[5004]: I1201 08:43:04.956859 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="7673aa49-60cb-427a-b089-42db27e176ee" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.247:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 08:43:11 crc kubenswrapper[5004]: I1201 08:43:11.658821 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 01 08:43:11 crc kubenswrapper[5004]: I1201 08:43:11.664868 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 01 08:43:11 crc kubenswrapper[5004]: I1201 08:43:11.670891 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 01 08:43:12 crc kubenswrapper[5004]: I1201 08:43:12.146110 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 01 08:43:12 crc kubenswrapper[5004]: I1201 08:43:12.339771 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cf491dc-7bad-4373-ba55-e59710d8c05f-combined-ca-bundle\") pod \"0cf491dc-7bad-4373-ba55-e59710d8c05f\" (UID: \"0cf491dc-7bad-4373-ba55-e59710d8c05f\") " Dec 01 08:43:12 crc kubenswrapper[5004]: I1201 08:43:12.339959 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cf491dc-7bad-4373-ba55-e59710d8c05f-config-data\") pod \"0cf491dc-7bad-4373-ba55-e59710d8c05f\" (UID: \"0cf491dc-7bad-4373-ba55-e59710d8c05f\") " Dec 01 08:43:12 crc kubenswrapper[5004]: I1201 08:43:12.340063 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vf64w\" (UniqueName: \"kubernetes.io/projected/0cf491dc-7bad-4373-ba55-e59710d8c05f-kube-api-access-vf64w\") pod \"0cf491dc-7bad-4373-ba55-e59710d8c05f\" (UID: \"0cf491dc-7bad-4373-ba55-e59710d8c05f\") " Dec 01 08:43:12 crc kubenswrapper[5004]: I1201 08:43:12.351720 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cf491dc-7bad-4373-ba55-e59710d8c05f-kube-api-access-vf64w" (OuterVolumeSpecName: "kube-api-access-vf64w") pod "0cf491dc-7bad-4373-ba55-e59710d8c05f" (UID: "0cf491dc-7bad-4373-ba55-e59710d8c05f"). InnerVolumeSpecName "kube-api-access-vf64w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:43:12 crc kubenswrapper[5004]: I1201 08:43:12.372553 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cf491dc-7bad-4373-ba55-e59710d8c05f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0cf491dc-7bad-4373-ba55-e59710d8c05f" (UID: "0cf491dc-7bad-4373-ba55-e59710d8c05f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:43:12 crc kubenswrapper[5004]: I1201 08:43:12.397950 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cf491dc-7bad-4373-ba55-e59710d8c05f-config-data" (OuterVolumeSpecName: "config-data") pod "0cf491dc-7bad-4373-ba55-e59710d8c05f" (UID: "0cf491dc-7bad-4373-ba55-e59710d8c05f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:43:12 crc kubenswrapper[5004]: I1201 08:43:12.451323 5004 generic.go:334] "Generic (PLEG): container finished" podID="0cf491dc-7bad-4373-ba55-e59710d8c05f" containerID="88d6d3368227731978b43a0b27e8daa6ef83cb94f13725e7e296d80e42311c4b" exitCode=137 Dec 01 08:43:12 crc kubenswrapper[5004]: I1201 08:43:12.451854 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 01 08:43:12 crc kubenswrapper[5004]: I1201 08:43:12.452613 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0cf491dc-7bad-4373-ba55-e59710d8c05f","Type":"ContainerDied","Data":"88d6d3368227731978b43a0b27e8daa6ef83cb94f13725e7e296d80e42311c4b"} Dec 01 08:43:12 crc kubenswrapper[5004]: I1201 08:43:12.452729 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0cf491dc-7bad-4373-ba55-e59710d8c05f","Type":"ContainerDied","Data":"1298ee3ffcaf2a14b357b355d13f12e4d9977f18621a00feaa89bcfd08e885f4"} Dec 01 08:43:12 crc kubenswrapper[5004]: I1201 08:43:12.452789 5004 scope.go:117] "RemoveContainer" containerID="88d6d3368227731978b43a0b27e8daa6ef83cb94f13725e7e296d80e42311c4b" Dec 01 08:43:12 crc kubenswrapper[5004]: I1201 08:43:12.453037 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cf491dc-7bad-4373-ba55-e59710d8c05f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:43:12 crc kubenswrapper[5004]: I1201 08:43:12.455224 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cf491dc-7bad-4373-ba55-e59710d8c05f-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 08:43:12 crc kubenswrapper[5004]: I1201 08:43:12.455267 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vf64w\" (UniqueName: \"kubernetes.io/projected/0cf491dc-7bad-4373-ba55-e59710d8c05f-kube-api-access-vf64w\") on node \"crc\" DevicePath \"\"" Dec 01 08:43:12 crc kubenswrapper[5004]: I1201 08:43:12.501421 5004 scope.go:117] "RemoveContainer" containerID="88d6d3368227731978b43a0b27e8daa6ef83cb94f13725e7e296d80e42311c4b" Dec 01 08:43:12 crc kubenswrapper[5004]: E1201 08:43:12.517143 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88d6d3368227731978b43a0b27e8daa6ef83cb94f13725e7e296d80e42311c4b\": container with ID starting with 88d6d3368227731978b43a0b27e8daa6ef83cb94f13725e7e296d80e42311c4b not found: ID does not exist" containerID="88d6d3368227731978b43a0b27e8daa6ef83cb94f13725e7e296d80e42311c4b" Dec 01 08:43:12 crc kubenswrapper[5004]: I1201 08:43:12.517221 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88d6d3368227731978b43a0b27e8daa6ef83cb94f13725e7e296d80e42311c4b"} err="failed to get container status \"88d6d3368227731978b43a0b27e8daa6ef83cb94f13725e7e296d80e42311c4b\": rpc error: code = NotFound desc = could not find container \"88d6d3368227731978b43a0b27e8daa6ef83cb94f13725e7e296d80e42311c4b\": container with ID starting with 88d6d3368227731978b43a0b27e8daa6ef83cb94f13725e7e296d80e42311c4b not found: ID does not exist" Dec 01 08:43:12 crc kubenswrapper[5004]: I1201 08:43:12.525774 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 01 08:43:12 crc kubenswrapper[5004]: I1201 08:43:12.537206 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 08:43:12 crc kubenswrapper[5004]: I1201 08:43:12.573696 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 08:43:12 crc kubenswrapper[5004]: I1201 08:43:12.592263 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 08:43:12 crc kubenswrapper[5004]: E1201 08:43:12.593175 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac4a2842-3b0b-45d4-b400-ce7be95df717" containerName="extract-content" Dec 01 08:43:12 crc kubenswrapper[5004]: I1201 08:43:12.593195 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac4a2842-3b0b-45d4-b400-ce7be95df717" containerName="extract-content" Dec 01 08:43:12 crc kubenswrapper[5004]: E1201 08:43:12.593227 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac4a2842-3b0b-45d4-b400-ce7be95df717" containerName="registry-server" Dec 01 08:43:12 crc kubenswrapper[5004]: I1201 08:43:12.593233 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac4a2842-3b0b-45d4-b400-ce7be95df717" containerName="registry-server" Dec 01 08:43:12 crc kubenswrapper[5004]: E1201 08:43:12.593244 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cf491dc-7bad-4373-ba55-e59710d8c05f" containerName="nova-cell1-novncproxy-novncproxy" Dec 01 08:43:12 crc kubenswrapper[5004]: I1201 08:43:12.593251 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cf491dc-7bad-4373-ba55-e59710d8c05f" containerName="nova-cell1-novncproxy-novncproxy" Dec 01 08:43:12 crc kubenswrapper[5004]: E1201 08:43:12.593263 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac4a2842-3b0b-45d4-b400-ce7be95df717" containerName="extract-utilities" Dec 01 08:43:12 crc kubenswrapper[5004]: I1201 08:43:12.593269 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac4a2842-3b0b-45d4-b400-ce7be95df717" containerName="extract-utilities" Dec 01 08:43:12 crc kubenswrapper[5004]: I1201 08:43:12.593624 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cf491dc-7bad-4373-ba55-e59710d8c05f" containerName="nova-cell1-novncproxy-novncproxy" Dec 01 08:43:12 crc kubenswrapper[5004]: I1201 08:43:12.593637 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac4a2842-3b0b-45d4-b400-ce7be95df717" containerName="registry-server" Dec 01 08:43:12 crc kubenswrapper[5004]: I1201 08:43:12.594435 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 01 08:43:12 crc kubenswrapper[5004]: I1201 08:43:12.597244 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Dec 01 08:43:12 crc kubenswrapper[5004]: I1201 08:43:12.597413 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 01 08:43:12 crc kubenswrapper[5004]: I1201 08:43:12.597421 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Dec 01 08:43:12 crc kubenswrapper[5004]: I1201 08:43:12.607695 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 08:43:12 crc kubenswrapper[5004]: I1201 08:43:12.664489 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/95dcb289-4473-451f-bd94-18d680b4f4f0-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"95dcb289-4473-451f-bd94-18d680b4f4f0\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 08:43:12 crc kubenswrapper[5004]: I1201 08:43:12.664608 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95dcb289-4473-451f-bd94-18d680b4f4f0-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"95dcb289-4473-451f-bd94-18d680b4f4f0\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 08:43:12 crc kubenswrapper[5004]: I1201 08:43:12.664658 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95dcb289-4473-451f-bd94-18d680b4f4f0-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"95dcb289-4473-451f-bd94-18d680b4f4f0\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 08:43:12 crc kubenswrapper[5004]: I1201 08:43:12.664676 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgdwc\" (UniqueName: \"kubernetes.io/projected/95dcb289-4473-451f-bd94-18d680b4f4f0-kube-api-access-tgdwc\") pod \"nova-cell1-novncproxy-0\" (UID: \"95dcb289-4473-451f-bd94-18d680b4f4f0\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 08:43:12 crc kubenswrapper[5004]: I1201 08:43:12.664705 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/95dcb289-4473-451f-bd94-18d680b4f4f0-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"95dcb289-4473-451f-bd94-18d680b4f4f0\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 08:43:12 crc kubenswrapper[5004]: I1201 08:43:12.766647 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/95dcb289-4473-451f-bd94-18d680b4f4f0-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"95dcb289-4473-451f-bd94-18d680b4f4f0\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 08:43:12 crc kubenswrapper[5004]: I1201 08:43:12.766713 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95dcb289-4473-451f-bd94-18d680b4f4f0-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"95dcb289-4473-451f-bd94-18d680b4f4f0\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 08:43:12 crc kubenswrapper[5004]: I1201 08:43:12.766740 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95dcb289-4473-451f-bd94-18d680b4f4f0-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"95dcb289-4473-451f-bd94-18d680b4f4f0\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 08:43:12 crc kubenswrapper[5004]: I1201 08:43:12.766755 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgdwc\" (UniqueName: \"kubernetes.io/projected/95dcb289-4473-451f-bd94-18d680b4f4f0-kube-api-access-tgdwc\") pod \"nova-cell1-novncproxy-0\" (UID: \"95dcb289-4473-451f-bd94-18d680b4f4f0\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 08:43:12 crc kubenswrapper[5004]: I1201 08:43:12.766785 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/95dcb289-4473-451f-bd94-18d680b4f4f0-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"95dcb289-4473-451f-bd94-18d680b4f4f0\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 08:43:12 crc kubenswrapper[5004]: I1201 08:43:12.771013 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95dcb289-4473-451f-bd94-18d680b4f4f0-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"95dcb289-4473-451f-bd94-18d680b4f4f0\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 08:43:12 crc kubenswrapper[5004]: I1201 08:43:12.771709 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/95dcb289-4473-451f-bd94-18d680b4f4f0-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"95dcb289-4473-451f-bd94-18d680b4f4f0\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 08:43:12 crc kubenswrapper[5004]: I1201 08:43:12.772796 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95dcb289-4473-451f-bd94-18d680b4f4f0-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"95dcb289-4473-451f-bd94-18d680b4f4f0\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 08:43:12 crc kubenswrapper[5004]: I1201 08:43:12.774685 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/95dcb289-4473-451f-bd94-18d680b4f4f0-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"95dcb289-4473-451f-bd94-18d680b4f4f0\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 08:43:12 crc kubenswrapper[5004]: I1201 08:43:12.784674 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cf491dc-7bad-4373-ba55-e59710d8c05f" path="/var/lib/kubelet/pods/0cf491dc-7bad-4373-ba55-e59710d8c05f/volumes" Dec 01 08:43:12 crc kubenswrapper[5004]: I1201 08:43:12.802483 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgdwc\" (UniqueName: \"kubernetes.io/projected/95dcb289-4473-451f-bd94-18d680b4f4f0-kube-api-access-tgdwc\") pod \"nova-cell1-novncproxy-0\" (UID: \"95dcb289-4473-451f-bd94-18d680b4f4f0\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 08:43:12 crc kubenswrapper[5004]: I1201 08:43:12.929091 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 01 08:43:13 crc kubenswrapper[5004]: I1201 08:43:13.434438 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 08:43:13 crc kubenswrapper[5004]: I1201 08:43:13.465755 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"95dcb289-4473-451f-bd94-18d680b4f4f0","Type":"ContainerStarted","Data":"db26fb06ed8e93fb4c531bfe7538e96b6b0695310142457293311d3b150105f0"} Dec 01 08:43:13 crc kubenswrapper[5004]: I1201 08:43:13.876976 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 01 08:43:13 crc kubenswrapper[5004]: I1201 08:43:13.877625 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 01 08:43:13 crc kubenswrapper[5004]: I1201 08:43:13.878494 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 01 08:43:13 crc kubenswrapper[5004]: I1201 08:43:13.881111 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 01 08:43:14 crc kubenswrapper[5004]: I1201 08:43:14.497390 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"95dcb289-4473-451f-bd94-18d680b4f4f0","Type":"ContainerStarted","Data":"ad9305fa9df052d462c17b5bb56029e666dfa44df595a3f85cea745c0ff7e8f7"} Dec 01 08:43:14 crc kubenswrapper[5004]: I1201 08:43:14.497852 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 01 08:43:14 crc kubenswrapper[5004]: I1201 08:43:14.510509 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 01 08:43:14 crc kubenswrapper[5004]: I1201 08:43:14.530807 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.530784934 podStartE2EDuration="2.530784934s" podCreationTimestamp="2025-12-01 08:43:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:43:14.519428706 +0000 UTC m=+1572.084420708" watchObservedRunningTime="2025-12-01 08:43:14.530784934 +0000 UTC m=+1572.095776936" Dec 01 08:43:14 crc kubenswrapper[5004]: I1201 08:43:14.919600 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d99f6bc7f-nk8fb"] Dec 01 08:43:14 crc kubenswrapper[5004]: I1201 08:43:14.923955 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d99f6bc7f-nk8fb"] Dec 01 08:43:14 crc kubenswrapper[5004]: I1201 08:43:14.924091 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d99f6bc7f-nk8fb" Dec 01 08:43:14 crc kubenswrapper[5004]: I1201 08:43:14.980634 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8b89ff03-e585-46b1-8656-fad2acbeaeaf-dns-svc\") pod \"dnsmasq-dns-6d99f6bc7f-nk8fb\" (UID: \"8b89ff03-e585-46b1-8656-fad2acbeaeaf\") " pod="openstack/dnsmasq-dns-6d99f6bc7f-nk8fb" Dec 01 08:43:14 crc kubenswrapper[5004]: I1201 08:43:14.980684 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmdzp\" (UniqueName: \"kubernetes.io/projected/8b89ff03-e585-46b1-8656-fad2acbeaeaf-kube-api-access-zmdzp\") pod \"dnsmasq-dns-6d99f6bc7f-nk8fb\" (UID: \"8b89ff03-e585-46b1-8656-fad2acbeaeaf\") " pod="openstack/dnsmasq-dns-6d99f6bc7f-nk8fb" Dec 01 08:43:14 crc kubenswrapper[5004]: I1201 08:43:14.980737 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8b89ff03-e585-46b1-8656-fad2acbeaeaf-ovsdbserver-nb\") pod \"dnsmasq-dns-6d99f6bc7f-nk8fb\" (UID: \"8b89ff03-e585-46b1-8656-fad2acbeaeaf\") " pod="openstack/dnsmasq-dns-6d99f6bc7f-nk8fb" Dec 01 08:43:14 crc kubenswrapper[5004]: I1201 08:43:14.980780 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b89ff03-e585-46b1-8656-fad2acbeaeaf-config\") pod \"dnsmasq-dns-6d99f6bc7f-nk8fb\" (UID: \"8b89ff03-e585-46b1-8656-fad2acbeaeaf\") " pod="openstack/dnsmasq-dns-6d99f6bc7f-nk8fb" Dec 01 08:43:14 crc kubenswrapper[5004]: I1201 08:43:14.980800 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8b89ff03-e585-46b1-8656-fad2acbeaeaf-ovsdbserver-sb\") pod \"dnsmasq-dns-6d99f6bc7f-nk8fb\" (UID: \"8b89ff03-e585-46b1-8656-fad2acbeaeaf\") " pod="openstack/dnsmasq-dns-6d99f6bc7f-nk8fb" Dec 01 08:43:14 crc kubenswrapper[5004]: I1201 08:43:14.980830 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8b89ff03-e585-46b1-8656-fad2acbeaeaf-dns-swift-storage-0\") pod \"dnsmasq-dns-6d99f6bc7f-nk8fb\" (UID: \"8b89ff03-e585-46b1-8656-fad2acbeaeaf\") " pod="openstack/dnsmasq-dns-6d99f6bc7f-nk8fb" Dec 01 08:43:15 crc kubenswrapper[5004]: I1201 08:43:15.082413 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8b89ff03-e585-46b1-8656-fad2acbeaeaf-ovsdbserver-nb\") pod \"dnsmasq-dns-6d99f6bc7f-nk8fb\" (UID: \"8b89ff03-e585-46b1-8656-fad2acbeaeaf\") " pod="openstack/dnsmasq-dns-6d99f6bc7f-nk8fb" Dec 01 08:43:15 crc kubenswrapper[5004]: I1201 08:43:15.082522 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b89ff03-e585-46b1-8656-fad2acbeaeaf-config\") pod \"dnsmasq-dns-6d99f6bc7f-nk8fb\" (UID: \"8b89ff03-e585-46b1-8656-fad2acbeaeaf\") " pod="openstack/dnsmasq-dns-6d99f6bc7f-nk8fb" Dec 01 08:43:15 crc kubenswrapper[5004]: I1201 08:43:15.082551 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8b89ff03-e585-46b1-8656-fad2acbeaeaf-ovsdbserver-sb\") pod \"dnsmasq-dns-6d99f6bc7f-nk8fb\" (UID: \"8b89ff03-e585-46b1-8656-fad2acbeaeaf\") " pod="openstack/dnsmasq-dns-6d99f6bc7f-nk8fb" Dec 01 08:43:15 crc kubenswrapper[5004]: I1201 08:43:15.082615 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8b89ff03-e585-46b1-8656-fad2acbeaeaf-dns-swift-storage-0\") pod \"dnsmasq-dns-6d99f6bc7f-nk8fb\" (UID: \"8b89ff03-e585-46b1-8656-fad2acbeaeaf\") " pod="openstack/dnsmasq-dns-6d99f6bc7f-nk8fb" Dec 01 08:43:15 crc kubenswrapper[5004]: I1201 08:43:15.082767 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8b89ff03-e585-46b1-8656-fad2acbeaeaf-dns-svc\") pod \"dnsmasq-dns-6d99f6bc7f-nk8fb\" (UID: \"8b89ff03-e585-46b1-8656-fad2acbeaeaf\") " pod="openstack/dnsmasq-dns-6d99f6bc7f-nk8fb" Dec 01 08:43:15 crc kubenswrapper[5004]: I1201 08:43:15.082843 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmdzp\" (UniqueName: \"kubernetes.io/projected/8b89ff03-e585-46b1-8656-fad2acbeaeaf-kube-api-access-zmdzp\") pod \"dnsmasq-dns-6d99f6bc7f-nk8fb\" (UID: \"8b89ff03-e585-46b1-8656-fad2acbeaeaf\") " pod="openstack/dnsmasq-dns-6d99f6bc7f-nk8fb" Dec 01 08:43:15 crc kubenswrapper[5004]: I1201 08:43:15.084157 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8b89ff03-e585-46b1-8656-fad2acbeaeaf-ovsdbserver-nb\") pod \"dnsmasq-dns-6d99f6bc7f-nk8fb\" (UID: \"8b89ff03-e585-46b1-8656-fad2acbeaeaf\") " pod="openstack/dnsmasq-dns-6d99f6bc7f-nk8fb" Dec 01 08:43:15 crc kubenswrapper[5004]: I1201 08:43:15.084900 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8b89ff03-e585-46b1-8656-fad2acbeaeaf-ovsdbserver-sb\") pod \"dnsmasq-dns-6d99f6bc7f-nk8fb\" (UID: \"8b89ff03-e585-46b1-8656-fad2acbeaeaf\") " pod="openstack/dnsmasq-dns-6d99f6bc7f-nk8fb" Dec 01 08:43:15 crc kubenswrapper[5004]: I1201 08:43:15.085300 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8b89ff03-e585-46b1-8656-fad2acbeaeaf-dns-swift-storage-0\") pod \"dnsmasq-dns-6d99f6bc7f-nk8fb\" (UID: \"8b89ff03-e585-46b1-8656-fad2acbeaeaf\") " pod="openstack/dnsmasq-dns-6d99f6bc7f-nk8fb" Dec 01 08:43:15 crc kubenswrapper[5004]: I1201 08:43:15.085549 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b89ff03-e585-46b1-8656-fad2acbeaeaf-config\") pod \"dnsmasq-dns-6d99f6bc7f-nk8fb\" (UID: \"8b89ff03-e585-46b1-8656-fad2acbeaeaf\") " pod="openstack/dnsmasq-dns-6d99f6bc7f-nk8fb" Dec 01 08:43:15 crc kubenswrapper[5004]: I1201 08:43:15.085792 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8b89ff03-e585-46b1-8656-fad2acbeaeaf-dns-svc\") pod \"dnsmasq-dns-6d99f6bc7f-nk8fb\" (UID: \"8b89ff03-e585-46b1-8656-fad2acbeaeaf\") " pod="openstack/dnsmasq-dns-6d99f6bc7f-nk8fb" Dec 01 08:43:15 crc kubenswrapper[5004]: I1201 08:43:15.106357 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmdzp\" (UniqueName: \"kubernetes.io/projected/8b89ff03-e585-46b1-8656-fad2acbeaeaf-kube-api-access-zmdzp\") pod \"dnsmasq-dns-6d99f6bc7f-nk8fb\" (UID: \"8b89ff03-e585-46b1-8656-fad2acbeaeaf\") " pod="openstack/dnsmasq-dns-6d99f6bc7f-nk8fb" Dec 01 08:43:15 crc kubenswrapper[5004]: I1201 08:43:15.331641 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d99f6bc7f-nk8fb" Dec 01 08:43:15 crc kubenswrapper[5004]: I1201 08:43:15.874249 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d99f6bc7f-nk8fb"] Dec 01 08:43:15 crc kubenswrapper[5004]: W1201 08:43:15.874980 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b89ff03_e585_46b1_8656_fad2acbeaeaf.slice/crio-0cb6dcc9790d156c27828ff5cab6ae630b97e92700342b613333f6c4e44f53b4 WatchSource:0}: Error finding container 0cb6dcc9790d156c27828ff5cab6ae630b97e92700342b613333f6c4e44f53b4: Status 404 returned error can't find the container with id 0cb6dcc9790d156c27828ff5cab6ae630b97e92700342b613333f6c4e44f53b4 Dec 01 08:43:16 crc kubenswrapper[5004]: I1201 08:43:16.526433 5004 generic.go:334] "Generic (PLEG): container finished" podID="8b89ff03-e585-46b1-8656-fad2acbeaeaf" containerID="7dc69bc8e5c3093480200580f2a842cdbda452efa11d1d59ca6c66592f30215e" exitCode=0 Dec 01 08:43:16 crc kubenswrapper[5004]: I1201 08:43:16.526504 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d99f6bc7f-nk8fb" event={"ID":"8b89ff03-e585-46b1-8656-fad2acbeaeaf","Type":"ContainerDied","Data":"7dc69bc8e5c3093480200580f2a842cdbda452efa11d1d59ca6c66592f30215e"} Dec 01 08:43:16 crc kubenswrapper[5004]: I1201 08:43:16.526832 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d99f6bc7f-nk8fb" event={"ID":"8b89ff03-e585-46b1-8656-fad2acbeaeaf","Type":"ContainerStarted","Data":"0cb6dcc9790d156c27828ff5cab6ae630b97e92700342b613333f6c4e44f53b4"} Dec 01 08:43:16 crc kubenswrapper[5004]: I1201 08:43:16.875570 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 08:43:16 crc kubenswrapper[5004]: I1201 08:43:16.876060 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cf129260-3776-4aef-908f-17defb18949d" containerName="ceilometer-central-agent" containerID="cri-o://fc776daf0808f78130eee654c714b6c4ee668c18424442260d9af29df0f5d80a" gracePeriod=30 Dec 01 08:43:16 crc kubenswrapper[5004]: I1201 08:43:16.876169 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cf129260-3776-4aef-908f-17defb18949d" containerName="proxy-httpd" containerID="cri-o://22ffa35445670733c02423becdd59f1ce90f1b9c2d0992fd12cb54d9c79e693a" gracePeriod=30 Dec 01 08:43:16 crc kubenswrapper[5004]: I1201 08:43:16.876210 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cf129260-3776-4aef-908f-17defb18949d" containerName="sg-core" containerID="cri-o://83caf7713f77ee17e875f3fe279b1d1afa85abc52ab689445cd0bfad842e8d0c" gracePeriod=30 Dec 01 08:43:16 crc kubenswrapper[5004]: I1201 08:43:16.876288 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cf129260-3776-4aef-908f-17defb18949d" containerName="ceilometer-notification-agent" containerID="cri-o://9d21786cfdb3689b03b08cae03431c6830bde6f9ee45b65f26bd50894c4acca3" gracePeriod=30 Dec 01 08:43:16 crc kubenswrapper[5004]: I1201 08:43:16.889864 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="cf129260-3776-4aef-908f-17defb18949d" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.248:3000/\": EOF" Dec 01 08:43:17 crc kubenswrapper[5004]: I1201 08:43:17.541728 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d99f6bc7f-nk8fb" event={"ID":"8b89ff03-e585-46b1-8656-fad2acbeaeaf","Type":"ContainerStarted","Data":"bf15c83552d127166c6c320e5f2fa50fd23d5e0335b939e809ad780efec1fcb6"} Dec 01 08:43:17 crc kubenswrapper[5004]: I1201 08:43:17.542313 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d99f6bc7f-nk8fb" Dec 01 08:43:17 crc kubenswrapper[5004]: I1201 08:43:17.548781 5004 generic.go:334] "Generic (PLEG): container finished" podID="cf129260-3776-4aef-908f-17defb18949d" containerID="22ffa35445670733c02423becdd59f1ce90f1b9c2d0992fd12cb54d9c79e693a" exitCode=0 Dec 01 08:43:17 crc kubenswrapper[5004]: I1201 08:43:17.548819 5004 generic.go:334] "Generic (PLEG): container finished" podID="cf129260-3776-4aef-908f-17defb18949d" containerID="83caf7713f77ee17e875f3fe279b1d1afa85abc52ab689445cd0bfad842e8d0c" exitCode=2 Dec 01 08:43:17 crc kubenswrapper[5004]: I1201 08:43:17.548831 5004 generic.go:334] "Generic (PLEG): container finished" podID="cf129260-3776-4aef-908f-17defb18949d" containerID="fc776daf0808f78130eee654c714b6c4ee668c18424442260d9af29df0f5d80a" exitCode=0 Dec 01 08:43:17 crc kubenswrapper[5004]: I1201 08:43:17.548855 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cf129260-3776-4aef-908f-17defb18949d","Type":"ContainerDied","Data":"22ffa35445670733c02423becdd59f1ce90f1b9c2d0992fd12cb54d9c79e693a"} Dec 01 08:43:17 crc kubenswrapper[5004]: I1201 08:43:17.548899 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cf129260-3776-4aef-908f-17defb18949d","Type":"ContainerDied","Data":"83caf7713f77ee17e875f3fe279b1d1afa85abc52ab689445cd0bfad842e8d0c"} Dec 01 08:43:17 crc kubenswrapper[5004]: I1201 08:43:17.548910 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cf129260-3776-4aef-908f-17defb18949d","Type":"ContainerDied","Data":"fc776daf0808f78130eee654c714b6c4ee668c18424442260d9af29df0f5d80a"} Dec 01 08:43:17 crc kubenswrapper[5004]: I1201 08:43:17.566917 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d99f6bc7f-nk8fb" podStartSLOduration=3.566901015 podStartE2EDuration="3.566901015s" podCreationTimestamp="2025-12-01 08:43:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:43:17.562308333 +0000 UTC m=+1575.127300315" watchObservedRunningTime="2025-12-01 08:43:17.566901015 +0000 UTC m=+1575.131892997" Dec 01 08:43:17 crc kubenswrapper[5004]: I1201 08:43:17.605864 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 01 08:43:17 crc kubenswrapper[5004]: I1201 08:43:17.606137 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7673aa49-60cb-427a-b089-42db27e176ee" containerName="nova-api-log" containerID="cri-o://60706ff456d9fc6048e6bf6d99347395c9dc671747d43be8126c7a9cb0e22c40" gracePeriod=30 Dec 01 08:43:17 crc kubenswrapper[5004]: I1201 08:43:17.606224 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7673aa49-60cb-427a-b089-42db27e176ee" containerName="nova-api-api" containerID="cri-o://97088024db1f02e2cfefe40f95b46e731ecfdf607491dd4db34ba16d64e22e64" gracePeriod=30 Dec 01 08:43:17 crc kubenswrapper[5004]: I1201 08:43:17.930703 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 01 08:43:18 crc kubenswrapper[5004]: I1201 08:43:18.563324 5004 generic.go:334] "Generic (PLEG): container finished" podID="7673aa49-60cb-427a-b089-42db27e176ee" containerID="60706ff456d9fc6048e6bf6d99347395c9dc671747d43be8126c7a9cb0e22c40" exitCode=143 Dec 01 08:43:18 crc kubenswrapper[5004]: I1201 08:43:18.563781 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7673aa49-60cb-427a-b089-42db27e176ee","Type":"ContainerDied","Data":"60706ff456d9fc6048e6bf6d99347395c9dc671747d43be8126c7a9cb0e22c40"} Dec 01 08:43:21 crc kubenswrapper[5004]: I1201 08:43:21.373862 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 08:43:21 crc kubenswrapper[5004]: I1201 08:43:21.539639 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5r24w\" (UniqueName: \"kubernetes.io/projected/7673aa49-60cb-427a-b089-42db27e176ee-kube-api-access-5r24w\") pod \"7673aa49-60cb-427a-b089-42db27e176ee\" (UID: \"7673aa49-60cb-427a-b089-42db27e176ee\") " Dec 01 08:43:21 crc kubenswrapper[5004]: I1201 08:43:21.540135 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7673aa49-60cb-427a-b089-42db27e176ee-config-data\") pod \"7673aa49-60cb-427a-b089-42db27e176ee\" (UID: \"7673aa49-60cb-427a-b089-42db27e176ee\") " Dec 01 08:43:21 crc kubenswrapper[5004]: I1201 08:43:21.540164 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7673aa49-60cb-427a-b089-42db27e176ee-combined-ca-bundle\") pod \"7673aa49-60cb-427a-b089-42db27e176ee\" (UID: \"7673aa49-60cb-427a-b089-42db27e176ee\") " Dec 01 08:43:21 crc kubenswrapper[5004]: I1201 08:43:21.540219 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7673aa49-60cb-427a-b089-42db27e176ee-logs\") pod \"7673aa49-60cb-427a-b089-42db27e176ee\" (UID: \"7673aa49-60cb-427a-b089-42db27e176ee\") " Dec 01 08:43:21 crc kubenswrapper[5004]: I1201 08:43:21.546013 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7673aa49-60cb-427a-b089-42db27e176ee-logs" (OuterVolumeSpecName: "logs") pod "7673aa49-60cb-427a-b089-42db27e176ee" (UID: "7673aa49-60cb-427a-b089-42db27e176ee"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:43:21 crc kubenswrapper[5004]: I1201 08:43:21.559787 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7673aa49-60cb-427a-b089-42db27e176ee-kube-api-access-5r24w" (OuterVolumeSpecName: "kube-api-access-5r24w") pod "7673aa49-60cb-427a-b089-42db27e176ee" (UID: "7673aa49-60cb-427a-b089-42db27e176ee"). InnerVolumeSpecName "kube-api-access-5r24w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:43:21 crc kubenswrapper[5004]: I1201 08:43:21.610064 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7673aa49-60cb-427a-b089-42db27e176ee-config-data" (OuterVolumeSpecName: "config-data") pod "7673aa49-60cb-427a-b089-42db27e176ee" (UID: "7673aa49-60cb-427a-b089-42db27e176ee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:43:21 crc kubenswrapper[5004]: I1201 08:43:21.614614 5004 generic.go:334] "Generic (PLEG): container finished" podID="7673aa49-60cb-427a-b089-42db27e176ee" containerID="97088024db1f02e2cfefe40f95b46e731ecfdf607491dd4db34ba16d64e22e64" exitCode=0 Dec 01 08:43:21 crc kubenswrapper[5004]: I1201 08:43:21.614700 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7673aa49-60cb-427a-b089-42db27e176ee","Type":"ContainerDied","Data":"97088024db1f02e2cfefe40f95b46e731ecfdf607491dd4db34ba16d64e22e64"} Dec 01 08:43:21 crc kubenswrapper[5004]: I1201 08:43:21.614734 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7673aa49-60cb-427a-b089-42db27e176ee","Type":"ContainerDied","Data":"71891789afc64ef3e2a0482dad7386a08cf6dd2ba50b75f23ecbdf12e89f7114"} Dec 01 08:43:21 crc kubenswrapper[5004]: I1201 08:43:21.614756 5004 scope.go:117] "RemoveContainer" containerID="97088024db1f02e2cfefe40f95b46e731ecfdf607491dd4db34ba16d64e22e64" Dec 01 08:43:21 crc kubenswrapper[5004]: I1201 08:43:21.614780 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 08:43:21 crc kubenswrapper[5004]: I1201 08:43:21.624699 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7673aa49-60cb-427a-b089-42db27e176ee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7673aa49-60cb-427a-b089-42db27e176ee" (UID: "7673aa49-60cb-427a-b089-42db27e176ee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:43:21 crc kubenswrapper[5004]: I1201 08:43:21.629863 5004 generic.go:334] "Generic (PLEG): container finished" podID="cf129260-3776-4aef-908f-17defb18949d" containerID="9d21786cfdb3689b03b08cae03431c6830bde6f9ee45b65f26bd50894c4acca3" exitCode=0 Dec 01 08:43:21 crc kubenswrapper[5004]: I1201 08:43:21.629925 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cf129260-3776-4aef-908f-17defb18949d","Type":"ContainerDied","Data":"9d21786cfdb3689b03b08cae03431c6830bde6f9ee45b65f26bd50894c4acca3"} Dec 01 08:43:21 crc kubenswrapper[5004]: I1201 08:43:21.644176 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7673aa49-60cb-427a-b089-42db27e176ee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:43:21 crc kubenswrapper[5004]: I1201 08:43:21.644204 5004 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7673aa49-60cb-427a-b089-42db27e176ee-logs\") on node \"crc\" DevicePath \"\"" Dec 01 08:43:21 crc kubenswrapper[5004]: I1201 08:43:21.644215 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5r24w\" (UniqueName: \"kubernetes.io/projected/7673aa49-60cb-427a-b089-42db27e176ee-kube-api-access-5r24w\") on node \"crc\" DevicePath \"\"" Dec 01 08:43:21 crc kubenswrapper[5004]: I1201 08:43:21.644226 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7673aa49-60cb-427a-b089-42db27e176ee-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 08:43:21 crc kubenswrapper[5004]: I1201 08:43:21.644379 5004 scope.go:117] "RemoveContainer" containerID="60706ff456d9fc6048e6bf6d99347395c9dc671747d43be8126c7a9cb0e22c40" Dec 01 08:43:21 crc kubenswrapper[5004]: I1201 08:43:21.670836 5004 scope.go:117] "RemoveContainer" containerID="97088024db1f02e2cfefe40f95b46e731ecfdf607491dd4db34ba16d64e22e64" Dec 01 08:43:21 crc kubenswrapper[5004]: E1201 08:43:21.672069 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97088024db1f02e2cfefe40f95b46e731ecfdf607491dd4db34ba16d64e22e64\": container with ID starting with 97088024db1f02e2cfefe40f95b46e731ecfdf607491dd4db34ba16d64e22e64 not found: ID does not exist" containerID="97088024db1f02e2cfefe40f95b46e731ecfdf607491dd4db34ba16d64e22e64" Dec 01 08:43:21 crc kubenswrapper[5004]: I1201 08:43:21.672103 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97088024db1f02e2cfefe40f95b46e731ecfdf607491dd4db34ba16d64e22e64"} err="failed to get container status \"97088024db1f02e2cfefe40f95b46e731ecfdf607491dd4db34ba16d64e22e64\": rpc error: code = NotFound desc = could not find container \"97088024db1f02e2cfefe40f95b46e731ecfdf607491dd4db34ba16d64e22e64\": container with ID starting with 97088024db1f02e2cfefe40f95b46e731ecfdf607491dd4db34ba16d64e22e64 not found: ID does not exist" Dec 01 08:43:21 crc kubenswrapper[5004]: I1201 08:43:21.672124 5004 scope.go:117] "RemoveContainer" containerID="60706ff456d9fc6048e6bf6d99347395c9dc671747d43be8126c7a9cb0e22c40" Dec 01 08:43:21 crc kubenswrapper[5004]: E1201 08:43:21.672424 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60706ff456d9fc6048e6bf6d99347395c9dc671747d43be8126c7a9cb0e22c40\": container with ID starting with 60706ff456d9fc6048e6bf6d99347395c9dc671747d43be8126c7a9cb0e22c40 not found: ID does not exist" containerID="60706ff456d9fc6048e6bf6d99347395c9dc671747d43be8126c7a9cb0e22c40" Dec 01 08:43:21 crc kubenswrapper[5004]: I1201 08:43:21.672445 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60706ff456d9fc6048e6bf6d99347395c9dc671747d43be8126c7a9cb0e22c40"} err="failed to get container status \"60706ff456d9fc6048e6bf6d99347395c9dc671747d43be8126c7a9cb0e22c40\": rpc error: code = NotFound desc = could not find container \"60706ff456d9fc6048e6bf6d99347395c9dc671747d43be8126c7a9cb0e22c40\": container with ID starting with 60706ff456d9fc6048e6bf6d99347395c9dc671747d43be8126c7a9cb0e22c40 not found: ID does not exist" Dec 01 08:43:21 crc kubenswrapper[5004]: I1201 08:43:21.892248 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 08:43:21 crc kubenswrapper[5004]: I1201 08:43:21.957522 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 01 08:43:21 crc kubenswrapper[5004]: I1201 08:43:21.975526 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 01 08:43:21 crc kubenswrapper[5004]: I1201 08:43:21.989527 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 01 08:43:21 crc kubenswrapper[5004]: E1201 08:43:21.990065 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7673aa49-60cb-427a-b089-42db27e176ee" containerName="nova-api-log" Dec 01 08:43:21 crc kubenswrapper[5004]: I1201 08:43:21.990081 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="7673aa49-60cb-427a-b089-42db27e176ee" containerName="nova-api-log" Dec 01 08:43:21 crc kubenswrapper[5004]: E1201 08:43:21.990106 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf129260-3776-4aef-908f-17defb18949d" containerName="proxy-httpd" Dec 01 08:43:21 crc kubenswrapper[5004]: I1201 08:43:21.990112 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf129260-3776-4aef-908f-17defb18949d" containerName="proxy-httpd" Dec 01 08:43:21 crc kubenswrapper[5004]: E1201 08:43:21.990126 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf129260-3776-4aef-908f-17defb18949d" containerName="sg-core" Dec 01 08:43:21 crc kubenswrapper[5004]: I1201 08:43:21.990133 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf129260-3776-4aef-908f-17defb18949d" containerName="sg-core" Dec 01 08:43:21 crc kubenswrapper[5004]: E1201 08:43:21.990150 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf129260-3776-4aef-908f-17defb18949d" containerName="ceilometer-central-agent" Dec 01 08:43:21 crc kubenswrapper[5004]: I1201 08:43:21.990155 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf129260-3776-4aef-908f-17defb18949d" containerName="ceilometer-central-agent" Dec 01 08:43:21 crc kubenswrapper[5004]: E1201 08:43:21.990164 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf129260-3776-4aef-908f-17defb18949d" containerName="ceilometer-notification-agent" Dec 01 08:43:21 crc kubenswrapper[5004]: I1201 08:43:21.990170 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf129260-3776-4aef-908f-17defb18949d" containerName="ceilometer-notification-agent" Dec 01 08:43:21 crc kubenswrapper[5004]: E1201 08:43:21.990189 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7673aa49-60cb-427a-b089-42db27e176ee" containerName="nova-api-api" Dec 01 08:43:21 crc kubenswrapper[5004]: I1201 08:43:21.990194 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="7673aa49-60cb-427a-b089-42db27e176ee" containerName="nova-api-api" Dec 01 08:43:21 crc kubenswrapper[5004]: I1201 08:43:21.990420 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf129260-3776-4aef-908f-17defb18949d" containerName="ceilometer-central-agent" Dec 01 08:43:21 crc kubenswrapper[5004]: I1201 08:43:21.990435 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="7673aa49-60cb-427a-b089-42db27e176ee" containerName="nova-api-log" Dec 01 08:43:21 crc kubenswrapper[5004]: I1201 08:43:21.990443 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf129260-3776-4aef-908f-17defb18949d" containerName="sg-core" Dec 01 08:43:21 crc kubenswrapper[5004]: I1201 08:43:21.990451 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf129260-3776-4aef-908f-17defb18949d" containerName="ceilometer-notification-agent" Dec 01 08:43:21 crc kubenswrapper[5004]: I1201 08:43:21.990466 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="7673aa49-60cb-427a-b089-42db27e176ee" containerName="nova-api-api" Dec 01 08:43:21 crc kubenswrapper[5004]: I1201 08:43:21.990475 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf129260-3776-4aef-908f-17defb18949d" containerName="proxy-httpd" Dec 01 08:43:21 crc kubenswrapper[5004]: I1201 08:43:21.991727 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 08:43:21 crc kubenswrapper[5004]: I1201 08:43:21.994219 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 01 08:43:21 crc kubenswrapper[5004]: I1201 08:43:21.994656 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 01 08:43:21 crc kubenswrapper[5004]: I1201 08:43:21.994828 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.002848 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.052787 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cf129260-3776-4aef-908f-17defb18949d-run-httpd\") pod \"cf129260-3776-4aef-908f-17defb18949d\" (UID: \"cf129260-3776-4aef-908f-17defb18949d\") " Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.052849 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf129260-3776-4aef-908f-17defb18949d-combined-ca-bundle\") pod \"cf129260-3776-4aef-908f-17defb18949d\" (UID: \"cf129260-3776-4aef-908f-17defb18949d\") " Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.052924 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cf129260-3776-4aef-908f-17defb18949d-log-httpd\") pod \"cf129260-3776-4aef-908f-17defb18949d\" (UID: \"cf129260-3776-4aef-908f-17defb18949d\") " Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.053015 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf129260-3776-4aef-908f-17defb18949d-scripts\") pod \"cf129260-3776-4aef-908f-17defb18949d\" (UID: \"cf129260-3776-4aef-908f-17defb18949d\") " Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.053078 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf129260-3776-4aef-908f-17defb18949d-config-data\") pod \"cf129260-3776-4aef-908f-17defb18949d\" (UID: \"cf129260-3776-4aef-908f-17defb18949d\") " Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.053145 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf129260-3776-4aef-908f-17defb18949d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "cf129260-3776-4aef-908f-17defb18949d" (UID: "cf129260-3776-4aef-908f-17defb18949d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.053263 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4772\" (UniqueName: \"kubernetes.io/projected/cf129260-3776-4aef-908f-17defb18949d-kube-api-access-x4772\") pod \"cf129260-3776-4aef-908f-17defb18949d\" (UID: \"cf129260-3776-4aef-908f-17defb18949d\") " Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.053346 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf129260-3776-4aef-908f-17defb18949d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "cf129260-3776-4aef-908f-17defb18949d" (UID: "cf129260-3776-4aef-908f-17defb18949d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.053401 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cf129260-3776-4aef-908f-17defb18949d-sg-core-conf-yaml\") pod \"cf129260-3776-4aef-908f-17defb18949d\" (UID: \"cf129260-3776-4aef-908f-17defb18949d\") " Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.054108 5004 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cf129260-3776-4aef-908f-17defb18949d-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.054147 5004 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cf129260-3776-4aef-908f-17defb18949d-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.060755 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf129260-3776-4aef-908f-17defb18949d-kube-api-access-x4772" (OuterVolumeSpecName: "kube-api-access-x4772") pod "cf129260-3776-4aef-908f-17defb18949d" (UID: "cf129260-3776-4aef-908f-17defb18949d"). InnerVolumeSpecName "kube-api-access-x4772". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.073724 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf129260-3776-4aef-908f-17defb18949d-scripts" (OuterVolumeSpecName: "scripts") pod "cf129260-3776-4aef-908f-17defb18949d" (UID: "cf129260-3776-4aef-908f-17defb18949d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.095143 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf129260-3776-4aef-908f-17defb18949d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "cf129260-3776-4aef-908f-17defb18949d" (UID: "cf129260-3776-4aef-908f-17defb18949d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.156478 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce6cfc91-0fab-40a6-ae94-9cd690cfde01-public-tls-certs\") pod \"nova-api-0\" (UID: \"ce6cfc91-0fab-40a6-ae94-9cd690cfde01\") " pod="openstack/nova-api-0" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.156586 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce6cfc91-0fab-40a6-ae94-9cd690cfde01-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ce6cfc91-0fab-40a6-ae94-9cd690cfde01\") " pod="openstack/nova-api-0" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.156683 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tfdh\" (UniqueName: \"kubernetes.io/projected/ce6cfc91-0fab-40a6-ae94-9cd690cfde01-kube-api-access-8tfdh\") pod \"nova-api-0\" (UID: \"ce6cfc91-0fab-40a6-ae94-9cd690cfde01\") " pod="openstack/nova-api-0" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.156715 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce6cfc91-0fab-40a6-ae94-9cd690cfde01-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ce6cfc91-0fab-40a6-ae94-9cd690cfde01\") " pod="openstack/nova-api-0" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.156815 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce6cfc91-0fab-40a6-ae94-9cd690cfde01-logs\") pod \"nova-api-0\" (UID: \"ce6cfc91-0fab-40a6-ae94-9cd690cfde01\") " pod="openstack/nova-api-0" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.157040 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce6cfc91-0fab-40a6-ae94-9cd690cfde01-config-data\") pod \"nova-api-0\" (UID: \"ce6cfc91-0fab-40a6-ae94-9cd690cfde01\") " pod="openstack/nova-api-0" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.157132 5004 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf129260-3776-4aef-908f-17defb18949d-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.157150 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4772\" (UniqueName: \"kubernetes.io/projected/cf129260-3776-4aef-908f-17defb18949d-kube-api-access-x4772\") on node \"crc\" DevicePath \"\"" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.157197 5004 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cf129260-3776-4aef-908f-17defb18949d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.162359 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf129260-3776-4aef-908f-17defb18949d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cf129260-3776-4aef-908f-17defb18949d" (UID: "cf129260-3776-4aef-908f-17defb18949d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.212138 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf129260-3776-4aef-908f-17defb18949d-config-data" (OuterVolumeSpecName: "config-data") pod "cf129260-3776-4aef-908f-17defb18949d" (UID: "cf129260-3776-4aef-908f-17defb18949d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.259228 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce6cfc91-0fab-40a6-ae94-9cd690cfde01-logs\") pod \"nova-api-0\" (UID: \"ce6cfc91-0fab-40a6-ae94-9cd690cfde01\") " pod="openstack/nova-api-0" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.259493 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce6cfc91-0fab-40a6-ae94-9cd690cfde01-config-data\") pod \"nova-api-0\" (UID: \"ce6cfc91-0fab-40a6-ae94-9cd690cfde01\") " pod="openstack/nova-api-0" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.261747 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce6cfc91-0fab-40a6-ae94-9cd690cfde01-logs\") pod \"nova-api-0\" (UID: \"ce6cfc91-0fab-40a6-ae94-9cd690cfde01\") " pod="openstack/nova-api-0" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.262574 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce6cfc91-0fab-40a6-ae94-9cd690cfde01-public-tls-certs\") pod \"nova-api-0\" (UID: \"ce6cfc91-0fab-40a6-ae94-9cd690cfde01\") " pod="openstack/nova-api-0" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.262666 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce6cfc91-0fab-40a6-ae94-9cd690cfde01-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ce6cfc91-0fab-40a6-ae94-9cd690cfde01\") " pod="openstack/nova-api-0" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.262811 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tfdh\" (UniqueName: \"kubernetes.io/projected/ce6cfc91-0fab-40a6-ae94-9cd690cfde01-kube-api-access-8tfdh\") pod \"nova-api-0\" (UID: \"ce6cfc91-0fab-40a6-ae94-9cd690cfde01\") " pod="openstack/nova-api-0" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.262848 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce6cfc91-0fab-40a6-ae94-9cd690cfde01-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ce6cfc91-0fab-40a6-ae94-9cd690cfde01\") " pod="openstack/nova-api-0" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.263211 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf129260-3776-4aef-908f-17defb18949d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.264245 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf129260-3776-4aef-908f-17defb18949d-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.267168 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce6cfc91-0fab-40a6-ae94-9cd690cfde01-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ce6cfc91-0fab-40a6-ae94-9cd690cfde01\") " pod="openstack/nova-api-0" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.267194 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce6cfc91-0fab-40a6-ae94-9cd690cfde01-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ce6cfc91-0fab-40a6-ae94-9cd690cfde01\") " pod="openstack/nova-api-0" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.267473 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce6cfc91-0fab-40a6-ae94-9cd690cfde01-config-data\") pod \"nova-api-0\" (UID: \"ce6cfc91-0fab-40a6-ae94-9cd690cfde01\") " pod="openstack/nova-api-0" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.268585 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce6cfc91-0fab-40a6-ae94-9cd690cfde01-public-tls-certs\") pod \"nova-api-0\" (UID: \"ce6cfc91-0fab-40a6-ae94-9cd690cfde01\") " pod="openstack/nova-api-0" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.300513 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tfdh\" (UniqueName: \"kubernetes.io/projected/ce6cfc91-0fab-40a6-ae94-9cd690cfde01-kube-api-access-8tfdh\") pod \"nova-api-0\" (UID: \"ce6cfc91-0fab-40a6-ae94-9cd690cfde01\") " pod="openstack/nova-api-0" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.324936 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.553959 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.656929 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cf129260-3776-4aef-908f-17defb18949d","Type":"ContainerDied","Data":"345c303513ce1d47ab0c4c8c7bcb58dcf32cb398e38b23efc795fcd1bb8a9d4f"} Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.657008 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.657023 5004 scope.go:117] "RemoveContainer" containerID="22ffa35445670733c02423becdd59f1ce90f1b9c2d0992fd12cb54d9c79e693a" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.664544 5004 generic.go:334] "Generic (PLEG): container finished" podID="6792a811-4156-4b0a-b6b4-6ff3280229d2" containerID="e789ac7dbb636b11509724961c9b1b5f2335aa9d96fb8d46c0eba9d201fadfd9" exitCode=137 Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.664613 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6792a811-4156-4b0a-b6b4-6ff3280229d2","Type":"ContainerDied","Data":"e789ac7dbb636b11509724961c9b1b5f2335aa9d96fb8d46c0eba9d201fadfd9"} Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.664650 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.664672 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6792a811-4156-4b0a-b6b4-6ff3280229d2","Type":"ContainerDied","Data":"96da3a799c73646c16fc155df5e8fd5c622c1f9a4d244cbfbef65a50cdeb0f90"} Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.679977 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6792a811-4156-4b0a-b6b4-6ff3280229d2-config-data\") pod \"6792a811-4156-4b0a-b6b4-6ff3280229d2\" (UID: \"6792a811-4156-4b0a-b6b4-6ff3280229d2\") " Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.680060 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6792a811-4156-4b0a-b6b4-6ff3280229d2-combined-ca-bundle\") pod \"6792a811-4156-4b0a-b6b4-6ff3280229d2\" (UID: \"6792a811-4156-4b0a-b6b4-6ff3280229d2\") " Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.680089 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnsxn\" (UniqueName: \"kubernetes.io/projected/6792a811-4156-4b0a-b6b4-6ff3280229d2-kube-api-access-vnsxn\") pod \"6792a811-4156-4b0a-b6b4-6ff3280229d2\" (UID: \"6792a811-4156-4b0a-b6b4-6ff3280229d2\") " Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.680227 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6792a811-4156-4b0a-b6b4-6ff3280229d2-scripts\") pod \"6792a811-4156-4b0a-b6b4-6ff3280229d2\" (UID: \"6792a811-4156-4b0a-b6b4-6ff3280229d2\") " Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.685775 5004 scope.go:117] "RemoveContainer" containerID="83caf7713f77ee17e875f3fe279b1d1afa85abc52ab689445cd0bfad842e8d0c" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.701174 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6792a811-4156-4b0a-b6b4-6ff3280229d2-scripts" (OuterVolumeSpecName: "scripts") pod "6792a811-4156-4b0a-b6b4-6ff3280229d2" (UID: "6792a811-4156-4b0a-b6b4-6ff3280229d2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.713349 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.726905 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6792a811-4156-4b0a-b6b4-6ff3280229d2-kube-api-access-vnsxn" (OuterVolumeSpecName: "kube-api-access-vnsxn") pod "6792a811-4156-4b0a-b6b4-6ff3280229d2" (UID: "6792a811-4156-4b0a-b6b4-6ff3280229d2"). InnerVolumeSpecName "kube-api-access-vnsxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.734257 5004 scope.go:117] "RemoveContainer" containerID="9d21786cfdb3689b03b08cae03431c6830bde6f9ee45b65f26bd50894c4acca3" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.753969 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.773074 5004 scope.go:117] "RemoveContainer" containerID="fc776daf0808f78130eee654c714b6c4ee668c18424442260d9af29df0f5d80a" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.784946 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnsxn\" (UniqueName: \"kubernetes.io/projected/6792a811-4156-4b0a-b6b4-6ff3280229d2-kube-api-access-vnsxn\") on node \"crc\" DevicePath \"\"" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.784979 5004 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6792a811-4156-4b0a-b6b4-6ff3280229d2-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.798746 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7673aa49-60cb-427a-b089-42db27e176ee" path="/var/lib/kubelet/pods/7673aa49-60cb-427a-b089-42db27e176ee/volumes" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.800445 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf129260-3776-4aef-908f-17defb18949d" path="/var/lib/kubelet/pods/cf129260-3776-4aef-908f-17defb18949d/volumes" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.811267 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 08:43:22 crc kubenswrapper[5004]: E1201 08:43:22.811937 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6792a811-4156-4b0a-b6b4-6ff3280229d2" containerName="aodh-api" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.811951 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="6792a811-4156-4b0a-b6b4-6ff3280229d2" containerName="aodh-api" Dec 01 08:43:22 crc kubenswrapper[5004]: E1201 08:43:22.811984 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6792a811-4156-4b0a-b6b4-6ff3280229d2" containerName="aodh-evaluator" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.811990 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="6792a811-4156-4b0a-b6b4-6ff3280229d2" containerName="aodh-evaluator" Dec 01 08:43:22 crc kubenswrapper[5004]: E1201 08:43:22.812006 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6792a811-4156-4b0a-b6b4-6ff3280229d2" containerName="aodh-notifier" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.812012 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="6792a811-4156-4b0a-b6b4-6ff3280229d2" containerName="aodh-notifier" Dec 01 08:43:22 crc kubenswrapper[5004]: E1201 08:43:22.812020 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6792a811-4156-4b0a-b6b4-6ff3280229d2" containerName="aodh-listener" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.812026 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="6792a811-4156-4b0a-b6b4-6ff3280229d2" containerName="aodh-listener" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.812245 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="6792a811-4156-4b0a-b6b4-6ff3280229d2" containerName="aodh-evaluator" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.812257 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="6792a811-4156-4b0a-b6b4-6ff3280229d2" containerName="aodh-api" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.812278 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="6792a811-4156-4b0a-b6b4-6ff3280229d2" containerName="aodh-listener" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.812288 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="6792a811-4156-4b0a-b6b4-6ff3280229d2" containerName="aodh-notifier" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.814724 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.814832 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.819509 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.819744 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.833649 5004 scope.go:117] "RemoveContainer" containerID="e789ac7dbb636b11509724961c9b1b5f2335aa9d96fb8d46c0eba9d201fadfd9" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.863868 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.873747 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6792a811-4156-4b0a-b6b4-6ff3280229d2-config-data" (OuterVolumeSpecName: "config-data") pod "6792a811-4156-4b0a-b6b4-6ff3280229d2" (UID: "6792a811-4156-4b0a-b6b4-6ff3280229d2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.886992 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65ef6d1f-ede0-4db1-b3a4-005921b1ce6b-scripts\") pod \"ceilometer-0\" (UID: \"65ef6d1f-ede0-4db1-b3a4-005921b1ce6b\") " pod="openstack/ceilometer-0" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.887259 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65ef6d1f-ede0-4db1-b3a4-005921b1ce6b-config-data\") pod \"ceilometer-0\" (UID: \"65ef6d1f-ede0-4db1-b3a4-005921b1ce6b\") " pod="openstack/ceilometer-0" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.887297 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/65ef6d1f-ede0-4db1-b3a4-005921b1ce6b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"65ef6d1f-ede0-4db1-b3a4-005921b1ce6b\") " pod="openstack/ceilometer-0" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.887375 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65ef6d1f-ede0-4db1-b3a4-005921b1ce6b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"65ef6d1f-ede0-4db1-b3a4-005921b1ce6b\") " pod="openstack/ceilometer-0" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.887471 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65ef6d1f-ede0-4db1-b3a4-005921b1ce6b-log-httpd\") pod \"ceilometer-0\" (UID: \"65ef6d1f-ede0-4db1-b3a4-005921b1ce6b\") " pod="openstack/ceilometer-0" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.887529 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65ef6d1f-ede0-4db1-b3a4-005921b1ce6b-run-httpd\") pod \"ceilometer-0\" (UID: \"65ef6d1f-ede0-4db1-b3a4-005921b1ce6b\") " pod="openstack/ceilometer-0" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.887607 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5bjr\" (UniqueName: \"kubernetes.io/projected/65ef6d1f-ede0-4db1-b3a4-005921b1ce6b-kube-api-access-n5bjr\") pod \"ceilometer-0\" (UID: \"65ef6d1f-ede0-4db1-b3a4-005921b1ce6b\") " pod="openstack/ceilometer-0" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.887656 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6792a811-4156-4b0a-b6b4-6ff3280229d2-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.908685 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6792a811-4156-4b0a-b6b4-6ff3280229d2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6792a811-4156-4b0a-b6b4-6ff3280229d2" (UID: "6792a811-4156-4b0a-b6b4-6ff3280229d2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.930225 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.952037 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.989677 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65ef6d1f-ede0-4db1-b3a4-005921b1ce6b-log-httpd\") pod \"ceilometer-0\" (UID: \"65ef6d1f-ede0-4db1-b3a4-005921b1ce6b\") " pod="openstack/ceilometer-0" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.989784 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65ef6d1f-ede0-4db1-b3a4-005921b1ce6b-run-httpd\") pod \"ceilometer-0\" (UID: \"65ef6d1f-ede0-4db1-b3a4-005921b1ce6b\") " pod="openstack/ceilometer-0" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.989868 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5bjr\" (UniqueName: \"kubernetes.io/projected/65ef6d1f-ede0-4db1-b3a4-005921b1ce6b-kube-api-access-n5bjr\") pod \"ceilometer-0\" (UID: \"65ef6d1f-ede0-4db1-b3a4-005921b1ce6b\") " pod="openstack/ceilometer-0" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.989927 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65ef6d1f-ede0-4db1-b3a4-005921b1ce6b-scripts\") pod \"ceilometer-0\" (UID: \"65ef6d1f-ede0-4db1-b3a4-005921b1ce6b\") " pod="openstack/ceilometer-0" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.989951 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65ef6d1f-ede0-4db1-b3a4-005921b1ce6b-config-data\") pod \"ceilometer-0\" (UID: \"65ef6d1f-ede0-4db1-b3a4-005921b1ce6b\") " pod="openstack/ceilometer-0" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.989975 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/65ef6d1f-ede0-4db1-b3a4-005921b1ce6b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"65ef6d1f-ede0-4db1-b3a4-005921b1ce6b\") " pod="openstack/ceilometer-0" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.990039 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65ef6d1f-ede0-4db1-b3a4-005921b1ce6b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"65ef6d1f-ede0-4db1-b3a4-005921b1ce6b\") " pod="openstack/ceilometer-0" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.990153 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6792a811-4156-4b0a-b6b4-6ff3280229d2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.991113 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65ef6d1f-ede0-4db1-b3a4-005921b1ce6b-run-httpd\") pod \"ceilometer-0\" (UID: \"65ef6d1f-ede0-4db1-b3a4-005921b1ce6b\") " pod="openstack/ceilometer-0" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.991162 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65ef6d1f-ede0-4db1-b3a4-005921b1ce6b-log-httpd\") pod \"ceilometer-0\" (UID: \"65ef6d1f-ede0-4db1-b3a4-005921b1ce6b\") " pod="openstack/ceilometer-0" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.994596 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65ef6d1f-ede0-4db1-b3a4-005921b1ce6b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"65ef6d1f-ede0-4db1-b3a4-005921b1ce6b\") " pod="openstack/ceilometer-0" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.994855 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65ef6d1f-ede0-4db1-b3a4-005921b1ce6b-scripts\") pod \"ceilometer-0\" (UID: \"65ef6d1f-ede0-4db1-b3a4-005921b1ce6b\") " pod="openstack/ceilometer-0" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.995158 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65ef6d1f-ede0-4db1-b3a4-005921b1ce6b-config-data\") pod \"ceilometer-0\" (UID: \"65ef6d1f-ede0-4db1-b3a4-005921b1ce6b\") " pod="openstack/ceilometer-0" Dec 01 08:43:22 crc kubenswrapper[5004]: I1201 08:43:22.997962 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/65ef6d1f-ede0-4db1-b3a4-005921b1ce6b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"65ef6d1f-ede0-4db1-b3a4-005921b1ce6b\") " pod="openstack/ceilometer-0" Dec 01 08:43:23 crc kubenswrapper[5004]: I1201 08:43:23.007831 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5bjr\" (UniqueName: \"kubernetes.io/projected/65ef6d1f-ede0-4db1-b3a4-005921b1ce6b-kube-api-access-n5bjr\") pod \"ceilometer-0\" (UID: \"65ef6d1f-ede0-4db1-b3a4-005921b1ce6b\") " pod="openstack/ceilometer-0" Dec 01 08:43:23 crc kubenswrapper[5004]: I1201 08:43:23.098767 5004 scope.go:117] "RemoveContainer" containerID="3997889ffc5431fc2825ff310e60d0300319a23df2695880a877b33ad1ff73a7" Dec 01 08:43:23 crc kubenswrapper[5004]: I1201 08:43:23.108357 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 08:43:23 crc kubenswrapper[5004]: I1201 08:43:23.165978 5004 scope.go:117] "RemoveContainer" containerID="2f13ef7709655a3d8d5548c58b0d41f1559b92eed686d7880bfceca816c21aad" Dec 01 08:43:23 crc kubenswrapper[5004]: I1201 08:43:23.166109 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Dec 01 08:43:23 crc kubenswrapper[5004]: I1201 08:43:23.177568 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Dec 01 08:43:23 crc kubenswrapper[5004]: I1201 08:43:23.188952 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Dec 01 08:43:23 crc kubenswrapper[5004]: I1201 08:43:23.214593 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 01 08:43:23 crc kubenswrapper[5004]: I1201 08:43:23.220718 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Dec 01 08:43:23 crc kubenswrapper[5004]: I1201 08:43:23.220923 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-hrc7d" Dec 01 08:43:23 crc kubenswrapper[5004]: I1201 08:43:23.221906 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Dec 01 08:43:23 crc kubenswrapper[5004]: I1201 08:43:23.222114 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Dec 01 08:43:23 crc kubenswrapper[5004]: I1201 08:43:23.222430 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Dec 01 08:43:23 crc kubenswrapper[5004]: I1201 08:43:23.253046 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 01 08:43:23 crc kubenswrapper[5004]: I1201 08:43:23.270350 5004 scope.go:117] "RemoveContainer" containerID="9ab23b6a648f9483fa5184cfcbb95b1a2b97c9bcd584bcdead439cfe601c29e3" Dec 01 08:43:23 crc kubenswrapper[5004]: I1201 08:43:23.319095 5004 scope.go:117] "RemoveContainer" containerID="e789ac7dbb636b11509724961c9b1b5f2335aa9d96fb8d46c0eba9d201fadfd9" Dec 01 08:43:23 crc kubenswrapper[5004]: E1201 08:43:23.320094 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e789ac7dbb636b11509724961c9b1b5f2335aa9d96fb8d46c0eba9d201fadfd9\": container with ID starting with e789ac7dbb636b11509724961c9b1b5f2335aa9d96fb8d46c0eba9d201fadfd9 not found: ID does not exist" containerID="e789ac7dbb636b11509724961c9b1b5f2335aa9d96fb8d46c0eba9d201fadfd9" Dec 01 08:43:23 crc kubenswrapper[5004]: I1201 08:43:23.320129 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e789ac7dbb636b11509724961c9b1b5f2335aa9d96fb8d46c0eba9d201fadfd9"} err="failed to get container status \"e789ac7dbb636b11509724961c9b1b5f2335aa9d96fb8d46c0eba9d201fadfd9\": rpc error: code = NotFound desc = could not find container \"e789ac7dbb636b11509724961c9b1b5f2335aa9d96fb8d46c0eba9d201fadfd9\": container with ID starting with e789ac7dbb636b11509724961c9b1b5f2335aa9d96fb8d46c0eba9d201fadfd9 not found: ID does not exist" Dec 01 08:43:23 crc kubenswrapper[5004]: I1201 08:43:23.320150 5004 scope.go:117] "RemoveContainer" containerID="3997889ffc5431fc2825ff310e60d0300319a23df2695880a877b33ad1ff73a7" Dec 01 08:43:23 crc kubenswrapper[5004]: E1201 08:43:23.320543 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3997889ffc5431fc2825ff310e60d0300319a23df2695880a877b33ad1ff73a7\": container with ID starting with 3997889ffc5431fc2825ff310e60d0300319a23df2695880a877b33ad1ff73a7 not found: ID does not exist" containerID="3997889ffc5431fc2825ff310e60d0300319a23df2695880a877b33ad1ff73a7" Dec 01 08:43:23 crc kubenswrapper[5004]: I1201 08:43:23.320639 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3997889ffc5431fc2825ff310e60d0300319a23df2695880a877b33ad1ff73a7"} err="failed to get container status \"3997889ffc5431fc2825ff310e60d0300319a23df2695880a877b33ad1ff73a7\": rpc error: code = NotFound desc = could not find container \"3997889ffc5431fc2825ff310e60d0300319a23df2695880a877b33ad1ff73a7\": container with ID starting with 3997889ffc5431fc2825ff310e60d0300319a23df2695880a877b33ad1ff73a7 not found: ID does not exist" Dec 01 08:43:23 crc kubenswrapper[5004]: I1201 08:43:23.320666 5004 scope.go:117] "RemoveContainer" containerID="2f13ef7709655a3d8d5548c58b0d41f1559b92eed686d7880bfceca816c21aad" Dec 01 08:43:23 crc kubenswrapper[5004]: E1201 08:43:23.321152 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f13ef7709655a3d8d5548c58b0d41f1559b92eed686d7880bfceca816c21aad\": container with ID starting with 2f13ef7709655a3d8d5548c58b0d41f1559b92eed686d7880bfceca816c21aad not found: ID does not exist" containerID="2f13ef7709655a3d8d5548c58b0d41f1559b92eed686d7880bfceca816c21aad" Dec 01 08:43:23 crc kubenswrapper[5004]: I1201 08:43:23.321215 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f13ef7709655a3d8d5548c58b0d41f1559b92eed686d7880bfceca816c21aad"} err="failed to get container status \"2f13ef7709655a3d8d5548c58b0d41f1559b92eed686d7880bfceca816c21aad\": rpc error: code = NotFound desc = could not find container \"2f13ef7709655a3d8d5548c58b0d41f1559b92eed686d7880bfceca816c21aad\": container with ID starting with 2f13ef7709655a3d8d5548c58b0d41f1559b92eed686d7880bfceca816c21aad not found: ID does not exist" Dec 01 08:43:23 crc kubenswrapper[5004]: I1201 08:43:23.321243 5004 scope.go:117] "RemoveContainer" containerID="9ab23b6a648f9483fa5184cfcbb95b1a2b97c9bcd584bcdead439cfe601c29e3" Dec 01 08:43:23 crc kubenswrapper[5004]: E1201 08:43:23.321541 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ab23b6a648f9483fa5184cfcbb95b1a2b97c9bcd584bcdead439cfe601c29e3\": container with ID starting with 9ab23b6a648f9483fa5184cfcbb95b1a2b97c9bcd584bcdead439cfe601c29e3 not found: ID does not exist" containerID="9ab23b6a648f9483fa5184cfcbb95b1a2b97c9bcd584bcdead439cfe601c29e3" Dec 01 08:43:23 crc kubenswrapper[5004]: I1201 08:43:23.321582 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ab23b6a648f9483fa5184cfcbb95b1a2b97c9bcd584bcdead439cfe601c29e3"} err="failed to get container status \"9ab23b6a648f9483fa5184cfcbb95b1a2b97c9bcd584bcdead439cfe601c29e3\": rpc error: code = NotFound desc = could not find container \"9ab23b6a648f9483fa5184cfcbb95b1a2b97c9bcd584bcdead439cfe601c29e3\": container with ID starting with 9ab23b6a648f9483fa5184cfcbb95b1a2b97c9bcd584bcdead439cfe601c29e3 not found: ID does not exist" Dec 01 08:43:23 crc kubenswrapper[5004]: I1201 08:43:23.334990 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/254314c7-5d69-4a31-b624-d985125bacee-internal-tls-certs\") pod \"aodh-0\" (UID: \"254314c7-5d69-4a31-b624-d985125bacee\") " pod="openstack/aodh-0" Dec 01 08:43:23 crc kubenswrapper[5004]: I1201 08:43:23.335039 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/254314c7-5d69-4a31-b624-d985125bacee-scripts\") pod \"aodh-0\" (UID: \"254314c7-5d69-4a31-b624-d985125bacee\") " pod="openstack/aodh-0" Dec 01 08:43:23 crc kubenswrapper[5004]: I1201 08:43:23.335071 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/254314c7-5d69-4a31-b624-d985125bacee-combined-ca-bundle\") pod \"aodh-0\" (UID: \"254314c7-5d69-4a31-b624-d985125bacee\") " pod="openstack/aodh-0" Dec 01 08:43:23 crc kubenswrapper[5004]: I1201 08:43:23.335114 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/254314c7-5d69-4a31-b624-d985125bacee-public-tls-certs\") pod \"aodh-0\" (UID: \"254314c7-5d69-4a31-b624-d985125bacee\") " pod="openstack/aodh-0" Dec 01 08:43:23 crc kubenswrapper[5004]: I1201 08:43:23.335180 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dswgk\" (UniqueName: \"kubernetes.io/projected/254314c7-5d69-4a31-b624-d985125bacee-kube-api-access-dswgk\") pod \"aodh-0\" (UID: \"254314c7-5d69-4a31-b624-d985125bacee\") " pod="openstack/aodh-0" Dec 01 08:43:23 crc kubenswrapper[5004]: I1201 08:43:23.335195 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/254314c7-5d69-4a31-b624-d985125bacee-config-data\") pod \"aodh-0\" (UID: \"254314c7-5d69-4a31-b624-d985125bacee\") " pod="openstack/aodh-0" Dec 01 08:43:23 crc kubenswrapper[5004]: I1201 08:43:23.436883 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/254314c7-5d69-4a31-b624-d985125bacee-internal-tls-certs\") pod \"aodh-0\" (UID: \"254314c7-5d69-4a31-b624-d985125bacee\") " pod="openstack/aodh-0" Dec 01 08:43:23 crc kubenswrapper[5004]: I1201 08:43:23.436938 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/254314c7-5d69-4a31-b624-d985125bacee-scripts\") pod \"aodh-0\" (UID: \"254314c7-5d69-4a31-b624-d985125bacee\") " pod="openstack/aodh-0" Dec 01 08:43:23 crc kubenswrapper[5004]: I1201 08:43:23.436971 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/254314c7-5d69-4a31-b624-d985125bacee-combined-ca-bundle\") pod \"aodh-0\" (UID: \"254314c7-5d69-4a31-b624-d985125bacee\") " pod="openstack/aodh-0" Dec 01 08:43:23 crc kubenswrapper[5004]: I1201 08:43:23.437008 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/254314c7-5d69-4a31-b624-d985125bacee-public-tls-certs\") pod \"aodh-0\" (UID: \"254314c7-5d69-4a31-b624-d985125bacee\") " pod="openstack/aodh-0" Dec 01 08:43:23 crc kubenswrapper[5004]: I1201 08:43:23.437071 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dswgk\" (UniqueName: \"kubernetes.io/projected/254314c7-5d69-4a31-b624-d985125bacee-kube-api-access-dswgk\") pod \"aodh-0\" (UID: \"254314c7-5d69-4a31-b624-d985125bacee\") " pod="openstack/aodh-0" Dec 01 08:43:23 crc kubenswrapper[5004]: I1201 08:43:23.437087 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/254314c7-5d69-4a31-b624-d985125bacee-config-data\") pod \"aodh-0\" (UID: \"254314c7-5d69-4a31-b624-d985125bacee\") " pod="openstack/aodh-0" Dec 01 08:43:23 crc kubenswrapper[5004]: I1201 08:43:23.442101 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/254314c7-5d69-4a31-b624-d985125bacee-scripts\") pod \"aodh-0\" (UID: \"254314c7-5d69-4a31-b624-d985125bacee\") " pod="openstack/aodh-0" Dec 01 08:43:23 crc kubenswrapper[5004]: I1201 08:43:23.449281 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/254314c7-5d69-4a31-b624-d985125bacee-combined-ca-bundle\") pod \"aodh-0\" (UID: \"254314c7-5d69-4a31-b624-d985125bacee\") " pod="openstack/aodh-0" Dec 01 08:43:23 crc kubenswrapper[5004]: I1201 08:43:23.450387 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/254314c7-5d69-4a31-b624-d985125bacee-config-data\") pod \"aodh-0\" (UID: \"254314c7-5d69-4a31-b624-d985125bacee\") " pod="openstack/aodh-0" Dec 01 08:43:23 crc kubenswrapper[5004]: I1201 08:43:23.452535 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/254314c7-5d69-4a31-b624-d985125bacee-internal-tls-certs\") pod \"aodh-0\" (UID: \"254314c7-5d69-4a31-b624-d985125bacee\") " pod="openstack/aodh-0" Dec 01 08:43:23 crc kubenswrapper[5004]: I1201 08:43:23.462126 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/254314c7-5d69-4a31-b624-d985125bacee-public-tls-certs\") pod \"aodh-0\" (UID: \"254314c7-5d69-4a31-b624-d985125bacee\") " pod="openstack/aodh-0" Dec 01 08:43:23 crc kubenswrapper[5004]: I1201 08:43:23.466083 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dswgk\" (UniqueName: \"kubernetes.io/projected/254314c7-5d69-4a31-b624-d985125bacee-kube-api-access-dswgk\") pod \"aodh-0\" (UID: \"254314c7-5d69-4a31-b624-d985125bacee\") " pod="openstack/aodh-0" Dec 01 08:43:23 crc kubenswrapper[5004]: I1201 08:43:23.569787 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 01 08:43:23 crc kubenswrapper[5004]: I1201 08:43:23.698799 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ce6cfc91-0fab-40a6-ae94-9cd690cfde01","Type":"ContainerStarted","Data":"abfbf195cf6c0fba0b21972049382e546af7fd71d0fa9667d1da8fd583500ac5"} Dec 01 08:43:23 crc kubenswrapper[5004]: I1201 08:43:23.698849 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ce6cfc91-0fab-40a6-ae94-9cd690cfde01","Type":"ContainerStarted","Data":"49f507d93b89c7803cbc7444933536ad98034a5df1c825ae62350a0b75a0940f"} Dec 01 08:43:23 crc kubenswrapper[5004]: I1201 08:43:23.698861 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ce6cfc91-0fab-40a6-ae94-9cd690cfde01","Type":"ContainerStarted","Data":"d65308ab58b2dfafabe5f43f896de657de1af0ed5946de9bbf0c0ea0ceaa2f24"} Dec 01 08:43:23 crc kubenswrapper[5004]: I1201 08:43:23.723392 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 08:43:23 crc kubenswrapper[5004]: W1201 08:43:23.724170 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65ef6d1f_ede0_4db1_b3a4_005921b1ce6b.slice/crio-8e68ee861d90139410a028be3bdd8c2c047df9436ca19d7d47b5998909212296 WatchSource:0}: Error finding container 8e68ee861d90139410a028be3bdd8c2c047df9436ca19d7d47b5998909212296: Status 404 returned error can't find the container with id 8e68ee861d90139410a028be3bdd8c2c047df9436ca19d7d47b5998909212296 Dec 01 08:43:23 crc kubenswrapper[5004]: I1201 08:43:23.730621 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 01 08:43:23 crc kubenswrapper[5004]: I1201 08:43:23.731890 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.731869344 podStartE2EDuration="2.731869344s" podCreationTimestamp="2025-12-01 08:43:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:43:23.72396684 +0000 UTC m=+1581.288958812" watchObservedRunningTime="2025-12-01 08:43:23.731869344 +0000 UTC m=+1581.296861326" Dec 01 08:43:23 crc kubenswrapper[5004]: I1201 08:43:23.945719 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-tvtkb"] Dec 01 08:43:23 crc kubenswrapper[5004]: I1201 08:43:23.947488 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-tvtkb" Dec 01 08:43:23 crc kubenswrapper[5004]: I1201 08:43:23.950459 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 01 08:43:23 crc kubenswrapper[5004]: I1201 08:43:23.951173 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 01 08:43:23 crc kubenswrapper[5004]: I1201 08:43:23.957763 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-tvtkb"] Dec 01 08:43:24 crc kubenswrapper[5004]: I1201 08:43:24.054482 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03552a97-36f1-421c-8dfe-359fe79b8a7f-scripts\") pod \"nova-cell1-cell-mapping-tvtkb\" (UID: \"03552a97-36f1-421c-8dfe-359fe79b8a7f\") " pod="openstack/nova-cell1-cell-mapping-tvtkb" Dec 01 08:43:24 crc kubenswrapper[5004]: I1201 08:43:24.054673 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03552a97-36f1-421c-8dfe-359fe79b8a7f-config-data\") pod \"nova-cell1-cell-mapping-tvtkb\" (UID: \"03552a97-36f1-421c-8dfe-359fe79b8a7f\") " pod="openstack/nova-cell1-cell-mapping-tvtkb" Dec 01 08:43:24 crc kubenswrapper[5004]: I1201 08:43:24.054739 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtn4n\" (UniqueName: \"kubernetes.io/projected/03552a97-36f1-421c-8dfe-359fe79b8a7f-kube-api-access-wtn4n\") pod \"nova-cell1-cell-mapping-tvtkb\" (UID: \"03552a97-36f1-421c-8dfe-359fe79b8a7f\") " pod="openstack/nova-cell1-cell-mapping-tvtkb" Dec 01 08:43:24 crc kubenswrapper[5004]: I1201 08:43:24.054831 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03552a97-36f1-421c-8dfe-359fe79b8a7f-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-tvtkb\" (UID: \"03552a97-36f1-421c-8dfe-359fe79b8a7f\") " pod="openstack/nova-cell1-cell-mapping-tvtkb" Dec 01 08:43:24 crc kubenswrapper[5004]: I1201 08:43:24.067379 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 01 08:43:24 crc kubenswrapper[5004]: I1201 08:43:24.156806 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03552a97-36f1-421c-8dfe-359fe79b8a7f-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-tvtkb\" (UID: \"03552a97-36f1-421c-8dfe-359fe79b8a7f\") " pod="openstack/nova-cell1-cell-mapping-tvtkb" Dec 01 08:43:24 crc kubenswrapper[5004]: I1201 08:43:24.156869 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03552a97-36f1-421c-8dfe-359fe79b8a7f-scripts\") pod \"nova-cell1-cell-mapping-tvtkb\" (UID: \"03552a97-36f1-421c-8dfe-359fe79b8a7f\") " pod="openstack/nova-cell1-cell-mapping-tvtkb" Dec 01 08:43:24 crc kubenswrapper[5004]: I1201 08:43:24.156961 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03552a97-36f1-421c-8dfe-359fe79b8a7f-config-data\") pod \"nova-cell1-cell-mapping-tvtkb\" (UID: \"03552a97-36f1-421c-8dfe-359fe79b8a7f\") " pod="openstack/nova-cell1-cell-mapping-tvtkb" Dec 01 08:43:24 crc kubenswrapper[5004]: I1201 08:43:24.157014 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtn4n\" (UniqueName: \"kubernetes.io/projected/03552a97-36f1-421c-8dfe-359fe79b8a7f-kube-api-access-wtn4n\") pod \"nova-cell1-cell-mapping-tvtkb\" (UID: \"03552a97-36f1-421c-8dfe-359fe79b8a7f\") " pod="openstack/nova-cell1-cell-mapping-tvtkb" Dec 01 08:43:24 crc kubenswrapper[5004]: I1201 08:43:24.162463 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03552a97-36f1-421c-8dfe-359fe79b8a7f-config-data\") pod \"nova-cell1-cell-mapping-tvtkb\" (UID: \"03552a97-36f1-421c-8dfe-359fe79b8a7f\") " pod="openstack/nova-cell1-cell-mapping-tvtkb" Dec 01 08:43:24 crc kubenswrapper[5004]: I1201 08:43:24.162906 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03552a97-36f1-421c-8dfe-359fe79b8a7f-scripts\") pod \"nova-cell1-cell-mapping-tvtkb\" (UID: \"03552a97-36f1-421c-8dfe-359fe79b8a7f\") " pod="openstack/nova-cell1-cell-mapping-tvtkb" Dec 01 08:43:24 crc kubenswrapper[5004]: I1201 08:43:24.163550 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03552a97-36f1-421c-8dfe-359fe79b8a7f-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-tvtkb\" (UID: \"03552a97-36f1-421c-8dfe-359fe79b8a7f\") " pod="openstack/nova-cell1-cell-mapping-tvtkb" Dec 01 08:43:24 crc kubenswrapper[5004]: I1201 08:43:24.173261 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtn4n\" (UniqueName: \"kubernetes.io/projected/03552a97-36f1-421c-8dfe-359fe79b8a7f-kube-api-access-wtn4n\") pod \"nova-cell1-cell-mapping-tvtkb\" (UID: \"03552a97-36f1-421c-8dfe-359fe79b8a7f\") " pod="openstack/nova-cell1-cell-mapping-tvtkb" Dec 01 08:43:24 crc kubenswrapper[5004]: I1201 08:43:24.279143 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-tvtkb" Dec 01 08:43:24 crc kubenswrapper[5004]: I1201 08:43:24.717145 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"254314c7-5d69-4a31-b624-d985125bacee","Type":"ContainerStarted","Data":"b39670f2d5f8a299b202368dbff731a29c2a6e7c91eba916211742d49d35967d"} Dec 01 08:43:24 crc kubenswrapper[5004]: I1201 08:43:24.719534 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65ef6d1f-ede0-4db1-b3a4-005921b1ce6b","Type":"ContainerStarted","Data":"d4b07553eb9a29d74c32d8259f59105e2125c612dc02287aa874b3cb7f3d8197"} Dec 01 08:43:24 crc kubenswrapper[5004]: I1201 08:43:24.719583 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65ef6d1f-ede0-4db1-b3a4-005921b1ce6b","Type":"ContainerStarted","Data":"8e68ee861d90139410a028be3bdd8c2c047df9436ca19d7d47b5998909212296"} Dec 01 08:43:24 crc kubenswrapper[5004]: I1201 08:43:24.776428 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6792a811-4156-4b0a-b6b4-6ff3280229d2" path="/var/lib/kubelet/pods/6792a811-4156-4b0a-b6b4-6ff3280229d2/volumes" Dec 01 08:43:24 crc kubenswrapper[5004]: W1201 08:43:24.819534 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod03552a97_36f1_421c_8dfe_359fe79b8a7f.slice/crio-3da9a7ae874dd63fe7a055d87037c45844d005a6bbafaf2c4a216c506b87d30b WatchSource:0}: Error finding container 3da9a7ae874dd63fe7a055d87037c45844d005a6bbafaf2c4a216c506b87d30b: Status 404 returned error can't find the container with id 3da9a7ae874dd63fe7a055d87037c45844d005a6bbafaf2c4a216c506b87d30b Dec 01 08:43:24 crc kubenswrapper[5004]: I1201 08:43:24.836763 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-tvtkb"] Dec 01 08:43:25 crc kubenswrapper[5004]: I1201 08:43:25.332754 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6d99f6bc7f-nk8fb" Dec 01 08:43:25 crc kubenswrapper[5004]: I1201 08:43:25.429092 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7877d89589-z7vkq"] Dec 01 08:43:25 crc kubenswrapper[5004]: I1201 08:43:25.429381 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7877d89589-z7vkq" podUID="2c39c876-65b4-4e2d-83c8-e239417edbf5" containerName="dnsmasq-dns" containerID="cri-o://93ae33f03605ec6547e3243a238e3683cce0f366c93b879a96c37a4e5da0618f" gracePeriod=10 Dec 01 08:43:25 crc kubenswrapper[5004]: I1201 08:43:25.738731 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-tvtkb" event={"ID":"03552a97-36f1-421c-8dfe-359fe79b8a7f","Type":"ContainerStarted","Data":"eefbec68660f5a6a8770dc5ee6f71f3c7b18cef614bc4087bac9a2369ec84460"} Dec 01 08:43:25 crc kubenswrapper[5004]: I1201 08:43:25.738961 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-tvtkb" event={"ID":"03552a97-36f1-421c-8dfe-359fe79b8a7f","Type":"ContainerStarted","Data":"3da9a7ae874dd63fe7a055d87037c45844d005a6bbafaf2c4a216c506b87d30b"} Dec 01 08:43:25 crc kubenswrapper[5004]: I1201 08:43:25.746088 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"254314c7-5d69-4a31-b624-d985125bacee","Type":"ContainerStarted","Data":"108bc634745220ffabaaad3cea124a02ba4a300a2ab78a8e63f1c5b6c285e421"} Dec 01 08:43:25 crc kubenswrapper[5004]: I1201 08:43:25.769267 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-tvtkb" podStartSLOduration=2.76924761 podStartE2EDuration="2.76924761s" podCreationTimestamp="2025-12-01 08:43:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:43:25.75411157 +0000 UTC m=+1583.319103552" watchObservedRunningTime="2025-12-01 08:43:25.76924761 +0000 UTC m=+1583.334239592" Dec 01 08:43:25 crc kubenswrapper[5004]: I1201 08:43:25.786754 5004 generic.go:334] "Generic (PLEG): container finished" podID="2c39c876-65b4-4e2d-83c8-e239417edbf5" containerID="93ae33f03605ec6547e3243a238e3683cce0f366c93b879a96c37a4e5da0618f" exitCode=0 Dec 01 08:43:25 crc kubenswrapper[5004]: I1201 08:43:25.786799 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7877d89589-z7vkq" event={"ID":"2c39c876-65b4-4e2d-83c8-e239417edbf5","Type":"ContainerDied","Data":"93ae33f03605ec6547e3243a238e3683cce0f366c93b879a96c37a4e5da0618f"} Dec 01 08:43:26 crc kubenswrapper[5004]: I1201 08:43:26.178883 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7877d89589-z7vkq" Dec 01 08:43:26 crc kubenswrapper[5004]: I1201 08:43:26.318087 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c39c876-65b4-4e2d-83c8-e239417edbf5-ovsdbserver-sb\") pod \"2c39c876-65b4-4e2d-83c8-e239417edbf5\" (UID: \"2c39c876-65b4-4e2d-83c8-e239417edbf5\") " Dec 01 08:43:26 crc kubenswrapper[5004]: I1201 08:43:26.318443 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c39c876-65b4-4e2d-83c8-e239417edbf5-config\") pod \"2c39c876-65b4-4e2d-83c8-e239417edbf5\" (UID: \"2c39c876-65b4-4e2d-83c8-e239417edbf5\") " Dec 01 08:43:26 crc kubenswrapper[5004]: I1201 08:43:26.318618 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c39c876-65b4-4e2d-83c8-e239417edbf5-ovsdbserver-nb\") pod \"2c39c876-65b4-4e2d-83c8-e239417edbf5\" (UID: \"2c39c876-65b4-4e2d-83c8-e239417edbf5\") " Dec 01 08:43:26 crc kubenswrapper[5004]: I1201 08:43:26.318733 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2c39c876-65b4-4e2d-83c8-e239417edbf5-dns-swift-storage-0\") pod \"2c39c876-65b4-4e2d-83c8-e239417edbf5\" (UID: \"2c39c876-65b4-4e2d-83c8-e239417edbf5\") " Dec 01 08:43:26 crc kubenswrapper[5004]: I1201 08:43:26.318954 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mj7cq\" (UniqueName: \"kubernetes.io/projected/2c39c876-65b4-4e2d-83c8-e239417edbf5-kube-api-access-mj7cq\") pod \"2c39c876-65b4-4e2d-83c8-e239417edbf5\" (UID: \"2c39c876-65b4-4e2d-83c8-e239417edbf5\") " Dec 01 08:43:26 crc kubenswrapper[5004]: I1201 08:43:26.319201 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c39c876-65b4-4e2d-83c8-e239417edbf5-dns-svc\") pod \"2c39c876-65b4-4e2d-83c8-e239417edbf5\" (UID: \"2c39c876-65b4-4e2d-83c8-e239417edbf5\") " Dec 01 08:43:26 crc kubenswrapper[5004]: I1201 08:43:26.328160 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c39c876-65b4-4e2d-83c8-e239417edbf5-kube-api-access-mj7cq" (OuterVolumeSpecName: "kube-api-access-mj7cq") pod "2c39c876-65b4-4e2d-83c8-e239417edbf5" (UID: "2c39c876-65b4-4e2d-83c8-e239417edbf5"). InnerVolumeSpecName "kube-api-access-mj7cq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:43:26 crc kubenswrapper[5004]: I1201 08:43:26.421879 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mj7cq\" (UniqueName: \"kubernetes.io/projected/2c39c876-65b4-4e2d-83c8-e239417edbf5-kube-api-access-mj7cq\") on node \"crc\" DevicePath \"\"" Dec 01 08:43:26 crc kubenswrapper[5004]: I1201 08:43:26.491779 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c39c876-65b4-4e2d-83c8-e239417edbf5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2c39c876-65b4-4e2d-83c8-e239417edbf5" (UID: "2c39c876-65b4-4e2d-83c8-e239417edbf5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:43:26 crc kubenswrapper[5004]: I1201 08:43:26.530629 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c39c876-65b4-4e2d-83c8-e239417edbf5-config" (OuterVolumeSpecName: "config") pod "2c39c876-65b4-4e2d-83c8-e239417edbf5" (UID: "2c39c876-65b4-4e2d-83c8-e239417edbf5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:43:26 crc kubenswrapper[5004]: I1201 08:43:26.537819 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c39c876-65b4-4e2d-83c8-e239417edbf5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2c39c876-65b4-4e2d-83c8-e239417edbf5" (UID: "2c39c876-65b4-4e2d-83c8-e239417edbf5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:43:26 crc kubenswrapper[5004]: I1201 08:43:26.541354 5004 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c39c876-65b4-4e2d-83c8-e239417edbf5-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 08:43:26 crc kubenswrapper[5004]: I1201 08:43:26.541383 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c39c876-65b4-4e2d-83c8-e239417edbf5-config\") on node \"crc\" DevicePath \"\"" Dec 01 08:43:26 crc kubenswrapper[5004]: I1201 08:43:26.541392 5004 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c39c876-65b4-4e2d-83c8-e239417edbf5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 08:43:26 crc kubenswrapper[5004]: I1201 08:43:26.552189 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c39c876-65b4-4e2d-83c8-e239417edbf5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2c39c876-65b4-4e2d-83c8-e239417edbf5" (UID: "2c39c876-65b4-4e2d-83c8-e239417edbf5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:43:26 crc kubenswrapper[5004]: I1201 08:43:26.552980 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c39c876-65b4-4e2d-83c8-e239417edbf5-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2c39c876-65b4-4e2d-83c8-e239417edbf5" (UID: "2c39c876-65b4-4e2d-83c8-e239417edbf5"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:43:26 crc kubenswrapper[5004]: I1201 08:43:26.644147 5004 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c39c876-65b4-4e2d-83c8-e239417edbf5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 08:43:26 crc kubenswrapper[5004]: I1201 08:43:26.644182 5004 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2c39c876-65b4-4e2d-83c8-e239417edbf5-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 01 08:43:26 crc kubenswrapper[5004]: I1201 08:43:26.797720 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65ef6d1f-ede0-4db1-b3a4-005921b1ce6b","Type":"ContainerStarted","Data":"d2a87718597afef8ab679f9456c35bf14d823b9b70960ddfb480d3b23a76e94e"} Dec 01 08:43:26 crc kubenswrapper[5004]: I1201 08:43:26.797762 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65ef6d1f-ede0-4db1-b3a4-005921b1ce6b","Type":"ContainerStarted","Data":"33802e98e906ac47a4ba440d97522895ac7fe8fc1a101d9b3d7fe685a13c1e95"} Dec 01 08:43:26 crc kubenswrapper[5004]: I1201 08:43:26.799712 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"254314c7-5d69-4a31-b624-d985125bacee","Type":"ContainerStarted","Data":"bfa91a9ee828473fdb3b0b3758debed03550479d13e5f986d09d96bfcaece393"} Dec 01 08:43:26 crc kubenswrapper[5004]: I1201 08:43:26.799758 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"254314c7-5d69-4a31-b624-d985125bacee","Type":"ContainerStarted","Data":"eae44cb9d6865a271883cb2bc942df42834c1acbe6e2299c68d4e66399188c6a"} Dec 01 08:43:26 crc kubenswrapper[5004]: I1201 08:43:26.801918 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7877d89589-z7vkq" Dec 01 08:43:26 crc kubenswrapper[5004]: I1201 08:43:26.801928 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7877d89589-z7vkq" event={"ID":"2c39c876-65b4-4e2d-83c8-e239417edbf5","Type":"ContainerDied","Data":"7341e488eac2c60bd0e4ba72eec610b452ffedda199ffebe3a798b1d7bcd66a0"} Dec 01 08:43:26 crc kubenswrapper[5004]: I1201 08:43:26.801963 5004 scope.go:117] "RemoveContainer" containerID="93ae33f03605ec6547e3243a238e3683cce0f366c93b879a96c37a4e5da0618f" Dec 01 08:43:26 crc kubenswrapper[5004]: I1201 08:43:26.843828 5004 scope.go:117] "RemoveContainer" containerID="49a3135120f1f483799c32df427157579566586b76f8d4c632bcacea2f004914" Dec 01 08:43:26 crc kubenswrapper[5004]: I1201 08:43:26.845446 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7877d89589-z7vkq"] Dec 01 08:43:26 crc kubenswrapper[5004]: I1201 08:43:26.854961 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7877d89589-z7vkq"] Dec 01 08:43:27 crc kubenswrapper[5004]: I1201 08:43:27.824723 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"254314c7-5d69-4a31-b624-d985125bacee","Type":"ContainerStarted","Data":"c7165fc56293d9f4bc95eb11bffcafa2f0bdacd92c7af0ff7c6c6039eca8db9c"} Dec 01 08:43:27 crc kubenswrapper[5004]: I1201 08:43:27.854788 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=1.620691335 podStartE2EDuration="4.852151778s" podCreationTimestamp="2025-12-01 08:43:23 +0000 UTC" firstStartedPulling="2025-12-01 08:43:24.077379544 +0000 UTC m=+1581.642371526" lastFinishedPulling="2025-12-01 08:43:27.308839977 +0000 UTC m=+1584.873831969" observedRunningTime="2025-12-01 08:43:27.842504402 +0000 UTC m=+1585.407496384" watchObservedRunningTime="2025-12-01 08:43:27.852151778 +0000 UTC m=+1585.417143760" Dec 01 08:43:28 crc kubenswrapper[5004]: I1201 08:43:28.774524 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c39c876-65b4-4e2d-83c8-e239417edbf5" path="/var/lib/kubelet/pods/2c39c876-65b4-4e2d-83c8-e239417edbf5/volumes" Dec 01 08:43:28 crc kubenswrapper[5004]: I1201 08:43:28.843617 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65ef6d1f-ede0-4db1-b3a4-005921b1ce6b","Type":"ContainerStarted","Data":"bb6c0b74cd9415897eabd78a157ae825d3e68d67450773ef51a02a56788fcf60"} Dec 01 08:43:28 crc kubenswrapper[5004]: I1201 08:43:28.843742 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 08:43:28 crc kubenswrapper[5004]: I1201 08:43:28.874607 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.548233005 podStartE2EDuration="6.874587382s" podCreationTimestamp="2025-12-01 08:43:22 +0000 UTC" firstStartedPulling="2025-12-01 08:43:23.726912863 +0000 UTC m=+1581.291904845" lastFinishedPulling="2025-12-01 08:43:28.05326724 +0000 UTC m=+1585.618259222" observedRunningTime="2025-12-01 08:43:28.865681325 +0000 UTC m=+1586.430673327" watchObservedRunningTime="2025-12-01 08:43:28.874587382 +0000 UTC m=+1586.439579364" Dec 01 08:43:30 crc kubenswrapper[5004]: I1201 08:43:30.871228 5004 generic.go:334] "Generic (PLEG): container finished" podID="03552a97-36f1-421c-8dfe-359fe79b8a7f" containerID="eefbec68660f5a6a8770dc5ee6f71f3c7b18cef614bc4087bac9a2369ec84460" exitCode=0 Dec 01 08:43:30 crc kubenswrapper[5004]: I1201 08:43:30.871403 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-tvtkb" event={"ID":"03552a97-36f1-421c-8dfe-359fe79b8a7f","Type":"ContainerDied","Data":"eefbec68660f5a6a8770dc5ee6f71f3c7b18cef614bc4087bac9a2369ec84460"} Dec 01 08:43:32 crc kubenswrapper[5004]: I1201 08:43:32.326178 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 08:43:32 crc kubenswrapper[5004]: I1201 08:43:32.326229 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 08:43:32 crc kubenswrapper[5004]: I1201 08:43:32.406589 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-tvtkb" Dec 01 08:43:32 crc kubenswrapper[5004]: I1201 08:43:32.514471 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03552a97-36f1-421c-8dfe-359fe79b8a7f-combined-ca-bundle\") pod \"03552a97-36f1-421c-8dfe-359fe79b8a7f\" (UID: \"03552a97-36f1-421c-8dfe-359fe79b8a7f\") " Dec 01 08:43:32 crc kubenswrapper[5004]: I1201 08:43:32.514851 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03552a97-36f1-421c-8dfe-359fe79b8a7f-config-data\") pod \"03552a97-36f1-421c-8dfe-359fe79b8a7f\" (UID: \"03552a97-36f1-421c-8dfe-359fe79b8a7f\") " Dec 01 08:43:32 crc kubenswrapper[5004]: I1201 08:43:32.514905 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtn4n\" (UniqueName: \"kubernetes.io/projected/03552a97-36f1-421c-8dfe-359fe79b8a7f-kube-api-access-wtn4n\") pod \"03552a97-36f1-421c-8dfe-359fe79b8a7f\" (UID: \"03552a97-36f1-421c-8dfe-359fe79b8a7f\") " Dec 01 08:43:32 crc kubenswrapper[5004]: I1201 08:43:32.514943 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03552a97-36f1-421c-8dfe-359fe79b8a7f-scripts\") pod \"03552a97-36f1-421c-8dfe-359fe79b8a7f\" (UID: \"03552a97-36f1-421c-8dfe-359fe79b8a7f\") " Dec 01 08:43:32 crc kubenswrapper[5004]: I1201 08:43:32.525298 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03552a97-36f1-421c-8dfe-359fe79b8a7f-kube-api-access-wtn4n" (OuterVolumeSpecName: "kube-api-access-wtn4n") pod "03552a97-36f1-421c-8dfe-359fe79b8a7f" (UID: "03552a97-36f1-421c-8dfe-359fe79b8a7f"). InnerVolumeSpecName "kube-api-access-wtn4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:43:32 crc kubenswrapper[5004]: I1201 08:43:32.530442 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03552a97-36f1-421c-8dfe-359fe79b8a7f-scripts" (OuterVolumeSpecName: "scripts") pod "03552a97-36f1-421c-8dfe-359fe79b8a7f" (UID: "03552a97-36f1-421c-8dfe-359fe79b8a7f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:43:32 crc kubenswrapper[5004]: I1201 08:43:32.552805 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03552a97-36f1-421c-8dfe-359fe79b8a7f-config-data" (OuterVolumeSpecName: "config-data") pod "03552a97-36f1-421c-8dfe-359fe79b8a7f" (UID: "03552a97-36f1-421c-8dfe-359fe79b8a7f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:43:32 crc kubenswrapper[5004]: I1201 08:43:32.562094 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03552a97-36f1-421c-8dfe-359fe79b8a7f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "03552a97-36f1-421c-8dfe-359fe79b8a7f" (UID: "03552a97-36f1-421c-8dfe-359fe79b8a7f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:43:32 crc kubenswrapper[5004]: I1201 08:43:32.622922 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03552a97-36f1-421c-8dfe-359fe79b8a7f-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 08:43:32 crc kubenswrapper[5004]: I1201 08:43:32.622957 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wtn4n\" (UniqueName: \"kubernetes.io/projected/03552a97-36f1-421c-8dfe-359fe79b8a7f-kube-api-access-wtn4n\") on node \"crc\" DevicePath \"\"" Dec 01 08:43:32 crc kubenswrapper[5004]: I1201 08:43:32.622970 5004 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03552a97-36f1-421c-8dfe-359fe79b8a7f-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 08:43:32 crc kubenswrapper[5004]: I1201 08:43:32.622984 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03552a97-36f1-421c-8dfe-359fe79b8a7f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:43:32 crc kubenswrapper[5004]: I1201 08:43:32.908846 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-tvtkb" event={"ID":"03552a97-36f1-421c-8dfe-359fe79b8a7f","Type":"ContainerDied","Data":"3da9a7ae874dd63fe7a055d87037c45844d005a6bbafaf2c4a216c506b87d30b"} Dec 01 08:43:32 crc kubenswrapper[5004]: I1201 08:43:32.908904 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3da9a7ae874dd63fe7a055d87037c45844d005a6bbafaf2c4a216c506b87d30b" Dec 01 08:43:32 crc kubenswrapper[5004]: I1201 08:43:32.909009 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-tvtkb" Dec 01 08:43:33 crc kubenswrapper[5004]: I1201 08:43:33.092997 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 01 08:43:33 crc kubenswrapper[5004]: I1201 08:43:33.093365 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ce6cfc91-0fab-40a6-ae94-9cd690cfde01" containerName="nova-api-log" containerID="cri-o://49f507d93b89c7803cbc7444933536ad98034a5df1c825ae62350a0b75a0940f" gracePeriod=30 Dec 01 08:43:33 crc kubenswrapper[5004]: I1201 08:43:33.094011 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ce6cfc91-0fab-40a6-ae94-9cd690cfde01" containerName="nova-api-api" containerID="cri-o://abfbf195cf6c0fba0b21972049382e546af7fd71d0fa9667d1da8fd583500ac5" gracePeriod=30 Dec 01 08:43:33 crc kubenswrapper[5004]: I1201 08:43:33.098974 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ce6cfc91-0fab-40a6-ae94-9cd690cfde01" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.251:8774/\": EOF" Dec 01 08:43:33 crc kubenswrapper[5004]: I1201 08:43:33.099137 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ce6cfc91-0fab-40a6-ae94-9cd690cfde01" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.251:8774/\": EOF" Dec 01 08:43:33 crc kubenswrapper[5004]: I1201 08:43:33.113088 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 08:43:33 crc kubenswrapper[5004]: I1201 08:43:33.113392 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="d16020b5-7cf1-492f-b129-5054bcbd8427" containerName="nova-scheduler-scheduler" containerID="cri-o://6827a7e54d43df41de9a37d9daa31f6e85299558d37f5eff5f15ec59ef888bc7" gracePeriod=30 Dec 01 08:43:33 crc kubenswrapper[5004]: I1201 08:43:33.148967 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 08:43:33 crc kubenswrapper[5004]: I1201 08:43:33.149233 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="855d076a-eacb-40d9-9135-8a1a3cf64f59" containerName="nova-metadata-log" containerID="cri-o://639a91cddffd8c36fa3b41966ba1065c7bd09ff518d776caad8ffe332c1198df" gracePeriod=30 Dec 01 08:43:33 crc kubenswrapper[5004]: I1201 08:43:33.149397 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="855d076a-eacb-40d9-9135-8a1a3cf64f59" containerName="nova-metadata-metadata" containerID="cri-o://d8a63febb8a8556f14ad5f947b65573a3ffe986bedf429596c00b97ca667ffca" gracePeriod=30 Dec 01 08:43:33 crc kubenswrapper[5004]: I1201 08:43:33.924542 5004 generic.go:334] "Generic (PLEG): container finished" podID="ce6cfc91-0fab-40a6-ae94-9cd690cfde01" containerID="49f507d93b89c7803cbc7444933536ad98034a5df1c825ae62350a0b75a0940f" exitCode=143 Dec 01 08:43:33 crc kubenswrapper[5004]: I1201 08:43:33.924675 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ce6cfc91-0fab-40a6-ae94-9cd690cfde01","Type":"ContainerDied","Data":"49f507d93b89c7803cbc7444933536ad98034a5df1c825ae62350a0b75a0940f"} Dec 01 08:43:33 crc kubenswrapper[5004]: I1201 08:43:33.927669 5004 generic.go:334] "Generic (PLEG): container finished" podID="855d076a-eacb-40d9-9135-8a1a3cf64f59" containerID="639a91cddffd8c36fa3b41966ba1065c7bd09ff518d776caad8ffe332c1198df" exitCode=143 Dec 01 08:43:33 crc kubenswrapper[5004]: I1201 08:43:33.927735 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"855d076a-eacb-40d9-9135-8a1a3cf64f59","Type":"ContainerDied","Data":"639a91cddffd8c36fa3b41966ba1065c7bd09ff518d776caad8ffe332c1198df"} Dec 01 08:43:36 crc kubenswrapper[5004]: I1201 08:43:36.967775 5004 generic.go:334] "Generic (PLEG): container finished" podID="855d076a-eacb-40d9-9135-8a1a3cf64f59" containerID="d8a63febb8a8556f14ad5f947b65573a3ffe986bedf429596c00b97ca667ffca" exitCode=0 Dec 01 08:43:36 crc kubenswrapper[5004]: I1201 08:43:36.967996 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"855d076a-eacb-40d9-9135-8a1a3cf64f59","Type":"ContainerDied","Data":"d8a63febb8a8556f14ad5f947b65573a3ffe986bedf429596c00b97ca667ffca"} Dec 01 08:43:36 crc kubenswrapper[5004]: I1201 08:43:36.968325 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"855d076a-eacb-40d9-9135-8a1a3cf64f59","Type":"ContainerDied","Data":"d8d8a3dcc2c19acc24ec50d9ce59f972502927fcb112181c54be8a276f306af2"} Dec 01 08:43:36 crc kubenswrapper[5004]: I1201 08:43:36.968343 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8d8a3dcc2c19acc24ec50d9ce59f972502927fcb112181c54be8a276f306af2" Dec 01 08:43:36 crc kubenswrapper[5004]: I1201 08:43:36.972182 5004 generic.go:334] "Generic (PLEG): container finished" podID="d16020b5-7cf1-492f-b129-5054bcbd8427" containerID="6827a7e54d43df41de9a37d9daa31f6e85299558d37f5eff5f15ec59ef888bc7" exitCode=0 Dec 01 08:43:36 crc kubenswrapper[5004]: I1201 08:43:36.972258 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d16020b5-7cf1-492f-b129-5054bcbd8427","Type":"ContainerDied","Data":"6827a7e54d43df41de9a37d9daa31f6e85299558d37f5eff5f15ec59ef888bc7"} Dec 01 08:43:36 crc kubenswrapper[5004]: I1201 08:43:36.972345 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d16020b5-7cf1-492f-b129-5054bcbd8427","Type":"ContainerDied","Data":"fdf1098de1e0e3ed52fdefe45d17e365f2230bad4f48833ffaf950a19b667815"} Dec 01 08:43:36 crc kubenswrapper[5004]: I1201 08:43:36.972362 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fdf1098de1e0e3ed52fdefe45d17e365f2230bad4f48833ffaf950a19b667815" Dec 01 08:43:36 crc kubenswrapper[5004]: I1201 08:43:36.979186 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 08:43:36 crc kubenswrapper[5004]: I1201 08:43:36.994533 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 08:43:37 crc kubenswrapper[5004]: I1201 08:43:37.135705 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/855d076a-eacb-40d9-9135-8a1a3cf64f59-combined-ca-bundle\") pod \"855d076a-eacb-40d9-9135-8a1a3cf64f59\" (UID: \"855d076a-eacb-40d9-9135-8a1a3cf64f59\") " Dec 01 08:43:37 crc kubenswrapper[5004]: I1201 08:43:37.135853 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/855d076a-eacb-40d9-9135-8a1a3cf64f59-nova-metadata-tls-certs\") pod \"855d076a-eacb-40d9-9135-8a1a3cf64f59\" (UID: \"855d076a-eacb-40d9-9135-8a1a3cf64f59\") " Dec 01 08:43:37 crc kubenswrapper[5004]: I1201 08:43:37.135885 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qft5\" (UniqueName: \"kubernetes.io/projected/855d076a-eacb-40d9-9135-8a1a3cf64f59-kube-api-access-9qft5\") pod \"855d076a-eacb-40d9-9135-8a1a3cf64f59\" (UID: \"855d076a-eacb-40d9-9135-8a1a3cf64f59\") " Dec 01 08:43:37 crc kubenswrapper[5004]: I1201 08:43:37.135947 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/855d076a-eacb-40d9-9135-8a1a3cf64f59-logs\") pod \"855d076a-eacb-40d9-9135-8a1a3cf64f59\" (UID: \"855d076a-eacb-40d9-9135-8a1a3cf64f59\") " Dec 01 08:43:37 crc kubenswrapper[5004]: I1201 08:43:37.136051 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/855d076a-eacb-40d9-9135-8a1a3cf64f59-config-data\") pod \"855d076a-eacb-40d9-9135-8a1a3cf64f59\" (UID: \"855d076a-eacb-40d9-9135-8a1a3cf64f59\") " Dec 01 08:43:37 crc kubenswrapper[5004]: I1201 08:43:37.136112 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d16020b5-7cf1-492f-b129-5054bcbd8427-combined-ca-bundle\") pod \"d16020b5-7cf1-492f-b129-5054bcbd8427\" (UID: \"d16020b5-7cf1-492f-b129-5054bcbd8427\") " Dec 01 08:43:37 crc kubenswrapper[5004]: I1201 08:43:37.136192 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d16020b5-7cf1-492f-b129-5054bcbd8427-config-data\") pod \"d16020b5-7cf1-492f-b129-5054bcbd8427\" (UID: \"d16020b5-7cf1-492f-b129-5054bcbd8427\") " Dec 01 08:43:37 crc kubenswrapper[5004]: I1201 08:43:37.136390 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrh9n\" (UniqueName: \"kubernetes.io/projected/d16020b5-7cf1-492f-b129-5054bcbd8427-kube-api-access-wrh9n\") pod \"d16020b5-7cf1-492f-b129-5054bcbd8427\" (UID: \"d16020b5-7cf1-492f-b129-5054bcbd8427\") " Dec 01 08:43:37 crc kubenswrapper[5004]: I1201 08:43:37.139350 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/855d076a-eacb-40d9-9135-8a1a3cf64f59-logs" (OuterVolumeSpecName: "logs") pod "855d076a-eacb-40d9-9135-8a1a3cf64f59" (UID: "855d076a-eacb-40d9-9135-8a1a3cf64f59"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:43:37 crc kubenswrapper[5004]: I1201 08:43:37.141452 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/855d076a-eacb-40d9-9135-8a1a3cf64f59-kube-api-access-9qft5" (OuterVolumeSpecName: "kube-api-access-9qft5") pod "855d076a-eacb-40d9-9135-8a1a3cf64f59" (UID: "855d076a-eacb-40d9-9135-8a1a3cf64f59"). InnerVolumeSpecName "kube-api-access-9qft5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:43:37 crc kubenswrapper[5004]: I1201 08:43:37.144390 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d16020b5-7cf1-492f-b129-5054bcbd8427-kube-api-access-wrh9n" (OuterVolumeSpecName: "kube-api-access-wrh9n") pod "d16020b5-7cf1-492f-b129-5054bcbd8427" (UID: "d16020b5-7cf1-492f-b129-5054bcbd8427"). InnerVolumeSpecName "kube-api-access-wrh9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:43:37 crc kubenswrapper[5004]: I1201 08:43:37.182530 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d16020b5-7cf1-492f-b129-5054bcbd8427-config-data" (OuterVolumeSpecName: "config-data") pod "d16020b5-7cf1-492f-b129-5054bcbd8427" (UID: "d16020b5-7cf1-492f-b129-5054bcbd8427"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:43:37 crc kubenswrapper[5004]: I1201 08:43:37.184880 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/855d076a-eacb-40d9-9135-8a1a3cf64f59-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "855d076a-eacb-40d9-9135-8a1a3cf64f59" (UID: "855d076a-eacb-40d9-9135-8a1a3cf64f59"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:43:37 crc kubenswrapper[5004]: I1201 08:43:37.201539 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/855d076a-eacb-40d9-9135-8a1a3cf64f59-config-data" (OuterVolumeSpecName: "config-data") pod "855d076a-eacb-40d9-9135-8a1a3cf64f59" (UID: "855d076a-eacb-40d9-9135-8a1a3cf64f59"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:43:37 crc kubenswrapper[5004]: I1201 08:43:37.211590 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d16020b5-7cf1-492f-b129-5054bcbd8427-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d16020b5-7cf1-492f-b129-5054bcbd8427" (UID: "d16020b5-7cf1-492f-b129-5054bcbd8427"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:43:37 crc kubenswrapper[5004]: I1201 08:43:37.218907 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/855d076a-eacb-40d9-9135-8a1a3cf64f59-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "855d076a-eacb-40d9-9135-8a1a3cf64f59" (UID: "855d076a-eacb-40d9-9135-8a1a3cf64f59"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:43:37 crc kubenswrapper[5004]: I1201 08:43:37.239007 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d16020b5-7cf1-492f-b129-5054bcbd8427-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 08:43:37 crc kubenswrapper[5004]: I1201 08:43:37.239053 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrh9n\" (UniqueName: \"kubernetes.io/projected/d16020b5-7cf1-492f-b129-5054bcbd8427-kube-api-access-wrh9n\") on node \"crc\" DevicePath \"\"" Dec 01 08:43:37 crc kubenswrapper[5004]: I1201 08:43:37.239064 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/855d076a-eacb-40d9-9135-8a1a3cf64f59-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:43:37 crc kubenswrapper[5004]: I1201 08:43:37.239073 5004 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/855d076a-eacb-40d9-9135-8a1a3cf64f59-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 08:43:37 crc kubenswrapper[5004]: I1201 08:43:37.239082 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qft5\" (UniqueName: \"kubernetes.io/projected/855d076a-eacb-40d9-9135-8a1a3cf64f59-kube-api-access-9qft5\") on node \"crc\" DevicePath \"\"" Dec 01 08:43:37 crc kubenswrapper[5004]: I1201 08:43:37.239092 5004 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/855d076a-eacb-40d9-9135-8a1a3cf64f59-logs\") on node \"crc\" DevicePath \"\"" Dec 01 08:43:37 crc kubenswrapper[5004]: I1201 08:43:37.239101 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/855d076a-eacb-40d9-9135-8a1a3cf64f59-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 08:43:37 crc kubenswrapper[5004]: I1201 08:43:37.239109 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d16020b5-7cf1-492f-b129-5054bcbd8427-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:43:37 crc kubenswrapper[5004]: I1201 08:43:37.983228 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 08:43:37 crc kubenswrapper[5004]: I1201 08:43:37.983518 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 08:43:38 crc kubenswrapper[5004]: I1201 08:43:38.021437 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 08:43:38 crc kubenswrapper[5004]: I1201 08:43:38.032780 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 08:43:38 crc kubenswrapper[5004]: I1201 08:43:38.043627 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 08:43:38 crc kubenswrapper[5004]: I1201 08:43:38.060634 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 08:43:38 crc kubenswrapper[5004]: I1201 08:43:38.076187 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 08:43:38 crc kubenswrapper[5004]: E1201 08:43:38.076641 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="855d076a-eacb-40d9-9135-8a1a3cf64f59" containerName="nova-metadata-metadata" Dec 01 08:43:38 crc kubenswrapper[5004]: I1201 08:43:38.076658 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="855d076a-eacb-40d9-9135-8a1a3cf64f59" containerName="nova-metadata-metadata" Dec 01 08:43:38 crc kubenswrapper[5004]: E1201 08:43:38.076679 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03552a97-36f1-421c-8dfe-359fe79b8a7f" containerName="nova-manage" Dec 01 08:43:38 crc kubenswrapper[5004]: I1201 08:43:38.076686 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="03552a97-36f1-421c-8dfe-359fe79b8a7f" containerName="nova-manage" Dec 01 08:43:38 crc kubenswrapper[5004]: E1201 08:43:38.076703 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d16020b5-7cf1-492f-b129-5054bcbd8427" containerName="nova-scheduler-scheduler" Dec 01 08:43:38 crc kubenswrapper[5004]: I1201 08:43:38.076711 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="d16020b5-7cf1-492f-b129-5054bcbd8427" containerName="nova-scheduler-scheduler" Dec 01 08:43:38 crc kubenswrapper[5004]: E1201 08:43:38.076735 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c39c876-65b4-4e2d-83c8-e239417edbf5" containerName="init" Dec 01 08:43:38 crc kubenswrapper[5004]: I1201 08:43:38.076741 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c39c876-65b4-4e2d-83c8-e239417edbf5" containerName="init" Dec 01 08:43:38 crc kubenswrapper[5004]: E1201 08:43:38.076757 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="855d076a-eacb-40d9-9135-8a1a3cf64f59" containerName="nova-metadata-log" Dec 01 08:43:38 crc kubenswrapper[5004]: I1201 08:43:38.076764 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="855d076a-eacb-40d9-9135-8a1a3cf64f59" containerName="nova-metadata-log" Dec 01 08:43:38 crc kubenswrapper[5004]: E1201 08:43:38.076779 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c39c876-65b4-4e2d-83c8-e239417edbf5" containerName="dnsmasq-dns" Dec 01 08:43:38 crc kubenswrapper[5004]: I1201 08:43:38.076785 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c39c876-65b4-4e2d-83c8-e239417edbf5" containerName="dnsmasq-dns" Dec 01 08:43:38 crc kubenswrapper[5004]: I1201 08:43:38.076975 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="855d076a-eacb-40d9-9135-8a1a3cf64f59" containerName="nova-metadata-log" Dec 01 08:43:38 crc kubenswrapper[5004]: I1201 08:43:38.076993 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="855d076a-eacb-40d9-9135-8a1a3cf64f59" containerName="nova-metadata-metadata" Dec 01 08:43:38 crc kubenswrapper[5004]: I1201 08:43:38.077011 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="d16020b5-7cf1-492f-b129-5054bcbd8427" containerName="nova-scheduler-scheduler" Dec 01 08:43:38 crc kubenswrapper[5004]: I1201 08:43:38.077025 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c39c876-65b4-4e2d-83c8-e239417edbf5" containerName="dnsmasq-dns" Dec 01 08:43:38 crc kubenswrapper[5004]: I1201 08:43:38.077037 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="03552a97-36f1-421c-8dfe-359fe79b8a7f" containerName="nova-manage" Dec 01 08:43:38 crc kubenswrapper[5004]: I1201 08:43:38.078787 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 08:43:38 crc kubenswrapper[5004]: I1201 08:43:38.082688 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 01 08:43:38 crc kubenswrapper[5004]: I1201 08:43:38.083917 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 08:43:38 crc kubenswrapper[5004]: I1201 08:43:38.103327 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 01 08:43:38 crc kubenswrapper[5004]: I1201 08:43:38.105541 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 08:43:38 crc kubenswrapper[5004]: I1201 08:43:38.107448 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 01 08:43:38 crc kubenswrapper[5004]: I1201 08:43:38.108582 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 01 08:43:38 crc kubenswrapper[5004]: I1201 08:43:38.129968 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 08:43:38 crc kubenswrapper[5004]: I1201 08:43:38.160680 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gjvf\" (UniqueName: \"kubernetes.io/projected/d68b7ff7-f7f8-442f-870d-94f82b0842c1-kube-api-access-2gjvf\") pod \"nova-scheduler-0\" (UID: \"d68b7ff7-f7f8-442f-870d-94f82b0842c1\") " pod="openstack/nova-scheduler-0" Dec 01 08:43:38 crc kubenswrapper[5004]: I1201 08:43:38.160746 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d68b7ff7-f7f8-442f-870d-94f82b0842c1-config-data\") pod \"nova-scheduler-0\" (UID: \"d68b7ff7-f7f8-442f-870d-94f82b0842c1\") " pod="openstack/nova-scheduler-0" Dec 01 08:43:38 crc kubenswrapper[5004]: I1201 08:43:38.160795 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d68b7ff7-f7f8-442f-870d-94f82b0842c1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d68b7ff7-f7f8-442f-870d-94f82b0842c1\") " pod="openstack/nova-scheduler-0" Dec 01 08:43:38 crc kubenswrapper[5004]: I1201 08:43:38.263267 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c51ed16a-b6da-4bc9-9d47-18ec809ba124-logs\") pod \"nova-metadata-0\" (UID: \"c51ed16a-b6da-4bc9-9d47-18ec809ba124\") " pod="openstack/nova-metadata-0" Dec 01 08:43:38 crc kubenswrapper[5004]: I1201 08:43:38.263378 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8r6b6\" (UniqueName: \"kubernetes.io/projected/c51ed16a-b6da-4bc9-9d47-18ec809ba124-kube-api-access-8r6b6\") pod \"nova-metadata-0\" (UID: \"c51ed16a-b6da-4bc9-9d47-18ec809ba124\") " pod="openstack/nova-metadata-0" Dec 01 08:43:38 crc kubenswrapper[5004]: I1201 08:43:38.263498 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c51ed16a-b6da-4bc9-9d47-18ec809ba124-config-data\") pod \"nova-metadata-0\" (UID: \"c51ed16a-b6da-4bc9-9d47-18ec809ba124\") " pod="openstack/nova-metadata-0" Dec 01 08:43:38 crc kubenswrapper[5004]: I1201 08:43:38.263547 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c51ed16a-b6da-4bc9-9d47-18ec809ba124-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c51ed16a-b6da-4bc9-9d47-18ec809ba124\") " pod="openstack/nova-metadata-0" Dec 01 08:43:38 crc kubenswrapper[5004]: I1201 08:43:38.263602 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gjvf\" (UniqueName: \"kubernetes.io/projected/d68b7ff7-f7f8-442f-870d-94f82b0842c1-kube-api-access-2gjvf\") pod \"nova-scheduler-0\" (UID: \"d68b7ff7-f7f8-442f-870d-94f82b0842c1\") " pod="openstack/nova-scheduler-0" Dec 01 08:43:38 crc kubenswrapper[5004]: I1201 08:43:38.263684 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d68b7ff7-f7f8-442f-870d-94f82b0842c1-config-data\") pod \"nova-scheduler-0\" (UID: \"d68b7ff7-f7f8-442f-870d-94f82b0842c1\") " pod="openstack/nova-scheduler-0" Dec 01 08:43:38 crc kubenswrapper[5004]: I1201 08:43:38.263711 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c51ed16a-b6da-4bc9-9d47-18ec809ba124-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c51ed16a-b6da-4bc9-9d47-18ec809ba124\") " pod="openstack/nova-metadata-0" Dec 01 08:43:38 crc kubenswrapper[5004]: I1201 08:43:38.263814 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d68b7ff7-f7f8-442f-870d-94f82b0842c1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d68b7ff7-f7f8-442f-870d-94f82b0842c1\") " pod="openstack/nova-scheduler-0" Dec 01 08:43:38 crc kubenswrapper[5004]: I1201 08:43:38.268850 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d68b7ff7-f7f8-442f-870d-94f82b0842c1-config-data\") pod \"nova-scheduler-0\" (UID: \"d68b7ff7-f7f8-442f-870d-94f82b0842c1\") " pod="openstack/nova-scheduler-0" Dec 01 08:43:38 crc kubenswrapper[5004]: I1201 08:43:38.268930 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d68b7ff7-f7f8-442f-870d-94f82b0842c1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d68b7ff7-f7f8-442f-870d-94f82b0842c1\") " pod="openstack/nova-scheduler-0" Dec 01 08:43:38 crc kubenswrapper[5004]: I1201 08:43:38.280779 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gjvf\" (UniqueName: \"kubernetes.io/projected/d68b7ff7-f7f8-442f-870d-94f82b0842c1-kube-api-access-2gjvf\") pod \"nova-scheduler-0\" (UID: \"d68b7ff7-f7f8-442f-870d-94f82b0842c1\") " pod="openstack/nova-scheduler-0" Dec 01 08:43:38 crc kubenswrapper[5004]: I1201 08:43:38.365614 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c51ed16a-b6da-4bc9-9d47-18ec809ba124-logs\") pod \"nova-metadata-0\" (UID: \"c51ed16a-b6da-4bc9-9d47-18ec809ba124\") " pod="openstack/nova-metadata-0" Dec 01 08:43:38 crc kubenswrapper[5004]: I1201 08:43:38.365710 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8r6b6\" (UniqueName: \"kubernetes.io/projected/c51ed16a-b6da-4bc9-9d47-18ec809ba124-kube-api-access-8r6b6\") pod \"nova-metadata-0\" (UID: \"c51ed16a-b6da-4bc9-9d47-18ec809ba124\") " pod="openstack/nova-metadata-0" Dec 01 08:43:38 crc kubenswrapper[5004]: I1201 08:43:38.365806 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c51ed16a-b6da-4bc9-9d47-18ec809ba124-config-data\") pod \"nova-metadata-0\" (UID: \"c51ed16a-b6da-4bc9-9d47-18ec809ba124\") " pod="openstack/nova-metadata-0" Dec 01 08:43:38 crc kubenswrapper[5004]: I1201 08:43:38.365832 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c51ed16a-b6da-4bc9-9d47-18ec809ba124-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c51ed16a-b6da-4bc9-9d47-18ec809ba124\") " pod="openstack/nova-metadata-0" Dec 01 08:43:38 crc kubenswrapper[5004]: I1201 08:43:38.365896 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c51ed16a-b6da-4bc9-9d47-18ec809ba124-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c51ed16a-b6da-4bc9-9d47-18ec809ba124\") " pod="openstack/nova-metadata-0" Dec 01 08:43:38 crc kubenswrapper[5004]: I1201 08:43:38.366267 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c51ed16a-b6da-4bc9-9d47-18ec809ba124-logs\") pod \"nova-metadata-0\" (UID: \"c51ed16a-b6da-4bc9-9d47-18ec809ba124\") " pod="openstack/nova-metadata-0" Dec 01 08:43:38 crc kubenswrapper[5004]: I1201 08:43:38.369998 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c51ed16a-b6da-4bc9-9d47-18ec809ba124-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c51ed16a-b6da-4bc9-9d47-18ec809ba124\") " pod="openstack/nova-metadata-0" Dec 01 08:43:38 crc kubenswrapper[5004]: I1201 08:43:38.370048 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c51ed16a-b6da-4bc9-9d47-18ec809ba124-config-data\") pod \"nova-metadata-0\" (UID: \"c51ed16a-b6da-4bc9-9d47-18ec809ba124\") " pod="openstack/nova-metadata-0" Dec 01 08:43:38 crc kubenswrapper[5004]: I1201 08:43:38.370050 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c51ed16a-b6da-4bc9-9d47-18ec809ba124-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c51ed16a-b6da-4bc9-9d47-18ec809ba124\") " pod="openstack/nova-metadata-0" Dec 01 08:43:38 crc kubenswrapper[5004]: I1201 08:43:38.388397 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8r6b6\" (UniqueName: \"kubernetes.io/projected/c51ed16a-b6da-4bc9-9d47-18ec809ba124-kube-api-access-8r6b6\") pod \"nova-metadata-0\" (UID: \"c51ed16a-b6da-4bc9-9d47-18ec809ba124\") " pod="openstack/nova-metadata-0" Dec 01 08:43:38 crc kubenswrapper[5004]: I1201 08:43:38.394724 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 08:43:38 crc kubenswrapper[5004]: I1201 08:43:38.428259 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 08:43:38 crc kubenswrapper[5004]: I1201 08:43:38.729277 5004 patch_prober.go:28] interesting pod/machine-config-daemon-fvdgt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 08:43:38 crc kubenswrapper[5004]: I1201 08:43:38.729613 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 08:43:38 crc kubenswrapper[5004]: I1201 08:43:38.789423 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="855d076a-eacb-40d9-9135-8a1a3cf64f59" path="/var/lib/kubelet/pods/855d076a-eacb-40d9-9135-8a1a3cf64f59/volumes" Dec 01 08:43:38 crc kubenswrapper[5004]: I1201 08:43:38.790626 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d16020b5-7cf1-492f-b129-5054bcbd8427" path="/var/lib/kubelet/pods/d16020b5-7cf1-492f-b129-5054bcbd8427/volumes" Dec 01 08:43:38 crc kubenswrapper[5004]: I1201 08:43:38.912957 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 08:43:38 crc kubenswrapper[5004]: I1201 08:43:38.997147 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 08:43:39 crc kubenswrapper[5004]: I1201 08:43:39.000680 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d68b7ff7-f7f8-442f-870d-94f82b0842c1","Type":"ContainerStarted","Data":"94e054528232b3a8a36008eb44163e1c81b6722d422a2c7802f31684ebbf5c7d"} Dec 01 08:43:39 crc kubenswrapper[5004]: I1201 08:43:39.003770 5004 generic.go:334] "Generic (PLEG): container finished" podID="ce6cfc91-0fab-40a6-ae94-9cd690cfde01" containerID="abfbf195cf6c0fba0b21972049382e546af7fd71d0fa9667d1da8fd583500ac5" exitCode=0 Dec 01 08:43:39 crc kubenswrapper[5004]: I1201 08:43:39.003804 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ce6cfc91-0fab-40a6-ae94-9cd690cfde01","Type":"ContainerDied","Data":"abfbf195cf6c0fba0b21972049382e546af7fd71d0fa9667d1da8fd583500ac5"} Dec 01 08:43:39 crc kubenswrapper[5004]: I1201 08:43:39.003827 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ce6cfc91-0fab-40a6-ae94-9cd690cfde01","Type":"ContainerDied","Data":"d65308ab58b2dfafabe5f43f896de657de1af0ed5946de9bbf0c0ea0ceaa2f24"} Dec 01 08:43:39 crc kubenswrapper[5004]: I1201 08:43:39.003842 5004 scope.go:117] "RemoveContainer" containerID="abfbf195cf6c0fba0b21972049382e546af7fd71d0fa9667d1da8fd583500ac5" Dec 01 08:43:39 crc kubenswrapper[5004]: I1201 08:43:39.004060 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 08:43:39 crc kubenswrapper[5004]: I1201 08:43:39.045015 5004 scope.go:117] "RemoveContainer" containerID="49f507d93b89c7803cbc7444933536ad98034a5df1c825ae62350a0b75a0940f" Dec 01 08:43:39 crc kubenswrapper[5004]: I1201 08:43:39.064534 5004 scope.go:117] "RemoveContainer" containerID="abfbf195cf6c0fba0b21972049382e546af7fd71d0fa9667d1da8fd583500ac5" Dec 01 08:43:39 crc kubenswrapper[5004]: E1201 08:43:39.065657 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abfbf195cf6c0fba0b21972049382e546af7fd71d0fa9667d1da8fd583500ac5\": container with ID starting with abfbf195cf6c0fba0b21972049382e546af7fd71d0fa9667d1da8fd583500ac5 not found: ID does not exist" containerID="abfbf195cf6c0fba0b21972049382e546af7fd71d0fa9667d1da8fd583500ac5" Dec 01 08:43:39 crc kubenswrapper[5004]: I1201 08:43:39.065699 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abfbf195cf6c0fba0b21972049382e546af7fd71d0fa9667d1da8fd583500ac5"} err="failed to get container status \"abfbf195cf6c0fba0b21972049382e546af7fd71d0fa9667d1da8fd583500ac5\": rpc error: code = NotFound desc = could not find container \"abfbf195cf6c0fba0b21972049382e546af7fd71d0fa9667d1da8fd583500ac5\": container with ID starting with abfbf195cf6c0fba0b21972049382e546af7fd71d0fa9667d1da8fd583500ac5 not found: ID does not exist" Dec 01 08:43:39 crc kubenswrapper[5004]: I1201 08:43:39.065753 5004 scope.go:117] "RemoveContainer" containerID="49f507d93b89c7803cbc7444933536ad98034a5df1c825ae62350a0b75a0940f" Dec 01 08:43:39 crc kubenswrapper[5004]: E1201 08:43:39.066738 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49f507d93b89c7803cbc7444933536ad98034a5df1c825ae62350a0b75a0940f\": container with ID starting with 49f507d93b89c7803cbc7444933536ad98034a5df1c825ae62350a0b75a0940f not found: ID does not exist" containerID="49f507d93b89c7803cbc7444933536ad98034a5df1c825ae62350a0b75a0940f" Dec 01 08:43:39 crc kubenswrapper[5004]: I1201 08:43:39.066762 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49f507d93b89c7803cbc7444933536ad98034a5df1c825ae62350a0b75a0940f"} err="failed to get container status \"49f507d93b89c7803cbc7444933536ad98034a5df1c825ae62350a0b75a0940f\": rpc error: code = NotFound desc = could not find container \"49f507d93b89c7803cbc7444933536ad98034a5df1c825ae62350a0b75a0940f\": container with ID starting with 49f507d93b89c7803cbc7444933536ad98034a5df1c825ae62350a0b75a0940f not found: ID does not exist" Dec 01 08:43:39 crc kubenswrapper[5004]: I1201 08:43:39.087063 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce6cfc91-0fab-40a6-ae94-9cd690cfde01-internal-tls-certs\") pod \"ce6cfc91-0fab-40a6-ae94-9cd690cfde01\" (UID: \"ce6cfc91-0fab-40a6-ae94-9cd690cfde01\") " Dec 01 08:43:39 crc kubenswrapper[5004]: I1201 08:43:39.087414 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce6cfc91-0fab-40a6-ae94-9cd690cfde01-combined-ca-bundle\") pod \"ce6cfc91-0fab-40a6-ae94-9cd690cfde01\" (UID: \"ce6cfc91-0fab-40a6-ae94-9cd690cfde01\") " Dec 01 08:43:39 crc kubenswrapper[5004]: I1201 08:43:39.087599 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce6cfc91-0fab-40a6-ae94-9cd690cfde01-logs\") pod \"ce6cfc91-0fab-40a6-ae94-9cd690cfde01\" (UID: \"ce6cfc91-0fab-40a6-ae94-9cd690cfde01\") " Dec 01 08:43:39 crc kubenswrapper[5004]: I1201 08:43:39.087710 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce6cfc91-0fab-40a6-ae94-9cd690cfde01-public-tls-certs\") pod \"ce6cfc91-0fab-40a6-ae94-9cd690cfde01\" (UID: \"ce6cfc91-0fab-40a6-ae94-9cd690cfde01\") " Dec 01 08:43:39 crc kubenswrapper[5004]: I1201 08:43:39.087735 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tfdh\" (UniqueName: \"kubernetes.io/projected/ce6cfc91-0fab-40a6-ae94-9cd690cfde01-kube-api-access-8tfdh\") pod \"ce6cfc91-0fab-40a6-ae94-9cd690cfde01\" (UID: \"ce6cfc91-0fab-40a6-ae94-9cd690cfde01\") " Dec 01 08:43:39 crc kubenswrapper[5004]: I1201 08:43:39.087800 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce6cfc91-0fab-40a6-ae94-9cd690cfde01-config-data\") pod \"ce6cfc91-0fab-40a6-ae94-9cd690cfde01\" (UID: \"ce6cfc91-0fab-40a6-ae94-9cd690cfde01\") " Dec 01 08:43:39 crc kubenswrapper[5004]: I1201 08:43:39.088238 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce6cfc91-0fab-40a6-ae94-9cd690cfde01-logs" (OuterVolumeSpecName: "logs") pod "ce6cfc91-0fab-40a6-ae94-9cd690cfde01" (UID: "ce6cfc91-0fab-40a6-ae94-9cd690cfde01"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:43:39 crc kubenswrapper[5004]: I1201 08:43:39.088361 5004 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce6cfc91-0fab-40a6-ae94-9cd690cfde01-logs\") on node \"crc\" DevicePath \"\"" Dec 01 08:43:39 crc kubenswrapper[5004]: I1201 08:43:39.093931 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce6cfc91-0fab-40a6-ae94-9cd690cfde01-kube-api-access-8tfdh" (OuterVolumeSpecName: "kube-api-access-8tfdh") pod "ce6cfc91-0fab-40a6-ae94-9cd690cfde01" (UID: "ce6cfc91-0fab-40a6-ae94-9cd690cfde01"). InnerVolumeSpecName "kube-api-access-8tfdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:43:39 crc kubenswrapper[5004]: I1201 08:43:39.125427 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce6cfc91-0fab-40a6-ae94-9cd690cfde01-config-data" (OuterVolumeSpecName: "config-data") pod "ce6cfc91-0fab-40a6-ae94-9cd690cfde01" (UID: "ce6cfc91-0fab-40a6-ae94-9cd690cfde01"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:43:39 crc kubenswrapper[5004]: I1201 08:43:39.133700 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce6cfc91-0fab-40a6-ae94-9cd690cfde01-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ce6cfc91-0fab-40a6-ae94-9cd690cfde01" (UID: "ce6cfc91-0fab-40a6-ae94-9cd690cfde01"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:43:39 crc kubenswrapper[5004]: I1201 08:43:39.137515 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 08:43:39 crc kubenswrapper[5004]: I1201 08:43:39.167058 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce6cfc91-0fab-40a6-ae94-9cd690cfde01-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ce6cfc91-0fab-40a6-ae94-9cd690cfde01" (UID: "ce6cfc91-0fab-40a6-ae94-9cd690cfde01"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:43:39 crc kubenswrapper[5004]: I1201 08:43:39.174540 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce6cfc91-0fab-40a6-ae94-9cd690cfde01-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ce6cfc91-0fab-40a6-ae94-9cd690cfde01" (UID: "ce6cfc91-0fab-40a6-ae94-9cd690cfde01"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:43:39 crc kubenswrapper[5004]: I1201 08:43:39.190164 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce6cfc91-0fab-40a6-ae94-9cd690cfde01-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:43:39 crc kubenswrapper[5004]: I1201 08:43:39.190198 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tfdh\" (UniqueName: \"kubernetes.io/projected/ce6cfc91-0fab-40a6-ae94-9cd690cfde01-kube-api-access-8tfdh\") on node \"crc\" DevicePath \"\"" Dec 01 08:43:39 crc kubenswrapper[5004]: I1201 08:43:39.190209 5004 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce6cfc91-0fab-40a6-ae94-9cd690cfde01-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 08:43:39 crc kubenswrapper[5004]: I1201 08:43:39.190218 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce6cfc91-0fab-40a6-ae94-9cd690cfde01-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 08:43:39 crc kubenswrapper[5004]: I1201 08:43:39.190226 5004 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce6cfc91-0fab-40a6-ae94-9cd690cfde01-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 08:43:39 crc kubenswrapper[5004]: I1201 08:43:39.337848 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 01 08:43:39 crc kubenswrapper[5004]: I1201 08:43:39.354961 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 01 08:43:39 crc kubenswrapper[5004]: I1201 08:43:39.371583 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 01 08:43:39 crc kubenswrapper[5004]: E1201 08:43:39.372129 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce6cfc91-0fab-40a6-ae94-9cd690cfde01" containerName="nova-api-log" Dec 01 08:43:39 crc kubenswrapper[5004]: I1201 08:43:39.372149 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce6cfc91-0fab-40a6-ae94-9cd690cfde01" containerName="nova-api-log" Dec 01 08:43:39 crc kubenswrapper[5004]: E1201 08:43:39.372172 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce6cfc91-0fab-40a6-ae94-9cd690cfde01" containerName="nova-api-api" Dec 01 08:43:39 crc kubenswrapper[5004]: I1201 08:43:39.372178 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce6cfc91-0fab-40a6-ae94-9cd690cfde01" containerName="nova-api-api" Dec 01 08:43:39 crc kubenswrapper[5004]: I1201 08:43:39.372434 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce6cfc91-0fab-40a6-ae94-9cd690cfde01" containerName="nova-api-api" Dec 01 08:43:39 crc kubenswrapper[5004]: I1201 08:43:39.372457 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce6cfc91-0fab-40a6-ae94-9cd690cfde01" containerName="nova-api-log" Dec 01 08:43:39 crc kubenswrapper[5004]: I1201 08:43:39.373707 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 08:43:39 crc kubenswrapper[5004]: I1201 08:43:39.376269 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 01 08:43:39 crc kubenswrapper[5004]: I1201 08:43:39.376446 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 01 08:43:39 crc kubenswrapper[5004]: I1201 08:43:39.376578 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 01 08:43:39 crc kubenswrapper[5004]: I1201 08:43:39.382728 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 08:43:39 crc kubenswrapper[5004]: I1201 08:43:39.511145 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5481ba31-c298-4eb4-ab74-e4fd53d46316-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5481ba31-c298-4eb4-ab74-e4fd53d46316\") " pod="openstack/nova-api-0" Dec 01 08:43:39 crc kubenswrapper[5004]: I1201 08:43:39.511316 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d426s\" (UniqueName: \"kubernetes.io/projected/5481ba31-c298-4eb4-ab74-e4fd53d46316-kube-api-access-d426s\") pod \"nova-api-0\" (UID: \"5481ba31-c298-4eb4-ab74-e4fd53d46316\") " pod="openstack/nova-api-0" Dec 01 08:43:39 crc kubenswrapper[5004]: I1201 08:43:39.511419 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5481ba31-c298-4eb4-ab74-e4fd53d46316-config-data\") pod \"nova-api-0\" (UID: \"5481ba31-c298-4eb4-ab74-e4fd53d46316\") " pod="openstack/nova-api-0" Dec 01 08:43:39 crc kubenswrapper[5004]: I1201 08:43:39.511495 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5481ba31-c298-4eb4-ab74-e4fd53d46316-internal-tls-certs\") pod \"nova-api-0\" (UID: \"5481ba31-c298-4eb4-ab74-e4fd53d46316\") " pod="openstack/nova-api-0" Dec 01 08:43:39 crc kubenswrapper[5004]: I1201 08:43:39.511601 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5481ba31-c298-4eb4-ab74-e4fd53d46316-public-tls-certs\") pod \"nova-api-0\" (UID: \"5481ba31-c298-4eb4-ab74-e4fd53d46316\") " pod="openstack/nova-api-0" Dec 01 08:43:39 crc kubenswrapper[5004]: I1201 08:43:39.511662 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5481ba31-c298-4eb4-ab74-e4fd53d46316-logs\") pod \"nova-api-0\" (UID: \"5481ba31-c298-4eb4-ab74-e4fd53d46316\") " pod="openstack/nova-api-0" Dec 01 08:43:39 crc kubenswrapper[5004]: I1201 08:43:39.614421 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5481ba31-c298-4eb4-ab74-e4fd53d46316-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5481ba31-c298-4eb4-ab74-e4fd53d46316\") " pod="openstack/nova-api-0" Dec 01 08:43:39 crc kubenswrapper[5004]: I1201 08:43:39.614481 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d426s\" (UniqueName: \"kubernetes.io/projected/5481ba31-c298-4eb4-ab74-e4fd53d46316-kube-api-access-d426s\") pod \"nova-api-0\" (UID: \"5481ba31-c298-4eb4-ab74-e4fd53d46316\") " pod="openstack/nova-api-0" Dec 01 08:43:39 crc kubenswrapper[5004]: I1201 08:43:39.614530 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5481ba31-c298-4eb4-ab74-e4fd53d46316-config-data\") pod \"nova-api-0\" (UID: \"5481ba31-c298-4eb4-ab74-e4fd53d46316\") " pod="openstack/nova-api-0" Dec 01 08:43:39 crc kubenswrapper[5004]: I1201 08:43:39.614581 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5481ba31-c298-4eb4-ab74-e4fd53d46316-internal-tls-certs\") pod \"nova-api-0\" (UID: \"5481ba31-c298-4eb4-ab74-e4fd53d46316\") " pod="openstack/nova-api-0" Dec 01 08:43:39 crc kubenswrapper[5004]: I1201 08:43:39.614625 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5481ba31-c298-4eb4-ab74-e4fd53d46316-public-tls-certs\") pod \"nova-api-0\" (UID: \"5481ba31-c298-4eb4-ab74-e4fd53d46316\") " pod="openstack/nova-api-0" Dec 01 08:43:39 crc kubenswrapper[5004]: I1201 08:43:39.614657 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5481ba31-c298-4eb4-ab74-e4fd53d46316-logs\") pod \"nova-api-0\" (UID: \"5481ba31-c298-4eb4-ab74-e4fd53d46316\") " pod="openstack/nova-api-0" Dec 01 08:43:39 crc kubenswrapper[5004]: I1201 08:43:39.615078 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5481ba31-c298-4eb4-ab74-e4fd53d46316-logs\") pod \"nova-api-0\" (UID: \"5481ba31-c298-4eb4-ab74-e4fd53d46316\") " pod="openstack/nova-api-0" Dec 01 08:43:39 crc kubenswrapper[5004]: I1201 08:43:39.619909 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5481ba31-c298-4eb4-ab74-e4fd53d46316-public-tls-certs\") pod \"nova-api-0\" (UID: \"5481ba31-c298-4eb4-ab74-e4fd53d46316\") " pod="openstack/nova-api-0" Dec 01 08:43:39 crc kubenswrapper[5004]: I1201 08:43:39.620166 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5481ba31-c298-4eb4-ab74-e4fd53d46316-internal-tls-certs\") pod \"nova-api-0\" (UID: \"5481ba31-c298-4eb4-ab74-e4fd53d46316\") " pod="openstack/nova-api-0" Dec 01 08:43:39 crc kubenswrapper[5004]: I1201 08:43:39.621334 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5481ba31-c298-4eb4-ab74-e4fd53d46316-config-data\") pod \"nova-api-0\" (UID: \"5481ba31-c298-4eb4-ab74-e4fd53d46316\") " pod="openstack/nova-api-0" Dec 01 08:43:39 crc kubenswrapper[5004]: I1201 08:43:39.648332 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5481ba31-c298-4eb4-ab74-e4fd53d46316-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5481ba31-c298-4eb4-ab74-e4fd53d46316\") " pod="openstack/nova-api-0" Dec 01 08:43:39 crc kubenswrapper[5004]: I1201 08:43:39.651975 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d426s\" (UniqueName: \"kubernetes.io/projected/5481ba31-c298-4eb4-ab74-e4fd53d46316-kube-api-access-d426s\") pod \"nova-api-0\" (UID: \"5481ba31-c298-4eb4-ab74-e4fd53d46316\") " pod="openstack/nova-api-0" Dec 01 08:43:39 crc kubenswrapper[5004]: I1201 08:43:39.862199 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 08:43:40 crc kubenswrapper[5004]: I1201 08:43:40.048202 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d68b7ff7-f7f8-442f-870d-94f82b0842c1","Type":"ContainerStarted","Data":"585d0458ab888a75207ea9d85c7cb11da660723533d17ec09b6cc05f935535eb"} Dec 01 08:43:40 crc kubenswrapper[5004]: I1201 08:43:40.059191 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c51ed16a-b6da-4bc9-9d47-18ec809ba124","Type":"ContainerStarted","Data":"3539dcbccbf35e7a1610bbb3b23d31d51cf473761ac69e6cb383cdeaa25ae71b"} Dec 01 08:43:40 crc kubenswrapper[5004]: I1201 08:43:40.059237 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c51ed16a-b6da-4bc9-9d47-18ec809ba124","Type":"ContainerStarted","Data":"ce5ea97119abd68fb6e6c3532116a3148d6137e187fcb118ed41b6041f7bc8cf"} Dec 01 08:43:40 crc kubenswrapper[5004]: I1201 08:43:40.059247 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c51ed16a-b6da-4bc9-9d47-18ec809ba124","Type":"ContainerStarted","Data":"3d05d74e9a28034921896c488478f834eedcfd092c85ba2df67d9530504db72f"} Dec 01 08:43:40 crc kubenswrapper[5004]: I1201 08:43:40.072102 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.072085546 podStartE2EDuration="2.072085546s" podCreationTimestamp="2025-12-01 08:43:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:43:40.063271641 +0000 UTC m=+1597.628263633" watchObservedRunningTime="2025-12-01 08:43:40.072085546 +0000 UTC m=+1597.637077528" Dec 01 08:43:40 crc kubenswrapper[5004]: I1201 08:43:40.103874 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.103857702 podStartE2EDuration="2.103857702s" podCreationTimestamp="2025-12-01 08:43:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:43:40.090012134 +0000 UTC m=+1597.655004116" watchObservedRunningTime="2025-12-01 08:43:40.103857702 +0000 UTC m=+1597.668849684" Dec 01 08:43:40 crc kubenswrapper[5004]: I1201 08:43:40.426431 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 08:43:40 crc kubenswrapper[5004]: I1201 08:43:40.799915 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce6cfc91-0fab-40a6-ae94-9cd690cfde01" path="/var/lib/kubelet/pods/ce6cfc91-0fab-40a6-ae94-9cd690cfde01/volumes" Dec 01 08:43:41 crc kubenswrapper[5004]: I1201 08:43:41.076125 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5481ba31-c298-4eb4-ab74-e4fd53d46316","Type":"ContainerStarted","Data":"c2cf75dc50aad90e8be073bbed765e1528fc3fe949b9b1b5fbf93921e9fc4532"} Dec 01 08:43:41 crc kubenswrapper[5004]: I1201 08:43:41.076421 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5481ba31-c298-4eb4-ab74-e4fd53d46316","Type":"ContainerStarted","Data":"6e8a35ba73d6d34dcd373ad324720471734d72d48d1409d98338d65649f36acb"} Dec 01 08:43:41 crc kubenswrapper[5004]: I1201 08:43:41.076437 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5481ba31-c298-4eb4-ab74-e4fd53d46316","Type":"ContainerStarted","Data":"edbf1c3910576a98631b2472adb4e766608067996bf2cff2be2299765e82b933"} Dec 01 08:43:41 crc kubenswrapper[5004]: I1201 08:43:41.105987 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.105966491 podStartE2EDuration="2.105966491s" podCreationTimestamp="2025-12-01 08:43:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:43:41.101353488 +0000 UTC m=+1598.666345480" watchObservedRunningTime="2025-12-01 08:43:41.105966491 +0000 UTC m=+1598.670958483" Dec 01 08:43:41 crc kubenswrapper[5004]: I1201 08:43:41.654943 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="855d076a-eacb-40d9-9135-8a1a3cf64f59" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.245:8775/\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 01 08:43:41 crc kubenswrapper[5004]: I1201 08:43:41.654965 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="855d076a-eacb-40d9-9135-8a1a3cf64f59" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.245:8775/\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 01 08:43:43 crc kubenswrapper[5004]: I1201 08:43:43.394906 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 01 08:43:43 crc kubenswrapper[5004]: I1201 08:43:43.429328 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 01 08:43:43 crc kubenswrapper[5004]: I1201 08:43:43.429462 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 01 08:43:44 crc kubenswrapper[5004]: I1201 08:43:44.529434 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-f9ffl"] Dec 01 08:43:44 crc kubenswrapper[5004]: I1201 08:43:44.533064 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f9ffl" Dec 01 08:43:44 crc kubenswrapper[5004]: I1201 08:43:44.557148 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f9ffl"] Dec 01 08:43:44 crc kubenswrapper[5004]: I1201 08:43:44.644022 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f80c1b8-13d7-4769-9bfd-6286aa4f7d82-catalog-content\") pod \"certified-operators-f9ffl\" (UID: \"3f80c1b8-13d7-4769-9bfd-6286aa4f7d82\") " pod="openshift-marketplace/certified-operators-f9ffl" Dec 01 08:43:44 crc kubenswrapper[5004]: I1201 08:43:44.644139 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f80c1b8-13d7-4769-9bfd-6286aa4f7d82-utilities\") pod \"certified-operators-f9ffl\" (UID: \"3f80c1b8-13d7-4769-9bfd-6286aa4f7d82\") " pod="openshift-marketplace/certified-operators-f9ffl" Dec 01 08:43:44 crc kubenswrapper[5004]: I1201 08:43:44.644182 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ks8zv\" (UniqueName: \"kubernetes.io/projected/3f80c1b8-13d7-4769-9bfd-6286aa4f7d82-kube-api-access-ks8zv\") pod \"certified-operators-f9ffl\" (UID: \"3f80c1b8-13d7-4769-9bfd-6286aa4f7d82\") " pod="openshift-marketplace/certified-operators-f9ffl" Dec 01 08:43:44 crc kubenswrapper[5004]: I1201 08:43:44.746518 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f80c1b8-13d7-4769-9bfd-6286aa4f7d82-catalog-content\") pod \"certified-operators-f9ffl\" (UID: \"3f80c1b8-13d7-4769-9bfd-6286aa4f7d82\") " pod="openshift-marketplace/certified-operators-f9ffl" Dec 01 08:43:44 crc kubenswrapper[5004]: I1201 08:43:44.746677 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f80c1b8-13d7-4769-9bfd-6286aa4f7d82-utilities\") pod \"certified-operators-f9ffl\" (UID: \"3f80c1b8-13d7-4769-9bfd-6286aa4f7d82\") " pod="openshift-marketplace/certified-operators-f9ffl" Dec 01 08:43:44 crc kubenswrapper[5004]: I1201 08:43:44.746735 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ks8zv\" (UniqueName: \"kubernetes.io/projected/3f80c1b8-13d7-4769-9bfd-6286aa4f7d82-kube-api-access-ks8zv\") pod \"certified-operators-f9ffl\" (UID: \"3f80c1b8-13d7-4769-9bfd-6286aa4f7d82\") " pod="openshift-marketplace/certified-operators-f9ffl" Dec 01 08:43:44 crc kubenswrapper[5004]: I1201 08:43:44.747159 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f80c1b8-13d7-4769-9bfd-6286aa4f7d82-catalog-content\") pod \"certified-operators-f9ffl\" (UID: \"3f80c1b8-13d7-4769-9bfd-6286aa4f7d82\") " pod="openshift-marketplace/certified-operators-f9ffl" Dec 01 08:43:44 crc kubenswrapper[5004]: I1201 08:43:44.747461 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f80c1b8-13d7-4769-9bfd-6286aa4f7d82-utilities\") pod \"certified-operators-f9ffl\" (UID: \"3f80c1b8-13d7-4769-9bfd-6286aa4f7d82\") " pod="openshift-marketplace/certified-operators-f9ffl" Dec 01 08:43:44 crc kubenswrapper[5004]: I1201 08:43:44.768906 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ks8zv\" (UniqueName: \"kubernetes.io/projected/3f80c1b8-13d7-4769-9bfd-6286aa4f7d82-kube-api-access-ks8zv\") pod \"certified-operators-f9ffl\" (UID: \"3f80c1b8-13d7-4769-9bfd-6286aa4f7d82\") " pod="openshift-marketplace/certified-operators-f9ffl" Dec 01 08:43:44 crc kubenswrapper[5004]: I1201 08:43:44.869387 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f9ffl" Dec 01 08:43:45 crc kubenswrapper[5004]: I1201 08:43:45.372979 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f9ffl"] Dec 01 08:43:46 crc kubenswrapper[5004]: I1201 08:43:46.158283 5004 generic.go:334] "Generic (PLEG): container finished" podID="3f80c1b8-13d7-4769-9bfd-6286aa4f7d82" containerID="031a6e9bbc7850c81b31b201c2d580355757202b966aa2f15b7fd4b6910be46b" exitCode=0 Dec 01 08:43:46 crc kubenswrapper[5004]: I1201 08:43:46.158375 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f9ffl" event={"ID":"3f80c1b8-13d7-4769-9bfd-6286aa4f7d82","Type":"ContainerDied","Data":"031a6e9bbc7850c81b31b201c2d580355757202b966aa2f15b7fd4b6910be46b"} Dec 01 08:43:46 crc kubenswrapper[5004]: I1201 08:43:46.158993 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f9ffl" event={"ID":"3f80c1b8-13d7-4769-9bfd-6286aa4f7d82","Type":"ContainerStarted","Data":"42a0eb1040fbbcc2a2605f706296ae7b590f1ebda88547ab2432f11a5ffffd3e"} Dec 01 08:43:48 crc kubenswrapper[5004]: I1201 08:43:48.182502 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f9ffl" event={"ID":"3f80c1b8-13d7-4769-9bfd-6286aa4f7d82","Type":"ContainerStarted","Data":"a40bb9433fba39770211a3b6bb76b2db8bcd1515733778eab275db406ada5c8b"} Dec 01 08:43:48 crc kubenswrapper[5004]: I1201 08:43:48.408378 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 01 08:43:48 crc kubenswrapper[5004]: I1201 08:43:48.432703 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 01 08:43:48 crc kubenswrapper[5004]: I1201 08:43:48.433350 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 01 08:43:48 crc kubenswrapper[5004]: I1201 08:43:48.450833 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 01 08:43:49 crc kubenswrapper[5004]: I1201 08:43:49.202107 5004 generic.go:334] "Generic (PLEG): container finished" podID="3f80c1b8-13d7-4769-9bfd-6286aa4f7d82" containerID="a40bb9433fba39770211a3b6bb76b2db8bcd1515733778eab275db406ada5c8b" exitCode=0 Dec 01 08:43:49 crc kubenswrapper[5004]: I1201 08:43:49.204838 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f9ffl" event={"ID":"3f80c1b8-13d7-4769-9bfd-6286aa4f7d82","Type":"ContainerDied","Data":"a40bb9433fba39770211a3b6bb76b2db8bcd1515733778eab275db406ada5c8b"} Dec 01 08:43:49 crc kubenswrapper[5004]: I1201 08:43:49.257287 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 01 08:43:49 crc kubenswrapper[5004]: I1201 08:43:49.468773 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="c51ed16a-b6da-4bc9-9d47-18ec809ba124" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.0:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 01 08:43:49 crc kubenswrapper[5004]: I1201 08:43:49.468820 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="c51ed16a-b6da-4bc9-9d47-18ec809ba124" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.0:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 01 08:43:49 crc kubenswrapper[5004]: I1201 08:43:49.863085 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 08:43:49 crc kubenswrapper[5004]: I1201 08:43:49.863463 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 08:43:50 crc kubenswrapper[5004]: I1201 08:43:50.217208 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f9ffl" event={"ID":"3f80c1b8-13d7-4769-9bfd-6286aa4f7d82","Type":"ContainerStarted","Data":"99d2ac178af949b7d6ebcde9702db7a077adf0733567a0956e13916c4564c37d"} Dec 01 08:43:50 crc kubenswrapper[5004]: I1201 08:43:50.243925 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-f9ffl" podStartSLOduration=2.680916246 podStartE2EDuration="6.243904997s" podCreationTimestamp="2025-12-01 08:43:44 +0000 UTC" firstStartedPulling="2025-12-01 08:43:46.161908149 +0000 UTC m=+1603.726900151" lastFinishedPulling="2025-12-01 08:43:49.72489692 +0000 UTC m=+1607.289888902" observedRunningTime="2025-12-01 08:43:50.235387719 +0000 UTC m=+1607.800379701" watchObservedRunningTime="2025-12-01 08:43:50.243904997 +0000 UTC m=+1607.808896979" Dec 01 08:43:51 crc kubenswrapper[5004]: I1201 08:43:51.357031 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5481ba31-c298-4eb4-ab74-e4fd53d46316" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.1:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 01 08:43:51 crc kubenswrapper[5004]: I1201 08:43:51.357740 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5481ba31-c298-4eb4-ab74-e4fd53d46316" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.1:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 01 08:43:53 crc kubenswrapper[5004]: I1201 08:43:53.116089 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 01 08:43:54 crc kubenswrapper[5004]: I1201 08:43:54.869516 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-f9ffl" Dec 01 08:43:54 crc kubenswrapper[5004]: I1201 08:43:54.870428 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-f9ffl" Dec 01 08:43:55 crc kubenswrapper[5004]: I1201 08:43:55.950874 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-f9ffl" podUID="3f80c1b8-13d7-4769-9bfd-6286aa4f7d82" containerName="registry-server" probeResult="failure" output=< Dec 01 08:43:55 crc kubenswrapper[5004]: timeout: failed to connect service ":50051" within 1s Dec 01 08:43:55 crc kubenswrapper[5004]: > Dec 01 08:43:58 crc kubenswrapper[5004]: I1201 08:43:58.435100 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 01 08:43:58 crc kubenswrapper[5004]: I1201 08:43:58.435691 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 01 08:43:58 crc kubenswrapper[5004]: I1201 08:43:58.439616 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 01 08:43:58 crc kubenswrapper[5004]: I1201 08:43:58.440448 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 01 08:43:58 crc kubenswrapper[5004]: I1201 08:43:58.849521 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 08:43:58 crc kubenswrapper[5004]: I1201 08:43:58.849900 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="5fcd0c1b-820e-4bc1-b3fd-22ac13415e3c" containerName="kube-state-metrics" containerID="cri-o://a3c90aa3298773adaa0bf995f7b18049ea6e437f21b642b2f55dbc4a2b40eb07" gracePeriod=30 Dec 01 08:43:59 crc kubenswrapper[5004]: I1201 08:43:59.010491 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-0"] Dec 01 08:43:59 crc kubenswrapper[5004]: I1201 08:43:59.011110 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mysqld-exporter-0" podUID="110193dd-2d1c-4ae1-86b3-985b039a16f0" containerName="mysqld-exporter" containerID="cri-o://1ed44d3c6abede1281aa4a670ff6d7d20fd5b8f0daf2f9f0651e0beb75b0f07f" gracePeriod=30 Dec 01 08:43:59 crc kubenswrapper[5004]: I1201 08:43:59.335959 5004 generic.go:334] "Generic (PLEG): container finished" podID="5fcd0c1b-820e-4bc1-b3fd-22ac13415e3c" containerID="a3c90aa3298773adaa0bf995f7b18049ea6e437f21b642b2f55dbc4a2b40eb07" exitCode=2 Dec 01 08:43:59 crc kubenswrapper[5004]: I1201 08:43:59.336043 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5fcd0c1b-820e-4bc1-b3fd-22ac13415e3c","Type":"ContainerDied","Data":"a3c90aa3298773adaa0bf995f7b18049ea6e437f21b642b2f55dbc4a2b40eb07"} Dec 01 08:43:59 crc kubenswrapper[5004]: I1201 08:43:59.344322 5004 generic.go:334] "Generic (PLEG): container finished" podID="110193dd-2d1c-4ae1-86b3-985b039a16f0" containerID="1ed44d3c6abede1281aa4a670ff6d7d20fd5b8f0daf2f9f0651e0beb75b0f07f" exitCode=2 Dec 01 08:43:59 crc kubenswrapper[5004]: I1201 08:43:59.344428 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"110193dd-2d1c-4ae1-86b3-985b039a16f0","Type":"ContainerDied","Data":"1ed44d3c6abede1281aa4a670ff6d7d20fd5b8f0daf2f9f0651e0beb75b0f07f"} Dec 01 08:43:59 crc kubenswrapper[5004]: I1201 08:43:59.554960 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 01 08:43:59 crc kubenswrapper[5004]: I1201 08:43:59.561373 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Dec 01 08:43:59 crc kubenswrapper[5004]: I1201 08:43:59.625481 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/110193dd-2d1c-4ae1-86b3-985b039a16f0-combined-ca-bundle\") pod \"110193dd-2d1c-4ae1-86b3-985b039a16f0\" (UID: \"110193dd-2d1c-4ae1-86b3-985b039a16f0\") " Dec 01 08:43:59 crc kubenswrapper[5004]: I1201 08:43:59.625547 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/110193dd-2d1c-4ae1-86b3-985b039a16f0-config-data\") pod \"110193dd-2d1c-4ae1-86b3-985b039a16f0\" (UID: \"110193dd-2d1c-4ae1-86b3-985b039a16f0\") " Dec 01 08:43:59 crc kubenswrapper[5004]: I1201 08:43:59.625610 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6swp\" (UniqueName: \"kubernetes.io/projected/5fcd0c1b-820e-4bc1-b3fd-22ac13415e3c-kube-api-access-c6swp\") pod \"5fcd0c1b-820e-4bc1-b3fd-22ac13415e3c\" (UID: \"5fcd0c1b-820e-4bc1-b3fd-22ac13415e3c\") " Dec 01 08:43:59 crc kubenswrapper[5004]: I1201 08:43:59.625708 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4xhx\" (UniqueName: \"kubernetes.io/projected/110193dd-2d1c-4ae1-86b3-985b039a16f0-kube-api-access-q4xhx\") pod \"110193dd-2d1c-4ae1-86b3-985b039a16f0\" (UID: \"110193dd-2d1c-4ae1-86b3-985b039a16f0\") " Dec 01 08:43:59 crc kubenswrapper[5004]: I1201 08:43:59.649945 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/110193dd-2d1c-4ae1-86b3-985b039a16f0-kube-api-access-q4xhx" (OuterVolumeSpecName: "kube-api-access-q4xhx") pod "110193dd-2d1c-4ae1-86b3-985b039a16f0" (UID: "110193dd-2d1c-4ae1-86b3-985b039a16f0"). InnerVolumeSpecName "kube-api-access-q4xhx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:43:59 crc kubenswrapper[5004]: I1201 08:43:59.651694 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fcd0c1b-820e-4bc1-b3fd-22ac13415e3c-kube-api-access-c6swp" (OuterVolumeSpecName: "kube-api-access-c6swp") pod "5fcd0c1b-820e-4bc1-b3fd-22ac13415e3c" (UID: "5fcd0c1b-820e-4bc1-b3fd-22ac13415e3c"). InnerVolumeSpecName "kube-api-access-c6swp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:43:59 crc kubenswrapper[5004]: I1201 08:43:59.728377 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6swp\" (UniqueName: \"kubernetes.io/projected/5fcd0c1b-820e-4bc1-b3fd-22ac13415e3c-kube-api-access-c6swp\") on node \"crc\" DevicePath \"\"" Dec 01 08:43:59 crc kubenswrapper[5004]: I1201 08:43:59.728413 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4xhx\" (UniqueName: \"kubernetes.io/projected/110193dd-2d1c-4ae1-86b3-985b039a16f0-kube-api-access-q4xhx\") on node \"crc\" DevicePath \"\"" Dec 01 08:43:59 crc kubenswrapper[5004]: I1201 08:43:59.741933 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/110193dd-2d1c-4ae1-86b3-985b039a16f0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "110193dd-2d1c-4ae1-86b3-985b039a16f0" (UID: "110193dd-2d1c-4ae1-86b3-985b039a16f0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:43:59 crc kubenswrapper[5004]: I1201 08:43:59.808682 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/110193dd-2d1c-4ae1-86b3-985b039a16f0-config-data" (OuterVolumeSpecName: "config-data") pod "110193dd-2d1c-4ae1-86b3-985b039a16f0" (UID: "110193dd-2d1c-4ae1-86b3-985b039a16f0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:43:59 crc kubenswrapper[5004]: I1201 08:43:59.830184 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/110193dd-2d1c-4ae1-86b3-985b039a16f0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:43:59 crc kubenswrapper[5004]: I1201 08:43:59.830231 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/110193dd-2d1c-4ae1-86b3-985b039a16f0-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 08:43:59 crc kubenswrapper[5004]: I1201 08:43:59.872967 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 01 08:43:59 crc kubenswrapper[5004]: I1201 08:43:59.874857 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 01 08:43:59 crc kubenswrapper[5004]: I1201 08:43:59.895413 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 01 08:43:59 crc kubenswrapper[5004]: I1201 08:43:59.913684 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 01 08:44:00 crc kubenswrapper[5004]: I1201 08:44:00.356207 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5fcd0c1b-820e-4bc1-b3fd-22ac13415e3c","Type":"ContainerDied","Data":"2402adbdc1bb8af82f683494045bbbfce5c45c16acc1dd2ccf1770c5ec08bdac"} Dec 01 08:44:00 crc kubenswrapper[5004]: I1201 08:44:00.356248 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 01 08:44:00 crc kubenswrapper[5004]: I1201 08:44:00.356508 5004 scope.go:117] "RemoveContainer" containerID="a3c90aa3298773adaa0bf995f7b18049ea6e437f21b642b2f55dbc4a2b40eb07" Dec 01 08:44:00 crc kubenswrapper[5004]: I1201 08:44:00.359535 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"110193dd-2d1c-4ae1-86b3-985b039a16f0","Type":"ContainerDied","Data":"4e9ac53ce4f5cae561b79ba8b1c2d86a347f4912461ebf56af1b8c2be1c9d7eb"} Dec 01 08:44:00 crc kubenswrapper[5004]: I1201 08:44:00.359550 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Dec 01 08:44:00 crc kubenswrapper[5004]: I1201 08:44:00.360381 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 01 08:44:00 crc kubenswrapper[5004]: I1201 08:44:00.386019 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 01 08:44:00 crc kubenswrapper[5004]: I1201 08:44:00.419446 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 08:44:00 crc kubenswrapper[5004]: I1201 08:44:00.433528 5004 scope.go:117] "RemoveContainer" containerID="1ed44d3c6abede1281aa4a670ff6d7d20fd5b8f0daf2f9f0651e0beb75b0f07f" Dec 01 08:44:00 crc kubenswrapper[5004]: I1201 08:44:00.449400 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 08:44:00 crc kubenswrapper[5004]: I1201 08:44:00.496743 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 08:44:00 crc kubenswrapper[5004]: E1201 08:44:00.497373 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="110193dd-2d1c-4ae1-86b3-985b039a16f0" containerName="mysqld-exporter" Dec 01 08:44:00 crc kubenswrapper[5004]: I1201 08:44:00.497398 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="110193dd-2d1c-4ae1-86b3-985b039a16f0" containerName="mysqld-exporter" Dec 01 08:44:00 crc kubenswrapper[5004]: E1201 08:44:00.497426 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fcd0c1b-820e-4bc1-b3fd-22ac13415e3c" containerName="kube-state-metrics" Dec 01 08:44:00 crc kubenswrapper[5004]: I1201 08:44:00.497434 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fcd0c1b-820e-4bc1-b3fd-22ac13415e3c" containerName="kube-state-metrics" Dec 01 08:44:00 crc kubenswrapper[5004]: I1201 08:44:00.497741 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fcd0c1b-820e-4bc1-b3fd-22ac13415e3c" containerName="kube-state-metrics" Dec 01 08:44:00 crc kubenswrapper[5004]: I1201 08:44:00.497782 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="110193dd-2d1c-4ae1-86b3-985b039a16f0" containerName="mysqld-exporter" Dec 01 08:44:00 crc kubenswrapper[5004]: I1201 08:44:00.499824 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 01 08:44:00 crc kubenswrapper[5004]: I1201 08:44:00.510097 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Dec 01 08:44:00 crc kubenswrapper[5004]: I1201 08:44:00.510155 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Dec 01 08:44:00 crc kubenswrapper[5004]: I1201 08:44:00.552410 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/7e571289-bc5b-4304-bca2-994e61086e68-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"7e571289-bc5b-4304-bca2-994e61086e68\") " pod="openstack/kube-state-metrics-0" Dec 01 08:44:00 crc kubenswrapper[5004]: I1201 08:44:00.552779 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npg4c\" (UniqueName: \"kubernetes.io/projected/7e571289-bc5b-4304-bca2-994e61086e68-kube-api-access-npg4c\") pod \"kube-state-metrics-0\" (UID: \"7e571289-bc5b-4304-bca2-994e61086e68\") " pod="openstack/kube-state-metrics-0" Dec 01 08:44:00 crc kubenswrapper[5004]: I1201 08:44:00.553551 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e571289-bc5b-4304-bca2-994e61086e68-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"7e571289-bc5b-4304-bca2-994e61086e68\") " pod="openstack/kube-state-metrics-0" Dec 01 08:44:00 crc kubenswrapper[5004]: I1201 08:44:00.553845 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e571289-bc5b-4304-bca2-994e61086e68-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"7e571289-bc5b-4304-bca2-994e61086e68\") " pod="openstack/kube-state-metrics-0" Dec 01 08:44:00 crc kubenswrapper[5004]: I1201 08:44:00.554316 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 08:44:00 crc kubenswrapper[5004]: I1201 08:44:00.569543 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-0"] Dec 01 08:44:00 crc kubenswrapper[5004]: I1201 08:44:00.592768 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-0"] Dec 01 08:44:00 crc kubenswrapper[5004]: I1201 08:44:00.623275 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-0"] Dec 01 08:44:00 crc kubenswrapper[5004]: I1201 08:44:00.625126 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Dec 01 08:44:00 crc kubenswrapper[5004]: I1201 08:44:00.627999 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-config-data" Dec 01 08:44:00 crc kubenswrapper[5004]: I1201 08:44:00.628185 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-mysqld-exporter-svc" Dec 01 08:44:00 crc kubenswrapper[5004]: I1201 08:44:00.655675 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Dec 01 08:44:00 crc kubenswrapper[5004]: I1201 08:44:00.656118 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e571289-bc5b-4304-bca2-994e61086e68-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"7e571289-bc5b-4304-bca2-994e61086e68\") " pod="openstack/kube-state-metrics-0" Dec 01 08:44:00 crc kubenswrapper[5004]: I1201 08:44:00.656193 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cls9x\" (UniqueName: \"kubernetes.io/projected/4941cf08-e742-4085-a85d-5d64305aec32-kube-api-access-cls9x\") pod \"mysqld-exporter-0\" (UID: \"4941cf08-e742-4085-a85d-5d64305aec32\") " pod="openstack/mysqld-exporter-0" Dec 01 08:44:00 crc kubenswrapper[5004]: I1201 08:44:00.656275 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e571289-bc5b-4304-bca2-994e61086e68-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"7e571289-bc5b-4304-bca2-994e61086e68\") " pod="openstack/kube-state-metrics-0" Dec 01 08:44:00 crc kubenswrapper[5004]: I1201 08:44:00.656314 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4941cf08-e742-4085-a85d-5d64305aec32-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"4941cf08-e742-4085-a85d-5d64305aec32\") " pod="openstack/mysqld-exporter-0" Dec 01 08:44:00 crc kubenswrapper[5004]: I1201 08:44:00.656351 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4941cf08-e742-4085-a85d-5d64305aec32-config-data\") pod \"mysqld-exporter-0\" (UID: \"4941cf08-e742-4085-a85d-5d64305aec32\") " pod="openstack/mysqld-exporter-0" Dec 01 08:44:00 crc kubenswrapper[5004]: I1201 08:44:00.656438 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/7e571289-bc5b-4304-bca2-994e61086e68-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"7e571289-bc5b-4304-bca2-994e61086e68\") " pod="openstack/kube-state-metrics-0" Dec 01 08:44:00 crc kubenswrapper[5004]: I1201 08:44:00.656508 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npg4c\" (UniqueName: \"kubernetes.io/projected/7e571289-bc5b-4304-bca2-994e61086e68-kube-api-access-npg4c\") pod \"kube-state-metrics-0\" (UID: \"7e571289-bc5b-4304-bca2-994e61086e68\") " pod="openstack/kube-state-metrics-0" Dec 01 08:44:00 crc kubenswrapper[5004]: I1201 08:44:00.656551 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/4941cf08-e742-4085-a85d-5d64305aec32-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"4941cf08-e742-4085-a85d-5d64305aec32\") " pod="openstack/mysqld-exporter-0" Dec 01 08:44:00 crc kubenswrapper[5004]: I1201 08:44:00.665523 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/7e571289-bc5b-4304-bca2-994e61086e68-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"7e571289-bc5b-4304-bca2-994e61086e68\") " pod="openstack/kube-state-metrics-0" Dec 01 08:44:00 crc kubenswrapper[5004]: I1201 08:44:00.666855 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e571289-bc5b-4304-bca2-994e61086e68-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"7e571289-bc5b-4304-bca2-994e61086e68\") " pod="openstack/kube-state-metrics-0" Dec 01 08:44:00 crc kubenswrapper[5004]: I1201 08:44:00.667530 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e571289-bc5b-4304-bca2-994e61086e68-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"7e571289-bc5b-4304-bca2-994e61086e68\") " pod="openstack/kube-state-metrics-0" Dec 01 08:44:00 crc kubenswrapper[5004]: I1201 08:44:00.674421 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npg4c\" (UniqueName: \"kubernetes.io/projected/7e571289-bc5b-4304-bca2-994e61086e68-kube-api-access-npg4c\") pod \"kube-state-metrics-0\" (UID: \"7e571289-bc5b-4304-bca2-994e61086e68\") " pod="openstack/kube-state-metrics-0" Dec 01 08:44:00 crc kubenswrapper[5004]: I1201 08:44:00.758374 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4941cf08-e742-4085-a85d-5d64305aec32-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"4941cf08-e742-4085-a85d-5d64305aec32\") " pod="openstack/mysqld-exporter-0" Dec 01 08:44:00 crc kubenswrapper[5004]: I1201 08:44:00.758727 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4941cf08-e742-4085-a85d-5d64305aec32-config-data\") pod \"mysqld-exporter-0\" (UID: \"4941cf08-e742-4085-a85d-5d64305aec32\") " pod="openstack/mysqld-exporter-0" Dec 01 08:44:00 crc kubenswrapper[5004]: I1201 08:44:00.758833 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/4941cf08-e742-4085-a85d-5d64305aec32-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"4941cf08-e742-4085-a85d-5d64305aec32\") " pod="openstack/mysqld-exporter-0" Dec 01 08:44:00 crc kubenswrapper[5004]: I1201 08:44:00.758933 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cls9x\" (UniqueName: \"kubernetes.io/projected/4941cf08-e742-4085-a85d-5d64305aec32-kube-api-access-cls9x\") pod \"mysqld-exporter-0\" (UID: \"4941cf08-e742-4085-a85d-5d64305aec32\") " pod="openstack/mysqld-exporter-0" Dec 01 08:44:00 crc kubenswrapper[5004]: I1201 08:44:00.762771 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4941cf08-e742-4085-a85d-5d64305aec32-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"4941cf08-e742-4085-a85d-5d64305aec32\") " pod="openstack/mysqld-exporter-0" Dec 01 08:44:00 crc kubenswrapper[5004]: I1201 08:44:00.763291 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/4941cf08-e742-4085-a85d-5d64305aec32-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"4941cf08-e742-4085-a85d-5d64305aec32\") " pod="openstack/mysqld-exporter-0" Dec 01 08:44:00 crc kubenswrapper[5004]: I1201 08:44:00.765825 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4941cf08-e742-4085-a85d-5d64305aec32-config-data\") pod \"mysqld-exporter-0\" (UID: \"4941cf08-e742-4085-a85d-5d64305aec32\") " pod="openstack/mysqld-exporter-0" Dec 01 08:44:00 crc kubenswrapper[5004]: I1201 08:44:00.774368 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="110193dd-2d1c-4ae1-86b3-985b039a16f0" path="/var/lib/kubelet/pods/110193dd-2d1c-4ae1-86b3-985b039a16f0/volumes" Dec 01 08:44:00 crc kubenswrapper[5004]: I1201 08:44:00.775537 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fcd0c1b-820e-4bc1-b3fd-22ac13415e3c" path="/var/lib/kubelet/pods/5fcd0c1b-820e-4bc1-b3fd-22ac13415e3c/volumes" Dec 01 08:44:00 crc kubenswrapper[5004]: I1201 08:44:00.782323 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cls9x\" (UniqueName: \"kubernetes.io/projected/4941cf08-e742-4085-a85d-5d64305aec32-kube-api-access-cls9x\") pod \"mysqld-exporter-0\" (UID: \"4941cf08-e742-4085-a85d-5d64305aec32\") " pod="openstack/mysqld-exporter-0" Dec 01 08:44:00 crc kubenswrapper[5004]: I1201 08:44:00.837957 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 01 08:44:00 crc kubenswrapper[5004]: I1201 08:44:00.947480 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Dec 01 08:44:01 crc kubenswrapper[5004]: W1201 08:44:01.298919 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e571289_bc5b_4304_bca2_994e61086e68.slice/crio-c715167271e3a95cb60d10570806e66dd50ff704ba961102c87ea3b75ff6a880 WatchSource:0}: Error finding container c715167271e3a95cb60d10570806e66dd50ff704ba961102c87ea3b75ff6a880: Status 404 returned error can't find the container with id c715167271e3a95cb60d10570806e66dd50ff704ba961102c87ea3b75ff6a880 Dec 01 08:44:01 crc kubenswrapper[5004]: I1201 08:44:01.306298 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 08:44:01 crc kubenswrapper[5004]: I1201 08:44:01.383141 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 08:44:01 crc kubenswrapper[5004]: I1201 08:44:01.383395 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="65ef6d1f-ede0-4db1-b3a4-005921b1ce6b" containerName="ceilometer-central-agent" containerID="cri-o://d4b07553eb9a29d74c32d8259f59105e2125c612dc02287aa874b3cb7f3d8197" gracePeriod=30 Dec 01 08:44:01 crc kubenswrapper[5004]: I1201 08:44:01.383443 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"7e571289-bc5b-4304-bca2-994e61086e68","Type":"ContainerStarted","Data":"c715167271e3a95cb60d10570806e66dd50ff704ba961102c87ea3b75ff6a880"} Dec 01 08:44:01 crc kubenswrapper[5004]: I1201 08:44:01.383527 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="65ef6d1f-ede0-4db1-b3a4-005921b1ce6b" containerName="sg-core" containerID="cri-o://33802e98e906ac47a4ba440d97522895ac7fe8fc1a101d9b3d7fe685a13c1e95" gracePeriod=30 Dec 01 08:44:01 crc kubenswrapper[5004]: I1201 08:44:01.383628 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="65ef6d1f-ede0-4db1-b3a4-005921b1ce6b" containerName="proxy-httpd" containerID="cri-o://bb6c0b74cd9415897eabd78a157ae825d3e68d67450773ef51a02a56788fcf60" gracePeriod=30 Dec 01 08:44:01 crc kubenswrapper[5004]: I1201 08:44:01.383699 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="65ef6d1f-ede0-4db1-b3a4-005921b1ce6b" containerName="ceilometer-notification-agent" containerID="cri-o://d2a87718597afef8ab679f9456c35bf14d823b9b70960ddfb480d3b23a76e94e" gracePeriod=30 Dec 01 08:44:01 crc kubenswrapper[5004]: I1201 08:44:01.440353 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Dec 01 08:44:01 crc kubenswrapper[5004]: W1201 08:44:01.443805 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4941cf08_e742_4085_a85d_5d64305aec32.slice/crio-aac8600bcc029001ac315f84aaffa197ce7ba360cc8537ba2bea2d764481782f WatchSource:0}: Error finding container aac8600bcc029001ac315f84aaffa197ce7ba360cc8537ba2bea2d764481782f: Status 404 returned error can't find the container with id aac8600bcc029001ac315f84aaffa197ce7ba360cc8537ba2bea2d764481782f Dec 01 08:44:02 crc kubenswrapper[5004]: I1201 08:44:02.440386 5004 generic.go:334] "Generic (PLEG): container finished" podID="65ef6d1f-ede0-4db1-b3a4-005921b1ce6b" containerID="bb6c0b74cd9415897eabd78a157ae825d3e68d67450773ef51a02a56788fcf60" exitCode=0 Dec 01 08:44:02 crc kubenswrapper[5004]: I1201 08:44:02.441100 5004 generic.go:334] "Generic (PLEG): container finished" podID="65ef6d1f-ede0-4db1-b3a4-005921b1ce6b" containerID="33802e98e906ac47a4ba440d97522895ac7fe8fc1a101d9b3d7fe685a13c1e95" exitCode=2 Dec 01 08:44:02 crc kubenswrapper[5004]: I1201 08:44:02.441112 5004 generic.go:334] "Generic (PLEG): container finished" podID="65ef6d1f-ede0-4db1-b3a4-005921b1ce6b" containerID="d4b07553eb9a29d74c32d8259f59105e2125c612dc02287aa874b3cb7f3d8197" exitCode=0 Dec 01 08:44:02 crc kubenswrapper[5004]: I1201 08:44:02.440606 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65ef6d1f-ede0-4db1-b3a4-005921b1ce6b","Type":"ContainerDied","Data":"bb6c0b74cd9415897eabd78a157ae825d3e68d67450773ef51a02a56788fcf60"} Dec 01 08:44:02 crc kubenswrapper[5004]: I1201 08:44:02.441182 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65ef6d1f-ede0-4db1-b3a4-005921b1ce6b","Type":"ContainerDied","Data":"33802e98e906ac47a4ba440d97522895ac7fe8fc1a101d9b3d7fe685a13c1e95"} Dec 01 08:44:02 crc kubenswrapper[5004]: I1201 08:44:02.441196 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65ef6d1f-ede0-4db1-b3a4-005921b1ce6b","Type":"ContainerDied","Data":"d4b07553eb9a29d74c32d8259f59105e2125c612dc02287aa874b3cb7f3d8197"} Dec 01 08:44:02 crc kubenswrapper[5004]: I1201 08:44:02.451258 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"4941cf08-e742-4085-a85d-5d64305aec32","Type":"ContainerStarted","Data":"aac8600bcc029001ac315f84aaffa197ce7ba360cc8537ba2bea2d764481782f"} Dec 01 08:44:02 crc kubenswrapper[5004]: I1201 08:44:02.469252 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"7e571289-bc5b-4304-bca2-994e61086e68","Type":"ContainerStarted","Data":"53f736aaedf8c107538ebfe3271977534af28874bdeb0a6e3e6ab086bb3d2080"} Dec 01 08:44:02 crc kubenswrapper[5004]: I1201 08:44:02.469474 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 01 08:44:02 crc kubenswrapper[5004]: I1201 08:44:02.546434 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.131372026 podStartE2EDuration="2.546410337s" podCreationTimestamp="2025-12-01 08:44:00 +0000 UTC" firstStartedPulling="2025-12-01 08:44:01.301888979 +0000 UTC m=+1618.866880961" lastFinishedPulling="2025-12-01 08:44:01.71692729 +0000 UTC m=+1619.281919272" observedRunningTime="2025-12-01 08:44:02.494604573 +0000 UTC m=+1620.059596555" watchObservedRunningTime="2025-12-01 08:44:02.546410337 +0000 UTC m=+1620.111402319" Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.136413 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.225838 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/65ef6d1f-ede0-4db1-b3a4-005921b1ce6b-sg-core-conf-yaml\") pod \"65ef6d1f-ede0-4db1-b3a4-005921b1ce6b\" (UID: \"65ef6d1f-ede0-4db1-b3a4-005921b1ce6b\") " Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.226137 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65ef6d1f-ede0-4db1-b3a4-005921b1ce6b-run-httpd\") pod \"65ef6d1f-ede0-4db1-b3a4-005921b1ce6b\" (UID: \"65ef6d1f-ede0-4db1-b3a4-005921b1ce6b\") " Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.226173 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65ef6d1f-ede0-4db1-b3a4-005921b1ce6b-log-httpd\") pod \"65ef6d1f-ede0-4db1-b3a4-005921b1ce6b\" (UID: \"65ef6d1f-ede0-4db1-b3a4-005921b1ce6b\") " Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.226251 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65ef6d1f-ede0-4db1-b3a4-005921b1ce6b-combined-ca-bundle\") pod \"65ef6d1f-ede0-4db1-b3a4-005921b1ce6b\" (UID: \"65ef6d1f-ede0-4db1-b3a4-005921b1ce6b\") " Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.226309 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65ef6d1f-ede0-4db1-b3a4-005921b1ce6b-scripts\") pod \"65ef6d1f-ede0-4db1-b3a4-005921b1ce6b\" (UID: \"65ef6d1f-ede0-4db1-b3a4-005921b1ce6b\") " Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.226337 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65ef6d1f-ede0-4db1-b3a4-005921b1ce6b-config-data\") pod \"65ef6d1f-ede0-4db1-b3a4-005921b1ce6b\" (UID: \"65ef6d1f-ede0-4db1-b3a4-005921b1ce6b\") " Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.226382 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5bjr\" (UniqueName: \"kubernetes.io/projected/65ef6d1f-ede0-4db1-b3a4-005921b1ce6b-kube-api-access-n5bjr\") pod \"65ef6d1f-ede0-4db1-b3a4-005921b1ce6b\" (UID: \"65ef6d1f-ede0-4db1-b3a4-005921b1ce6b\") " Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.227397 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65ef6d1f-ede0-4db1-b3a4-005921b1ce6b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "65ef6d1f-ede0-4db1-b3a4-005921b1ce6b" (UID: "65ef6d1f-ede0-4db1-b3a4-005921b1ce6b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.230699 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65ef6d1f-ede0-4db1-b3a4-005921b1ce6b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "65ef6d1f-ede0-4db1-b3a4-005921b1ce6b" (UID: "65ef6d1f-ede0-4db1-b3a4-005921b1ce6b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.232636 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65ef6d1f-ede0-4db1-b3a4-005921b1ce6b-kube-api-access-n5bjr" (OuterVolumeSpecName: "kube-api-access-n5bjr") pod "65ef6d1f-ede0-4db1-b3a4-005921b1ce6b" (UID: "65ef6d1f-ede0-4db1-b3a4-005921b1ce6b"). InnerVolumeSpecName "kube-api-access-n5bjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.240740 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65ef6d1f-ede0-4db1-b3a4-005921b1ce6b-scripts" (OuterVolumeSpecName: "scripts") pod "65ef6d1f-ede0-4db1-b3a4-005921b1ce6b" (UID: "65ef6d1f-ede0-4db1-b3a4-005921b1ce6b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.282273 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65ef6d1f-ede0-4db1-b3a4-005921b1ce6b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "65ef6d1f-ede0-4db1-b3a4-005921b1ce6b" (UID: "65ef6d1f-ede0-4db1-b3a4-005921b1ce6b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.329477 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5bjr\" (UniqueName: \"kubernetes.io/projected/65ef6d1f-ede0-4db1-b3a4-005921b1ce6b-kube-api-access-n5bjr\") on node \"crc\" DevicePath \"\"" Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.329816 5004 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/65ef6d1f-ede0-4db1-b3a4-005921b1ce6b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.329829 5004 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65ef6d1f-ede0-4db1-b3a4-005921b1ce6b-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.329837 5004 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65ef6d1f-ede0-4db1-b3a4-005921b1ce6b-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.329846 5004 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65ef6d1f-ede0-4db1-b3a4-005921b1ce6b-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.340662 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65ef6d1f-ede0-4db1-b3a4-005921b1ce6b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "65ef6d1f-ede0-4db1-b3a4-005921b1ce6b" (UID: "65ef6d1f-ede0-4db1-b3a4-005921b1ce6b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.386452 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65ef6d1f-ede0-4db1-b3a4-005921b1ce6b-config-data" (OuterVolumeSpecName: "config-data") pod "65ef6d1f-ede0-4db1-b3a4-005921b1ce6b" (UID: "65ef6d1f-ede0-4db1-b3a4-005921b1ce6b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.432555 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65ef6d1f-ede0-4db1-b3a4-005921b1ce6b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.432713 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65ef6d1f-ede0-4db1-b3a4-005921b1ce6b-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.481306 5004 generic.go:334] "Generic (PLEG): container finished" podID="65ef6d1f-ede0-4db1-b3a4-005921b1ce6b" containerID="d2a87718597afef8ab679f9456c35bf14d823b9b70960ddfb480d3b23a76e94e" exitCode=0 Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.481385 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.481394 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65ef6d1f-ede0-4db1-b3a4-005921b1ce6b","Type":"ContainerDied","Data":"d2a87718597afef8ab679f9456c35bf14d823b9b70960ddfb480d3b23a76e94e"} Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.482102 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65ef6d1f-ede0-4db1-b3a4-005921b1ce6b","Type":"ContainerDied","Data":"8e68ee861d90139410a028be3bdd8c2c047df9436ca19d7d47b5998909212296"} Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.482134 5004 scope.go:117] "RemoveContainer" containerID="bb6c0b74cd9415897eabd78a157ae825d3e68d67450773ef51a02a56788fcf60" Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.485013 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"4941cf08-e742-4085-a85d-5d64305aec32","Type":"ContainerStarted","Data":"f08dbbe6f4b286bb405a88652278c39e502861816c426b5aaef01bbfdc78f349"} Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.509014 5004 scope.go:117] "RemoveContainer" containerID="33802e98e906ac47a4ba440d97522895ac7fe8fc1a101d9b3d7fe685a13c1e95" Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.528711 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-0" podStartSLOduration=2.574924879 podStartE2EDuration="3.528691491s" podCreationTimestamp="2025-12-01 08:44:00 +0000 UTC" firstStartedPulling="2025-12-01 08:44:01.451989723 +0000 UTC m=+1619.016981705" lastFinishedPulling="2025-12-01 08:44:02.405756335 +0000 UTC m=+1619.970748317" observedRunningTime="2025-12-01 08:44:03.500491605 +0000 UTC m=+1621.065483587" watchObservedRunningTime="2025-12-01 08:44:03.528691491 +0000 UTC m=+1621.093683473" Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.561244 5004 scope.go:117] "RemoveContainer" containerID="d2a87718597afef8ab679f9456c35bf14d823b9b70960ddfb480d3b23a76e94e" Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.591687 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.598572 5004 scope.go:117] "RemoveContainer" containerID="d4b07553eb9a29d74c32d8259f59105e2125c612dc02287aa874b3cb7f3d8197" Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.607494 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.620353 5004 scope.go:117] "RemoveContainer" containerID="bb6c0b74cd9415897eabd78a157ae825d3e68d67450773ef51a02a56788fcf60" Dec 01 08:44:03 crc kubenswrapper[5004]: E1201 08:44:03.620823 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb6c0b74cd9415897eabd78a157ae825d3e68d67450773ef51a02a56788fcf60\": container with ID starting with bb6c0b74cd9415897eabd78a157ae825d3e68d67450773ef51a02a56788fcf60 not found: ID does not exist" containerID="bb6c0b74cd9415897eabd78a157ae825d3e68d67450773ef51a02a56788fcf60" Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.620872 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb6c0b74cd9415897eabd78a157ae825d3e68d67450773ef51a02a56788fcf60"} err="failed to get container status \"bb6c0b74cd9415897eabd78a157ae825d3e68d67450773ef51a02a56788fcf60\": rpc error: code = NotFound desc = could not find container \"bb6c0b74cd9415897eabd78a157ae825d3e68d67450773ef51a02a56788fcf60\": container with ID starting with bb6c0b74cd9415897eabd78a157ae825d3e68d67450773ef51a02a56788fcf60 not found: ID does not exist" Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.620901 5004 scope.go:117] "RemoveContainer" containerID="33802e98e906ac47a4ba440d97522895ac7fe8fc1a101d9b3d7fe685a13c1e95" Dec 01 08:44:03 crc kubenswrapper[5004]: E1201 08:44:03.621331 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33802e98e906ac47a4ba440d97522895ac7fe8fc1a101d9b3d7fe685a13c1e95\": container with ID starting with 33802e98e906ac47a4ba440d97522895ac7fe8fc1a101d9b3d7fe685a13c1e95 not found: ID does not exist" containerID="33802e98e906ac47a4ba440d97522895ac7fe8fc1a101d9b3d7fe685a13c1e95" Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.621360 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33802e98e906ac47a4ba440d97522895ac7fe8fc1a101d9b3d7fe685a13c1e95"} err="failed to get container status \"33802e98e906ac47a4ba440d97522895ac7fe8fc1a101d9b3d7fe685a13c1e95\": rpc error: code = NotFound desc = could not find container \"33802e98e906ac47a4ba440d97522895ac7fe8fc1a101d9b3d7fe685a13c1e95\": container with ID starting with 33802e98e906ac47a4ba440d97522895ac7fe8fc1a101d9b3d7fe685a13c1e95 not found: ID does not exist" Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.621389 5004 scope.go:117] "RemoveContainer" containerID="d2a87718597afef8ab679f9456c35bf14d823b9b70960ddfb480d3b23a76e94e" Dec 01 08:44:03 crc kubenswrapper[5004]: E1201 08:44:03.621711 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2a87718597afef8ab679f9456c35bf14d823b9b70960ddfb480d3b23a76e94e\": container with ID starting with d2a87718597afef8ab679f9456c35bf14d823b9b70960ddfb480d3b23a76e94e not found: ID does not exist" containerID="d2a87718597afef8ab679f9456c35bf14d823b9b70960ddfb480d3b23a76e94e" Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.621761 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2a87718597afef8ab679f9456c35bf14d823b9b70960ddfb480d3b23a76e94e"} err="failed to get container status \"d2a87718597afef8ab679f9456c35bf14d823b9b70960ddfb480d3b23a76e94e\": rpc error: code = NotFound desc = could not find container \"d2a87718597afef8ab679f9456c35bf14d823b9b70960ddfb480d3b23a76e94e\": container with ID starting with d2a87718597afef8ab679f9456c35bf14d823b9b70960ddfb480d3b23a76e94e not found: ID does not exist" Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.621795 5004 scope.go:117] "RemoveContainer" containerID="d4b07553eb9a29d74c32d8259f59105e2125c612dc02287aa874b3cb7f3d8197" Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.623306 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 08:44:03 crc kubenswrapper[5004]: E1201 08:44:03.623502 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4b07553eb9a29d74c32d8259f59105e2125c612dc02287aa874b3cb7f3d8197\": container with ID starting with d4b07553eb9a29d74c32d8259f59105e2125c612dc02287aa874b3cb7f3d8197 not found: ID does not exist" containerID="d4b07553eb9a29d74c32d8259f59105e2125c612dc02287aa874b3cb7f3d8197" Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.623536 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4b07553eb9a29d74c32d8259f59105e2125c612dc02287aa874b3cb7f3d8197"} err="failed to get container status \"d4b07553eb9a29d74c32d8259f59105e2125c612dc02287aa874b3cb7f3d8197\": rpc error: code = NotFound desc = could not find container \"d4b07553eb9a29d74c32d8259f59105e2125c612dc02287aa874b3cb7f3d8197\": container with ID starting with d4b07553eb9a29d74c32d8259f59105e2125c612dc02287aa874b3cb7f3d8197 not found: ID does not exist" Dec 01 08:44:03 crc kubenswrapper[5004]: E1201 08:44:03.623798 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65ef6d1f-ede0-4db1-b3a4-005921b1ce6b" containerName="ceilometer-notification-agent" Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.623815 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="65ef6d1f-ede0-4db1-b3a4-005921b1ce6b" containerName="ceilometer-notification-agent" Dec 01 08:44:03 crc kubenswrapper[5004]: E1201 08:44:03.623849 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65ef6d1f-ede0-4db1-b3a4-005921b1ce6b" containerName="ceilometer-central-agent" Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.623856 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="65ef6d1f-ede0-4db1-b3a4-005921b1ce6b" containerName="ceilometer-central-agent" Dec 01 08:44:03 crc kubenswrapper[5004]: E1201 08:44:03.623870 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65ef6d1f-ede0-4db1-b3a4-005921b1ce6b" containerName="proxy-httpd" Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.623876 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="65ef6d1f-ede0-4db1-b3a4-005921b1ce6b" containerName="proxy-httpd" Dec 01 08:44:03 crc kubenswrapper[5004]: E1201 08:44:03.623897 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65ef6d1f-ede0-4db1-b3a4-005921b1ce6b" containerName="sg-core" Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.623902 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="65ef6d1f-ede0-4db1-b3a4-005921b1ce6b" containerName="sg-core" Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.624471 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="65ef6d1f-ede0-4db1-b3a4-005921b1ce6b" containerName="proxy-httpd" Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.624498 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="65ef6d1f-ede0-4db1-b3a4-005921b1ce6b" containerName="ceilometer-notification-agent" Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.624509 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="65ef6d1f-ede0-4db1-b3a4-005921b1ce6b" containerName="ceilometer-central-agent" Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.624520 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="65ef6d1f-ede0-4db1-b3a4-005921b1ce6b" containerName="sg-core" Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.627245 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.629606 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.629611 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.629727 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.653199 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.741666 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dm7d\" (UniqueName: \"kubernetes.io/projected/63d8f417-66f9-445d-bb60-2ad3ea77ce39-kube-api-access-5dm7d\") pod \"ceilometer-0\" (UID: \"63d8f417-66f9-445d-bb60-2ad3ea77ce39\") " pod="openstack/ceilometer-0" Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.741714 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63d8f417-66f9-445d-bb60-2ad3ea77ce39-config-data\") pod \"ceilometer-0\" (UID: \"63d8f417-66f9-445d-bb60-2ad3ea77ce39\") " pod="openstack/ceilometer-0" Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.741738 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63d8f417-66f9-445d-bb60-2ad3ea77ce39-run-httpd\") pod \"ceilometer-0\" (UID: \"63d8f417-66f9-445d-bb60-2ad3ea77ce39\") " pod="openstack/ceilometer-0" Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.741777 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63d8f417-66f9-445d-bb60-2ad3ea77ce39-log-httpd\") pod \"ceilometer-0\" (UID: \"63d8f417-66f9-445d-bb60-2ad3ea77ce39\") " pod="openstack/ceilometer-0" Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.741884 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/63d8f417-66f9-445d-bb60-2ad3ea77ce39-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"63d8f417-66f9-445d-bb60-2ad3ea77ce39\") " pod="openstack/ceilometer-0" Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.741912 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63d8f417-66f9-445d-bb60-2ad3ea77ce39-scripts\") pod \"ceilometer-0\" (UID: \"63d8f417-66f9-445d-bb60-2ad3ea77ce39\") " pod="openstack/ceilometer-0" Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.741955 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/63d8f417-66f9-445d-bb60-2ad3ea77ce39-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"63d8f417-66f9-445d-bb60-2ad3ea77ce39\") " pod="openstack/ceilometer-0" Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.742038 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63d8f417-66f9-445d-bb60-2ad3ea77ce39-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"63d8f417-66f9-445d-bb60-2ad3ea77ce39\") " pod="openstack/ceilometer-0" Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.844716 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63d8f417-66f9-445d-bb60-2ad3ea77ce39-run-httpd\") pod \"ceilometer-0\" (UID: \"63d8f417-66f9-445d-bb60-2ad3ea77ce39\") " pod="openstack/ceilometer-0" Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.844800 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63d8f417-66f9-445d-bb60-2ad3ea77ce39-log-httpd\") pod \"ceilometer-0\" (UID: \"63d8f417-66f9-445d-bb60-2ad3ea77ce39\") " pod="openstack/ceilometer-0" Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.844868 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/63d8f417-66f9-445d-bb60-2ad3ea77ce39-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"63d8f417-66f9-445d-bb60-2ad3ea77ce39\") " pod="openstack/ceilometer-0" Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.844900 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63d8f417-66f9-445d-bb60-2ad3ea77ce39-scripts\") pod \"ceilometer-0\" (UID: \"63d8f417-66f9-445d-bb60-2ad3ea77ce39\") " pod="openstack/ceilometer-0" Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.844930 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/63d8f417-66f9-445d-bb60-2ad3ea77ce39-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"63d8f417-66f9-445d-bb60-2ad3ea77ce39\") " pod="openstack/ceilometer-0" Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.844983 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63d8f417-66f9-445d-bb60-2ad3ea77ce39-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"63d8f417-66f9-445d-bb60-2ad3ea77ce39\") " pod="openstack/ceilometer-0" Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.845105 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dm7d\" (UniqueName: \"kubernetes.io/projected/63d8f417-66f9-445d-bb60-2ad3ea77ce39-kube-api-access-5dm7d\") pod \"ceilometer-0\" (UID: \"63d8f417-66f9-445d-bb60-2ad3ea77ce39\") " pod="openstack/ceilometer-0" Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.845144 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63d8f417-66f9-445d-bb60-2ad3ea77ce39-config-data\") pod \"ceilometer-0\" (UID: \"63d8f417-66f9-445d-bb60-2ad3ea77ce39\") " pod="openstack/ceilometer-0" Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.845881 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63d8f417-66f9-445d-bb60-2ad3ea77ce39-log-httpd\") pod \"ceilometer-0\" (UID: \"63d8f417-66f9-445d-bb60-2ad3ea77ce39\") " pod="openstack/ceilometer-0" Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.845899 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63d8f417-66f9-445d-bb60-2ad3ea77ce39-run-httpd\") pod \"ceilometer-0\" (UID: \"63d8f417-66f9-445d-bb60-2ad3ea77ce39\") " pod="openstack/ceilometer-0" Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.850333 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/63d8f417-66f9-445d-bb60-2ad3ea77ce39-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"63d8f417-66f9-445d-bb60-2ad3ea77ce39\") " pod="openstack/ceilometer-0" Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.850981 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63d8f417-66f9-445d-bb60-2ad3ea77ce39-config-data\") pod \"ceilometer-0\" (UID: \"63d8f417-66f9-445d-bb60-2ad3ea77ce39\") " pod="openstack/ceilometer-0" Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.851066 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/63d8f417-66f9-445d-bb60-2ad3ea77ce39-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"63d8f417-66f9-445d-bb60-2ad3ea77ce39\") " pod="openstack/ceilometer-0" Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.852006 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63d8f417-66f9-445d-bb60-2ad3ea77ce39-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"63d8f417-66f9-445d-bb60-2ad3ea77ce39\") " pod="openstack/ceilometer-0" Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.862440 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63d8f417-66f9-445d-bb60-2ad3ea77ce39-scripts\") pod \"ceilometer-0\" (UID: \"63d8f417-66f9-445d-bb60-2ad3ea77ce39\") " pod="openstack/ceilometer-0" Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.863884 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dm7d\" (UniqueName: \"kubernetes.io/projected/63d8f417-66f9-445d-bb60-2ad3ea77ce39-kube-api-access-5dm7d\") pod \"ceilometer-0\" (UID: \"63d8f417-66f9-445d-bb60-2ad3ea77ce39\") " pod="openstack/ceilometer-0" Dec 01 08:44:03 crc kubenswrapper[5004]: I1201 08:44:03.954065 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 08:44:04 crc kubenswrapper[5004]: I1201 08:44:04.514511 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 08:44:04 crc kubenswrapper[5004]: I1201 08:44:04.782673 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65ef6d1f-ede0-4db1-b3a4-005921b1ce6b" path="/var/lib/kubelet/pods/65ef6d1f-ede0-4db1-b3a4-005921b1ce6b/volumes" Dec 01 08:44:04 crc kubenswrapper[5004]: I1201 08:44:04.928521 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-f9ffl" Dec 01 08:44:05 crc kubenswrapper[5004]: I1201 08:44:05.015851 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-f9ffl" Dec 01 08:44:05 crc kubenswrapper[5004]: I1201 08:44:05.171825 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-f9ffl"] Dec 01 08:44:05 crc kubenswrapper[5004]: I1201 08:44:05.520066 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63d8f417-66f9-445d-bb60-2ad3ea77ce39","Type":"ContainerStarted","Data":"977c984ddd7bb48a7fca0179a0e4db6cfa0fbb772d854243834696604492a252"} Dec 01 08:44:05 crc kubenswrapper[5004]: I1201 08:44:05.520110 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63d8f417-66f9-445d-bb60-2ad3ea77ce39","Type":"ContainerStarted","Data":"85ae528f813633e23744266ff1a3a163dd2f853b79ed7066dec94ca41bd9fc73"} Dec 01 08:44:06 crc kubenswrapper[5004]: I1201 08:44:06.534370 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63d8f417-66f9-445d-bb60-2ad3ea77ce39","Type":"ContainerStarted","Data":"c0cd25e07211c8cab96d8fb395bf9f372c17fff2919519b608968da796fd2765"} Dec 01 08:44:06 crc kubenswrapper[5004]: I1201 08:44:06.535101 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-f9ffl" podUID="3f80c1b8-13d7-4769-9bfd-6286aa4f7d82" containerName="registry-server" containerID="cri-o://99d2ac178af949b7d6ebcde9702db7a077adf0733567a0956e13916c4564c37d" gracePeriod=2 Dec 01 08:44:07 crc kubenswrapper[5004]: I1201 08:44:07.098907 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f9ffl" Dec 01 08:44:07 crc kubenswrapper[5004]: I1201 08:44:07.247725 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ks8zv\" (UniqueName: \"kubernetes.io/projected/3f80c1b8-13d7-4769-9bfd-6286aa4f7d82-kube-api-access-ks8zv\") pod \"3f80c1b8-13d7-4769-9bfd-6286aa4f7d82\" (UID: \"3f80c1b8-13d7-4769-9bfd-6286aa4f7d82\") " Dec 01 08:44:07 crc kubenswrapper[5004]: I1201 08:44:07.249342 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f80c1b8-13d7-4769-9bfd-6286aa4f7d82-catalog-content\") pod \"3f80c1b8-13d7-4769-9bfd-6286aa4f7d82\" (UID: \"3f80c1b8-13d7-4769-9bfd-6286aa4f7d82\") " Dec 01 08:44:07 crc kubenswrapper[5004]: I1201 08:44:07.251988 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f80c1b8-13d7-4769-9bfd-6286aa4f7d82-utilities\") pod \"3f80c1b8-13d7-4769-9bfd-6286aa4f7d82\" (UID: \"3f80c1b8-13d7-4769-9bfd-6286aa4f7d82\") " Dec 01 08:44:07 crc kubenswrapper[5004]: I1201 08:44:07.252478 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f80c1b8-13d7-4769-9bfd-6286aa4f7d82-utilities" (OuterVolumeSpecName: "utilities") pod "3f80c1b8-13d7-4769-9bfd-6286aa4f7d82" (UID: "3f80c1b8-13d7-4769-9bfd-6286aa4f7d82"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:44:07 crc kubenswrapper[5004]: I1201 08:44:07.254050 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f80c1b8-13d7-4769-9bfd-6286aa4f7d82-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 08:44:07 crc kubenswrapper[5004]: I1201 08:44:07.254232 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f80c1b8-13d7-4769-9bfd-6286aa4f7d82-kube-api-access-ks8zv" (OuterVolumeSpecName: "kube-api-access-ks8zv") pod "3f80c1b8-13d7-4769-9bfd-6286aa4f7d82" (UID: "3f80c1b8-13d7-4769-9bfd-6286aa4f7d82"). InnerVolumeSpecName "kube-api-access-ks8zv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:44:07 crc kubenswrapper[5004]: I1201 08:44:07.307859 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f80c1b8-13d7-4769-9bfd-6286aa4f7d82-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3f80c1b8-13d7-4769-9bfd-6286aa4f7d82" (UID: "3f80c1b8-13d7-4769-9bfd-6286aa4f7d82"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:44:07 crc kubenswrapper[5004]: I1201 08:44:07.355907 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ks8zv\" (UniqueName: \"kubernetes.io/projected/3f80c1b8-13d7-4769-9bfd-6286aa4f7d82-kube-api-access-ks8zv\") on node \"crc\" DevicePath \"\"" Dec 01 08:44:07 crc kubenswrapper[5004]: I1201 08:44:07.355940 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f80c1b8-13d7-4769-9bfd-6286aa4f7d82-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 08:44:07 crc kubenswrapper[5004]: I1201 08:44:07.550166 5004 generic.go:334] "Generic (PLEG): container finished" podID="3f80c1b8-13d7-4769-9bfd-6286aa4f7d82" containerID="99d2ac178af949b7d6ebcde9702db7a077adf0733567a0956e13916c4564c37d" exitCode=0 Dec 01 08:44:07 crc kubenswrapper[5004]: I1201 08:44:07.550224 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f9ffl" Dec 01 08:44:07 crc kubenswrapper[5004]: I1201 08:44:07.550225 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f9ffl" event={"ID":"3f80c1b8-13d7-4769-9bfd-6286aa4f7d82","Type":"ContainerDied","Data":"99d2ac178af949b7d6ebcde9702db7a077adf0733567a0956e13916c4564c37d"} Dec 01 08:44:07 crc kubenswrapper[5004]: I1201 08:44:07.550267 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f9ffl" event={"ID":"3f80c1b8-13d7-4769-9bfd-6286aa4f7d82","Type":"ContainerDied","Data":"42a0eb1040fbbcc2a2605f706296ae7b590f1ebda88547ab2432f11a5ffffd3e"} Dec 01 08:44:07 crc kubenswrapper[5004]: I1201 08:44:07.550284 5004 scope.go:117] "RemoveContainer" containerID="99d2ac178af949b7d6ebcde9702db7a077adf0733567a0956e13916c4564c37d" Dec 01 08:44:07 crc kubenswrapper[5004]: I1201 08:44:07.557823 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63d8f417-66f9-445d-bb60-2ad3ea77ce39","Type":"ContainerStarted","Data":"661f933c44774140587c03dd76cb5c684f2e558f3d7447a56d99a9d8f85f9218"} Dec 01 08:44:07 crc kubenswrapper[5004]: I1201 08:44:07.581955 5004 scope.go:117] "RemoveContainer" containerID="a40bb9433fba39770211a3b6bb76b2db8bcd1515733778eab275db406ada5c8b" Dec 01 08:44:07 crc kubenswrapper[5004]: I1201 08:44:07.594846 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-f9ffl"] Dec 01 08:44:07 crc kubenswrapper[5004]: I1201 08:44:07.609091 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-f9ffl"] Dec 01 08:44:07 crc kubenswrapper[5004]: I1201 08:44:07.624896 5004 scope.go:117] "RemoveContainer" containerID="031a6e9bbc7850c81b31b201c2d580355757202b966aa2f15b7fd4b6910be46b" Dec 01 08:44:07 crc kubenswrapper[5004]: I1201 08:44:07.655023 5004 scope.go:117] "RemoveContainer" containerID="99d2ac178af949b7d6ebcde9702db7a077adf0733567a0956e13916c4564c37d" Dec 01 08:44:07 crc kubenswrapper[5004]: E1201 08:44:07.658038 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99d2ac178af949b7d6ebcde9702db7a077adf0733567a0956e13916c4564c37d\": container with ID starting with 99d2ac178af949b7d6ebcde9702db7a077adf0733567a0956e13916c4564c37d not found: ID does not exist" containerID="99d2ac178af949b7d6ebcde9702db7a077adf0733567a0956e13916c4564c37d" Dec 01 08:44:07 crc kubenswrapper[5004]: I1201 08:44:07.658082 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99d2ac178af949b7d6ebcde9702db7a077adf0733567a0956e13916c4564c37d"} err="failed to get container status \"99d2ac178af949b7d6ebcde9702db7a077adf0733567a0956e13916c4564c37d\": rpc error: code = NotFound desc = could not find container \"99d2ac178af949b7d6ebcde9702db7a077adf0733567a0956e13916c4564c37d\": container with ID starting with 99d2ac178af949b7d6ebcde9702db7a077adf0733567a0956e13916c4564c37d not found: ID does not exist" Dec 01 08:44:07 crc kubenswrapper[5004]: I1201 08:44:07.658112 5004 scope.go:117] "RemoveContainer" containerID="a40bb9433fba39770211a3b6bb76b2db8bcd1515733778eab275db406ada5c8b" Dec 01 08:44:07 crc kubenswrapper[5004]: E1201 08:44:07.658482 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a40bb9433fba39770211a3b6bb76b2db8bcd1515733778eab275db406ada5c8b\": container with ID starting with a40bb9433fba39770211a3b6bb76b2db8bcd1515733778eab275db406ada5c8b not found: ID does not exist" containerID="a40bb9433fba39770211a3b6bb76b2db8bcd1515733778eab275db406ada5c8b" Dec 01 08:44:07 crc kubenswrapper[5004]: I1201 08:44:07.658517 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a40bb9433fba39770211a3b6bb76b2db8bcd1515733778eab275db406ada5c8b"} err="failed to get container status \"a40bb9433fba39770211a3b6bb76b2db8bcd1515733778eab275db406ada5c8b\": rpc error: code = NotFound desc = could not find container \"a40bb9433fba39770211a3b6bb76b2db8bcd1515733778eab275db406ada5c8b\": container with ID starting with a40bb9433fba39770211a3b6bb76b2db8bcd1515733778eab275db406ada5c8b not found: ID does not exist" Dec 01 08:44:07 crc kubenswrapper[5004]: I1201 08:44:07.658539 5004 scope.go:117] "RemoveContainer" containerID="031a6e9bbc7850c81b31b201c2d580355757202b966aa2f15b7fd4b6910be46b" Dec 01 08:44:07 crc kubenswrapper[5004]: E1201 08:44:07.658820 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"031a6e9bbc7850c81b31b201c2d580355757202b966aa2f15b7fd4b6910be46b\": container with ID starting with 031a6e9bbc7850c81b31b201c2d580355757202b966aa2f15b7fd4b6910be46b not found: ID does not exist" containerID="031a6e9bbc7850c81b31b201c2d580355757202b966aa2f15b7fd4b6910be46b" Dec 01 08:44:07 crc kubenswrapper[5004]: I1201 08:44:07.658842 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"031a6e9bbc7850c81b31b201c2d580355757202b966aa2f15b7fd4b6910be46b"} err="failed to get container status \"031a6e9bbc7850c81b31b201c2d580355757202b966aa2f15b7fd4b6910be46b\": rpc error: code = NotFound desc = could not find container \"031a6e9bbc7850c81b31b201c2d580355757202b966aa2f15b7fd4b6910be46b\": container with ID starting with 031a6e9bbc7850c81b31b201c2d580355757202b966aa2f15b7fd4b6910be46b not found: ID does not exist" Dec 01 08:44:08 crc kubenswrapper[5004]: I1201 08:44:08.591104 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63d8f417-66f9-445d-bb60-2ad3ea77ce39","Type":"ContainerStarted","Data":"15b95416c492fa8247eb121b14bd497e4d3d82f3d518443504a7baa01666325d"} Dec 01 08:44:08 crc kubenswrapper[5004]: I1201 08:44:08.591825 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 08:44:08 crc kubenswrapper[5004]: I1201 08:44:08.729255 5004 patch_prober.go:28] interesting pod/machine-config-daemon-fvdgt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 08:44:08 crc kubenswrapper[5004]: I1201 08:44:08.729303 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 08:44:08 crc kubenswrapper[5004]: I1201 08:44:08.772089 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f80c1b8-13d7-4769-9bfd-6286aa4f7d82" path="/var/lib/kubelet/pods/3f80c1b8-13d7-4769-9bfd-6286aa4f7d82/volumes" Dec 01 08:44:10 crc kubenswrapper[5004]: I1201 08:44:10.861551 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 01 08:44:10 crc kubenswrapper[5004]: I1201 08:44:10.903102 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=4.234788885 podStartE2EDuration="7.903085425s" podCreationTimestamp="2025-12-01 08:44:03 +0000 UTC" firstStartedPulling="2025-12-01 08:44:04.503639484 +0000 UTC m=+1622.068631466" lastFinishedPulling="2025-12-01 08:44:08.171936024 +0000 UTC m=+1625.736928006" observedRunningTime="2025-12-01 08:44:08.616983773 +0000 UTC m=+1626.181975785" watchObservedRunningTime="2025-12-01 08:44:10.903085425 +0000 UTC m=+1628.468077407" Dec 01 08:44:33 crc kubenswrapper[5004]: I1201 08:44:33.970942 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 01 08:44:38 crc kubenswrapper[5004]: I1201 08:44:38.729699 5004 patch_prober.go:28] interesting pod/machine-config-daemon-fvdgt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 08:44:38 crc kubenswrapper[5004]: I1201 08:44:38.730139 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 08:44:38 crc kubenswrapper[5004]: I1201 08:44:38.730204 5004 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" Dec 01 08:44:38 crc kubenswrapper[5004]: I1201 08:44:38.731333 5004 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"70f45b3d2a6bb4bf0d87f18bc08b282543e238e1d391a58745914af24ad7956c"} pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 08:44:38 crc kubenswrapper[5004]: I1201 08:44:38.731422 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" containerID="cri-o://70f45b3d2a6bb4bf0d87f18bc08b282543e238e1d391a58745914af24ad7956c" gracePeriod=600 Dec 01 08:44:38 crc kubenswrapper[5004]: E1201 08:44:38.884417 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 08:44:38 crc kubenswrapper[5004]: I1201 08:44:38.993675 5004 generic.go:334] "Generic (PLEG): container finished" podID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerID="70f45b3d2a6bb4bf0d87f18bc08b282543e238e1d391a58745914af24ad7956c" exitCode=0 Dec 01 08:44:38 crc kubenswrapper[5004]: I1201 08:44:38.993719 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" event={"ID":"f9977ebb-82de-4e96-8763-0b5a84f8d4ce","Type":"ContainerDied","Data":"70f45b3d2a6bb4bf0d87f18bc08b282543e238e1d391a58745914af24ad7956c"} Dec 01 08:44:38 crc kubenswrapper[5004]: I1201 08:44:38.993796 5004 scope.go:117] "RemoveContainer" containerID="af0bd8ad09d4d665e418c6d76caa0150a18c17d3528d47d38f4681f4edce895d" Dec 01 08:44:38 crc kubenswrapper[5004]: I1201 08:44:38.994660 5004 scope.go:117] "RemoveContainer" containerID="70f45b3d2a6bb4bf0d87f18bc08b282543e238e1d391a58745914af24ad7956c" Dec 01 08:44:38 crc kubenswrapper[5004]: E1201 08:44:38.994957 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 08:44:45 crc kubenswrapper[5004]: I1201 08:44:45.607239 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-cwgb2"] Dec 01 08:44:45 crc kubenswrapper[5004]: I1201 08:44:45.619174 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-cwgb2"] Dec 01 08:44:45 crc kubenswrapper[5004]: I1201 08:44:45.710739 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-sskn9"] Dec 01 08:44:45 crc kubenswrapper[5004]: E1201 08:44:45.711438 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f80c1b8-13d7-4769-9bfd-6286aa4f7d82" containerName="registry-server" Dec 01 08:44:45 crc kubenswrapper[5004]: I1201 08:44:45.711657 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f80c1b8-13d7-4769-9bfd-6286aa4f7d82" containerName="registry-server" Dec 01 08:44:45 crc kubenswrapper[5004]: E1201 08:44:45.711751 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f80c1b8-13d7-4769-9bfd-6286aa4f7d82" containerName="extract-content" Dec 01 08:44:45 crc kubenswrapper[5004]: I1201 08:44:45.711806 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f80c1b8-13d7-4769-9bfd-6286aa4f7d82" containerName="extract-content" Dec 01 08:44:45 crc kubenswrapper[5004]: E1201 08:44:45.711903 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f80c1b8-13d7-4769-9bfd-6286aa4f7d82" containerName="extract-utilities" Dec 01 08:44:45 crc kubenswrapper[5004]: I1201 08:44:45.711987 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f80c1b8-13d7-4769-9bfd-6286aa4f7d82" containerName="extract-utilities" Dec 01 08:44:45 crc kubenswrapper[5004]: I1201 08:44:45.712250 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f80c1b8-13d7-4769-9bfd-6286aa4f7d82" containerName="registry-server" Dec 01 08:44:45 crc kubenswrapper[5004]: I1201 08:44:45.713160 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-sskn9" Dec 01 08:44:45 crc kubenswrapper[5004]: I1201 08:44:45.724146 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-sskn9"] Dec 01 08:44:45 crc kubenswrapper[5004]: I1201 08:44:45.865251 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fc1c621-41a4-4fcd-ab52-dd45c3d82080-config-data\") pod \"heat-db-sync-sskn9\" (UID: \"1fc1c621-41a4-4fcd-ab52-dd45c3d82080\") " pod="openstack/heat-db-sync-sskn9" Dec 01 08:44:45 crc kubenswrapper[5004]: I1201 08:44:45.865344 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fc1c621-41a4-4fcd-ab52-dd45c3d82080-combined-ca-bundle\") pod \"heat-db-sync-sskn9\" (UID: \"1fc1c621-41a4-4fcd-ab52-dd45c3d82080\") " pod="openstack/heat-db-sync-sskn9" Dec 01 08:44:45 crc kubenswrapper[5004]: I1201 08:44:45.865361 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7km7j\" (UniqueName: \"kubernetes.io/projected/1fc1c621-41a4-4fcd-ab52-dd45c3d82080-kube-api-access-7km7j\") pod \"heat-db-sync-sskn9\" (UID: \"1fc1c621-41a4-4fcd-ab52-dd45c3d82080\") " pod="openstack/heat-db-sync-sskn9" Dec 01 08:44:45 crc kubenswrapper[5004]: I1201 08:44:45.967810 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fc1c621-41a4-4fcd-ab52-dd45c3d82080-config-data\") pod \"heat-db-sync-sskn9\" (UID: \"1fc1c621-41a4-4fcd-ab52-dd45c3d82080\") " pod="openstack/heat-db-sync-sskn9" Dec 01 08:44:45 crc kubenswrapper[5004]: I1201 08:44:45.967912 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fc1c621-41a4-4fcd-ab52-dd45c3d82080-combined-ca-bundle\") pod \"heat-db-sync-sskn9\" (UID: \"1fc1c621-41a4-4fcd-ab52-dd45c3d82080\") " pod="openstack/heat-db-sync-sskn9" Dec 01 08:44:45 crc kubenswrapper[5004]: I1201 08:44:45.967934 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7km7j\" (UniqueName: \"kubernetes.io/projected/1fc1c621-41a4-4fcd-ab52-dd45c3d82080-kube-api-access-7km7j\") pod \"heat-db-sync-sskn9\" (UID: \"1fc1c621-41a4-4fcd-ab52-dd45c3d82080\") " pod="openstack/heat-db-sync-sskn9" Dec 01 08:44:45 crc kubenswrapper[5004]: I1201 08:44:45.976656 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fc1c621-41a4-4fcd-ab52-dd45c3d82080-combined-ca-bundle\") pod \"heat-db-sync-sskn9\" (UID: \"1fc1c621-41a4-4fcd-ab52-dd45c3d82080\") " pod="openstack/heat-db-sync-sskn9" Dec 01 08:44:45 crc kubenswrapper[5004]: I1201 08:44:45.982306 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fc1c621-41a4-4fcd-ab52-dd45c3d82080-config-data\") pod \"heat-db-sync-sskn9\" (UID: \"1fc1c621-41a4-4fcd-ab52-dd45c3d82080\") " pod="openstack/heat-db-sync-sskn9" Dec 01 08:44:45 crc kubenswrapper[5004]: I1201 08:44:45.986786 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7km7j\" (UniqueName: \"kubernetes.io/projected/1fc1c621-41a4-4fcd-ab52-dd45c3d82080-kube-api-access-7km7j\") pod \"heat-db-sync-sskn9\" (UID: \"1fc1c621-41a4-4fcd-ab52-dd45c3d82080\") " pod="openstack/heat-db-sync-sskn9" Dec 01 08:44:46 crc kubenswrapper[5004]: I1201 08:44:46.116072 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-sskn9" Dec 01 08:44:46 crc kubenswrapper[5004]: I1201 08:44:46.602467 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-sskn9"] Dec 01 08:44:46 crc kubenswrapper[5004]: W1201 08:44:46.606342 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1fc1c621_41a4_4fcd_ab52_dd45c3d82080.slice/crio-32080147857fdfb086b1076e8d0fa72401785c1b6a5511ba704c2f6df380eae6 WatchSource:0}: Error finding container 32080147857fdfb086b1076e8d0fa72401785c1b6a5511ba704c2f6df380eae6: Status 404 returned error can't find the container with id 32080147857fdfb086b1076e8d0fa72401785c1b6a5511ba704c2f6df380eae6 Dec 01 08:44:46 crc kubenswrapper[5004]: I1201 08:44:46.608554 5004 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 08:44:46 crc kubenswrapper[5004]: I1201 08:44:46.774346 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb372dfc-6007-42ba-bc16-96f7d99d8b98" path="/var/lib/kubelet/pods/fb372dfc-6007-42ba-bc16-96f7d99d8b98/volumes" Dec 01 08:44:47 crc kubenswrapper[5004]: I1201 08:44:47.139219 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-sskn9" event={"ID":"1fc1c621-41a4-4fcd-ab52-dd45c3d82080","Type":"ContainerStarted","Data":"32080147857fdfb086b1076e8d0fa72401785c1b6a5511ba704c2f6df380eae6"} Dec 01 08:44:47 crc kubenswrapper[5004]: I1201 08:44:47.864530 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 08:44:47 crc kubenswrapper[5004]: I1201 08:44:47.928606 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 08:44:47 crc kubenswrapper[5004]: I1201 08:44:47.928896 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="63d8f417-66f9-445d-bb60-2ad3ea77ce39" containerName="ceilometer-central-agent" containerID="cri-o://977c984ddd7bb48a7fca0179a0e4db6cfa0fbb772d854243834696604492a252" gracePeriod=30 Dec 01 08:44:47 crc kubenswrapper[5004]: I1201 08:44:47.928936 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="63d8f417-66f9-445d-bb60-2ad3ea77ce39" containerName="proxy-httpd" containerID="cri-o://15b95416c492fa8247eb121b14bd497e4d3d82f3d518443504a7baa01666325d" gracePeriod=30 Dec 01 08:44:47 crc kubenswrapper[5004]: I1201 08:44:47.929015 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="63d8f417-66f9-445d-bb60-2ad3ea77ce39" containerName="sg-core" containerID="cri-o://661f933c44774140587c03dd76cb5c684f2e558f3d7447a56d99a9d8f85f9218" gracePeriod=30 Dec 01 08:44:47 crc kubenswrapper[5004]: I1201 08:44:47.929056 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="63d8f417-66f9-445d-bb60-2ad3ea77ce39" containerName="ceilometer-notification-agent" containerID="cri-o://c0cd25e07211c8cab96d8fb395bf9f372c17fff2919519b608968da796fd2765" gracePeriod=30 Dec 01 08:44:48 crc kubenswrapper[5004]: I1201 08:44:48.176153 5004 generic.go:334] "Generic (PLEG): container finished" podID="63d8f417-66f9-445d-bb60-2ad3ea77ce39" containerID="661f933c44774140587c03dd76cb5c684f2e558f3d7447a56d99a9d8f85f9218" exitCode=2 Dec 01 08:44:48 crc kubenswrapper[5004]: I1201 08:44:48.176201 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63d8f417-66f9-445d-bb60-2ad3ea77ce39","Type":"ContainerDied","Data":"661f933c44774140587c03dd76cb5c684f2e558f3d7447a56d99a9d8f85f9218"} Dec 01 08:44:48 crc kubenswrapper[5004]: E1201 08:44:48.248989 5004 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63d8f417_66f9_445d_bb60_2ad3ea77ce39.slice/crio-conmon-15b95416c492fa8247eb121b14bd497e4d3d82f3d518443504a7baa01666325d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63d8f417_66f9_445d_bb60_2ad3ea77ce39.slice/crio-15b95416c492fa8247eb121b14bd497e4d3d82f3d518443504a7baa01666325d.scope\": RecentStats: unable to find data in memory cache]" Dec 01 08:44:48 crc kubenswrapper[5004]: E1201 08:44:48.249235 5004 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63d8f417_66f9_445d_bb60_2ad3ea77ce39.slice/crio-conmon-15b95416c492fa8247eb121b14bd497e4d3d82f3d518443504a7baa01666325d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63d8f417_66f9_445d_bb60_2ad3ea77ce39.slice/crio-15b95416c492fa8247eb121b14bd497e4d3d82f3d518443504a7baa01666325d.scope\": RecentStats: unable to find data in memory cache]" Dec 01 08:44:48 crc kubenswrapper[5004]: I1201 08:44:48.808105 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.036809 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.151704 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63d8f417-66f9-445d-bb60-2ad3ea77ce39-config-data\") pod \"63d8f417-66f9-445d-bb60-2ad3ea77ce39\" (UID: \"63d8f417-66f9-445d-bb60-2ad3ea77ce39\") " Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.151857 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63d8f417-66f9-445d-bb60-2ad3ea77ce39-run-httpd\") pod \"63d8f417-66f9-445d-bb60-2ad3ea77ce39\" (UID: \"63d8f417-66f9-445d-bb60-2ad3ea77ce39\") " Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.151883 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/63d8f417-66f9-445d-bb60-2ad3ea77ce39-sg-core-conf-yaml\") pod \"63d8f417-66f9-445d-bb60-2ad3ea77ce39\" (UID: \"63d8f417-66f9-445d-bb60-2ad3ea77ce39\") " Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.151917 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63d8f417-66f9-445d-bb60-2ad3ea77ce39-scripts\") pod \"63d8f417-66f9-445d-bb60-2ad3ea77ce39\" (UID: \"63d8f417-66f9-445d-bb60-2ad3ea77ce39\") " Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.152006 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63d8f417-66f9-445d-bb60-2ad3ea77ce39-log-httpd\") pod \"63d8f417-66f9-445d-bb60-2ad3ea77ce39\" (UID: \"63d8f417-66f9-445d-bb60-2ad3ea77ce39\") " Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.152095 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63d8f417-66f9-445d-bb60-2ad3ea77ce39-combined-ca-bundle\") pod \"63d8f417-66f9-445d-bb60-2ad3ea77ce39\" (UID: \"63d8f417-66f9-445d-bb60-2ad3ea77ce39\") " Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.152153 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/63d8f417-66f9-445d-bb60-2ad3ea77ce39-ceilometer-tls-certs\") pod \"63d8f417-66f9-445d-bb60-2ad3ea77ce39\" (UID: \"63d8f417-66f9-445d-bb60-2ad3ea77ce39\") " Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.152190 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dm7d\" (UniqueName: \"kubernetes.io/projected/63d8f417-66f9-445d-bb60-2ad3ea77ce39-kube-api-access-5dm7d\") pod \"63d8f417-66f9-445d-bb60-2ad3ea77ce39\" (UID: \"63d8f417-66f9-445d-bb60-2ad3ea77ce39\") " Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.154205 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63d8f417-66f9-445d-bb60-2ad3ea77ce39-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "63d8f417-66f9-445d-bb60-2ad3ea77ce39" (UID: "63d8f417-66f9-445d-bb60-2ad3ea77ce39"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.154724 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63d8f417-66f9-445d-bb60-2ad3ea77ce39-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "63d8f417-66f9-445d-bb60-2ad3ea77ce39" (UID: "63d8f417-66f9-445d-bb60-2ad3ea77ce39"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.162395 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63d8f417-66f9-445d-bb60-2ad3ea77ce39-scripts" (OuterVolumeSpecName: "scripts") pod "63d8f417-66f9-445d-bb60-2ad3ea77ce39" (UID: "63d8f417-66f9-445d-bb60-2ad3ea77ce39"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.195375 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63d8f417-66f9-445d-bb60-2ad3ea77ce39-kube-api-access-5dm7d" (OuterVolumeSpecName: "kube-api-access-5dm7d") pod "63d8f417-66f9-445d-bb60-2ad3ea77ce39" (UID: "63d8f417-66f9-445d-bb60-2ad3ea77ce39"). InnerVolumeSpecName "kube-api-access-5dm7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.243875 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63d8f417-66f9-445d-bb60-2ad3ea77ce39-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "63d8f417-66f9-445d-bb60-2ad3ea77ce39" (UID: "63d8f417-66f9-445d-bb60-2ad3ea77ce39"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.271925 5004 generic.go:334] "Generic (PLEG): container finished" podID="63d8f417-66f9-445d-bb60-2ad3ea77ce39" containerID="15b95416c492fa8247eb121b14bd497e4d3d82f3d518443504a7baa01666325d" exitCode=0 Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.271961 5004 generic.go:334] "Generic (PLEG): container finished" podID="63d8f417-66f9-445d-bb60-2ad3ea77ce39" containerID="c0cd25e07211c8cab96d8fb395bf9f372c17fff2919519b608968da796fd2765" exitCode=0 Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.271970 5004 generic.go:334] "Generic (PLEG): container finished" podID="63d8f417-66f9-445d-bb60-2ad3ea77ce39" containerID="977c984ddd7bb48a7fca0179a0e4db6cfa0fbb772d854243834696604492a252" exitCode=0 Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.271996 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63d8f417-66f9-445d-bb60-2ad3ea77ce39","Type":"ContainerDied","Data":"15b95416c492fa8247eb121b14bd497e4d3d82f3d518443504a7baa01666325d"} Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.272046 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63d8f417-66f9-445d-bb60-2ad3ea77ce39","Type":"ContainerDied","Data":"c0cd25e07211c8cab96d8fb395bf9f372c17fff2919519b608968da796fd2765"} Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.272058 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63d8f417-66f9-445d-bb60-2ad3ea77ce39","Type":"ContainerDied","Data":"977c984ddd7bb48a7fca0179a0e4db6cfa0fbb772d854243834696604492a252"} Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.272072 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63d8f417-66f9-445d-bb60-2ad3ea77ce39","Type":"ContainerDied","Data":"85ae528f813633e23744266ff1a3a163dd2f853b79ed7066dec94ca41bd9fc73"} Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.272080 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.272091 5004 scope.go:117] "RemoveContainer" containerID="15b95416c492fa8247eb121b14bd497e4d3d82f3d518443504a7baa01666325d" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.275990 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dm7d\" (UniqueName: \"kubernetes.io/projected/63d8f417-66f9-445d-bb60-2ad3ea77ce39-kube-api-access-5dm7d\") on node \"crc\" DevicePath \"\"" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.276453 5004 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63d8f417-66f9-445d-bb60-2ad3ea77ce39-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.276468 5004 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/63d8f417-66f9-445d-bb60-2ad3ea77ce39-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.276483 5004 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63d8f417-66f9-445d-bb60-2ad3ea77ce39-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.276492 5004 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63d8f417-66f9-445d-bb60-2ad3ea77ce39-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.397050 5004 scope.go:117] "RemoveContainer" containerID="661f933c44774140587c03dd76cb5c684f2e558f3d7447a56d99a9d8f85f9218" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.414819 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63d8f417-66f9-445d-bb60-2ad3ea77ce39-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "63d8f417-66f9-445d-bb60-2ad3ea77ce39" (UID: "63d8f417-66f9-445d-bb60-2ad3ea77ce39"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.473700 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63d8f417-66f9-445d-bb60-2ad3ea77ce39-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "63d8f417-66f9-445d-bb60-2ad3ea77ce39" (UID: "63d8f417-66f9-445d-bb60-2ad3ea77ce39"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.481442 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63d8f417-66f9-445d-bb60-2ad3ea77ce39-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.481473 5004 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/63d8f417-66f9-445d-bb60-2ad3ea77ce39-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.484293 5004 scope.go:117] "RemoveContainer" containerID="c0cd25e07211c8cab96d8fb395bf9f372c17fff2919519b608968da796fd2765" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.528690 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63d8f417-66f9-445d-bb60-2ad3ea77ce39-config-data" (OuterVolumeSpecName: "config-data") pod "63d8f417-66f9-445d-bb60-2ad3ea77ce39" (UID: "63d8f417-66f9-445d-bb60-2ad3ea77ce39"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.548252 5004 scope.go:117] "RemoveContainer" containerID="977c984ddd7bb48a7fca0179a0e4db6cfa0fbb772d854243834696604492a252" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.582941 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63d8f417-66f9-445d-bb60-2ad3ea77ce39-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.610749 5004 scope.go:117] "RemoveContainer" containerID="15b95416c492fa8247eb121b14bd497e4d3d82f3d518443504a7baa01666325d" Dec 01 08:44:49 crc kubenswrapper[5004]: E1201 08:44:49.611247 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15b95416c492fa8247eb121b14bd497e4d3d82f3d518443504a7baa01666325d\": container with ID starting with 15b95416c492fa8247eb121b14bd497e4d3d82f3d518443504a7baa01666325d not found: ID does not exist" containerID="15b95416c492fa8247eb121b14bd497e4d3d82f3d518443504a7baa01666325d" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.611293 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15b95416c492fa8247eb121b14bd497e4d3d82f3d518443504a7baa01666325d"} err="failed to get container status \"15b95416c492fa8247eb121b14bd497e4d3d82f3d518443504a7baa01666325d\": rpc error: code = NotFound desc = could not find container \"15b95416c492fa8247eb121b14bd497e4d3d82f3d518443504a7baa01666325d\": container with ID starting with 15b95416c492fa8247eb121b14bd497e4d3d82f3d518443504a7baa01666325d not found: ID does not exist" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.611321 5004 scope.go:117] "RemoveContainer" containerID="661f933c44774140587c03dd76cb5c684f2e558f3d7447a56d99a9d8f85f9218" Dec 01 08:44:49 crc kubenswrapper[5004]: E1201 08:44:49.611746 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"661f933c44774140587c03dd76cb5c684f2e558f3d7447a56d99a9d8f85f9218\": container with ID starting with 661f933c44774140587c03dd76cb5c684f2e558f3d7447a56d99a9d8f85f9218 not found: ID does not exist" containerID="661f933c44774140587c03dd76cb5c684f2e558f3d7447a56d99a9d8f85f9218" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.611782 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"661f933c44774140587c03dd76cb5c684f2e558f3d7447a56d99a9d8f85f9218"} err="failed to get container status \"661f933c44774140587c03dd76cb5c684f2e558f3d7447a56d99a9d8f85f9218\": rpc error: code = NotFound desc = could not find container \"661f933c44774140587c03dd76cb5c684f2e558f3d7447a56d99a9d8f85f9218\": container with ID starting with 661f933c44774140587c03dd76cb5c684f2e558f3d7447a56d99a9d8f85f9218 not found: ID does not exist" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.611804 5004 scope.go:117] "RemoveContainer" containerID="c0cd25e07211c8cab96d8fb395bf9f372c17fff2919519b608968da796fd2765" Dec 01 08:44:49 crc kubenswrapper[5004]: E1201 08:44:49.612189 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0cd25e07211c8cab96d8fb395bf9f372c17fff2919519b608968da796fd2765\": container with ID starting with c0cd25e07211c8cab96d8fb395bf9f372c17fff2919519b608968da796fd2765 not found: ID does not exist" containerID="c0cd25e07211c8cab96d8fb395bf9f372c17fff2919519b608968da796fd2765" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.612218 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0cd25e07211c8cab96d8fb395bf9f372c17fff2919519b608968da796fd2765"} err="failed to get container status \"c0cd25e07211c8cab96d8fb395bf9f372c17fff2919519b608968da796fd2765\": rpc error: code = NotFound desc = could not find container \"c0cd25e07211c8cab96d8fb395bf9f372c17fff2919519b608968da796fd2765\": container with ID starting with c0cd25e07211c8cab96d8fb395bf9f372c17fff2919519b608968da796fd2765 not found: ID does not exist" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.612235 5004 scope.go:117] "RemoveContainer" containerID="977c984ddd7bb48a7fca0179a0e4db6cfa0fbb772d854243834696604492a252" Dec 01 08:44:49 crc kubenswrapper[5004]: E1201 08:44:49.612552 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"977c984ddd7bb48a7fca0179a0e4db6cfa0fbb772d854243834696604492a252\": container with ID starting with 977c984ddd7bb48a7fca0179a0e4db6cfa0fbb772d854243834696604492a252 not found: ID does not exist" containerID="977c984ddd7bb48a7fca0179a0e4db6cfa0fbb772d854243834696604492a252" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.612603 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"977c984ddd7bb48a7fca0179a0e4db6cfa0fbb772d854243834696604492a252"} err="failed to get container status \"977c984ddd7bb48a7fca0179a0e4db6cfa0fbb772d854243834696604492a252\": rpc error: code = NotFound desc = could not find container \"977c984ddd7bb48a7fca0179a0e4db6cfa0fbb772d854243834696604492a252\": container with ID starting with 977c984ddd7bb48a7fca0179a0e4db6cfa0fbb772d854243834696604492a252 not found: ID does not exist" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.612616 5004 scope.go:117] "RemoveContainer" containerID="15b95416c492fa8247eb121b14bd497e4d3d82f3d518443504a7baa01666325d" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.613004 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15b95416c492fa8247eb121b14bd497e4d3d82f3d518443504a7baa01666325d"} err="failed to get container status \"15b95416c492fa8247eb121b14bd497e4d3d82f3d518443504a7baa01666325d\": rpc error: code = NotFound desc = could not find container \"15b95416c492fa8247eb121b14bd497e4d3d82f3d518443504a7baa01666325d\": container with ID starting with 15b95416c492fa8247eb121b14bd497e4d3d82f3d518443504a7baa01666325d not found: ID does not exist" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.613061 5004 scope.go:117] "RemoveContainer" containerID="661f933c44774140587c03dd76cb5c684f2e558f3d7447a56d99a9d8f85f9218" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.613374 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"661f933c44774140587c03dd76cb5c684f2e558f3d7447a56d99a9d8f85f9218"} err="failed to get container status \"661f933c44774140587c03dd76cb5c684f2e558f3d7447a56d99a9d8f85f9218\": rpc error: code = NotFound desc = could not find container \"661f933c44774140587c03dd76cb5c684f2e558f3d7447a56d99a9d8f85f9218\": container with ID starting with 661f933c44774140587c03dd76cb5c684f2e558f3d7447a56d99a9d8f85f9218 not found: ID does not exist" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.613398 5004 scope.go:117] "RemoveContainer" containerID="c0cd25e07211c8cab96d8fb395bf9f372c17fff2919519b608968da796fd2765" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.613606 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0cd25e07211c8cab96d8fb395bf9f372c17fff2919519b608968da796fd2765"} err="failed to get container status \"c0cd25e07211c8cab96d8fb395bf9f372c17fff2919519b608968da796fd2765\": rpc error: code = NotFound desc = could not find container \"c0cd25e07211c8cab96d8fb395bf9f372c17fff2919519b608968da796fd2765\": container with ID starting with c0cd25e07211c8cab96d8fb395bf9f372c17fff2919519b608968da796fd2765 not found: ID does not exist" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.613625 5004 scope.go:117] "RemoveContainer" containerID="977c984ddd7bb48a7fca0179a0e4db6cfa0fbb772d854243834696604492a252" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.615284 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"977c984ddd7bb48a7fca0179a0e4db6cfa0fbb772d854243834696604492a252"} err="failed to get container status \"977c984ddd7bb48a7fca0179a0e4db6cfa0fbb772d854243834696604492a252\": rpc error: code = NotFound desc = could not find container \"977c984ddd7bb48a7fca0179a0e4db6cfa0fbb772d854243834696604492a252\": container with ID starting with 977c984ddd7bb48a7fca0179a0e4db6cfa0fbb772d854243834696604492a252 not found: ID does not exist" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.615329 5004 scope.go:117] "RemoveContainer" containerID="15b95416c492fa8247eb121b14bd497e4d3d82f3d518443504a7baa01666325d" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.616672 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15b95416c492fa8247eb121b14bd497e4d3d82f3d518443504a7baa01666325d"} err="failed to get container status \"15b95416c492fa8247eb121b14bd497e4d3d82f3d518443504a7baa01666325d\": rpc error: code = NotFound desc = could not find container \"15b95416c492fa8247eb121b14bd497e4d3d82f3d518443504a7baa01666325d\": container with ID starting with 15b95416c492fa8247eb121b14bd497e4d3d82f3d518443504a7baa01666325d not found: ID does not exist" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.616697 5004 scope.go:117] "RemoveContainer" containerID="661f933c44774140587c03dd76cb5c684f2e558f3d7447a56d99a9d8f85f9218" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.616949 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"661f933c44774140587c03dd76cb5c684f2e558f3d7447a56d99a9d8f85f9218"} err="failed to get container status \"661f933c44774140587c03dd76cb5c684f2e558f3d7447a56d99a9d8f85f9218\": rpc error: code = NotFound desc = could not find container \"661f933c44774140587c03dd76cb5c684f2e558f3d7447a56d99a9d8f85f9218\": container with ID starting with 661f933c44774140587c03dd76cb5c684f2e558f3d7447a56d99a9d8f85f9218 not found: ID does not exist" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.616970 5004 scope.go:117] "RemoveContainer" containerID="c0cd25e07211c8cab96d8fb395bf9f372c17fff2919519b608968da796fd2765" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.617142 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0cd25e07211c8cab96d8fb395bf9f372c17fff2919519b608968da796fd2765"} err="failed to get container status \"c0cd25e07211c8cab96d8fb395bf9f372c17fff2919519b608968da796fd2765\": rpc error: code = NotFound desc = could not find container \"c0cd25e07211c8cab96d8fb395bf9f372c17fff2919519b608968da796fd2765\": container with ID starting with c0cd25e07211c8cab96d8fb395bf9f372c17fff2919519b608968da796fd2765 not found: ID does not exist" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.617160 5004 scope.go:117] "RemoveContainer" containerID="977c984ddd7bb48a7fca0179a0e4db6cfa0fbb772d854243834696604492a252" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.617331 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"977c984ddd7bb48a7fca0179a0e4db6cfa0fbb772d854243834696604492a252"} err="failed to get container status \"977c984ddd7bb48a7fca0179a0e4db6cfa0fbb772d854243834696604492a252\": rpc error: code = NotFound desc = could not find container \"977c984ddd7bb48a7fca0179a0e4db6cfa0fbb772d854243834696604492a252\": container with ID starting with 977c984ddd7bb48a7fca0179a0e4db6cfa0fbb772d854243834696604492a252 not found: ID does not exist" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.624773 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.640115 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.653995 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 08:44:49 crc kubenswrapper[5004]: E1201 08:44:49.654627 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63d8f417-66f9-445d-bb60-2ad3ea77ce39" containerName="ceilometer-central-agent" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.654648 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="63d8f417-66f9-445d-bb60-2ad3ea77ce39" containerName="ceilometer-central-agent" Dec 01 08:44:49 crc kubenswrapper[5004]: E1201 08:44:49.654679 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63d8f417-66f9-445d-bb60-2ad3ea77ce39" containerName="proxy-httpd" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.654687 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="63d8f417-66f9-445d-bb60-2ad3ea77ce39" containerName="proxy-httpd" Dec 01 08:44:49 crc kubenswrapper[5004]: E1201 08:44:49.654703 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63d8f417-66f9-445d-bb60-2ad3ea77ce39" containerName="ceilometer-notification-agent" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.654711 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="63d8f417-66f9-445d-bb60-2ad3ea77ce39" containerName="ceilometer-notification-agent" Dec 01 08:44:49 crc kubenswrapper[5004]: E1201 08:44:49.654724 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63d8f417-66f9-445d-bb60-2ad3ea77ce39" containerName="sg-core" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.654731 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="63d8f417-66f9-445d-bb60-2ad3ea77ce39" containerName="sg-core" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.654968 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="63d8f417-66f9-445d-bb60-2ad3ea77ce39" containerName="sg-core" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.655003 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="63d8f417-66f9-445d-bb60-2ad3ea77ce39" containerName="ceilometer-notification-agent" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.655016 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="63d8f417-66f9-445d-bb60-2ad3ea77ce39" containerName="ceilometer-central-agent" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.655028 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="63d8f417-66f9-445d-bb60-2ad3ea77ce39" containerName="proxy-httpd" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.657150 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.664643 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.667875 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.668198 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.668393 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.795879 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5cdcf7de-1cb2-4071-b7e9-cdf08f6ff5c7-run-httpd\") pod \"ceilometer-0\" (UID: \"5cdcf7de-1cb2-4071-b7e9-cdf08f6ff5c7\") " pod="openstack/ceilometer-0" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.796183 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5cdcf7de-1cb2-4071-b7e9-cdf08f6ff5c7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5cdcf7de-1cb2-4071-b7e9-cdf08f6ff5c7\") " pod="openstack/ceilometer-0" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.796212 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cdcf7de-1cb2-4071-b7e9-cdf08f6ff5c7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5cdcf7de-1cb2-4071-b7e9-cdf08f6ff5c7\") " pod="openstack/ceilometer-0" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.796233 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9kh2\" (UniqueName: \"kubernetes.io/projected/5cdcf7de-1cb2-4071-b7e9-cdf08f6ff5c7-kube-api-access-r9kh2\") pod \"ceilometer-0\" (UID: \"5cdcf7de-1cb2-4071-b7e9-cdf08f6ff5c7\") " pod="openstack/ceilometer-0" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.796268 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cdcf7de-1cb2-4071-b7e9-cdf08f6ff5c7-config-data\") pod \"ceilometer-0\" (UID: \"5cdcf7de-1cb2-4071-b7e9-cdf08f6ff5c7\") " pod="openstack/ceilometer-0" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.796329 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5cdcf7de-1cb2-4071-b7e9-cdf08f6ff5c7-scripts\") pod \"ceilometer-0\" (UID: \"5cdcf7de-1cb2-4071-b7e9-cdf08f6ff5c7\") " pod="openstack/ceilometer-0" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.796354 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5cdcf7de-1cb2-4071-b7e9-cdf08f6ff5c7-log-httpd\") pod \"ceilometer-0\" (UID: \"5cdcf7de-1cb2-4071-b7e9-cdf08f6ff5c7\") " pod="openstack/ceilometer-0" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.796421 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5cdcf7de-1cb2-4071-b7e9-cdf08f6ff5c7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5cdcf7de-1cb2-4071-b7e9-cdf08f6ff5c7\") " pod="openstack/ceilometer-0" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.897925 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5cdcf7de-1cb2-4071-b7e9-cdf08f6ff5c7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5cdcf7de-1cb2-4071-b7e9-cdf08f6ff5c7\") " pod="openstack/ceilometer-0" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.898069 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5cdcf7de-1cb2-4071-b7e9-cdf08f6ff5c7-run-httpd\") pod \"ceilometer-0\" (UID: \"5cdcf7de-1cb2-4071-b7e9-cdf08f6ff5c7\") " pod="openstack/ceilometer-0" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.898113 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5cdcf7de-1cb2-4071-b7e9-cdf08f6ff5c7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5cdcf7de-1cb2-4071-b7e9-cdf08f6ff5c7\") " pod="openstack/ceilometer-0" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.898137 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cdcf7de-1cb2-4071-b7e9-cdf08f6ff5c7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5cdcf7de-1cb2-4071-b7e9-cdf08f6ff5c7\") " pod="openstack/ceilometer-0" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.898159 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9kh2\" (UniqueName: \"kubernetes.io/projected/5cdcf7de-1cb2-4071-b7e9-cdf08f6ff5c7-kube-api-access-r9kh2\") pod \"ceilometer-0\" (UID: \"5cdcf7de-1cb2-4071-b7e9-cdf08f6ff5c7\") " pod="openstack/ceilometer-0" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.898220 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cdcf7de-1cb2-4071-b7e9-cdf08f6ff5c7-config-data\") pod \"ceilometer-0\" (UID: \"5cdcf7de-1cb2-4071-b7e9-cdf08f6ff5c7\") " pod="openstack/ceilometer-0" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.898292 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5cdcf7de-1cb2-4071-b7e9-cdf08f6ff5c7-scripts\") pod \"ceilometer-0\" (UID: \"5cdcf7de-1cb2-4071-b7e9-cdf08f6ff5c7\") " pod="openstack/ceilometer-0" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.898316 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5cdcf7de-1cb2-4071-b7e9-cdf08f6ff5c7-log-httpd\") pod \"ceilometer-0\" (UID: \"5cdcf7de-1cb2-4071-b7e9-cdf08f6ff5c7\") " pod="openstack/ceilometer-0" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.898882 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5cdcf7de-1cb2-4071-b7e9-cdf08f6ff5c7-log-httpd\") pod \"ceilometer-0\" (UID: \"5cdcf7de-1cb2-4071-b7e9-cdf08f6ff5c7\") " pod="openstack/ceilometer-0" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.899046 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5cdcf7de-1cb2-4071-b7e9-cdf08f6ff5c7-run-httpd\") pod \"ceilometer-0\" (UID: \"5cdcf7de-1cb2-4071-b7e9-cdf08f6ff5c7\") " pod="openstack/ceilometer-0" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.907375 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5cdcf7de-1cb2-4071-b7e9-cdf08f6ff5c7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5cdcf7de-1cb2-4071-b7e9-cdf08f6ff5c7\") " pod="openstack/ceilometer-0" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.910294 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5cdcf7de-1cb2-4071-b7e9-cdf08f6ff5c7-scripts\") pod \"ceilometer-0\" (UID: \"5cdcf7de-1cb2-4071-b7e9-cdf08f6ff5c7\") " pod="openstack/ceilometer-0" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.912919 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5cdcf7de-1cb2-4071-b7e9-cdf08f6ff5c7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5cdcf7de-1cb2-4071-b7e9-cdf08f6ff5c7\") " pod="openstack/ceilometer-0" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.927337 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9kh2\" (UniqueName: \"kubernetes.io/projected/5cdcf7de-1cb2-4071-b7e9-cdf08f6ff5c7-kube-api-access-r9kh2\") pod \"ceilometer-0\" (UID: \"5cdcf7de-1cb2-4071-b7e9-cdf08f6ff5c7\") " pod="openstack/ceilometer-0" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.930210 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cdcf7de-1cb2-4071-b7e9-cdf08f6ff5c7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5cdcf7de-1cb2-4071-b7e9-cdf08f6ff5c7\") " pod="openstack/ceilometer-0" Dec 01 08:44:49 crc kubenswrapper[5004]: I1201 08:44:49.936460 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cdcf7de-1cb2-4071-b7e9-cdf08f6ff5c7-config-data\") pod \"ceilometer-0\" (UID: \"5cdcf7de-1cb2-4071-b7e9-cdf08f6ff5c7\") " pod="openstack/ceilometer-0" Dec 01 08:44:50 crc kubenswrapper[5004]: I1201 08:44:50.011028 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 08:44:50 crc kubenswrapper[5004]: I1201 08:44:50.557221 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 08:44:50 crc kubenswrapper[5004]: W1201 08:44:50.567969 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5cdcf7de_1cb2_4071_b7e9_cdf08f6ff5c7.slice/crio-10940cbab93d738be6419be6d358702ab9042acf77834cbc95e3a405fab40e75 WatchSource:0}: Error finding container 10940cbab93d738be6419be6d358702ab9042acf77834cbc95e3a405fab40e75: Status 404 returned error can't find the container with id 10940cbab93d738be6419be6d358702ab9042acf77834cbc95e3a405fab40e75 Dec 01 08:44:50 crc kubenswrapper[5004]: I1201 08:44:50.772691 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63d8f417-66f9-445d-bb60-2ad3ea77ce39" path="/var/lib/kubelet/pods/63d8f417-66f9-445d-bb60-2ad3ea77ce39/volumes" Dec 01 08:44:51 crc kubenswrapper[5004]: I1201 08:44:51.309112 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5cdcf7de-1cb2-4071-b7e9-cdf08f6ff5c7","Type":"ContainerStarted","Data":"10940cbab93d738be6419be6d358702ab9042acf77834cbc95e3a405fab40e75"} Dec 01 08:44:51 crc kubenswrapper[5004]: I1201 08:44:51.758956 5004 scope.go:117] "RemoveContainer" containerID="70f45b3d2a6bb4bf0d87f18bc08b282543e238e1d391a58745914af24ad7956c" Dec 01 08:44:51 crc kubenswrapper[5004]: E1201 08:44:51.759255 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 08:44:53 crc kubenswrapper[5004]: I1201 08:44:53.208918 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="e17a426c-0069-4c51-91ad-e5fbf6e0bb2a" containerName="rabbitmq" containerID="cri-o://5c7f10624a418d25374fc5f6d483787b7f89ecdac57f34c3dd622c1e3de143e9" gracePeriod=604795 Dec 01 08:44:53 crc kubenswrapper[5004]: I1201 08:44:53.707322 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="b571e2e5-2a78-45af-83aa-3d874b2569b3" containerName="rabbitmq" containerID="cri-o://8781fe91187bcd78fc6abd4d08c3d2787079b1137a12a60f1c7b65f90c2a6635" gracePeriod=604796 Dec 01 08:44:56 crc kubenswrapper[5004]: I1201 08:44:56.689928 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="e17a426c-0069-4c51-91ad-e5fbf6e0bb2a" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.129:5671: connect: connection refused" Dec 01 08:44:57 crc kubenswrapper[5004]: I1201 08:44:57.079289 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="b571e2e5-2a78-45af-83aa-3d874b2569b3" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.130:5671: connect: connection refused" Dec 01 08:45:00 crc kubenswrapper[5004]: I1201 08:45:00.157167 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409645-5722b"] Dec 01 08:45:00 crc kubenswrapper[5004]: I1201 08:45:00.160215 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409645-5722b" Dec 01 08:45:00 crc kubenswrapper[5004]: I1201 08:45:00.162582 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 08:45:00 crc kubenswrapper[5004]: I1201 08:45:00.163043 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 08:45:00 crc kubenswrapper[5004]: I1201 08:45:00.169804 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409645-5722b"] Dec 01 08:45:00 crc kubenswrapper[5004]: I1201 08:45:00.359649 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgn7s\" (UniqueName: \"kubernetes.io/projected/14e9ea25-5306-4134-8e77-dde9901fceb5-kube-api-access-bgn7s\") pod \"collect-profiles-29409645-5722b\" (UID: \"14e9ea25-5306-4134-8e77-dde9901fceb5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409645-5722b" Dec 01 08:45:00 crc kubenswrapper[5004]: I1201 08:45:00.359800 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/14e9ea25-5306-4134-8e77-dde9901fceb5-secret-volume\") pod \"collect-profiles-29409645-5722b\" (UID: \"14e9ea25-5306-4134-8e77-dde9901fceb5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409645-5722b" Dec 01 08:45:00 crc kubenswrapper[5004]: I1201 08:45:00.359846 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/14e9ea25-5306-4134-8e77-dde9901fceb5-config-volume\") pod \"collect-profiles-29409645-5722b\" (UID: \"14e9ea25-5306-4134-8e77-dde9901fceb5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409645-5722b" Dec 01 08:45:00 crc kubenswrapper[5004]: I1201 08:45:00.462253 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/14e9ea25-5306-4134-8e77-dde9901fceb5-secret-volume\") pod \"collect-profiles-29409645-5722b\" (UID: \"14e9ea25-5306-4134-8e77-dde9901fceb5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409645-5722b" Dec 01 08:45:00 crc kubenswrapper[5004]: I1201 08:45:00.462379 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/14e9ea25-5306-4134-8e77-dde9901fceb5-config-volume\") pod \"collect-profiles-29409645-5722b\" (UID: \"14e9ea25-5306-4134-8e77-dde9901fceb5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409645-5722b" Dec 01 08:45:00 crc kubenswrapper[5004]: I1201 08:45:00.463579 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/14e9ea25-5306-4134-8e77-dde9901fceb5-config-volume\") pod \"collect-profiles-29409645-5722b\" (UID: \"14e9ea25-5306-4134-8e77-dde9901fceb5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409645-5722b" Dec 01 08:45:00 crc kubenswrapper[5004]: I1201 08:45:00.468952 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgn7s\" (UniqueName: \"kubernetes.io/projected/14e9ea25-5306-4134-8e77-dde9901fceb5-kube-api-access-bgn7s\") pod \"collect-profiles-29409645-5722b\" (UID: \"14e9ea25-5306-4134-8e77-dde9901fceb5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409645-5722b" Dec 01 08:45:00 crc kubenswrapper[5004]: I1201 08:45:00.477602 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/14e9ea25-5306-4134-8e77-dde9901fceb5-secret-volume\") pod \"collect-profiles-29409645-5722b\" (UID: \"14e9ea25-5306-4134-8e77-dde9901fceb5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409645-5722b" Dec 01 08:45:00 crc kubenswrapper[5004]: I1201 08:45:00.519229 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgn7s\" (UniqueName: \"kubernetes.io/projected/14e9ea25-5306-4134-8e77-dde9901fceb5-kube-api-access-bgn7s\") pod \"collect-profiles-29409645-5722b\" (UID: \"14e9ea25-5306-4134-8e77-dde9901fceb5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409645-5722b" Dec 01 08:45:00 crc kubenswrapper[5004]: I1201 08:45:00.815341 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409645-5722b" Dec 01 08:45:01 crc kubenswrapper[5004]: I1201 08:45:01.479649 5004 generic.go:334] "Generic (PLEG): container finished" podID="e17a426c-0069-4c51-91ad-e5fbf6e0bb2a" containerID="5c7f10624a418d25374fc5f6d483787b7f89ecdac57f34c3dd622c1e3de143e9" exitCode=0 Dec 01 08:45:01 crc kubenswrapper[5004]: I1201 08:45:01.479706 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e17a426c-0069-4c51-91ad-e5fbf6e0bb2a","Type":"ContainerDied","Data":"5c7f10624a418d25374fc5f6d483787b7f89ecdac57f34c3dd622c1e3de143e9"} Dec 01 08:45:01 crc kubenswrapper[5004]: I1201 08:45:01.482022 5004 generic.go:334] "Generic (PLEG): container finished" podID="b571e2e5-2a78-45af-83aa-3d874b2569b3" containerID="8781fe91187bcd78fc6abd4d08c3d2787079b1137a12a60f1c7b65f90c2a6635" exitCode=0 Dec 01 08:45:01 crc kubenswrapper[5004]: I1201 08:45:01.482067 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b571e2e5-2a78-45af-83aa-3d874b2569b3","Type":"ContainerDied","Data":"8781fe91187bcd78fc6abd4d08c3d2787079b1137a12a60f1c7b65f90c2a6635"} Dec 01 08:45:02 crc kubenswrapper[5004]: I1201 08:45:02.951441 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-594cb89c79-lt85z"] Dec 01 08:45:02 crc kubenswrapper[5004]: I1201 08:45:02.956310 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-594cb89c79-lt85z" Dec 01 08:45:02 crc kubenswrapper[5004]: I1201 08:45:02.959132 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Dec 01 08:45:03 crc kubenswrapper[5004]: I1201 08:45:03.001606 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-594cb89c79-lt85z"] Dec 01 08:45:03 crc kubenswrapper[5004]: I1201 08:45:03.059077 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37126008-a0f7-45ad-a2b6-ff127083b74e-ovsdbserver-nb\") pod \"dnsmasq-dns-594cb89c79-lt85z\" (UID: \"37126008-a0f7-45ad-a2b6-ff127083b74e\") " pod="openstack/dnsmasq-dns-594cb89c79-lt85z" Dec 01 08:45:03 crc kubenswrapper[5004]: I1201 08:45:03.059133 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/37126008-a0f7-45ad-a2b6-ff127083b74e-openstack-edpm-ipam\") pod \"dnsmasq-dns-594cb89c79-lt85z\" (UID: \"37126008-a0f7-45ad-a2b6-ff127083b74e\") " pod="openstack/dnsmasq-dns-594cb89c79-lt85z" Dec 01 08:45:03 crc kubenswrapper[5004]: I1201 08:45:03.059151 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/37126008-a0f7-45ad-a2b6-ff127083b74e-dns-swift-storage-0\") pod \"dnsmasq-dns-594cb89c79-lt85z\" (UID: \"37126008-a0f7-45ad-a2b6-ff127083b74e\") " pod="openstack/dnsmasq-dns-594cb89c79-lt85z" Dec 01 08:45:03 crc kubenswrapper[5004]: I1201 08:45:03.059208 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2l562\" (UniqueName: \"kubernetes.io/projected/37126008-a0f7-45ad-a2b6-ff127083b74e-kube-api-access-2l562\") pod \"dnsmasq-dns-594cb89c79-lt85z\" (UID: \"37126008-a0f7-45ad-a2b6-ff127083b74e\") " pod="openstack/dnsmasq-dns-594cb89c79-lt85z" Dec 01 08:45:03 crc kubenswrapper[5004]: I1201 08:45:03.059255 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37126008-a0f7-45ad-a2b6-ff127083b74e-config\") pod \"dnsmasq-dns-594cb89c79-lt85z\" (UID: \"37126008-a0f7-45ad-a2b6-ff127083b74e\") " pod="openstack/dnsmasq-dns-594cb89c79-lt85z" Dec 01 08:45:03 crc kubenswrapper[5004]: I1201 08:45:03.059283 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37126008-a0f7-45ad-a2b6-ff127083b74e-ovsdbserver-sb\") pod \"dnsmasq-dns-594cb89c79-lt85z\" (UID: \"37126008-a0f7-45ad-a2b6-ff127083b74e\") " pod="openstack/dnsmasq-dns-594cb89c79-lt85z" Dec 01 08:45:03 crc kubenswrapper[5004]: I1201 08:45:03.059320 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37126008-a0f7-45ad-a2b6-ff127083b74e-dns-svc\") pod \"dnsmasq-dns-594cb89c79-lt85z\" (UID: \"37126008-a0f7-45ad-a2b6-ff127083b74e\") " pod="openstack/dnsmasq-dns-594cb89c79-lt85z" Dec 01 08:45:03 crc kubenswrapper[5004]: I1201 08:45:03.161692 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37126008-a0f7-45ad-a2b6-ff127083b74e-ovsdbserver-nb\") pod \"dnsmasq-dns-594cb89c79-lt85z\" (UID: \"37126008-a0f7-45ad-a2b6-ff127083b74e\") " pod="openstack/dnsmasq-dns-594cb89c79-lt85z" Dec 01 08:45:03 crc kubenswrapper[5004]: I1201 08:45:03.161768 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/37126008-a0f7-45ad-a2b6-ff127083b74e-openstack-edpm-ipam\") pod \"dnsmasq-dns-594cb89c79-lt85z\" (UID: \"37126008-a0f7-45ad-a2b6-ff127083b74e\") " pod="openstack/dnsmasq-dns-594cb89c79-lt85z" Dec 01 08:45:03 crc kubenswrapper[5004]: I1201 08:45:03.161788 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/37126008-a0f7-45ad-a2b6-ff127083b74e-dns-swift-storage-0\") pod \"dnsmasq-dns-594cb89c79-lt85z\" (UID: \"37126008-a0f7-45ad-a2b6-ff127083b74e\") " pod="openstack/dnsmasq-dns-594cb89c79-lt85z" Dec 01 08:45:03 crc kubenswrapper[5004]: I1201 08:45:03.161846 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2l562\" (UniqueName: \"kubernetes.io/projected/37126008-a0f7-45ad-a2b6-ff127083b74e-kube-api-access-2l562\") pod \"dnsmasq-dns-594cb89c79-lt85z\" (UID: \"37126008-a0f7-45ad-a2b6-ff127083b74e\") " pod="openstack/dnsmasq-dns-594cb89c79-lt85z" Dec 01 08:45:03 crc kubenswrapper[5004]: I1201 08:45:03.161891 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37126008-a0f7-45ad-a2b6-ff127083b74e-config\") pod \"dnsmasq-dns-594cb89c79-lt85z\" (UID: \"37126008-a0f7-45ad-a2b6-ff127083b74e\") " pod="openstack/dnsmasq-dns-594cb89c79-lt85z" Dec 01 08:45:03 crc kubenswrapper[5004]: I1201 08:45:03.161921 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37126008-a0f7-45ad-a2b6-ff127083b74e-ovsdbserver-sb\") pod \"dnsmasq-dns-594cb89c79-lt85z\" (UID: \"37126008-a0f7-45ad-a2b6-ff127083b74e\") " pod="openstack/dnsmasq-dns-594cb89c79-lt85z" Dec 01 08:45:03 crc kubenswrapper[5004]: I1201 08:45:03.161956 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37126008-a0f7-45ad-a2b6-ff127083b74e-dns-svc\") pod \"dnsmasq-dns-594cb89c79-lt85z\" (UID: \"37126008-a0f7-45ad-a2b6-ff127083b74e\") " pod="openstack/dnsmasq-dns-594cb89c79-lt85z" Dec 01 08:45:03 crc kubenswrapper[5004]: I1201 08:45:03.162948 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37126008-a0f7-45ad-a2b6-ff127083b74e-dns-svc\") pod \"dnsmasq-dns-594cb89c79-lt85z\" (UID: \"37126008-a0f7-45ad-a2b6-ff127083b74e\") " pod="openstack/dnsmasq-dns-594cb89c79-lt85z" Dec 01 08:45:03 crc kubenswrapper[5004]: I1201 08:45:03.163092 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/37126008-a0f7-45ad-a2b6-ff127083b74e-dns-swift-storage-0\") pod \"dnsmasq-dns-594cb89c79-lt85z\" (UID: \"37126008-a0f7-45ad-a2b6-ff127083b74e\") " pod="openstack/dnsmasq-dns-594cb89c79-lt85z" Dec 01 08:45:03 crc kubenswrapper[5004]: I1201 08:45:03.163659 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37126008-a0f7-45ad-a2b6-ff127083b74e-ovsdbserver-nb\") pod \"dnsmasq-dns-594cb89c79-lt85z\" (UID: \"37126008-a0f7-45ad-a2b6-ff127083b74e\") " pod="openstack/dnsmasq-dns-594cb89c79-lt85z" Dec 01 08:45:03 crc kubenswrapper[5004]: I1201 08:45:03.164178 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/37126008-a0f7-45ad-a2b6-ff127083b74e-openstack-edpm-ipam\") pod \"dnsmasq-dns-594cb89c79-lt85z\" (UID: \"37126008-a0f7-45ad-a2b6-ff127083b74e\") " pod="openstack/dnsmasq-dns-594cb89c79-lt85z" Dec 01 08:45:03 crc kubenswrapper[5004]: I1201 08:45:03.164710 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37126008-a0f7-45ad-a2b6-ff127083b74e-config\") pod \"dnsmasq-dns-594cb89c79-lt85z\" (UID: \"37126008-a0f7-45ad-a2b6-ff127083b74e\") " pod="openstack/dnsmasq-dns-594cb89c79-lt85z" Dec 01 08:45:03 crc kubenswrapper[5004]: I1201 08:45:03.165003 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37126008-a0f7-45ad-a2b6-ff127083b74e-ovsdbserver-sb\") pod \"dnsmasq-dns-594cb89c79-lt85z\" (UID: \"37126008-a0f7-45ad-a2b6-ff127083b74e\") " pod="openstack/dnsmasq-dns-594cb89c79-lt85z" Dec 01 08:45:03 crc kubenswrapper[5004]: I1201 08:45:03.203497 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2l562\" (UniqueName: \"kubernetes.io/projected/37126008-a0f7-45ad-a2b6-ff127083b74e-kube-api-access-2l562\") pod \"dnsmasq-dns-594cb89c79-lt85z\" (UID: \"37126008-a0f7-45ad-a2b6-ff127083b74e\") " pod="openstack/dnsmasq-dns-594cb89c79-lt85z" Dec 01 08:45:03 crc kubenswrapper[5004]: I1201 08:45:03.310069 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-594cb89c79-lt85z" Dec 01 08:45:04 crc kubenswrapper[5004]: I1201 08:45:04.759433 5004 scope.go:117] "RemoveContainer" containerID="70f45b3d2a6bb4bf0d87f18bc08b282543e238e1d391a58745914af24ad7956c" Dec 01 08:45:04 crc kubenswrapper[5004]: E1201 08:45:04.760040 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 08:45:06 crc kubenswrapper[5004]: E1201 08:45:06.984711 5004 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested" Dec 01 08:45:06 crc kubenswrapper[5004]: E1201 08:45:06.985129 5004 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested" Dec 01 08:45:06 crc kubenswrapper[5004]: E1201 08:45:06.985277 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n97h5f7h57dh58h597hf4h5b5h696h56h654h5fhbdh54bh574h587h595h576hf5hb5h577h96h68ch56ch5cbh589h9h567h598h576h55ch5d9hbfq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r9kh2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(5cdcf7de-1cb2-4071-b7e9-cdf08f6ff5c7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.072975 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.093176 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.214274 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e17a426c-0069-4c51-91ad-e5fbf6e0bb2a-rabbitmq-confd\") pod \"e17a426c-0069-4c51-91ad-e5fbf6e0bb2a\" (UID: \"e17a426c-0069-4c51-91ad-e5fbf6e0bb2a\") " Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.214346 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b571e2e5-2a78-45af-83aa-3d874b2569b3-rabbitmq-erlang-cookie\") pod \"b571e2e5-2a78-45af-83aa-3d874b2569b3\" (UID: \"b571e2e5-2a78-45af-83aa-3d874b2569b3\") " Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.214403 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e17a426c-0069-4c51-91ad-e5fbf6e0bb2a-server-conf\") pod \"e17a426c-0069-4c51-91ad-e5fbf6e0bb2a\" (UID: \"e17a426c-0069-4c51-91ad-e5fbf6e0bb2a\") " Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.214419 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b571e2e5-2a78-45af-83aa-3d874b2569b3-config-data\") pod \"b571e2e5-2a78-45af-83aa-3d874b2569b3\" (UID: \"b571e2e5-2a78-45af-83aa-3d874b2569b3\") " Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.214445 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e17a426c-0069-4c51-91ad-e5fbf6e0bb2a-pod-info\") pod \"e17a426c-0069-4c51-91ad-e5fbf6e0bb2a\" (UID: \"e17a426c-0069-4c51-91ad-e5fbf6e0bb2a\") " Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.214496 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e17a426c-0069-4c51-91ad-e5fbf6e0bb2a-plugins-conf\") pod \"e17a426c-0069-4c51-91ad-e5fbf6e0bb2a\" (UID: \"e17a426c-0069-4c51-91ad-e5fbf6e0bb2a\") " Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.214521 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e17a426c-0069-4c51-91ad-e5fbf6e0bb2a-config-data\") pod \"e17a426c-0069-4c51-91ad-e5fbf6e0bb2a\" (UID: \"e17a426c-0069-4c51-91ad-e5fbf6e0bb2a\") " Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.214549 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b571e2e5-2a78-45af-83aa-3d874b2569b3-erlang-cookie-secret\") pod \"b571e2e5-2a78-45af-83aa-3d874b2569b3\" (UID: \"b571e2e5-2a78-45af-83aa-3d874b2569b3\") " Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.214582 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b571e2e5-2a78-45af-83aa-3d874b2569b3-plugins-conf\") pod \"b571e2e5-2a78-45af-83aa-3d874b2569b3\" (UID: \"b571e2e5-2a78-45af-83aa-3d874b2569b3\") " Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.214620 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e17a426c-0069-4c51-91ad-e5fbf6e0bb2a-rabbitmq-erlang-cookie\") pod \"e17a426c-0069-4c51-91ad-e5fbf6e0bb2a\" (UID: \"e17a426c-0069-4c51-91ad-e5fbf6e0bb2a\") " Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.214636 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b571e2e5-2a78-45af-83aa-3d874b2569b3-pod-info\") pod \"b571e2e5-2a78-45af-83aa-3d874b2569b3\" (UID: \"b571e2e5-2a78-45af-83aa-3d874b2569b3\") " Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.214664 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"e17a426c-0069-4c51-91ad-e5fbf6e0bb2a\" (UID: \"e17a426c-0069-4c51-91ad-e5fbf6e0bb2a\") " Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.214729 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ph2g8\" (UniqueName: \"kubernetes.io/projected/e17a426c-0069-4c51-91ad-e5fbf6e0bb2a-kube-api-access-ph2g8\") pod \"e17a426c-0069-4c51-91ad-e5fbf6e0bb2a\" (UID: \"e17a426c-0069-4c51-91ad-e5fbf6e0bb2a\") " Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.214755 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"b571e2e5-2a78-45af-83aa-3d874b2569b3\" (UID: \"b571e2e5-2a78-45af-83aa-3d874b2569b3\") " Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.214779 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b571e2e5-2a78-45af-83aa-3d874b2569b3-rabbitmq-plugins\") pod \"b571e2e5-2a78-45af-83aa-3d874b2569b3\" (UID: \"b571e2e5-2a78-45af-83aa-3d874b2569b3\") " Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.214822 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e17a426c-0069-4c51-91ad-e5fbf6e0bb2a-erlang-cookie-secret\") pod \"e17a426c-0069-4c51-91ad-e5fbf6e0bb2a\" (UID: \"e17a426c-0069-4c51-91ad-e5fbf6e0bb2a\") " Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.214879 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e17a426c-0069-4c51-91ad-e5fbf6e0bb2a-rabbitmq-tls\") pod \"e17a426c-0069-4c51-91ad-e5fbf6e0bb2a\" (UID: \"e17a426c-0069-4c51-91ad-e5fbf6e0bb2a\") " Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.214895 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b571e2e5-2a78-45af-83aa-3d874b2569b3-server-conf\") pod \"b571e2e5-2a78-45af-83aa-3d874b2569b3\" (UID: \"b571e2e5-2a78-45af-83aa-3d874b2569b3\") " Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.214914 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b571e2e5-2a78-45af-83aa-3d874b2569b3-rabbitmq-confd\") pod \"b571e2e5-2a78-45af-83aa-3d874b2569b3\" (UID: \"b571e2e5-2a78-45af-83aa-3d874b2569b3\") " Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.214939 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b571e2e5-2a78-45af-83aa-3d874b2569b3-rabbitmq-tls\") pod \"b571e2e5-2a78-45af-83aa-3d874b2569b3\" (UID: \"b571e2e5-2a78-45af-83aa-3d874b2569b3\") " Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.214981 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e17a426c-0069-4c51-91ad-e5fbf6e0bb2a-rabbitmq-plugins\") pod \"e17a426c-0069-4c51-91ad-e5fbf6e0bb2a\" (UID: \"e17a426c-0069-4c51-91ad-e5fbf6e0bb2a\") " Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.215000 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vsg4t\" (UniqueName: \"kubernetes.io/projected/b571e2e5-2a78-45af-83aa-3d874b2569b3-kube-api-access-vsg4t\") pod \"b571e2e5-2a78-45af-83aa-3d874b2569b3\" (UID: \"b571e2e5-2a78-45af-83aa-3d874b2569b3\") " Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.215740 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e17a426c-0069-4c51-91ad-e5fbf6e0bb2a-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "e17a426c-0069-4c51-91ad-e5fbf6e0bb2a" (UID: "e17a426c-0069-4c51-91ad-e5fbf6e0bb2a"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.215858 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b571e2e5-2a78-45af-83aa-3d874b2569b3-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "b571e2e5-2a78-45af-83aa-3d874b2569b3" (UID: "b571e2e5-2a78-45af-83aa-3d874b2569b3"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.224369 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b571e2e5-2a78-45af-83aa-3d874b2569b3-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "b571e2e5-2a78-45af-83aa-3d874b2569b3" (UID: "b571e2e5-2a78-45af-83aa-3d874b2569b3"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.226345 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b571e2e5-2a78-45af-83aa-3d874b2569b3-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "b571e2e5-2a78-45af-83aa-3d874b2569b3" (UID: "b571e2e5-2a78-45af-83aa-3d874b2569b3"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.227722 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b571e2e5-2a78-45af-83aa-3d874b2569b3-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "b571e2e5-2a78-45af-83aa-3d874b2569b3" (UID: "b571e2e5-2a78-45af-83aa-3d874b2569b3"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.228002 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e17a426c-0069-4c51-91ad-e5fbf6e0bb2a-kube-api-access-ph2g8" (OuterVolumeSpecName: "kube-api-access-ph2g8") pod "e17a426c-0069-4c51-91ad-e5fbf6e0bb2a" (UID: "e17a426c-0069-4c51-91ad-e5fbf6e0bb2a"). InnerVolumeSpecName "kube-api-access-ph2g8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.229496 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "persistence") pod "e17a426c-0069-4c51-91ad-e5fbf6e0bb2a" (UID: "e17a426c-0069-4c51-91ad-e5fbf6e0bb2a"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.229801 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/b571e2e5-2a78-45af-83aa-3d874b2569b3-pod-info" (OuterVolumeSpecName: "pod-info") pod "b571e2e5-2a78-45af-83aa-3d874b2569b3" (UID: "b571e2e5-2a78-45af-83aa-3d874b2569b3"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.231919 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e17a426c-0069-4c51-91ad-e5fbf6e0bb2a-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "e17a426c-0069-4c51-91ad-e5fbf6e0bb2a" (UID: "e17a426c-0069-4c51-91ad-e5fbf6e0bb2a"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.233195 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b571e2e5-2a78-45af-83aa-3d874b2569b3-kube-api-access-vsg4t" (OuterVolumeSpecName: "kube-api-access-vsg4t") pod "b571e2e5-2a78-45af-83aa-3d874b2569b3" (UID: "b571e2e5-2a78-45af-83aa-3d874b2569b3"). InnerVolumeSpecName "kube-api-access-vsg4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.234198 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e17a426c-0069-4c51-91ad-e5fbf6e0bb2a-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "e17a426c-0069-4c51-91ad-e5fbf6e0bb2a" (UID: "e17a426c-0069-4c51-91ad-e5fbf6e0bb2a"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.246814 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e17a426c-0069-4c51-91ad-e5fbf6e0bb2a-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "e17a426c-0069-4c51-91ad-e5fbf6e0bb2a" (UID: "e17a426c-0069-4c51-91ad-e5fbf6e0bb2a"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.246921 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "persistence") pod "b571e2e5-2a78-45af-83aa-3d874b2569b3" (UID: "b571e2e5-2a78-45af-83aa-3d874b2569b3"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.246916 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/e17a426c-0069-4c51-91ad-e5fbf6e0bb2a-pod-info" (OuterVolumeSpecName: "pod-info") pod "e17a426c-0069-4c51-91ad-e5fbf6e0bb2a" (UID: "e17a426c-0069-4c51-91ad-e5fbf6e0bb2a"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.256631 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b571e2e5-2a78-45af-83aa-3d874b2569b3-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "b571e2e5-2a78-45af-83aa-3d874b2569b3" (UID: "b571e2e5-2a78-45af-83aa-3d874b2569b3"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.266828 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e17a426c-0069-4c51-91ad-e5fbf6e0bb2a-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "e17a426c-0069-4c51-91ad-e5fbf6e0bb2a" (UID: "e17a426c-0069-4c51-91ad-e5fbf6e0bb2a"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.314124 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e17a426c-0069-4c51-91ad-e5fbf6e0bb2a-config-data" (OuterVolumeSpecName: "config-data") pod "e17a426c-0069-4c51-91ad-e5fbf6e0bb2a" (UID: "e17a426c-0069-4c51-91ad-e5fbf6e0bb2a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.317393 5004 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e17a426c-0069-4c51-91ad-e5fbf6e0bb2a-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.323072 5004 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b571e2e5-2a78-45af-83aa-3d874b2569b3-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.323248 5004 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e17a426c-0069-4c51-91ad-e5fbf6e0bb2a-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.323311 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vsg4t\" (UniqueName: \"kubernetes.io/projected/b571e2e5-2a78-45af-83aa-3d874b2569b3-kube-api-access-vsg4t\") on node \"crc\" DevicePath \"\"" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.323379 5004 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b571e2e5-2a78-45af-83aa-3d874b2569b3-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.323438 5004 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e17a426c-0069-4c51-91ad-e5fbf6e0bb2a-pod-info\") on node \"crc\" DevicePath \"\"" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.323497 5004 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e17a426c-0069-4c51-91ad-e5fbf6e0bb2a-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.323563 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e17a426c-0069-4c51-91ad-e5fbf6e0bb2a-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.323649 5004 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b571e2e5-2a78-45af-83aa-3d874b2569b3-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.323708 5004 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b571e2e5-2a78-45af-83aa-3d874b2569b3-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.323765 5004 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e17a426c-0069-4c51-91ad-e5fbf6e0bb2a-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.323822 5004 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b571e2e5-2a78-45af-83aa-3d874b2569b3-pod-info\") on node \"crc\" DevicePath \"\"" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.323975 5004 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.324074 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ph2g8\" (UniqueName: \"kubernetes.io/projected/e17a426c-0069-4c51-91ad-e5fbf6e0bb2a-kube-api-access-ph2g8\") on node \"crc\" DevicePath \"\"" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.324149 5004 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.324214 5004 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b571e2e5-2a78-45af-83aa-3d874b2569b3-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.324276 5004 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e17a426c-0069-4c51-91ad-e5fbf6e0bb2a-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.352509 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b571e2e5-2a78-45af-83aa-3d874b2569b3-config-data" (OuterVolumeSpecName: "config-data") pod "b571e2e5-2a78-45af-83aa-3d874b2569b3" (UID: "b571e2e5-2a78-45af-83aa-3d874b2569b3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.367272 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e17a426c-0069-4c51-91ad-e5fbf6e0bb2a-server-conf" (OuterVolumeSpecName: "server-conf") pod "e17a426c-0069-4c51-91ad-e5fbf6e0bb2a" (UID: "e17a426c-0069-4c51-91ad-e5fbf6e0bb2a"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.385965 5004 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.399055 5004 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.430329 5004 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e17a426c-0069-4c51-91ad-e5fbf6e0bb2a-server-conf\") on node \"crc\" DevicePath \"\"" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.430360 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b571e2e5-2a78-45af-83aa-3d874b2569b3-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.430369 5004 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.430380 5004 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.477258 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b571e2e5-2a78-45af-83aa-3d874b2569b3-server-conf" (OuterVolumeSpecName: "server-conf") pod "b571e2e5-2a78-45af-83aa-3d874b2569b3" (UID: "b571e2e5-2a78-45af-83aa-3d874b2569b3"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.492597 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b571e2e5-2a78-45af-83aa-3d874b2569b3-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "b571e2e5-2a78-45af-83aa-3d874b2569b3" (UID: "b571e2e5-2a78-45af-83aa-3d874b2569b3"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.539856 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e17a426c-0069-4c51-91ad-e5fbf6e0bb2a-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "e17a426c-0069-4c51-91ad-e5fbf6e0bb2a" (UID: "e17a426c-0069-4c51-91ad-e5fbf6e0bb2a"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.540347 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e17a426c-0069-4c51-91ad-e5fbf6e0bb2a-rabbitmq-confd\") pod \"e17a426c-0069-4c51-91ad-e5fbf6e0bb2a\" (UID: \"e17a426c-0069-4c51-91ad-e5fbf6e0bb2a\") " Dec 01 08:45:11 crc kubenswrapper[5004]: W1201 08:45:11.540810 5004 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/e17a426c-0069-4c51-91ad-e5fbf6e0bb2a/volumes/kubernetes.io~projected/rabbitmq-confd Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.540917 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e17a426c-0069-4c51-91ad-e5fbf6e0bb2a-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "e17a426c-0069-4c51-91ad-e5fbf6e0bb2a" (UID: "e17a426c-0069-4c51-91ad-e5fbf6e0bb2a"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.541666 5004 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b571e2e5-2a78-45af-83aa-3d874b2569b3-server-conf\") on node \"crc\" DevicePath \"\"" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.541689 5004 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b571e2e5-2a78-45af-83aa-3d874b2569b3-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.541700 5004 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e17a426c-0069-4c51-91ad-e5fbf6e0bb2a-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.640269 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e17a426c-0069-4c51-91ad-e5fbf6e0bb2a","Type":"ContainerDied","Data":"207ade0bc554b41d9f45fe1cb0ddb6600e0af82e2ae314fd7caaf67248a79fa6"} Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.640325 5004 scope.go:117] "RemoveContainer" containerID="5c7f10624a418d25374fc5f6d483787b7f89ecdac57f34c3dd622c1e3de143e9" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.640331 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.643110 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b571e2e5-2a78-45af-83aa-3d874b2569b3","Type":"ContainerDied","Data":"09c5010b95ba396d08e791d342ba34fd773f6961f449432d7a61ef4a0f3ba58d"} Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.643187 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.681620 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.690106 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="e17a426c-0069-4c51-91ad-e5fbf6e0bb2a" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.129:5671: i/o timeout" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.693822 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.704673 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.718549 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.728127 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 08:45:11 crc kubenswrapper[5004]: E1201 08:45:11.729032 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b571e2e5-2a78-45af-83aa-3d874b2569b3" containerName="setup-container" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.729129 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="b571e2e5-2a78-45af-83aa-3d874b2569b3" containerName="setup-container" Dec 01 08:45:11 crc kubenswrapper[5004]: E1201 08:45:11.729225 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e17a426c-0069-4c51-91ad-e5fbf6e0bb2a" containerName="rabbitmq" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.729297 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="e17a426c-0069-4c51-91ad-e5fbf6e0bb2a" containerName="rabbitmq" Dec 01 08:45:11 crc kubenswrapper[5004]: E1201 08:45:11.729369 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b571e2e5-2a78-45af-83aa-3d874b2569b3" containerName="rabbitmq" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.729439 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="b571e2e5-2a78-45af-83aa-3d874b2569b3" containerName="rabbitmq" Dec 01 08:45:11 crc kubenswrapper[5004]: E1201 08:45:11.729519 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e17a426c-0069-4c51-91ad-e5fbf6e0bb2a" containerName="setup-container" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.729612 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="e17a426c-0069-4c51-91ad-e5fbf6e0bb2a" containerName="setup-container" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.729988 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="e17a426c-0069-4c51-91ad-e5fbf6e0bb2a" containerName="rabbitmq" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.730092 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="b571e2e5-2a78-45af-83aa-3d874b2569b3" containerName="rabbitmq" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.731783 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.738932 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.748826 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.755414 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.755675 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.755725 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-qpv7q" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.755832 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.755936 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.755998 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.756077 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.756135 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-p9vkw" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.756340 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.756452 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.756616 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.756680 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.756697 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.756819 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.781676 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.797760 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.847199 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56831219-9428-45a8-8888-869bc645d080-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"56831219-9428-45a8-8888-869bc645d080\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.847709 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/56831219-9428-45a8-8888-869bc645d080-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"56831219-9428-45a8-8888-869bc645d080\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.847777 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"af2896c5-aa9f-47d7-ba02-ebea4bbd29ed\") " pod="openstack/rabbitmq-server-0" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.847793 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/af2896c5-aa9f-47d7-ba02-ebea4bbd29ed-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"af2896c5-aa9f-47d7-ba02-ebea4bbd29ed\") " pod="openstack/rabbitmq-server-0" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.847822 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/af2896c5-aa9f-47d7-ba02-ebea4bbd29ed-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"af2896c5-aa9f-47d7-ba02-ebea4bbd29ed\") " pod="openstack/rabbitmq-server-0" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.847853 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/af2896c5-aa9f-47d7-ba02-ebea4bbd29ed-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"af2896c5-aa9f-47d7-ba02-ebea4bbd29ed\") " pod="openstack/rabbitmq-server-0" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.847873 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/56831219-9428-45a8-8888-869bc645d080-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"56831219-9428-45a8-8888-869bc645d080\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.847901 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fh28\" (UniqueName: \"kubernetes.io/projected/56831219-9428-45a8-8888-869bc645d080-kube-api-access-5fh28\") pod \"rabbitmq-cell1-server-0\" (UID: \"56831219-9428-45a8-8888-869bc645d080\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.847919 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/af2896c5-aa9f-47d7-ba02-ebea4bbd29ed-config-data\") pod \"rabbitmq-server-0\" (UID: \"af2896c5-aa9f-47d7-ba02-ebea4bbd29ed\") " pod="openstack/rabbitmq-server-0" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.847944 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/56831219-9428-45a8-8888-869bc645d080-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"56831219-9428-45a8-8888-869bc645d080\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.847958 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/56831219-9428-45a8-8888-869bc645d080-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"56831219-9428-45a8-8888-869bc645d080\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.847995 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/af2896c5-aa9f-47d7-ba02-ebea4bbd29ed-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"af2896c5-aa9f-47d7-ba02-ebea4bbd29ed\") " pod="openstack/rabbitmq-server-0" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.848032 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"56831219-9428-45a8-8888-869bc645d080\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.848052 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/af2896c5-aa9f-47d7-ba02-ebea4bbd29ed-pod-info\") pod \"rabbitmq-server-0\" (UID: \"af2896c5-aa9f-47d7-ba02-ebea4bbd29ed\") " pod="openstack/rabbitmq-server-0" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.848071 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/56831219-9428-45a8-8888-869bc645d080-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"56831219-9428-45a8-8888-869bc645d080\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.848103 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/56831219-9428-45a8-8888-869bc645d080-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"56831219-9428-45a8-8888-869bc645d080\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.848144 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/af2896c5-aa9f-47d7-ba02-ebea4bbd29ed-server-conf\") pod \"rabbitmq-server-0\" (UID: \"af2896c5-aa9f-47d7-ba02-ebea4bbd29ed\") " pod="openstack/rabbitmq-server-0" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.848174 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/56831219-9428-45a8-8888-869bc645d080-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"56831219-9428-45a8-8888-869bc645d080\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.848194 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsgnk\" (UniqueName: \"kubernetes.io/projected/af2896c5-aa9f-47d7-ba02-ebea4bbd29ed-kube-api-access-hsgnk\") pod \"rabbitmq-server-0\" (UID: \"af2896c5-aa9f-47d7-ba02-ebea4bbd29ed\") " pod="openstack/rabbitmq-server-0" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.848220 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/af2896c5-aa9f-47d7-ba02-ebea4bbd29ed-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"af2896c5-aa9f-47d7-ba02-ebea4bbd29ed\") " pod="openstack/rabbitmq-server-0" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.848262 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/af2896c5-aa9f-47d7-ba02-ebea4bbd29ed-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"af2896c5-aa9f-47d7-ba02-ebea4bbd29ed\") " pod="openstack/rabbitmq-server-0" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.848279 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/56831219-9428-45a8-8888-869bc645d080-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"56831219-9428-45a8-8888-869bc645d080\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:45:11 crc kubenswrapper[5004]: E1201 08:45:11.940489 5004 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested" Dec 01 08:45:11 crc kubenswrapper[5004]: E1201 08:45:11.940541 5004 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested" Dec 01 08:45:11 crc kubenswrapper[5004]: E1201 08:45:11.940750 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:heat-db-sync,Image:quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested,Command:[/bin/bash],Args:[-c /usr/bin/heat-manage --config-dir /etc/heat/heat.conf.d db_sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/heat/heat.conf.d/00-default.conf,SubPath:00-default.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/heat/heat.conf.d/01-custom.conf,SubPath:01-custom.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7km7j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42418,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42418,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-db-sync-sskn9_openstack(1fc1c621-41a4-4fcd-ab52-dd45c3d82080): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 08:45:11 crc kubenswrapper[5004]: E1201 08:45:11.942289 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/heat-db-sync-sskn9" podUID="1fc1c621-41a4-4fcd-ab52-dd45c3d82080" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.950522 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/56831219-9428-45a8-8888-869bc645d080-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"56831219-9428-45a8-8888-869bc645d080\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.950644 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56831219-9428-45a8-8888-869bc645d080-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"56831219-9428-45a8-8888-869bc645d080\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.950692 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/56831219-9428-45a8-8888-869bc645d080-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"56831219-9428-45a8-8888-869bc645d080\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.950720 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"af2896c5-aa9f-47d7-ba02-ebea4bbd29ed\") " pod="openstack/rabbitmq-server-0" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.950745 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/af2896c5-aa9f-47d7-ba02-ebea4bbd29ed-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"af2896c5-aa9f-47d7-ba02-ebea4bbd29ed\") " pod="openstack/rabbitmq-server-0" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.950785 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/af2896c5-aa9f-47d7-ba02-ebea4bbd29ed-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"af2896c5-aa9f-47d7-ba02-ebea4bbd29ed\") " pod="openstack/rabbitmq-server-0" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.950817 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/af2896c5-aa9f-47d7-ba02-ebea4bbd29ed-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"af2896c5-aa9f-47d7-ba02-ebea4bbd29ed\") " pod="openstack/rabbitmq-server-0" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.950841 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/56831219-9428-45a8-8888-869bc645d080-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"56831219-9428-45a8-8888-869bc645d080\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.950881 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fh28\" (UniqueName: \"kubernetes.io/projected/56831219-9428-45a8-8888-869bc645d080-kube-api-access-5fh28\") pod \"rabbitmq-cell1-server-0\" (UID: \"56831219-9428-45a8-8888-869bc645d080\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.950904 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/af2896c5-aa9f-47d7-ba02-ebea4bbd29ed-config-data\") pod \"rabbitmq-server-0\" (UID: \"af2896c5-aa9f-47d7-ba02-ebea4bbd29ed\") " pod="openstack/rabbitmq-server-0" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.950941 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/56831219-9428-45a8-8888-869bc645d080-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"56831219-9428-45a8-8888-869bc645d080\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.950964 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/56831219-9428-45a8-8888-869bc645d080-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"56831219-9428-45a8-8888-869bc645d080\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.951049 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/af2896c5-aa9f-47d7-ba02-ebea4bbd29ed-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"af2896c5-aa9f-47d7-ba02-ebea4bbd29ed\") " pod="openstack/rabbitmq-server-0" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.951092 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"56831219-9428-45a8-8888-869bc645d080\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.951122 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/af2896c5-aa9f-47d7-ba02-ebea4bbd29ed-pod-info\") pod \"rabbitmq-server-0\" (UID: \"af2896c5-aa9f-47d7-ba02-ebea4bbd29ed\") " pod="openstack/rabbitmq-server-0" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.951138 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/56831219-9428-45a8-8888-869bc645d080-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"56831219-9428-45a8-8888-869bc645d080\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.951143 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/56831219-9428-45a8-8888-869bc645d080-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"56831219-9428-45a8-8888-869bc645d080\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.951283 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/56831219-9428-45a8-8888-869bc645d080-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"56831219-9428-45a8-8888-869bc645d080\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.951360 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/af2896c5-aa9f-47d7-ba02-ebea4bbd29ed-server-conf\") pod \"rabbitmq-server-0\" (UID: \"af2896c5-aa9f-47d7-ba02-ebea4bbd29ed\") " pod="openstack/rabbitmq-server-0" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.952407 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/56831219-9428-45a8-8888-869bc645d080-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"56831219-9428-45a8-8888-869bc645d080\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.953634 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56831219-9428-45a8-8888-869bc645d080-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"56831219-9428-45a8-8888-869bc645d080\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.954095 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/af2896c5-aa9f-47d7-ba02-ebea4bbd29ed-server-conf\") pod \"rabbitmq-server-0\" (UID: \"af2896c5-aa9f-47d7-ba02-ebea4bbd29ed\") " pod="openstack/rabbitmq-server-0" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.954716 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/af2896c5-aa9f-47d7-ba02-ebea4bbd29ed-config-data\") pod \"rabbitmq-server-0\" (UID: \"af2896c5-aa9f-47d7-ba02-ebea4bbd29ed\") " pod="openstack/rabbitmq-server-0" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.955007 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/56831219-9428-45a8-8888-869bc645d080-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"56831219-9428-45a8-8888-869bc645d080\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.955215 5004 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"56831219-9428-45a8-8888-869bc645d080\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.955240 5004 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"af2896c5-aa9f-47d7-ba02-ebea4bbd29ed\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/rabbitmq-server-0" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.955501 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/af2896c5-aa9f-47d7-ba02-ebea4bbd29ed-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"af2896c5-aa9f-47d7-ba02-ebea4bbd29ed\") " pod="openstack/rabbitmq-server-0" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.951416 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/56831219-9428-45a8-8888-869bc645d080-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"56831219-9428-45a8-8888-869bc645d080\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.956215 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsgnk\" (UniqueName: \"kubernetes.io/projected/af2896c5-aa9f-47d7-ba02-ebea4bbd29ed-kube-api-access-hsgnk\") pod \"rabbitmq-server-0\" (UID: \"af2896c5-aa9f-47d7-ba02-ebea4bbd29ed\") " pod="openstack/rabbitmq-server-0" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.956240 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/56831219-9428-45a8-8888-869bc645d080-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"56831219-9428-45a8-8888-869bc645d080\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.956258 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/af2896c5-aa9f-47d7-ba02-ebea4bbd29ed-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"af2896c5-aa9f-47d7-ba02-ebea4bbd29ed\") " pod="openstack/rabbitmq-server-0" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.956267 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/af2896c5-aa9f-47d7-ba02-ebea4bbd29ed-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"af2896c5-aa9f-47d7-ba02-ebea4bbd29ed\") " pod="openstack/rabbitmq-server-0" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.956846 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/af2896c5-aa9f-47d7-ba02-ebea4bbd29ed-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"af2896c5-aa9f-47d7-ba02-ebea4bbd29ed\") " pod="openstack/rabbitmq-server-0" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.957207 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/af2896c5-aa9f-47d7-ba02-ebea4bbd29ed-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"af2896c5-aa9f-47d7-ba02-ebea4bbd29ed\") " pod="openstack/rabbitmq-server-0" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.959122 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/56831219-9428-45a8-8888-869bc645d080-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"56831219-9428-45a8-8888-869bc645d080\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.960207 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/af2896c5-aa9f-47d7-ba02-ebea4bbd29ed-pod-info\") pod \"rabbitmq-server-0\" (UID: \"af2896c5-aa9f-47d7-ba02-ebea4bbd29ed\") " pod="openstack/rabbitmq-server-0" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.961494 5004 scope.go:117] "RemoveContainer" containerID="ccde5507419ccdb6bb1307ad7276e94115097f8e8b951d4a4f702511b46356d2" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.964859 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/56831219-9428-45a8-8888-869bc645d080-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"56831219-9428-45a8-8888-869bc645d080\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.965062 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/af2896c5-aa9f-47d7-ba02-ebea4bbd29ed-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"af2896c5-aa9f-47d7-ba02-ebea4bbd29ed\") " pod="openstack/rabbitmq-server-0" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.973995 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fh28\" (UniqueName: \"kubernetes.io/projected/56831219-9428-45a8-8888-869bc645d080-kube-api-access-5fh28\") pod \"rabbitmq-cell1-server-0\" (UID: \"56831219-9428-45a8-8888-869bc645d080\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.975775 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsgnk\" (UniqueName: \"kubernetes.io/projected/af2896c5-aa9f-47d7-ba02-ebea4bbd29ed-kube-api-access-hsgnk\") pod \"rabbitmq-server-0\" (UID: \"af2896c5-aa9f-47d7-ba02-ebea4bbd29ed\") " pod="openstack/rabbitmq-server-0" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.979680 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/56831219-9428-45a8-8888-869bc645d080-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"56831219-9428-45a8-8888-869bc645d080\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.979797 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/af2896c5-aa9f-47d7-ba02-ebea4bbd29ed-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"af2896c5-aa9f-47d7-ba02-ebea4bbd29ed\") " pod="openstack/rabbitmq-server-0" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.981076 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/56831219-9428-45a8-8888-869bc645d080-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"56831219-9428-45a8-8888-869bc645d080\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:45:11 crc kubenswrapper[5004]: I1201 08:45:11.992562 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/af2896c5-aa9f-47d7-ba02-ebea4bbd29ed-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"af2896c5-aa9f-47d7-ba02-ebea4bbd29ed\") " pod="openstack/rabbitmq-server-0" Dec 01 08:45:12 crc kubenswrapper[5004]: I1201 08:45:12.079145 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="b571e2e5-2a78-45af-83aa-3d874b2569b3" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.130:5671: i/o timeout" Dec 01 08:45:12 crc kubenswrapper[5004]: I1201 08:45:12.100460 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"56831219-9428-45a8-8888-869bc645d080\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:45:12 crc kubenswrapper[5004]: I1201 08:45:12.104368 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"af2896c5-aa9f-47d7-ba02-ebea4bbd29ed\") " pod="openstack/rabbitmq-server-0" Dec 01 08:45:12 crc kubenswrapper[5004]: I1201 08:45:12.110189 5004 scope.go:117] "RemoveContainer" containerID="8781fe91187bcd78fc6abd4d08c3d2787079b1137a12a60f1c7b65f90c2a6635" Dec 01 08:45:12 crc kubenswrapper[5004]: I1201 08:45:12.188356 5004 scope.go:117] "RemoveContainer" containerID="17c1164a1a9ddf12e0f7bb16f0fda29c357c85933a430ff5b061e9d6f7746e89" Dec 01 08:45:12 crc kubenswrapper[5004]: I1201 08:45:12.362746 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 01 08:45:12 crc kubenswrapper[5004]: I1201 08:45:12.381756 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:45:12 crc kubenswrapper[5004]: I1201 08:45:12.419130 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-594cb89c79-lt85z"] Dec 01 08:45:12 crc kubenswrapper[5004]: W1201 08:45:12.434064 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37126008_a0f7_45ad_a2b6_ff127083b74e.slice/crio-44b7e48189edb10f4d86bf497d54e4bccfc2dbb880938f25e9be54743e9a4edc WatchSource:0}: Error finding container 44b7e48189edb10f4d86bf497d54e4bccfc2dbb880938f25e9be54743e9a4edc: Status 404 returned error can't find the container with id 44b7e48189edb10f4d86bf497d54e4bccfc2dbb880938f25e9be54743e9a4edc Dec 01 08:45:12 crc kubenswrapper[5004]: I1201 08:45:12.446512 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409645-5722b"] Dec 01 08:45:12 crc kubenswrapper[5004]: I1201 08:45:12.670184 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-594cb89c79-lt85z" event={"ID":"37126008-a0f7-45ad-a2b6-ff127083b74e","Type":"ContainerStarted","Data":"44b7e48189edb10f4d86bf497d54e4bccfc2dbb880938f25e9be54743e9a4edc"} Dec 01 08:45:12 crc kubenswrapper[5004]: I1201 08:45:12.675052 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5cdcf7de-1cb2-4071-b7e9-cdf08f6ff5c7","Type":"ContainerStarted","Data":"aa17beb6184c6efab0e301b6b752a93e115163a106303bf6ca568a8688030475"} Dec 01 08:45:12 crc kubenswrapper[5004]: I1201 08:45:12.676151 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409645-5722b" event={"ID":"14e9ea25-5306-4134-8e77-dde9901fceb5","Type":"ContainerStarted","Data":"68015ad1e36092ca134adae7a69924dac296f6a6c327bad6648792c0a46f2b23"} Dec 01 08:45:12 crc kubenswrapper[5004]: E1201 08:45:12.678611 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested\\\"\"" pod="openstack/heat-db-sync-sskn9" podUID="1fc1c621-41a4-4fcd-ab52-dd45c3d82080" Dec 01 08:45:12 crc kubenswrapper[5004]: I1201 08:45:12.779053 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b571e2e5-2a78-45af-83aa-3d874b2569b3" path="/var/lib/kubelet/pods/b571e2e5-2a78-45af-83aa-3d874b2569b3/volumes" Dec 01 08:45:12 crc kubenswrapper[5004]: I1201 08:45:12.780505 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e17a426c-0069-4c51-91ad-e5fbf6e0bb2a" path="/var/lib/kubelet/pods/e17a426c-0069-4c51-91ad-e5fbf6e0bb2a/volumes" Dec 01 08:45:12 crc kubenswrapper[5004]: I1201 08:45:12.885437 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 08:45:12 crc kubenswrapper[5004]: W1201 08:45:12.894838 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf2896c5_aa9f_47d7_ba02_ebea4bbd29ed.slice/crio-e4dd73bc789642c31a67af72748ae03d8f749b9eb0248dc9c25c128c51f2ef25 WatchSource:0}: Error finding container e4dd73bc789642c31a67af72748ae03d8f749b9eb0248dc9c25c128c51f2ef25: Status 404 returned error can't find the container with id e4dd73bc789642c31a67af72748ae03d8f749b9eb0248dc9c25c128c51f2ef25 Dec 01 08:45:12 crc kubenswrapper[5004]: W1201 08:45:12.974916 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56831219_9428_45a8_8888_869bc645d080.slice/crio-31fc17f84c1f2423fb6004d73687e6f714f26160af33b766676c461a0293ba52 WatchSource:0}: Error finding container 31fc17f84c1f2423fb6004d73687e6f714f26160af33b766676c461a0293ba52: Status 404 returned error can't find the container with id 31fc17f84c1f2423fb6004d73687e6f714f26160af33b766676c461a0293ba52 Dec 01 08:45:12 crc kubenswrapper[5004]: I1201 08:45:12.988271 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 08:45:13 crc kubenswrapper[5004]: I1201 08:45:13.698303 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5cdcf7de-1cb2-4071-b7e9-cdf08f6ff5c7","Type":"ContainerStarted","Data":"d0279c468b07edaae723f570e187c86233b594319e9cf7135dc0fcba2c61a0cb"} Dec 01 08:45:13 crc kubenswrapper[5004]: I1201 08:45:13.701142 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"56831219-9428-45a8-8888-869bc645d080","Type":"ContainerStarted","Data":"31fc17f84c1f2423fb6004d73687e6f714f26160af33b766676c461a0293ba52"} Dec 01 08:45:13 crc kubenswrapper[5004]: I1201 08:45:13.703785 5004 generic.go:334] "Generic (PLEG): container finished" podID="14e9ea25-5306-4134-8e77-dde9901fceb5" containerID="8d2f618e7d1ef58aba1a2e2c91c469e60a14830ccc83d7b1673365c261980545" exitCode=0 Dec 01 08:45:13 crc kubenswrapper[5004]: I1201 08:45:13.703870 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409645-5722b" event={"ID":"14e9ea25-5306-4134-8e77-dde9901fceb5","Type":"ContainerDied","Data":"8d2f618e7d1ef58aba1a2e2c91c469e60a14830ccc83d7b1673365c261980545"} Dec 01 08:45:13 crc kubenswrapper[5004]: I1201 08:45:13.712415 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"af2896c5-aa9f-47d7-ba02-ebea4bbd29ed","Type":"ContainerStarted","Data":"e4dd73bc789642c31a67af72748ae03d8f749b9eb0248dc9c25c128c51f2ef25"} Dec 01 08:45:13 crc kubenswrapper[5004]: I1201 08:45:13.724299 5004 generic.go:334] "Generic (PLEG): container finished" podID="37126008-a0f7-45ad-a2b6-ff127083b74e" containerID="da3dbbd8a270e3e009bd061b9263173b7e9c79fbc742ad062a19712ac9c33067" exitCode=0 Dec 01 08:45:13 crc kubenswrapper[5004]: I1201 08:45:13.724360 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-594cb89c79-lt85z" event={"ID":"37126008-a0f7-45ad-a2b6-ff127083b74e","Type":"ContainerDied","Data":"da3dbbd8a270e3e009bd061b9263173b7e9c79fbc742ad062a19712ac9c33067"} Dec 01 08:45:14 crc kubenswrapper[5004]: I1201 08:45:14.745700 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-594cb89c79-lt85z" event={"ID":"37126008-a0f7-45ad-a2b6-ff127083b74e","Type":"ContainerStarted","Data":"9e760ef0364d3c21ed545b09da924d56c4851c8e0b9a2d663e5d77154a10bf36"} Dec 01 08:45:14 crc kubenswrapper[5004]: I1201 08:45:14.747349 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-594cb89c79-lt85z" Dec 01 08:45:14 crc kubenswrapper[5004]: I1201 08:45:14.795703 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-594cb89c79-lt85z" podStartSLOduration=12.795671361 podStartE2EDuration="12.795671361s" podCreationTimestamp="2025-12-01 08:45:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:45:14.780020831 +0000 UTC m=+1692.345012813" watchObservedRunningTime="2025-12-01 08:45:14.795671361 +0000 UTC m=+1692.360663343" Dec 01 08:45:15 crc kubenswrapper[5004]: I1201 08:45:15.590283 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409645-5722b" Dec 01 08:45:15 crc kubenswrapper[5004]: I1201 08:45:15.668001 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/14e9ea25-5306-4134-8e77-dde9901fceb5-secret-volume\") pod \"14e9ea25-5306-4134-8e77-dde9901fceb5\" (UID: \"14e9ea25-5306-4134-8e77-dde9901fceb5\") " Dec 01 08:45:15 crc kubenswrapper[5004]: I1201 08:45:15.668216 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/14e9ea25-5306-4134-8e77-dde9901fceb5-config-volume\") pod \"14e9ea25-5306-4134-8e77-dde9901fceb5\" (UID: \"14e9ea25-5306-4134-8e77-dde9901fceb5\") " Dec 01 08:45:15 crc kubenswrapper[5004]: I1201 08:45:15.668358 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgn7s\" (UniqueName: \"kubernetes.io/projected/14e9ea25-5306-4134-8e77-dde9901fceb5-kube-api-access-bgn7s\") pod \"14e9ea25-5306-4134-8e77-dde9901fceb5\" (UID: \"14e9ea25-5306-4134-8e77-dde9901fceb5\") " Dec 01 08:45:15 crc kubenswrapper[5004]: I1201 08:45:15.669063 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14e9ea25-5306-4134-8e77-dde9901fceb5-config-volume" (OuterVolumeSpecName: "config-volume") pod "14e9ea25-5306-4134-8e77-dde9901fceb5" (UID: "14e9ea25-5306-4134-8e77-dde9901fceb5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:45:15 crc kubenswrapper[5004]: I1201 08:45:15.669368 5004 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/14e9ea25-5306-4134-8e77-dde9901fceb5-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 08:45:15 crc kubenswrapper[5004]: I1201 08:45:15.672278 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14e9ea25-5306-4134-8e77-dde9901fceb5-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "14e9ea25-5306-4134-8e77-dde9901fceb5" (UID: "14e9ea25-5306-4134-8e77-dde9901fceb5"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:45:15 crc kubenswrapper[5004]: E1201 08:45:15.687805 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="5cdcf7de-1cb2-4071-b7e9-cdf08f6ff5c7" Dec 01 08:45:15 crc kubenswrapper[5004]: I1201 08:45:15.688873 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14e9ea25-5306-4134-8e77-dde9901fceb5-kube-api-access-bgn7s" (OuterVolumeSpecName: "kube-api-access-bgn7s") pod "14e9ea25-5306-4134-8e77-dde9901fceb5" (UID: "14e9ea25-5306-4134-8e77-dde9901fceb5"). InnerVolumeSpecName "kube-api-access-bgn7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:45:15 crc kubenswrapper[5004]: I1201 08:45:15.759354 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"56831219-9428-45a8-8888-869bc645d080","Type":"ContainerStarted","Data":"e5af2dcf78f7d12950029477b6acb6ce3560f5f0060986de61cf5576be8671d4"} Dec 01 08:45:15 crc kubenswrapper[5004]: I1201 08:45:15.763642 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409645-5722b" Dec 01 08:45:15 crc kubenswrapper[5004]: I1201 08:45:15.763713 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409645-5722b" event={"ID":"14e9ea25-5306-4134-8e77-dde9901fceb5","Type":"ContainerDied","Data":"68015ad1e36092ca134adae7a69924dac296f6a6c327bad6648792c0a46f2b23"} Dec 01 08:45:15 crc kubenswrapper[5004]: I1201 08:45:15.763755 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68015ad1e36092ca134adae7a69924dac296f6a6c327bad6648792c0a46f2b23" Dec 01 08:45:15 crc kubenswrapper[5004]: I1201 08:45:15.768311 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"af2896c5-aa9f-47d7-ba02-ebea4bbd29ed","Type":"ContainerStarted","Data":"848c231234ec861d648dae35401b2cb1f4150307feba3b366e0ddd04db606c7b"} Dec 01 08:45:15 crc kubenswrapper[5004]: I1201 08:45:15.771287 5004 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/14e9ea25-5306-4134-8e77-dde9901fceb5-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 08:45:15 crc kubenswrapper[5004]: I1201 08:45:15.771335 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bgn7s\" (UniqueName: \"kubernetes.io/projected/14e9ea25-5306-4134-8e77-dde9901fceb5-kube-api-access-bgn7s\") on node \"crc\" DevicePath \"\"" Dec 01 08:45:15 crc kubenswrapper[5004]: I1201 08:45:15.773989 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5cdcf7de-1cb2-4071-b7e9-cdf08f6ff5c7","Type":"ContainerStarted","Data":"4f37520fc4932fefd2fd8f841fe5dcd04a0f3b793e7edc63aa2af0caea087ee4"} Dec 01 08:45:15 crc kubenswrapper[5004]: I1201 08:45:15.774043 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 08:45:15 crc kubenswrapper[5004]: E1201 08:45:15.775924 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested\\\"\"" pod="openstack/ceilometer-0" podUID="5cdcf7de-1cb2-4071-b7e9-cdf08f6ff5c7" Dec 01 08:45:16 crc kubenswrapper[5004]: E1201 08:45:16.791876 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested\\\"\"" pod="openstack/ceilometer-0" podUID="5cdcf7de-1cb2-4071-b7e9-cdf08f6ff5c7" Dec 01 08:45:18 crc kubenswrapper[5004]: I1201 08:45:18.762466 5004 scope.go:117] "RemoveContainer" containerID="70f45b3d2a6bb4bf0d87f18bc08b282543e238e1d391a58745914af24ad7956c" Dec 01 08:45:18 crc kubenswrapper[5004]: E1201 08:45:18.764540 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 08:45:23 crc kubenswrapper[5004]: I1201 08:45:23.312843 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-594cb89c79-lt85z" Dec 01 08:45:23 crc kubenswrapper[5004]: I1201 08:45:23.399527 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d99f6bc7f-nk8fb"] Dec 01 08:45:23 crc kubenswrapper[5004]: I1201 08:45:23.400166 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d99f6bc7f-nk8fb" podUID="8b89ff03-e585-46b1-8656-fad2acbeaeaf" containerName="dnsmasq-dns" containerID="cri-o://bf15c83552d127166c6c320e5f2fa50fd23d5e0335b939e809ad780efec1fcb6" gracePeriod=10 Dec 01 08:45:23 crc kubenswrapper[5004]: I1201 08:45:23.592514 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5596c69fcc-l9pb7"] Dec 01 08:45:23 crc kubenswrapper[5004]: E1201 08:45:23.593039 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14e9ea25-5306-4134-8e77-dde9901fceb5" containerName="collect-profiles" Dec 01 08:45:23 crc kubenswrapper[5004]: I1201 08:45:23.593060 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="14e9ea25-5306-4134-8e77-dde9901fceb5" containerName="collect-profiles" Dec 01 08:45:23 crc kubenswrapper[5004]: I1201 08:45:23.593329 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="14e9ea25-5306-4134-8e77-dde9901fceb5" containerName="collect-profiles" Dec 01 08:45:23 crc kubenswrapper[5004]: I1201 08:45:23.594552 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5596c69fcc-l9pb7" Dec 01 08:45:23 crc kubenswrapper[5004]: I1201 08:45:23.609479 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5596c69fcc-l9pb7"] Dec 01 08:45:23 crc kubenswrapper[5004]: I1201 08:45:23.696690 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8g6lj\" (UniqueName: \"kubernetes.io/projected/7d1249df-92dc-4f52-8db5-8088870a934c-kube-api-access-8g6lj\") pod \"dnsmasq-dns-5596c69fcc-l9pb7\" (UID: \"7d1249df-92dc-4f52-8db5-8088870a934c\") " pod="openstack/dnsmasq-dns-5596c69fcc-l9pb7" Dec 01 08:45:23 crc kubenswrapper[5004]: I1201 08:45:23.696751 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7d1249df-92dc-4f52-8db5-8088870a934c-ovsdbserver-sb\") pod \"dnsmasq-dns-5596c69fcc-l9pb7\" (UID: \"7d1249df-92dc-4f52-8db5-8088870a934c\") " pod="openstack/dnsmasq-dns-5596c69fcc-l9pb7" Dec 01 08:45:23 crc kubenswrapper[5004]: I1201 08:45:23.696806 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/7d1249df-92dc-4f52-8db5-8088870a934c-openstack-edpm-ipam\") pod \"dnsmasq-dns-5596c69fcc-l9pb7\" (UID: \"7d1249df-92dc-4f52-8db5-8088870a934c\") " pod="openstack/dnsmasq-dns-5596c69fcc-l9pb7" Dec 01 08:45:23 crc kubenswrapper[5004]: I1201 08:45:23.696828 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7d1249df-92dc-4f52-8db5-8088870a934c-dns-swift-storage-0\") pod \"dnsmasq-dns-5596c69fcc-l9pb7\" (UID: \"7d1249df-92dc-4f52-8db5-8088870a934c\") " pod="openstack/dnsmasq-dns-5596c69fcc-l9pb7" Dec 01 08:45:23 crc kubenswrapper[5004]: I1201 08:45:23.697039 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d1249df-92dc-4f52-8db5-8088870a934c-config\") pod \"dnsmasq-dns-5596c69fcc-l9pb7\" (UID: \"7d1249df-92dc-4f52-8db5-8088870a934c\") " pod="openstack/dnsmasq-dns-5596c69fcc-l9pb7" Dec 01 08:45:23 crc kubenswrapper[5004]: I1201 08:45:23.697072 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7d1249df-92dc-4f52-8db5-8088870a934c-ovsdbserver-nb\") pod \"dnsmasq-dns-5596c69fcc-l9pb7\" (UID: \"7d1249df-92dc-4f52-8db5-8088870a934c\") " pod="openstack/dnsmasq-dns-5596c69fcc-l9pb7" Dec 01 08:45:23 crc kubenswrapper[5004]: I1201 08:45:23.697094 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7d1249df-92dc-4f52-8db5-8088870a934c-dns-svc\") pod \"dnsmasq-dns-5596c69fcc-l9pb7\" (UID: \"7d1249df-92dc-4f52-8db5-8088870a934c\") " pod="openstack/dnsmasq-dns-5596c69fcc-l9pb7" Dec 01 08:45:23 crc kubenswrapper[5004]: I1201 08:45:23.798914 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7d1249df-92dc-4f52-8db5-8088870a934c-ovsdbserver-nb\") pod \"dnsmasq-dns-5596c69fcc-l9pb7\" (UID: \"7d1249df-92dc-4f52-8db5-8088870a934c\") " pod="openstack/dnsmasq-dns-5596c69fcc-l9pb7" Dec 01 08:45:23 crc kubenswrapper[5004]: I1201 08:45:23.798969 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7d1249df-92dc-4f52-8db5-8088870a934c-dns-svc\") pod \"dnsmasq-dns-5596c69fcc-l9pb7\" (UID: \"7d1249df-92dc-4f52-8db5-8088870a934c\") " pod="openstack/dnsmasq-dns-5596c69fcc-l9pb7" Dec 01 08:45:23 crc kubenswrapper[5004]: I1201 08:45:23.799094 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8g6lj\" (UniqueName: \"kubernetes.io/projected/7d1249df-92dc-4f52-8db5-8088870a934c-kube-api-access-8g6lj\") pod \"dnsmasq-dns-5596c69fcc-l9pb7\" (UID: \"7d1249df-92dc-4f52-8db5-8088870a934c\") " pod="openstack/dnsmasq-dns-5596c69fcc-l9pb7" Dec 01 08:45:23 crc kubenswrapper[5004]: I1201 08:45:23.799143 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7d1249df-92dc-4f52-8db5-8088870a934c-ovsdbserver-sb\") pod \"dnsmasq-dns-5596c69fcc-l9pb7\" (UID: \"7d1249df-92dc-4f52-8db5-8088870a934c\") " pod="openstack/dnsmasq-dns-5596c69fcc-l9pb7" Dec 01 08:45:23 crc kubenswrapper[5004]: I1201 08:45:23.799202 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/7d1249df-92dc-4f52-8db5-8088870a934c-openstack-edpm-ipam\") pod \"dnsmasq-dns-5596c69fcc-l9pb7\" (UID: \"7d1249df-92dc-4f52-8db5-8088870a934c\") " pod="openstack/dnsmasq-dns-5596c69fcc-l9pb7" Dec 01 08:45:23 crc kubenswrapper[5004]: I1201 08:45:23.799226 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7d1249df-92dc-4f52-8db5-8088870a934c-dns-swift-storage-0\") pod \"dnsmasq-dns-5596c69fcc-l9pb7\" (UID: \"7d1249df-92dc-4f52-8db5-8088870a934c\") " pod="openstack/dnsmasq-dns-5596c69fcc-l9pb7" Dec 01 08:45:23 crc kubenswrapper[5004]: I1201 08:45:23.799424 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d1249df-92dc-4f52-8db5-8088870a934c-config\") pod \"dnsmasq-dns-5596c69fcc-l9pb7\" (UID: \"7d1249df-92dc-4f52-8db5-8088870a934c\") " pod="openstack/dnsmasq-dns-5596c69fcc-l9pb7" Dec 01 08:45:23 crc kubenswrapper[5004]: I1201 08:45:23.800341 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d1249df-92dc-4f52-8db5-8088870a934c-config\") pod \"dnsmasq-dns-5596c69fcc-l9pb7\" (UID: \"7d1249df-92dc-4f52-8db5-8088870a934c\") " pod="openstack/dnsmasq-dns-5596c69fcc-l9pb7" Dec 01 08:45:23 crc kubenswrapper[5004]: I1201 08:45:23.800891 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7d1249df-92dc-4f52-8db5-8088870a934c-dns-svc\") pod \"dnsmasq-dns-5596c69fcc-l9pb7\" (UID: \"7d1249df-92dc-4f52-8db5-8088870a934c\") " pod="openstack/dnsmasq-dns-5596c69fcc-l9pb7" Dec 01 08:45:23 crc kubenswrapper[5004]: I1201 08:45:23.801639 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7d1249df-92dc-4f52-8db5-8088870a934c-dns-swift-storage-0\") pod \"dnsmasq-dns-5596c69fcc-l9pb7\" (UID: \"7d1249df-92dc-4f52-8db5-8088870a934c\") " pod="openstack/dnsmasq-dns-5596c69fcc-l9pb7" Dec 01 08:45:23 crc kubenswrapper[5004]: I1201 08:45:23.801785 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7d1249df-92dc-4f52-8db5-8088870a934c-ovsdbserver-nb\") pod \"dnsmasq-dns-5596c69fcc-l9pb7\" (UID: \"7d1249df-92dc-4f52-8db5-8088870a934c\") " pod="openstack/dnsmasq-dns-5596c69fcc-l9pb7" Dec 01 08:45:23 crc kubenswrapper[5004]: I1201 08:45:23.801923 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7d1249df-92dc-4f52-8db5-8088870a934c-ovsdbserver-sb\") pod \"dnsmasq-dns-5596c69fcc-l9pb7\" (UID: \"7d1249df-92dc-4f52-8db5-8088870a934c\") " pod="openstack/dnsmasq-dns-5596c69fcc-l9pb7" Dec 01 08:45:23 crc kubenswrapper[5004]: I1201 08:45:23.801938 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/7d1249df-92dc-4f52-8db5-8088870a934c-openstack-edpm-ipam\") pod \"dnsmasq-dns-5596c69fcc-l9pb7\" (UID: \"7d1249df-92dc-4f52-8db5-8088870a934c\") " pod="openstack/dnsmasq-dns-5596c69fcc-l9pb7" Dec 01 08:45:23 crc kubenswrapper[5004]: I1201 08:45:23.855305 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8g6lj\" (UniqueName: \"kubernetes.io/projected/7d1249df-92dc-4f52-8db5-8088870a934c-kube-api-access-8g6lj\") pod \"dnsmasq-dns-5596c69fcc-l9pb7\" (UID: \"7d1249df-92dc-4f52-8db5-8088870a934c\") " pod="openstack/dnsmasq-dns-5596c69fcc-l9pb7" Dec 01 08:45:23 crc kubenswrapper[5004]: I1201 08:45:23.913134 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5596c69fcc-l9pb7" Dec 01 08:45:23 crc kubenswrapper[5004]: I1201 08:45:23.916935 5004 generic.go:334] "Generic (PLEG): container finished" podID="8b89ff03-e585-46b1-8656-fad2acbeaeaf" containerID="bf15c83552d127166c6c320e5f2fa50fd23d5e0335b939e809ad780efec1fcb6" exitCode=0 Dec 01 08:45:23 crc kubenswrapper[5004]: I1201 08:45:23.916982 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d99f6bc7f-nk8fb" event={"ID":"8b89ff03-e585-46b1-8656-fad2acbeaeaf","Type":"ContainerDied","Data":"bf15c83552d127166c6c320e5f2fa50fd23d5e0335b939e809ad780efec1fcb6"} Dec 01 08:45:24 crc kubenswrapper[5004]: I1201 08:45:24.165919 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d99f6bc7f-nk8fb" Dec 01 08:45:24 crc kubenswrapper[5004]: I1201 08:45:24.210446 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8b89ff03-e585-46b1-8656-fad2acbeaeaf-dns-svc\") pod \"8b89ff03-e585-46b1-8656-fad2acbeaeaf\" (UID: \"8b89ff03-e585-46b1-8656-fad2acbeaeaf\") " Dec 01 08:45:24 crc kubenswrapper[5004]: I1201 08:45:24.210499 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8b89ff03-e585-46b1-8656-fad2acbeaeaf-ovsdbserver-sb\") pod \"8b89ff03-e585-46b1-8656-fad2acbeaeaf\" (UID: \"8b89ff03-e585-46b1-8656-fad2acbeaeaf\") " Dec 01 08:45:24 crc kubenswrapper[5004]: I1201 08:45:24.210714 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmdzp\" (UniqueName: \"kubernetes.io/projected/8b89ff03-e585-46b1-8656-fad2acbeaeaf-kube-api-access-zmdzp\") pod \"8b89ff03-e585-46b1-8656-fad2acbeaeaf\" (UID: \"8b89ff03-e585-46b1-8656-fad2acbeaeaf\") " Dec 01 08:45:24 crc kubenswrapper[5004]: I1201 08:45:24.210778 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8b89ff03-e585-46b1-8656-fad2acbeaeaf-ovsdbserver-nb\") pod \"8b89ff03-e585-46b1-8656-fad2acbeaeaf\" (UID: \"8b89ff03-e585-46b1-8656-fad2acbeaeaf\") " Dec 01 08:45:24 crc kubenswrapper[5004]: I1201 08:45:24.210795 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8b89ff03-e585-46b1-8656-fad2acbeaeaf-dns-swift-storage-0\") pod \"8b89ff03-e585-46b1-8656-fad2acbeaeaf\" (UID: \"8b89ff03-e585-46b1-8656-fad2acbeaeaf\") " Dec 01 08:45:24 crc kubenswrapper[5004]: I1201 08:45:24.210843 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b89ff03-e585-46b1-8656-fad2acbeaeaf-config\") pod \"8b89ff03-e585-46b1-8656-fad2acbeaeaf\" (UID: \"8b89ff03-e585-46b1-8656-fad2acbeaeaf\") " Dec 01 08:45:24 crc kubenswrapper[5004]: I1201 08:45:24.219067 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b89ff03-e585-46b1-8656-fad2acbeaeaf-kube-api-access-zmdzp" (OuterVolumeSpecName: "kube-api-access-zmdzp") pod "8b89ff03-e585-46b1-8656-fad2acbeaeaf" (UID: "8b89ff03-e585-46b1-8656-fad2acbeaeaf"). InnerVolumeSpecName "kube-api-access-zmdzp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:45:24 crc kubenswrapper[5004]: I1201 08:45:24.314257 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmdzp\" (UniqueName: \"kubernetes.io/projected/8b89ff03-e585-46b1-8656-fad2acbeaeaf-kube-api-access-zmdzp\") on node \"crc\" DevicePath \"\"" Dec 01 08:45:24 crc kubenswrapper[5004]: I1201 08:45:24.316328 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b89ff03-e585-46b1-8656-fad2acbeaeaf-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8b89ff03-e585-46b1-8656-fad2acbeaeaf" (UID: "8b89ff03-e585-46b1-8656-fad2acbeaeaf"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:45:24 crc kubenswrapper[5004]: I1201 08:45:24.374495 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b89ff03-e585-46b1-8656-fad2acbeaeaf-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8b89ff03-e585-46b1-8656-fad2acbeaeaf" (UID: "8b89ff03-e585-46b1-8656-fad2acbeaeaf"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:45:24 crc kubenswrapper[5004]: I1201 08:45:24.386205 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b89ff03-e585-46b1-8656-fad2acbeaeaf-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8b89ff03-e585-46b1-8656-fad2acbeaeaf" (UID: "8b89ff03-e585-46b1-8656-fad2acbeaeaf"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:45:24 crc kubenswrapper[5004]: I1201 08:45:24.386295 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b89ff03-e585-46b1-8656-fad2acbeaeaf-config" (OuterVolumeSpecName: "config") pod "8b89ff03-e585-46b1-8656-fad2acbeaeaf" (UID: "8b89ff03-e585-46b1-8656-fad2acbeaeaf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:45:24 crc kubenswrapper[5004]: I1201 08:45:24.390248 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5596c69fcc-l9pb7"] Dec 01 08:45:24 crc kubenswrapper[5004]: I1201 08:45:24.395525 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b89ff03-e585-46b1-8656-fad2acbeaeaf-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8b89ff03-e585-46b1-8656-fad2acbeaeaf" (UID: "8b89ff03-e585-46b1-8656-fad2acbeaeaf"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:45:24 crc kubenswrapper[5004]: I1201 08:45:24.416129 5004 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8b89ff03-e585-46b1-8656-fad2acbeaeaf-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 08:45:24 crc kubenswrapper[5004]: I1201 08:45:24.416163 5004 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8b89ff03-e585-46b1-8656-fad2acbeaeaf-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 01 08:45:24 crc kubenswrapper[5004]: I1201 08:45:24.416176 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b89ff03-e585-46b1-8656-fad2acbeaeaf-config\") on node \"crc\" DevicePath \"\"" Dec 01 08:45:24 crc kubenswrapper[5004]: I1201 08:45:24.416187 5004 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8b89ff03-e585-46b1-8656-fad2acbeaeaf-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 08:45:24 crc kubenswrapper[5004]: I1201 08:45:24.416195 5004 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8b89ff03-e585-46b1-8656-fad2acbeaeaf-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 08:45:24 crc kubenswrapper[5004]: I1201 08:45:24.939271 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5596c69fcc-l9pb7" event={"ID":"7d1249df-92dc-4f52-8db5-8088870a934c","Type":"ContainerStarted","Data":"e105e7a81f1ffdc02a92c50462e173cb4da482e5a8ae62b65c22418a7f09a721"} Dec 01 08:45:24 crc kubenswrapper[5004]: I1201 08:45:24.941493 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d99f6bc7f-nk8fb" event={"ID":"8b89ff03-e585-46b1-8656-fad2acbeaeaf","Type":"ContainerDied","Data":"0cb6dcc9790d156c27828ff5cab6ae630b97e92700342b613333f6c4e44f53b4"} Dec 01 08:45:24 crc kubenswrapper[5004]: I1201 08:45:24.941524 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d99f6bc7f-nk8fb" Dec 01 08:45:24 crc kubenswrapper[5004]: I1201 08:45:24.941576 5004 scope.go:117] "RemoveContainer" containerID="bf15c83552d127166c6c320e5f2fa50fd23d5e0335b939e809ad780efec1fcb6" Dec 01 08:45:24 crc kubenswrapper[5004]: I1201 08:45:24.988679 5004 scope.go:117] "RemoveContainer" containerID="7dc69bc8e5c3093480200580f2a842cdbda452efa11d1d59ca6c66592f30215e" Dec 01 08:45:24 crc kubenswrapper[5004]: I1201 08:45:24.990219 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d99f6bc7f-nk8fb"] Dec 01 08:45:25 crc kubenswrapper[5004]: I1201 08:45:25.030875 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d99f6bc7f-nk8fb"] Dec 01 08:45:26 crc kubenswrapper[5004]: I1201 08:45:26.773504 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b89ff03-e585-46b1-8656-fad2acbeaeaf" path="/var/lib/kubelet/pods/8b89ff03-e585-46b1-8656-fad2acbeaeaf/volumes" Dec 01 08:45:26 crc kubenswrapper[5004]: I1201 08:45:26.991030 5004 generic.go:334] "Generic (PLEG): container finished" podID="7d1249df-92dc-4f52-8db5-8088870a934c" containerID="783f4c4b201a84969d01ae43b6d32fd7d39689363acbe9fcd3a39d9d1e4c42f2" exitCode=0 Dec 01 08:45:26 crc kubenswrapper[5004]: I1201 08:45:26.991279 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5596c69fcc-l9pb7" event={"ID":"7d1249df-92dc-4f52-8db5-8088870a934c","Type":"ContainerDied","Data":"783f4c4b201a84969d01ae43b6d32fd7d39689363acbe9fcd3a39d9d1e4c42f2"} Dec 01 08:45:28 crc kubenswrapper[5004]: I1201 08:45:28.008415 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5596c69fcc-l9pb7" event={"ID":"7d1249df-92dc-4f52-8db5-8088870a934c","Type":"ContainerStarted","Data":"7ceac32bc4c43e835f5e33339055d2dd4d1c6e701ef74622b462c9b4f368ead0"} Dec 01 08:45:28 crc kubenswrapper[5004]: I1201 08:45:28.009340 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5596c69fcc-l9pb7" Dec 01 08:45:28 crc kubenswrapper[5004]: I1201 08:45:28.011901 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-sskn9" event={"ID":"1fc1c621-41a4-4fcd-ab52-dd45c3d82080","Type":"ContainerStarted","Data":"eedac86e8773a04b9d3fb1bd246de36e21e0a24e59c7b0772bb015eb9b07fe2a"} Dec 01 08:45:28 crc kubenswrapper[5004]: I1201 08:45:28.055500 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5596c69fcc-l9pb7" podStartSLOduration=5.055475159 podStartE2EDuration="5.055475159s" podCreationTimestamp="2025-12-01 08:45:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:45:28.040375602 +0000 UTC m=+1705.605367594" watchObservedRunningTime="2025-12-01 08:45:28.055475159 +0000 UTC m=+1705.620467181" Dec 01 08:45:28 crc kubenswrapper[5004]: I1201 08:45:28.076987 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-sskn9" podStartSLOduration=2.72401167 podStartE2EDuration="43.076967006s" podCreationTimestamp="2025-12-01 08:44:45 +0000 UTC" firstStartedPulling="2025-12-01 08:44:46.608371236 +0000 UTC m=+1664.173363208" lastFinishedPulling="2025-12-01 08:45:26.961326562 +0000 UTC m=+1704.526318544" observedRunningTime="2025-12-01 08:45:28.059051993 +0000 UTC m=+1705.624043995" watchObservedRunningTime="2025-12-01 08:45:28.076967006 +0000 UTC m=+1705.641958998" Dec 01 08:45:29 crc kubenswrapper[5004]: I1201 08:45:29.759882 5004 scope.go:117] "RemoveContainer" containerID="70f45b3d2a6bb4bf0d87f18bc08b282543e238e1d391a58745914af24ad7956c" Dec 01 08:45:29 crc kubenswrapper[5004]: E1201 08:45:29.760706 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 08:45:30 crc kubenswrapper[5004]: I1201 08:45:30.042716 5004 generic.go:334] "Generic (PLEG): container finished" podID="1fc1c621-41a4-4fcd-ab52-dd45c3d82080" containerID="eedac86e8773a04b9d3fb1bd246de36e21e0a24e59c7b0772bb015eb9b07fe2a" exitCode=0 Dec 01 08:45:30 crc kubenswrapper[5004]: I1201 08:45:30.042791 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-sskn9" event={"ID":"1fc1c621-41a4-4fcd-ab52-dd45c3d82080","Type":"ContainerDied","Data":"eedac86e8773a04b9d3fb1bd246de36e21e0a24e59c7b0772bb015eb9b07fe2a"} Dec 01 08:45:30 crc kubenswrapper[5004]: I1201 08:45:30.776503 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 01 08:45:31 crc kubenswrapper[5004]: I1201 08:45:31.590261 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-sskn9" Dec 01 08:45:31 crc kubenswrapper[5004]: I1201 08:45:31.699145 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7km7j\" (UniqueName: \"kubernetes.io/projected/1fc1c621-41a4-4fcd-ab52-dd45c3d82080-kube-api-access-7km7j\") pod \"1fc1c621-41a4-4fcd-ab52-dd45c3d82080\" (UID: \"1fc1c621-41a4-4fcd-ab52-dd45c3d82080\") " Dec 01 08:45:31 crc kubenswrapper[5004]: I1201 08:45:31.699207 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fc1c621-41a4-4fcd-ab52-dd45c3d82080-combined-ca-bundle\") pod \"1fc1c621-41a4-4fcd-ab52-dd45c3d82080\" (UID: \"1fc1c621-41a4-4fcd-ab52-dd45c3d82080\") " Dec 01 08:45:31 crc kubenswrapper[5004]: I1201 08:45:31.699338 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fc1c621-41a4-4fcd-ab52-dd45c3d82080-config-data\") pod \"1fc1c621-41a4-4fcd-ab52-dd45c3d82080\" (UID: \"1fc1c621-41a4-4fcd-ab52-dd45c3d82080\") " Dec 01 08:45:31 crc kubenswrapper[5004]: I1201 08:45:31.706105 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fc1c621-41a4-4fcd-ab52-dd45c3d82080-kube-api-access-7km7j" (OuterVolumeSpecName: "kube-api-access-7km7j") pod "1fc1c621-41a4-4fcd-ab52-dd45c3d82080" (UID: "1fc1c621-41a4-4fcd-ab52-dd45c3d82080"). InnerVolumeSpecName "kube-api-access-7km7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:45:31 crc kubenswrapper[5004]: I1201 08:45:31.734370 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fc1c621-41a4-4fcd-ab52-dd45c3d82080-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1fc1c621-41a4-4fcd-ab52-dd45c3d82080" (UID: "1fc1c621-41a4-4fcd-ab52-dd45c3d82080"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:45:31 crc kubenswrapper[5004]: I1201 08:45:31.802677 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7km7j\" (UniqueName: \"kubernetes.io/projected/1fc1c621-41a4-4fcd-ab52-dd45c3d82080-kube-api-access-7km7j\") on node \"crc\" DevicePath \"\"" Dec 01 08:45:31 crc kubenswrapper[5004]: I1201 08:45:31.802712 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fc1c621-41a4-4fcd-ab52-dd45c3d82080-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:45:31 crc kubenswrapper[5004]: I1201 08:45:31.818895 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fc1c621-41a4-4fcd-ab52-dd45c3d82080-config-data" (OuterVolumeSpecName: "config-data") pod "1fc1c621-41a4-4fcd-ab52-dd45c3d82080" (UID: "1fc1c621-41a4-4fcd-ab52-dd45c3d82080"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:45:31 crc kubenswrapper[5004]: I1201 08:45:31.904645 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fc1c621-41a4-4fcd-ab52-dd45c3d82080-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 08:45:32 crc kubenswrapper[5004]: I1201 08:45:32.075582 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5cdcf7de-1cb2-4071-b7e9-cdf08f6ff5c7","Type":"ContainerStarted","Data":"de21f790db1138f8d95bce7680a77c79ef09b6f889cbb51f6605c0b40fa080b2"} Dec 01 08:45:32 crc kubenswrapper[5004]: I1201 08:45:32.078077 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-sskn9" event={"ID":"1fc1c621-41a4-4fcd-ab52-dd45c3d82080","Type":"ContainerDied","Data":"32080147857fdfb086b1076e8d0fa72401785c1b6a5511ba704c2f6df380eae6"} Dec 01 08:45:32 crc kubenswrapper[5004]: I1201 08:45:32.078120 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32080147857fdfb086b1076e8d0fa72401785c1b6a5511ba704c2f6df380eae6" Dec 01 08:45:32 crc kubenswrapper[5004]: I1201 08:45:32.078156 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-sskn9" Dec 01 08:45:32 crc kubenswrapper[5004]: I1201 08:45:32.110235 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.468638263 podStartE2EDuration="43.110219766s" podCreationTimestamp="2025-12-01 08:44:49 +0000 UTC" firstStartedPulling="2025-12-01 08:44:50.571495578 +0000 UTC m=+1668.136487560" lastFinishedPulling="2025-12-01 08:45:31.213077081 +0000 UTC m=+1708.778069063" observedRunningTime="2025-12-01 08:45:32.103253071 +0000 UTC m=+1709.668245063" watchObservedRunningTime="2025-12-01 08:45:32.110219766 +0000 UTC m=+1709.675211758" Dec 01 08:45:33 crc kubenswrapper[5004]: I1201 08:45:33.753798 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-74fd44f8dd-bd4pt"] Dec 01 08:45:33 crc kubenswrapper[5004]: E1201 08:45:33.754853 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b89ff03-e585-46b1-8656-fad2acbeaeaf" containerName="init" Dec 01 08:45:33 crc kubenswrapper[5004]: I1201 08:45:33.754873 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b89ff03-e585-46b1-8656-fad2acbeaeaf" containerName="init" Dec 01 08:45:33 crc kubenswrapper[5004]: E1201 08:45:33.754898 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b89ff03-e585-46b1-8656-fad2acbeaeaf" containerName="dnsmasq-dns" Dec 01 08:45:33 crc kubenswrapper[5004]: I1201 08:45:33.754907 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b89ff03-e585-46b1-8656-fad2acbeaeaf" containerName="dnsmasq-dns" Dec 01 08:45:33 crc kubenswrapper[5004]: E1201 08:45:33.754967 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fc1c621-41a4-4fcd-ab52-dd45c3d82080" containerName="heat-db-sync" Dec 01 08:45:33 crc kubenswrapper[5004]: I1201 08:45:33.754977 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fc1c621-41a4-4fcd-ab52-dd45c3d82080" containerName="heat-db-sync" Dec 01 08:45:33 crc kubenswrapper[5004]: I1201 08:45:33.755280 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b89ff03-e585-46b1-8656-fad2acbeaeaf" containerName="dnsmasq-dns" Dec 01 08:45:33 crc kubenswrapper[5004]: I1201 08:45:33.755328 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fc1c621-41a4-4fcd-ab52-dd45c3d82080" containerName="heat-db-sync" Dec 01 08:45:33 crc kubenswrapper[5004]: I1201 08:45:33.756497 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-74fd44f8dd-bd4pt" Dec 01 08:45:33 crc kubenswrapper[5004]: I1201 08:45:33.765782 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-74fd44f8dd-bd4pt"] Dec 01 08:45:33 crc kubenswrapper[5004]: I1201 08:45:33.779084 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-697fddcf97-nrt68"] Dec 01 08:45:33 crc kubenswrapper[5004]: I1201 08:45:33.780741 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-697fddcf97-nrt68" Dec 01 08:45:33 crc kubenswrapper[5004]: I1201 08:45:33.791179 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-697fddcf97-nrt68"] Dec 01 08:45:33 crc kubenswrapper[5004]: I1201 08:45:33.855923 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1c0efc3a-6559-4f89-8d20-c11f7b4f291b-config-data-custom\") pod \"heat-engine-74fd44f8dd-bd4pt\" (UID: \"1c0efc3a-6559-4f89-8d20-c11f7b4f291b\") " pod="openstack/heat-engine-74fd44f8dd-bd4pt" Dec 01 08:45:33 crc kubenswrapper[5004]: I1201 08:45:33.856074 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c0efc3a-6559-4f89-8d20-c11f7b4f291b-config-data\") pod \"heat-engine-74fd44f8dd-bd4pt\" (UID: \"1c0efc3a-6559-4f89-8d20-c11f7b4f291b\") " pod="openstack/heat-engine-74fd44f8dd-bd4pt" Dec 01 08:45:33 crc kubenswrapper[5004]: I1201 08:45:33.856103 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c0efc3a-6559-4f89-8d20-c11f7b4f291b-combined-ca-bundle\") pod \"heat-engine-74fd44f8dd-bd4pt\" (UID: \"1c0efc3a-6559-4f89-8d20-c11f7b4f291b\") " pod="openstack/heat-engine-74fd44f8dd-bd4pt" Dec 01 08:45:33 crc kubenswrapper[5004]: I1201 08:45:33.857305 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wfqx\" (UniqueName: \"kubernetes.io/projected/1c0efc3a-6559-4f89-8d20-c11f7b4f291b-kube-api-access-6wfqx\") pod \"heat-engine-74fd44f8dd-bd4pt\" (UID: \"1c0efc3a-6559-4f89-8d20-c11f7b4f291b\") " pod="openstack/heat-engine-74fd44f8dd-bd4pt" Dec 01 08:45:33 crc kubenswrapper[5004]: I1201 08:45:33.882128 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-54fdd54654-fhlqp"] Dec 01 08:45:33 crc kubenswrapper[5004]: I1201 08:45:33.883765 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-54fdd54654-fhlqp" Dec 01 08:45:33 crc kubenswrapper[5004]: I1201 08:45:33.896500 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-54fdd54654-fhlqp"] Dec 01 08:45:33 crc kubenswrapper[5004]: I1201 08:45:33.914761 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5596c69fcc-l9pb7" Dec 01 08:45:33 crc kubenswrapper[5004]: I1201 08:45:33.959109 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1c0efc3a-6559-4f89-8d20-c11f7b4f291b-config-data-custom\") pod \"heat-engine-74fd44f8dd-bd4pt\" (UID: \"1c0efc3a-6559-4f89-8d20-c11f7b4f291b\") " pod="openstack/heat-engine-74fd44f8dd-bd4pt" Dec 01 08:45:33 crc kubenswrapper[5004]: I1201 08:45:33.959161 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcswj\" (UniqueName: \"kubernetes.io/projected/7a6dfa68-ad11-4ad2-8596-951823d970cc-kube-api-access-wcswj\") pod \"heat-api-697fddcf97-nrt68\" (UID: \"7a6dfa68-ad11-4ad2-8596-951823d970cc\") " pod="openstack/heat-api-697fddcf97-nrt68" Dec 01 08:45:33 crc kubenswrapper[5004]: I1201 08:45:33.959208 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a6dfa68-ad11-4ad2-8596-951823d970cc-combined-ca-bundle\") pod \"heat-api-697fddcf97-nrt68\" (UID: \"7a6dfa68-ad11-4ad2-8596-951823d970cc\") " pod="openstack/heat-api-697fddcf97-nrt68" Dec 01 08:45:33 crc kubenswrapper[5004]: I1201 08:45:33.959255 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c0efc3a-6559-4f89-8d20-c11f7b4f291b-config-data\") pod \"heat-engine-74fd44f8dd-bd4pt\" (UID: \"1c0efc3a-6559-4f89-8d20-c11f7b4f291b\") " pod="openstack/heat-engine-74fd44f8dd-bd4pt" Dec 01 08:45:33 crc kubenswrapper[5004]: I1201 08:45:33.959299 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c0efc3a-6559-4f89-8d20-c11f7b4f291b-combined-ca-bundle\") pod \"heat-engine-74fd44f8dd-bd4pt\" (UID: \"1c0efc3a-6559-4f89-8d20-c11f7b4f291b\") " pod="openstack/heat-engine-74fd44f8dd-bd4pt" Dec 01 08:45:33 crc kubenswrapper[5004]: I1201 08:45:33.959367 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a6dfa68-ad11-4ad2-8596-951823d970cc-config-data\") pod \"heat-api-697fddcf97-nrt68\" (UID: \"7a6dfa68-ad11-4ad2-8596-951823d970cc\") " pod="openstack/heat-api-697fddcf97-nrt68" Dec 01 08:45:33 crc kubenswrapper[5004]: I1201 08:45:33.959443 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a6dfa68-ad11-4ad2-8596-951823d970cc-public-tls-certs\") pod \"heat-api-697fddcf97-nrt68\" (UID: \"7a6dfa68-ad11-4ad2-8596-951823d970cc\") " pod="openstack/heat-api-697fddcf97-nrt68" Dec 01 08:45:33 crc kubenswrapper[5004]: I1201 08:45:33.959476 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a6dfa68-ad11-4ad2-8596-951823d970cc-internal-tls-certs\") pod \"heat-api-697fddcf97-nrt68\" (UID: \"7a6dfa68-ad11-4ad2-8596-951823d970cc\") " pod="openstack/heat-api-697fddcf97-nrt68" Dec 01 08:45:33 crc kubenswrapper[5004]: I1201 08:45:33.959499 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wfqx\" (UniqueName: \"kubernetes.io/projected/1c0efc3a-6559-4f89-8d20-c11f7b4f291b-kube-api-access-6wfqx\") pod \"heat-engine-74fd44f8dd-bd4pt\" (UID: \"1c0efc3a-6559-4f89-8d20-c11f7b4f291b\") " pod="openstack/heat-engine-74fd44f8dd-bd4pt" Dec 01 08:45:33 crc kubenswrapper[5004]: I1201 08:45:33.959527 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a6dfa68-ad11-4ad2-8596-951823d970cc-config-data-custom\") pod \"heat-api-697fddcf97-nrt68\" (UID: \"7a6dfa68-ad11-4ad2-8596-951823d970cc\") " pod="openstack/heat-api-697fddcf97-nrt68" Dec 01 08:45:33 crc kubenswrapper[5004]: I1201 08:45:33.966543 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c0efc3a-6559-4f89-8d20-c11f7b4f291b-config-data\") pod \"heat-engine-74fd44f8dd-bd4pt\" (UID: \"1c0efc3a-6559-4f89-8d20-c11f7b4f291b\") " pod="openstack/heat-engine-74fd44f8dd-bd4pt" Dec 01 08:45:33 crc kubenswrapper[5004]: I1201 08:45:33.967535 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1c0efc3a-6559-4f89-8d20-c11f7b4f291b-config-data-custom\") pod \"heat-engine-74fd44f8dd-bd4pt\" (UID: \"1c0efc3a-6559-4f89-8d20-c11f7b4f291b\") " pod="openstack/heat-engine-74fd44f8dd-bd4pt" Dec 01 08:45:33 crc kubenswrapper[5004]: I1201 08:45:33.983814 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c0efc3a-6559-4f89-8d20-c11f7b4f291b-combined-ca-bundle\") pod \"heat-engine-74fd44f8dd-bd4pt\" (UID: \"1c0efc3a-6559-4f89-8d20-c11f7b4f291b\") " pod="openstack/heat-engine-74fd44f8dd-bd4pt" Dec 01 08:45:33 crc kubenswrapper[5004]: I1201 08:45:33.996623 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wfqx\" (UniqueName: \"kubernetes.io/projected/1c0efc3a-6559-4f89-8d20-c11f7b4f291b-kube-api-access-6wfqx\") pod \"heat-engine-74fd44f8dd-bd4pt\" (UID: \"1c0efc3a-6559-4f89-8d20-c11f7b4f291b\") " pod="openstack/heat-engine-74fd44f8dd-bd4pt" Dec 01 08:45:34 crc kubenswrapper[5004]: I1201 08:45:34.003070 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-594cb89c79-lt85z"] Dec 01 08:45:34 crc kubenswrapper[5004]: I1201 08:45:34.003317 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-594cb89c79-lt85z" podUID="37126008-a0f7-45ad-a2b6-ff127083b74e" containerName="dnsmasq-dns" containerID="cri-o://9e760ef0364d3c21ed545b09da924d56c4851c8e0b9a2d663e5d77154a10bf36" gracePeriod=10 Dec 01 08:45:34 crc kubenswrapper[5004]: I1201 08:45:34.060976 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a6dfa68-ad11-4ad2-8596-951823d970cc-public-tls-certs\") pod \"heat-api-697fddcf97-nrt68\" (UID: \"7a6dfa68-ad11-4ad2-8596-951823d970cc\") " pod="openstack/heat-api-697fddcf97-nrt68" Dec 01 08:45:34 crc kubenswrapper[5004]: I1201 08:45:34.061523 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a6dfa68-ad11-4ad2-8596-951823d970cc-internal-tls-certs\") pod \"heat-api-697fddcf97-nrt68\" (UID: \"7a6dfa68-ad11-4ad2-8596-951823d970cc\") " pod="openstack/heat-api-697fddcf97-nrt68" Dec 01 08:45:34 crc kubenswrapper[5004]: I1201 08:45:34.061967 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a88be62-a3b8-4a6c-b226-e0e203d4c022-combined-ca-bundle\") pod \"heat-cfnapi-54fdd54654-fhlqp\" (UID: \"8a88be62-a3b8-4a6c-b226-e0e203d4c022\") " pod="openstack/heat-cfnapi-54fdd54654-fhlqp" Dec 01 08:45:34 crc kubenswrapper[5004]: I1201 08:45:34.062086 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a6dfa68-ad11-4ad2-8596-951823d970cc-config-data-custom\") pod \"heat-api-697fddcf97-nrt68\" (UID: \"7a6dfa68-ad11-4ad2-8596-951823d970cc\") " pod="openstack/heat-api-697fddcf97-nrt68" Dec 01 08:45:34 crc kubenswrapper[5004]: I1201 08:45:34.062820 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcswj\" (UniqueName: \"kubernetes.io/projected/7a6dfa68-ad11-4ad2-8596-951823d970cc-kube-api-access-wcswj\") pod \"heat-api-697fddcf97-nrt68\" (UID: \"7a6dfa68-ad11-4ad2-8596-951823d970cc\") " pod="openstack/heat-api-697fddcf97-nrt68" Dec 01 08:45:34 crc kubenswrapper[5004]: I1201 08:45:34.064866 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a88be62-a3b8-4a6c-b226-e0e203d4c022-internal-tls-certs\") pod \"heat-cfnapi-54fdd54654-fhlqp\" (UID: \"8a88be62-a3b8-4a6c-b226-e0e203d4c022\") " pod="openstack/heat-cfnapi-54fdd54654-fhlqp" Dec 01 08:45:34 crc kubenswrapper[5004]: I1201 08:45:34.064953 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a6dfa68-ad11-4ad2-8596-951823d970cc-combined-ca-bundle\") pod \"heat-api-697fddcf97-nrt68\" (UID: \"7a6dfa68-ad11-4ad2-8596-951823d970cc\") " pod="openstack/heat-api-697fddcf97-nrt68" Dec 01 08:45:34 crc kubenswrapper[5004]: I1201 08:45:34.065030 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a88be62-a3b8-4a6c-b226-e0e203d4c022-config-data-custom\") pod \"heat-cfnapi-54fdd54654-fhlqp\" (UID: \"8a88be62-a3b8-4a6c-b226-e0e203d4c022\") " pod="openstack/heat-cfnapi-54fdd54654-fhlqp" Dec 01 08:45:34 crc kubenswrapper[5004]: I1201 08:45:34.065124 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jks27\" (UniqueName: \"kubernetes.io/projected/8a88be62-a3b8-4a6c-b226-e0e203d4c022-kube-api-access-jks27\") pod \"heat-cfnapi-54fdd54654-fhlqp\" (UID: \"8a88be62-a3b8-4a6c-b226-e0e203d4c022\") " pod="openstack/heat-cfnapi-54fdd54654-fhlqp" Dec 01 08:45:34 crc kubenswrapper[5004]: I1201 08:45:34.065174 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a88be62-a3b8-4a6c-b226-e0e203d4c022-public-tls-certs\") pod \"heat-cfnapi-54fdd54654-fhlqp\" (UID: \"8a88be62-a3b8-4a6c-b226-e0e203d4c022\") " pod="openstack/heat-cfnapi-54fdd54654-fhlqp" Dec 01 08:45:34 crc kubenswrapper[5004]: I1201 08:45:34.065209 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a6dfa68-ad11-4ad2-8596-951823d970cc-public-tls-certs\") pod \"heat-api-697fddcf97-nrt68\" (UID: \"7a6dfa68-ad11-4ad2-8596-951823d970cc\") " pod="openstack/heat-api-697fddcf97-nrt68" Dec 01 08:45:34 crc kubenswrapper[5004]: I1201 08:45:34.065305 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a88be62-a3b8-4a6c-b226-e0e203d4c022-config-data\") pod \"heat-cfnapi-54fdd54654-fhlqp\" (UID: \"8a88be62-a3b8-4a6c-b226-e0e203d4c022\") " pod="openstack/heat-cfnapi-54fdd54654-fhlqp" Dec 01 08:45:34 crc kubenswrapper[5004]: I1201 08:45:34.065369 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a6dfa68-ad11-4ad2-8596-951823d970cc-config-data\") pod \"heat-api-697fddcf97-nrt68\" (UID: \"7a6dfa68-ad11-4ad2-8596-951823d970cc\") " pod="openstack/heat-api-697fddcf97-nrt68" Dec 01 08:45:34 crc kubenswrapper[5004]: I1201 08:45:34.066163 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a6dfa68-ad11-4ad2-8596-951823d970cc-config-data-custom\") pod \"heat-api-697fddcf97-nrt68\" (UID: \"7a6dfa68-ad11-4ad2-8596-951823d970cc\") " pod="openstack/heat-api-697fddcf97-nrt68" Dec 01 08:45:34 crc kubenswrapper[5004]: I1201 08:45:34.071281 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a6dfa68-ad11-4ad2-8596-951823d970cc-config-data\") pod \"heat-api-697fddcf97-nrt68\" (UID: \"7a6dfa68-ad11-4ad2-8596-951823d970cc\") " pod="openstack/heat-api-697fddcf97-nrt68" Dec 01 08:45:34 crc kubenswrapper[5004]: I1201 08:45:34.080340 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-74fd44f8dd-bd4pt" Dec 01 08:45:34 crc kubenswrapper[5004]: I1201 08:45:34.082220 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a6dfa68-ad11-4ad2-8596-951823d970cc-combined-ca-bundle\") pod \"heat-api-697fddcf97-nrt68\" (UID: \"7a6dfa68-ad11-4ad2-8596-951823d970cc\") " pod="openstack/heat-api-697fddcf97-nrt68" Dec 01 08:45:34 crc kubenswrapper[5004]: I1201 08:45:34.082618 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a6dfa68-ad11-4ad2-8596-951823d970cc-internal-tls-certs\") pod \"heat-api-697fddcf97-nrt68\" (UID: \"7a6dfa68-ad11-4ad2-8596-951823d970cc\") " pod="openstack/heat-api-697fddcf97-nrt68" Dec 01 08:45:34 crc kubenswrapper[5004]: I1201 08:45:34.085261 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcswj\" (UniqueName: \"kubernetes.io/projected/7a6dfa68-ad11-4ad2-8596-951823d970cc-kube-api-access-wcswj\") pod \"heat-api-697fddcf97-nrt68\" (UID: \"7a6dfa68-ad11-4ad2-8596-951823d970cc\") " pod="openstack/heat-api-697fddcf97-nrt68" Dec 01 08:45:34 crc kubenswrapper[5004]: I1201 08:45:34.109430 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-697fddcf97-nrt68" Dec 01 08:45:34 crc kubenswrapper[5004]: I1201 08:45:34.170099 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a88be62-a3b8-4a6c-b226-e0e203d4c022-combined-ca-bundle\") pod \"heat-cfnapi-54fdd54654-fhlqp\" (UID: \"8a88be62-a3b8-4a6c-b226-e0e203d4c022\") " pod="openstack/heat-cfnapi-54fdd54654-fhlqp" Dec 01 08:45:34 crc kubenswrapper[5004]: I1201 08:45:34.170331 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a88be62-a3b8-4a6c-b226-e0e203d4c022-internal-tls-certs\") pod \"heat-cfnapi-54fdd54654-fhlqp\" (UID: \"8a88be62-a3b8-4a6c-b226-e0e203d4c022\") " pod="openstack/heat-cfnapi-54fdd54654-fhlqp" Dec 01 08:45:34 crc kubenswrapper[5004]: I1201 08:45:34.170409 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a88be62-a3b8-4a6c-b226-e0e203d4c022-config-data-custom\") pod \"heat-cfnapi-54fdd54654-fhlqp\" (UID: \"8a88be62-a3b8-4a6c-b226-e0e203d4c022\") " pod="openstack/heat-cfnapi-54fdd54654-fhlqp" Dec 01 08:45:34 crc kubenswrapper[5004]: I1201 08:45:34.170461 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jks27\" (UniqueName: \"kubernetes.io/projected/8a88be62-a3b8-4a6c-b226-e0e203d4c022-kube-api-access-jks27\") pod \"heat-cfnapi-54fdd54654-fhlqp\" (UID: \"8a88be62-a3b8-4a6c-b226-e0e203d4c022\") " pod="openstack/heat-cfnapi-54fdd54654-fhlqp" Dec 01 08:45:34 crc kubenswrapper[5004]: I1201 08:45:34.170497 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a88be62-a3b8-4a6c-b226-e0e203d4c022-public-tls-certs\") pod \"heat-cfnapi-54fdd54654-fhlqp\" (UID: \"8a88be62-a3b8-4a6c-b226-e0e203d4c022\") " pod="openstack/heat-cfnapi-54fdd54654-fhlqp" Dec 01 08:45:34 crc kubenswrapper[5004]: I1201 08:45:34.170553 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a88be62-a3b8-4a6c-b226-e0e203d4c022-config-data\") pod \"heat-cfnapi-54fdd54654-fhlqp\" (UID: \"8a88be62-a3b8-4a6c-b226-e0e203d4c022\") " pod="openstack/heat-cfnapi-54fdd54654-fhlqp" Dec 01 08:45:34 crc kubenswrapper[5004]: I1201 08:45:34.175224 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a88be62-a3b8-4a6c-b226-e0e203d4c022-config-data-custom\") pod \"heat-cfnapi-54fdd54654-fhlqp\" (UID: \"8a88be62-a3b8-4a6c-b226-e0e203d4c022\") " pod="openstack/heat-cfnapi-54fdd54654-fhlqp" Dec 01 08:45:34 crc kubenswrapper[5004]: I1201 08:45:34.175461 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a88be62-a3b8-4a6c-b226-e0e203d4c022-combined-ca-bundle\") pod \"heat-cfnapi-54fdd54654-fhlqp\" (UID: \"8a88be62-a3b8-4a6c-b226-e0e203d4c022\") " pod="openstack/heat-cfnapi-54fdd54654-fhlqp" Dec 01 08:45:34 crc kubenswrapper[5004]: I1201 08:45:34.176952 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a88be62-a3b8-4a6c-b226-e0e203d4c022-public-tls-certs\") pod \"heat-cfnapi-54fdd54654-fhlqp\" (UID: \"8a88be62-a3b8-4a6c-b226-e0e203d4c022\") " pod="openstack/heat-cfnapi-54fdd54654-fhlqp" Dec 01 08:45:34 crc kubenswrapper[5004]: I1201 08:45:34.177700 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a88be62-a3b8-4a6c-b226-e0e203d4c022-internal-tls-certs\") pod \"heat-cfnapi-54fdd54654-fhlqp\" (UID: \"8a88be62-a3b8-4a6c-b226-e0e203d4c022\") " pod="openstack/heat-cfnapi-54fdd54654-fhlqp" Dec 01 08:45:34 crc kubenswrapper[5004]: I1201 08:45:34.185145 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a88be62-a3b8-4a6c-b226-e0e203d4c022-config-data\") pod \"heat-cfnapi-54fdd54654-fhlqp\" (UID: \"8a88be62-a3b8-4a6c-b226-e0e203d4c022\") " pod="openstack/heat-cfnapi-54fdd54654-fhlqp" Dec 01 08:45:34 crc kubenswrapper[5004]: I1201 08:45:34.186988 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jks27\" (UniqueName: \"kubernetes.io/projected/8a88be62-a3b8-4a6c-b226-e0e203d4c022-kube-api-access-jks27\") pod \"heat-cfnapi-54fdd54654-fhlqp\" (UID: \"8a88be62-a3b8-4a6c-b226-e0e203d4c022\") " pod="openstack/heat-cfnapi-54fdd54654-fhlqp" Dec 01 08:45:34 crc kubenswrapper[5004]: I1201 08:45:34.237594 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-54fdd54654-fhlqp" Dec 01 08:45:34 crc kubenswrapper[5004]: W1201 08:45:34.595720 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a6dfa68_ad11_4ad2_8596_951823d970cc.slice/crio-3ebea9c9316b0604d77d63727b9393124619414e274e9508be4fb00c7111dadc WatchSource:0}: Error finding container 3ebea9c9316b0604d77d63727b9393124619414e274e9508be4fb00c7111dadc: Status 404 returned error can't find the container with id 3ebea9c9316b0604d77d63727b9393124619414e274e9508be4fb00c7111dadc Dec 01 08:45:34 crc kubenswrapper[5004]: I1201 08:45:34.602111 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-697fddcf97-nrt68"] Dec 01 08:45:34 crc kubenswrapper[5004]: I1201 08:45:34.780080 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-74fd44f8dd-bd4pt"] Dec 01 08:45:34 crc kubenswrapper[5004]: I1201 08:45:34.810924 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-594cb89c79-lt85z" Dec 01 08:45:34 crc kubenswrapper[5004]: I1201 08:45:34.893004 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/37126008-a0f7-45ad-a2b6-ff127083b74e-openstack-edpm-ipam\") pod \"37126008-a0f7-45ad-a2b6-ff127083b74e\" (UID: \"37126008-a0f7-45ad-a2b6-ff127083b74e\") " Dec 01 08:45:34 crc kubenswrapper[5004]: I1201 08:45:34.893283 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/37126008-a0f7-45ad-a2b6-ff127083b74e-dns-swift-storage-0\") pod \"37126008-a0f7-45ad-a2b6-ff127083b74e\" (UID: \"37126008-a0f7-45ad-a2b6-ff127083b74e\") " Dec 01 08:45:34 crc kubenswrapper[5004]: I1201 08:45:34.893535 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37126008-a0f7-45ad-a2b6-ff127083b74e-ovsdbserver-sb\") pod \"37126008-a0f7-45ad-a2b6-ff127083b74e\" (UID: \"37126008-a0f7-45ad-a2b6-ff127083b74e\") " Dec 01 08:45:34 crc kubenswrapper[5004]: I1201 08:45:34.894030 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37126008-a0f7-45ad-a2b6-ff127083b74e-ovsdbserver-nb\") pod \"37126008-a0f7-45ad-a2b6-ff127083b74e\" (UID: \"37126008-a0f7-45ad-a2b6-ff127083b74e\") " Dec 01 08:45:34 crc kubenswrapper[5004]: I1201 08:45:34.895422 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2l562\" (UniqueName: \"kubernetes.io/projected/37126008-a0f7-45ad-a2b6-ff127083b74e-kube-api-access-2l562\") pod \"37126008-a0f7-45ad-a2b6-ff127083b74e\" (UID: \"37126008-a0f7-45ad-a2b6-ff127083b74e\") " Dec 01 08:45:34 crc kubenswrapper[5004]: I1201 08:45:34.895900 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37126008-a0f7-45ad-a2b6-ff127083b74e-dns-svc\") pod \"37126008-a0f7-45ad-a2b6-ff127083b74e\" (UID: \"37126008-a0f7-45ad-a2b6-ff127083b74e\") " Dec 01 08:45:34 crc kubenswrapper[5004]: I1201 08:45:34.897305 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37126008-a0f7-45ad-a2b6-ff127083b74e-config\") pod \"37126008-a0f7-45ad-a2b6-ff127083b74e\" (UID: \"37126008-a0f7-45ad-a2b6-ff127083b74e\") " Dec 01 08:45:34 crc kubenswrapper[5004]: I1201 08:45:34.914249 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37126008-a0f7-45ad-a2b6-ff127083b74e-kube-api-access-2l562" (OuterVolumeSpecName: "kube-api-access-2l562") pod "37126008-a0f7-45ad-a2b6-ff127083b74e" (UID: "37126008-a0f7-45ad-a2b6-ff127083b74e"). InnerVolumeSpecName "kube-api-access-2l562". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:45:34 crc kubenswrapper[5004]: I1201 08:45:34.963703 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-54fdd54654-fhlqp"] Dec 01 08:45:35 crc kubenswrapper[5004]: I1201 08:45:35.003702 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2l562\" (UniqueName: \"kubernetes.io/projected/37126008-a0f7-45ad-a2b6-ff127083b74e-kube-api-access-2l562\") on node \"crc\" DevicePath \"\"" Dec 01 08:45:35 crc kubenswrapper[5004]: I1201 08:45:35.042128 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37126008-a0f7-45ad-a2b6-ff127083b74e-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "37126008-a0f7-45ad-a2b6-ff127083b74e" (UID: "37126008-a0f7-45ad-a2b6-ff127083b74e"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:45:35 crc kubenswrapper[5004]: I1201 08:45:35.046020 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37126008-a0f7-45ad-a2b6-ff127083b74e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "37126008-a0f7-45ad-a2b6-ff127083b74e" (UID: "37126008-a0f7-45ad-a2b6-ff127083b74e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:45:35 crc kubenswrapper[5004]: I1201 08:45:35.046342 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37126008-a0f7-45ad-a2b6-ff127083b74e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "37126008-a0f7-45ad-a2b6-ff127083b74e" (UID: "37126008-a0f7-45ad-a2b6-ff127083b74e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:45:35 crc kubenswrapper[5004]: I1201 08:45:35.069490 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37126008-a0f7-45ad-a2b6-ff127083b74e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "37126008-a0f7-45ad-a2b6-ff127083b74e" (UID: "37126008-a0f7-45ad-a2b6-ff127083b74e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:45:35 crc kubenswrapper[5004]: I1201 08:45:35.077911 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37126008-a0f7-45ad-a2b6-ff127083b74e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "37126008-a0f7-45ad-a2b6-ff127083b74e" (UID: "37126008-a0f7-45ad-a2b6-ff127083b74e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:45:35 crc kubenswrapper[5004]: I1201 08:45:35.096512 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37126008-a0f7-45ad-a2b6-ff127083b74e-config" (OuterVolumeSpecName: "config") pod "37126008-a0f7-45ad-a2b6-ff127083b74e" (UID: "37126008-a0f7-45ad-a2b6-ff127083b74e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:45:35 crc kubenswrapper[5004]: I1201 08:45:35.106833 5004 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37126008-a0f7-45ad-a2b6-ff127083b74e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 08:45:35 crc kubenswrapper[5004]: I1201 08:45:35.106864 5004 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37126008-a0f7-45ad-a2b6-ff127083b74e-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 08:45:35 crc kubenswrapper[5004]: I1201 08:45:35.106873 5004 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37126008-a0f7-45ad-a2b6-ff127083b74e-config\") on node \"crc\" DevicePath \"\"" Dec 01 08:45:35 crc kubenswrapper[5004]: I1201 08:45:35.106884 5004 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/37126008-a0f7-45ad-a2b6-ff127083b74e-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 01 08:45:35 crc kubenswrapper[5004]: I1201 08:45:35.106893 5004 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/37126008-a0f7-45ad-a2b6-ff127083b74e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 01 08:45:35 crc kubenswrapper[5004]: I1201 08:45:35.106903 5004 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37126008-a0f7-45ad-a2b6-ff127083b74e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 08:45:35 crc kubenswrapper[5004]: I1201 08:45:35.139077 5004 generic.go:334] "Generic (PLEG): container finished" podID="37126008-a0f7-45ad-a2b6-ff127083b74e" containerID="9e760ef0364d3c21ed545b09da924d56c4851c8e0b9a2d663e5d77154a10bf36" exitCode=0 Dec 01 08:45:35 crc kubenswrapper[5004]: I1201 08:45:35.139132 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-594cb89c79-lt85z" event={"ID":"37126008-a0f7-45ad-a2b6-ff127083b74e","Type":"ContainerDied","Data":"9e760ef0364d3c21ed545b09da924d56c4851c8e0b9a2d663e5d77154a10bf36"} Dec 01 08:45:35 crc kubenswrapper[5004]: I1201 08:45:35.139159 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-594cb89c79-lt85z" event={"ID":"37126008-a0f7-45ad-a2b6-ff127083b74e","Type":"ContainerDied","Data":"44b7e48189edb10f4d86bf497d54e4bccfc2dbb880938f25e9be54743e9a4edc"} Dec 01 08:45:35 crc kubenswrapper[5004]: I1201 08:45:35.139178 5004 scope.go:117] "RemoveContainer" containerID="9e760ef0364d3c21ed545b09da924d56c4851c8e0b9a2d663e5d77154a10bf36" Dec 01 08:45:35 crc kubenswrapper[5004]: I1201 08:45:35.139314 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-594cb89c79-lt85z" Dec 01 08:45:35 crc kubenswrapper[5004]: I1201 08:45:35.144705 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-54fdd54654-fhlqp" event={"ID":"8a88be62-a3b8-4a6c-b226-e0e203d4c022","Type":"ContainerStarted","Data":"32bd219414c5c961bb793a24d16b7a12e703a38eb922aeedc6de0a9f4af30172"} Dec 01 08:45:35 crc kubenswrapper[5004]: I1201 08:45:35.146995 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-74fd44f8dd-bd4pt" event={"ID":"1c0efc3a-6559-4f89-8d20-c11f7b4f291b","Type":"ContainerStarted","Data":"c64fce695061aab7393e0e3691f5b3898dad47660eba1d971ae2f8a224d58eb7"} Dec 01 08:45:35 crc kubenswrapper[5004]: I1201 08:45:35.147020 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-74fd44f8dd-bd4pt" event={"ID":"1c0efc3a-6559-4f89-8d20-c11f7b4f291b","Type":"ContainerStarted","Data":"c3fefe05a8ea618ef6b5423a08b31b5e330cd5101ba14ad751e7c38eef9a35e9"} Dec 01 08:45:35 crc kubenswrapper[5004]: I1201 08:45:35.148044 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-74fd44f8dd-bd4pt" Dec 01 08:45:35 crc kubenswrapper[5004]: I1201 08:45:35.150237 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-697fddcf97-nrt68" event={"ID":"7a6dfa68-ad11-4ad2-8596-951823d970cc","Type":"ContainerStarted","Data":"3ebea9c9316b0604d77d63727b9393124619414e274e9508be4fb00c7111dadc"} Dec 01 08:45:35 crc kubenswrapper[5004]: I1201 08:45:35.166887 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-74fd44f8dd-bd4pt" podStartSLOduration=2.166870353 podStartE2EDuration="2.166870353s" podCreationTimestamp="2025-12-01 08:45:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:45:35.163110084 +0000 UTC m=+1712.728102066" watchObservedRunningTime="2025-12-01 08:45:35.166870353 +0000 UTC m=+1712.731862325" Dec 01 08:45:35 crc kubenswrapper[5004]: I1201 08:45:35.178126 5004 scope.go:117] "RemoveContainer" containerID="da3dbbd8a270e3e009bd061b9263173b7e9c79fbc742ad062a19712ac9c33067" Dec 01 08:45:35 crc kubenswrapper[5004]: I1201 08:45:35.191399 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-594cb89c79-lt85z"] Dec 01 08:45:35 crc kubenswrapper[5004]: I1201 08:45:35.202464 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-594cb89c79-lt85z"] Dec 01 08:45:35 crc kubenswrapper[5004]: I1201 08:45:35.203234 5004 scope.go:117] "RemoveContainer" containerID="9e760ef0364d3c21ed545b09da924d56c4851c8e0b9a2d663e5d77154a10bf36" Dec 01 08:45:35 crc kubenswrapper[5004]: E1201 08:45:35.203736 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e760ef0364d3c21ed545b09da924d56c4851c8e0b9a2d663e5d77154a10bf36\": container with ID starting with 9e760ef0364d3c21ed545b09da924d56c4851c8e0b9a2d663e5d77154a10bf36 not found: ID does not exist" containerID="9e760ef0364d3c21ed545b09da924d56c4851c8e0b9a2d663e5d77154a10bf36" Dec 01 08:45:35 crc kubenswrapper[5004]: I1201 08:45:35.203775 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e760ef0364d3c21ed545b09da924d56c4851c8e0b9a2d663e5d77154a10bf36"} err="failed to get container status \"9e760ef0364d3c21ed545b09da924d56c4851c8e0b9a2d663e5d77154a10bf36\": rpc error: code = NotFound desc = could not find container \"9e760ef0364d3c21ed545b09da924d56c4851c8e0b9a2d663e5d77154a10bf36\": container with ID starting with 9e760ef0364d3c21ed545b09da924d56c4851c8e0b9a2d663e5d77154a10bf36 not found: ID does not exist" Dec 01 08:45:35 crc kubenswrapper[5004]: I1201 08:45:35.203801 5004 scope.go:117] "RemoveContainer" containerID="da3dbbd8a270e3e009bd061b9263173b7e9c79fbc742ad062a19712ac9c33067" Dec 01 08:45:35 crc kubenswrapper[5004]: E1201 08:45:35.204139 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da3dbbd8a270e3e009bd061b9263173b7e9c79fbc742ad062a19712ac9c33067\": container with ID starting with da3dbbd8a270e3e009bd061b9263173b7e9c79fbc742ad062a19712ac9c33067 not found: ID does not exist" containerID="da3dbbd8a270e3e009bd061b9263173b7e9c79fbc742ad062a19712ac9c33067" Dec 01 08:45:35 crc kubenswrapper[5004]: I1201 08:45:35.204169 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da3dbbd8a270e3e009bd061b9263173b7e9c79fbc742ad062a19712ac9c33067"} err="failed to get container status \"da3dbbd8a270e3e009bd061b9263173b7e9c79fbc742ad062a19712ac9c33067\": rpc error: code = NotFound desc = could not find container \"da3dbbd8a270e3e009bd061b9263173b7e9c79fbc742ad062a19712ac9c33067\": container with ID starting with da3dbbd8a270e3e009bd061b9263173b7e9c79fbc742ad062a19712ac9c33067 not found: ID does not exist" Dec 01 08:45:36 crc kubenswrapper[5004]: I1201 08:45:36.770198 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37126008-a0f7-45ad-a2b6-ff127083b74e" path="/var/lib/kubelet/pods/37126008-a0f7-45ad-a2b6-ff127083b74e/volumes" Dec 01 08:45:39 crc kubenswrapper[5004]: I1201 08:45:39.203078 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-697fddcf97-nrt68" event={"ID":"7a6dfa68-ad11-4ad2-8596-951823d970cc","Type":"ContainerStarted","Data":"224c3854ccbd6286c4dc437b903ebf4099c7b1d655617cc7fa6d28d5a19fbc85"} Dec 01 08:45:39 crc kubenswrapper[5004]: I1201 08:45:39.206263 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-54fdd54654-fhlqp" event={"ID":"8a88be62-a3b8-4a6c-b226-e0e203d4c022","Type":"ContainerStarted","Data":"f582449ad5ed1dec3860364b1f0852423d5427ab9d841aa57c5281fbf48cb5cf"} Dec 01 08:45:39 crc kubenswrapper[5004]: I1201 08:45:39.206636 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-54fdd54654-fhlqp" Dec 01 08:45:39 crc kubenswrapper[5004]: I1201 08:45:39.229952 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-697fddcf97-nrt68" podStartSLOduration=2.807312675 podStartE2EDuration="6.229928875s" podCreationTimestamp="2025-12-01 08:45:33 +0000 UTC" firstStartedPulling="2025-12-01 08:45:34.597532869 +0000 UTC m=+1712.162524851" lastFinishedPulling="2025-12-01 08:45:38.020149069 +0000 UTC m=+1715.585141051" observedRunningTime="2025-12-01 08:45:39.22588767 +0000 UTC m=+1716.790879692" watchObservedRunningTime="2025-12-01 08:45:39.229928875 +0000 UTC m=+1716.794920867" Dec 01 08:45:39 crc kubenswrapper[5004]: I1201 08:45:39.267509 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-54fdd54654-fhlqp" podStartSLOduration=3.226528655 podStartE2EDuration="6.267483902s" podCreationTimestamp="2025-12-01 08:45:33 +0000 UTC" firstStartedPulling="2025-12-01 08:45:34.976756524 +0000 UTC m=+1712.541748506" lastFinishedPulling="2025-12-01 08:45:38.017711761 +0000 UTC m=+1715.582703753" observedRunningTime="2025-12-01 08:45:39.256090943 +0000 UTC m=+1716.821082975" watchObservedRunningTime="2025-12-01 08:45:39.267483902 +0000 UTC m=+1716.832475894" Dec 01 08:45:40 crc kubenswrapper[5004]: I1201 08:45:40.222129 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-697fddcf97-nrt68" Dec 01 08:45:44 crc kubenswrapper[5004]: I1201 08:45:44.760225 5004 scope.go:117] "RemoveContainer" containerID="70f45b3d2a6bb4bf0d87f18bc08b282543e238e1d391a58745914af24ad7956c" Dec 01 08:45:44 crc kubenswrapper[5004]: E1201 08:45:44.762043 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 08:45:45 crc kubenswrapper[5004]: I1201 08:45:45.454542 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-697fddcf97-nrt68" Dec 01 08:45:45 crc kubenswrapper[5004]: I1201 08:45:45.466114 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-54fdd54654-fhlqp" Dec 01 08:45:45 crc kubenswrapper[5004]: I1201 08:45:45.545429 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-77c6469896-8fmqx"] Dec 01 08:45:45 crc kubenswrapper[5004]: I1201 08:45:45.545649 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-77c6469896-8fmqx" podUID="6cda3888-d928-439e-9dfa-54e3535e4be9" containerName="heat-api" containerID="cri-o://08449de0d68aad31145fdc1ede407b7192336817d4528c0aa1b7e6af9203d422" gracePeriod=60 Dec 01 08:45:45 crc kubenswrapper[5004]: I1201 08:45:45.575028 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-5b78cf6765-f2dsx"] Dec 01 08:45:45 crc kubenswrapper[5004]: I1201 08:45:45.575388 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-5b78cf6765-f2dsx" podUID="a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a" containerName="heat-cfnapi" containerID="cri-o://a7674e31b785c47a0c5f335ab681e055f6c081ef07eb98adb40bb183d9b0987b" gracePeriod=60 Dec 01 08:45:48 crc kubenswrapper[5004]: I1201 08:45:48.342049 5004 generic.go:334] "Generic (PLEG): container finished" podID="56831219-9428-45a8-8888-869bc645d080" containerID="e5af2dcf78f7d12950029477b6acb6ce3560f5f0060986de61cf5576be8671d4" exitCode=0 Dec 01 08:45:48 crc kubenswrapper[5004]: I1201 08:45:48.342155 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"56831219-9428-45a8-8888-869bc645d080","Type":"ContainerDied","Data":"e5af2dcf78f7d12950029477b6acb6ce3560f5f0060986de61cf5576be8671d4"} Dec 01 08:45:48 crc kubenswrapper[5004]: I1201 08:45:48.353147 5004 generic.go:334] "Generic (PLEG): container finished" podID="af2896c5-aa9f-47d7-ba02-ebea4bbd29ed" containerID="848c231234ec861d648dae35401b2cb1f4150307feba3b366e0ddd04db606c7b" exitCode=0 Dec 01 08:45:48 crc kubenswrapper[5004]: I1201 08:45:48.353205 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"af2896c5-aa9f-47d7-ba02-ebea4bbd29ed","Type":"ContainerDied","Data":"848c231234ec861d648dae35401b2cb1f4150307feba3b366e0ddd04db606c7b"} Dec 01 08:45:48 crc kubenswrapper[5004]: I1201 08:45:48.726502 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-77c6469896-8fmqx" podUID="6cda3888-d928-439e-9dfa-54e3535e4be9" containerName="heat-api" probeResult="failure" output="Get \"https://10.217.0.216:8004/healthcheck\": read tcp 10.217.0.2:33966->10.217.0.216:8004: read: connection reset by peer" Dec 01 08:45:48 crc kubenswrapper[5004]: I1201 08:45:48.733691 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-5b78cf6765-f2dsx" podUID="a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a" containerName="heat-cfnapi" probeResult="failure" output="Get \"https://10.217.0.217:8000/healthcheck\": read tcp 10.217.0.2:42406->10.217.0.217:8000: read: connection reset by peer" Dec 01 08:45:49 crc kubenswrapper[5004]: I1201 08:45:49.205751 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-77c6469896-8fmqx" Dec 01 08:45:49 crc kubenswrapper[5004]: I1201 08:45:49.301474 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cda3888-d928-439e-9dfa-54e3535e4be9-config-data\") pod \"6cda3888-d928-439e-9dfa-54e3535e4be9\" (UID: \"6cda3888-d928-439e-9dfa-54e3535e4be9\") " Dec 01 08:45:49 crc kubenswrapper[5004]: I1201 08:45:49.301646 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7wzx\" (UniqueName: \"kubernetes.io/projected/6cda3888-d928-439e-9dfa-54e3535e4be9-kube-api-access-q7wzx\") pod \"6cda3888-d928-439e-9dfa-54e3535e4be9\" (UID: \"6cda3888-d928-439e-9dfa-54e3535e4be9\") " Dec 01 08:45:49 crc kubenswrapper[5004]: I1201 08:45:49.301742 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6cda3888-d928-439e-9dfa-54e3535e4be9-config-data-custom\") pod \"6cda3888-d928-439e-9dfa-54e3535e4be9\" (UID: \"6cda3888-d928-439e-9dfa-54e3535e4be9\") " Dec 01 08:45:49 crc kubenswrapper[5004]: I1201 08:45:49.301772 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cda3888-d928-439e-9dfa-54e3535e4be9-public-tls-certs\") pod \"6cda3888-d928-439e-9dfa-54e3535e4be9\" (UID: \"6cda3888-d928-439e-9dfa-54e3535e4be9\") " Dec 01 08:45:49 crc kubenswrapper[5004]: I1201 08:45:49.301855 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cda3888-d928-439e-9dfa-54e3535e4be9-combined-ca-bundle\") pod \"6cda3888-d928-439e-9dfa-54e3535e4be9\" (UID: \"6cda3888-d928-439e-9dfa-54e3535e4be9\") " Dec 01 08:45:49 crc kubenswrapper[5004]: I1201 08:45:49.301908 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cda3888-d928-439e-9dfa-54e3535e4be9-internal-tls-certs\") pod \"6cda3888-d928-439e-9dfa-54e3535e4be9\" (UID: \"6cda3888-d928-439e-9dfa-54e3535e4be9\") " Dec 01 08:45:49 crc kubenswrapper[5004]: I1201 08:45:49.319802 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cda3888-d928-439e-9dfa-54e3535e4be9-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6cda3888-d928-439e-9dfa-54e3535e4be9" (UID: "6cda3888-d928-439e-9dfa-54e3535e4be9"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:45:49 crc kubenswrapper[5004]: I1201 08:45:49.319990 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cda3888-d928-439e-9dfa-54e3535e4be9-kube-api-access-q7wzx" (OuterVolumeSpecName: "kube-api-access-q7wzx") pod "6cda3888-d928-439e-9dfa-54e3535e4be9" (UID: "6cda3888-d928-439e-9dfa-54e3535e4be9"). InnerVolumeSpecName "kube-api-access-q7wzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:45:49 crc kubenswrapper[5004]: I1201 08:45:49.413133 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7wzx\" (UniqueName: \"kubernetes.io/projected/6cda3888-d928-439e-9dfa-54e3535e4be9-kube-api-access-q7wzx\") on node \"crc\" DevicePath \"\"" Dec 01 08:45:49 crc kubenswrapper[5004]: I1201 08:45:49.414103 5004 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6cda3888-d928-439e-9dfa-54e3535e4be9-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 01 08:45:49 crc kubenswrapper[5004]: I1201 08:45:49.429252 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cda3888-d928-439e-9dfa-54e3535e4be9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6cda3888-d928-439e-9dfa-54e3535e4be9" (UID: "6cda3888-d928-439e-9dfa-54e3535e4be9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:45:49 crc kubenswrapper[5004]: I1201 08:45:49.445938 5004 generic.go:334] "Generic (PLEG): container finished" podID="6cda3888-d928-439e-9dfa-54e3535e4be9" containerID="08449de0d68aad31145fdc1ede407b7192336817d4528c0aa1b7e6af9203d422" exitCode=0 Dec 01 08:45:49 crc kubenswrapper[5004]: I1201 08:45:49.446009 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-77c6469896-8fmqx" event={"ID":"6cda3888-d928-439e-9dfa-54e3535e4be9","Type":"ContainerDied","Data":"08449de0d68aad31145fdc1ede407b7192336817d4528c0aa1b7e6af9203d422"} Dec 01 08:45:49 crc kubenswrapper[5004]: I1201 08:45:49.446036 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-77c6469896-8fmqx" event={"ID":"6cda3888-d928-439e-9dfa-54e3535e4be9","Type":"ContainerDied","Data":"1e824aafe6b218266be44356546689940a6ea769707fa852e39c7cdf6c7c5a62"} Dec 01 08:45:49 crc kubenswrapper[5004]: I1201 08:45:49.446052 5004 scope.go:117] "RemoveContainer" containerID="08449de0d68aad31145fdc1ede407b7192336817d4528c0aa1b7e6af9203d422" Dec 01 08:45:49 crc kubenswrapper[5004]: I1201 08:45:49.446184 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-77c6469896-8fmqx" Dec 01 08:45:49 crc kubenswrapper[5004]: I1201 08:45:49.476511 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cda3888-d928-439e-9dfa-54e3535e4be9-config-data" (OuterVolumeSpecName: "config-data") pod "6cda3888-d928-439e-9dfa-54e3535e4be9" (UID: "6cda3888-d928-439e-9dfa-54e3535e4be9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:45:49 crc kubenswrapper[5004]: I1201 08:45:49.481465 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"56831219-9428-45a8-8888-869bc645d080","Type":"ContainerStarted","Data":"feebad0c424497614f290ce808228d06a351629af5b88977810333778db2b702"} Dec 01 08:45:49 crc kubenswrapper[5004]: I1201 08:45:49.482181 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:45:49 crc kubenswrapper[5004]: I1201 08:45:49.493824 5004 generic.go:334] "Generic (PLEG): container finished" podID="a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a" containerID="a7674e31b785c47a0c5f335ab681e055f6c081ef07eb98adb40bb183d9b0987b" exitCode=0 Dec 01 08:45:49 crc kubenswrapper[5004]: I1201 08:45:49.493899 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cda3888-d928-439e-9dfa-54e3535e4be9-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6cda3888-d928-439e-9dfa-54e3535e4be9" (UID: "6cda3888-d928-439e-9dfa-54e3535e4be9"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:45:49 crc kubenswrapper[5004]: I1201 08:45:49.493918 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5b78cf6765-f2dsx" event={"ID":"a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a","Type":"ContainerDied","Data":"a7674e31b785c47a0c5f335ab681e055f6c081ef07eb98adb40bb183d9b0987b"} Dec 01 08:45:49 crc kubenswrapper[5004]: I1201 08:45:49.493944 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5b78cf6765-f2dsx" event={"ID":"a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a","Type":"ContainerDied","Data":"54a55fc0d908cde346782c051fd460c62447ed402a487adde1b1354af012fff9"} Dec 01 08:45:49 crc kubenswrapper[5004]: I1201 08:45:49.493954 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54a55fc0d908cde346782c051fd460c62447ed402a487adde1b1354af012fff9" Dec 01 08:45:49 crc kubenswrapper[5004]: I1201 08:45:49.507650 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cda3888-d928-439e-9dfa-54e3535e4be9-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6cda3888-d928-439e-9dfa-54e3535e4be9" (UID: "6cda3888-d928-439e-9dfa-54e3535e4be9"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:45:49 crc kubenswrapper[5004]: I1201 08:45:49.516338 5004 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cda3888-d928-439e-9dfa-54e3535e4be9-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 08:45:49 crc kubenswrapper[5004]: I1201 08:45:49.516543 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cda3888-d928-439e-9dfa-54e3535e4be9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:45:49 crc kubenswrapper[5004]: I1201 08:45:49.516663 5004 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cda3888-d928-439e-9dfa-54e3535e4be9-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 08:45:49 crc kubenswrapper[5004]: I1201 08:45:49.516736 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cda3888-d928-439e-9dfa-54e3535e4be9-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 08:45:49 crc kubenswrapper[5004]: I1201 08:45:49.518142 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"af2896c5-aa9f-47d7-ba02-ebea4bbd29ed","Type":"ContainerStarted","Data":"60a4b9b9a42985a7a6b0849406c92aaed5804c3c28710ea48e50b6984377cd81"} Dec 01 08:45:49 crc kubenswrapper[5004]: I1201 08:45:49.519362 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 01 08:45:49 crc kubenswrapper[5004]: I1201 08:45:49.537393 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=38.537371728 podStartE2EDuration="38.537371728s" podCreationTimestamp="2025-12-01 08:45:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:45:49.517986911 +0000 UTC m=+1727.082978893" watchObservedRunningTime="2025-12-01 08:45:49.537371728 +0000 UTC m=+1727.102363710" Dec 01 08:45:49 crc kubenswrapper[5004]: I1201 08:45:49.561603 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=38.561553299 podStartE2EDuration="38.561553299s" podCreationTimestamp="2025-12-01 08:45:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 08:45:49.546189376 +0000 UTC m=+1727.111181378" watchObservedRunningTime="2025-12-01 08:45:49.561553299 +0000 UTC m=+1727.126545281" Dec 01 08:45:49 crc kubenswrapper[5004]: I1201 08:45:49.612906 5004 scope.go:117] "RemoveContainer" containerID="08449de0d68aad31145fdc1ede407b7192336817d4528c0aa1b7e6af9203d422" Dec 01 08:45:49 crc kubenswrapper[5004]: E1201 08:45:49.616939 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08449de0d68aad31145fdc1ede407b7192336817d4528c0aa1b7e6af9203d422\": container with ID starting with 08449de0d68aad31145fdc1ede407b7192336817d4528c0aa1b7e6af9203d422 not found: ID does not exist" containerID="08449de0d68aad31145fdc1ede407b7192336817d4528c0aa1b7e6af9203d422" Dec 01 08:45:49 crc kubenswrapper[5004]: I1201 08:45:49.616977 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08449de0d68aad31145fdc1ede407b7192336817d4528c0aa1b7e6af9203d422"} err="failed to get container status \"08449de0d68aad31145fdc1ede407b7192336817d4528c0aa1b7e6af9203d422\": rpc error: code = NotFound desc = could not find container \"08449de0d68aad31145fdc1ede407b7192336817d4528c0aa1b7e6af9203d422\": container with ID starting with 08449de0d68aad31145fdc1ede407b7192336817d4528c0aa1b7e6af9203d422 not found: ID does not exist" Dec 01 08:45:49 crc kubenswrapper[5004]: I1201 08:45:49.617151 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5b78cf6765-f2dsx" Dec 01 08:45:49 crc kubenswrapper[5004]: I1201 08:45:49.720657 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a-config-data-custom\") pod \"a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a\" (UID: \"a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a\") " Dec 01 08:45:49 crc kubenswrapper[5004]: I1201 08:45:49.721379 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a-combined-ca-bundle\") pod \"a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a\" (UID: \"a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a\") " Dec 01 08:45:49 crc kubenswrapper[5004]: I1201 08:45:49.721478 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a-public-tls-certs\") pod \"a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a\" (UID: \"a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a\") " Dec 01 08:45:49 crc kubenswrapper[5004]: I1201 08:45:49.721603 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a-config-data\") pod \"a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a\" (UID: \"a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a\") " Dec 01 08:45:49 crc kubenswrapper[5004]: I1201 08:45:49.721712 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5znx2\" (UniqueName: \"kubernetes.io/projected/a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a-kube-api-access-5znx2\") pod \"a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a\" (UID: \"a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a\") " Dec 01 08:45:49 crc kubenswrapper[5004]: I1201 08:45:49.721790 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a-internal-tls-certs\") pod \"a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a\" (UID: \"a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a\") " Dec 01 08:45:49 crc kubenswrapper[5004]: I1201 08:45:49.730822 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a-kube-api-access-5znx2" (OuterVolumeSpecName: "kube-api-access-5znx2") pod "a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a" (UID: "a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a"). InnerVolumeSpecName "kube-api-access-5znx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:45:49 crc kubenswrapper[5004]: I1201 08:45:49.737965 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a" (UID: "a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:45:49 crc kubenswrapper[5004]: I1201 08:45:49.760584 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a" (UID: "a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:45:49 crc kubenswrapper[5004]: I1201 08:45:49.806022 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a" (UID: "a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:45:49 crc kubenswrapper[5004]: I1201 08:45:49.824615 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:45:49 crc kubenswrapper[5004]: I1201 08:45:49.824660 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5znx2\" (UniqueName: \"kubernetes.io/projected/a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a-kube-api-access-5znx2\") on node \"crc\" DevicePath \"\"" Dec 01 08:45:49 crc kubenswrapper[5004]: I1201 08:45:49.824675 5004 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 08:45:49 crc kubenswrapper[5004]: I1201 08:45:49.824688 5004 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 01 08:45:49 crc kubenswrapper[5004]: I1201 08:45:49.825282 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-77c6469896-8fmqx"] Dec 01 08:45:49 crc kubenswrapper[5004]: I1201 08:45:49.838758 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a-config-data" (OuterVolumeSpecName: "config-data") pod "a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a" (UID: "a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:45:49 crc kubenswrapper[5004]: I1201 08:45:49.864521 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-77c6469896-8fmqx"] Dec 01 08:45:49 crc kubenswrapper[5004]: I1201 08:45:49.868696 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a" (UID: "a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:45:49 crc kubenswrapper[5004]: I1201 08:45:49.927289 5004 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 08:45:49 crc kubenswrapper[5004]: I1201 08:45:49.927326 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 08:45:50 crc kubenswrapper[5004]: I1201 08:45:50.527776 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5b78cf6765-f2dsx" Dec 01 08:45:50 crc kubenswrapper[5004]: I1201 08:45:50.576651 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-5b78cf6765-f2dsx"] Dec 01 08:45:50 crc kubenswrapper[5004]: I1201 08:45:50.586774 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-5b78cf6765-f2dsx"] Dec 01 08:45:50 crc kubenswrapper[5004]: I1201 08:45:50.771343 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cda3888-d928-439e-9dfa-54e3535e4be9" path="/var/lib/kubelet/pods/6cda3888-d928-439e-9dfa-54e3535e4be9/volumes" Dec 01 08:45:50 crc kubenswrapper[5004]: I1201 08:45:50.772127 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a" path="/var/lib/kubelet/pods/a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a/volumes" Dec 01 08:45:53 crc kubenswrapper[5004]: I1201 08:45:53.599422 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8ttnq"] Dec 01 08:45:53 crc kubenswrapper[5004]: E1201 08:45:53.600497 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cda3888-d928-439e-9dfa-54e3535e4be9" containerName="heat-api" Dec 01 08:45:53 crc kubenswrapper[5004]: I1201 08:45:53.600513 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cda3888-d928-439e-9dfa-54e3535e4be9" containerName="heat-api" Dec 01 08:45:53 crc kubenswrapper[5004]: E1201 08:45:53.600533 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a" containerName="heat-cfnapi" Dec 01 08:45:53 crc kubenswrapper[5004]: I1201 08:45:53.600540 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a" containerName="heat-cfnapi" Dec 01 08:45:53 crc kubenswrapper[5004]: E1201 08:45:53.600591 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37126008-a0f7-45ad-a2b6-ff127083b74e" containerName="init" Dec 01 08:45:53 crc kubenswrapper[5004]: I1201 08:45:53.600601 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="37126008-a0f7-45ad-a2b6-ff127083b74e" containerName="init" Dec 01 08:45:53 crc kubenswrapper[5004]: E1201 08:45:53.600638 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37126008-a0f7-45ad-a2b6-ff127083b74e" containerName="dnsmasq-dns" Dec 01 08:45:53 crc kubenswrapper[5004]: I1201 08:45:53.600646 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="37126008-a0f7-45ad-a2b6-ff127083b74e" containerName="dnsmasq-dns" Dec 01 08:45:53 crc kubenswrapper[5004]: I1201 08:45:53.600947 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="a98e9ac8-7fc0-4778-823a-fb3d6b8e0e1a" containerName="heat-cfnapi" Dec 01 08:45:53 crc kubenswrapper[5004]: I1201 08:45:53.600971 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cda3888-d928-439e-9dfa-54e3535e4be9" containerName="heat-api" Dec 01 08:45:53 crc kubenswrapper[5004]: I1201 08:45:53.600990 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="37126008-a0f7-45ad-a2b6-ff127083b74e" containerName="dnsmasq-dns" Dec 01 08:45:53 crc kubenswrapper[5004]: I1201 08:45:53.602143 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8ttnq" Dec 01 08:45:53 crc kubenswrapper[5004]: I1201 08:45:53.604336 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pdnrq" Dec 01 08:45:53 crc kubenswrapper[5004]: I1201 08:45:53.604832 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 08:45:53 crc kubenswrapper[5004]: I1201 08:45:53.609630 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 08:45:53 crc kubenswrapper[5004]: I1201 08:45:53.609654 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 08:45:53 crc kubenswrapper[5004]: I1201 08:45:53.637279 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8ttnq"] Dec 01 08:45:53 crc kubenswrapper[5004]: I1201 08:45:53.724991 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5c56494f-200d-4f94-ad70-1f53e7b5d1fe-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8ttnq\" (UID: \"5c56494f-200d-4f94-ad70-1f53e7b5d1fe\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8ttnq" Dec 01 08:45:53 crc kubenswrapper[5004]: I1201 08:45:53.725209 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c56494f-200d-4f94-ad70-1f53e7b5d1fe-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8ttnq\" (UID: \"5c56494f-200d-4f94-ad70-1f53e7b5d1fe\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8ttnq" Dec 01 08:45:53 crc kubenswrapper[5004]: I1201 08:45:53.725417 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhxf2\" (UniqueName: \"kubernetes.io/projected/5c56494f-200d-4f94-ad70-1f53e7b5d1fe-kube-api-access-jhxf2\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8ttnq\" (UID: \"5c56494f-200d-4f94-ad70-1f53e7b5d1fe\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8ttnq" Dec 01 08:45:53 crc kubenswrapper[5004]: I1201 08:45:53.725501 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c56494f-200d-4f94-ad70-1f53e7b5d1fe-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8ttnq\" (UID: \"5c56494f-200d-4f94-ad70-1f53e7b5d1fe\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8ttnq" Dec 01 08:45:53 crc kubenswrapper[5004]: I1201 08:45:53.827898 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5c56494f-200d-4f94-ad70-1f53e7b5d1fe-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8ttnq\" (UID: \"5c56494f-200d-4f94-ad70-1f53e7b5d1fe\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8ttnq" Dec 01 08:45:53 crc kubenswrapper[5004]: I1201 08:45:53.828203 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c56494f-200d-4f94-ad70-1f53e7b5d1fe-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8ttnq\" (UID: \"5c56494f-200d-4f94-ad70-1f53e7b5d1fe\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8ttnq" Dec 01 08:45:53 crc kubenswrapper[5004]: I1201 08:45:53.828368 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhxf2\" (UniqueName: \"kubernetes.io/projected/5c56494f-200d-4f94-ad70-1f53e7b5d1fe-kube-api-access-jhxf2\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8ttnq\" (UID: \"5c56494f-200d-4f94-ad70-1f53e7b5d1fe\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8ttnq" Dec 01 08:45:53 crc kubenswrapper[5004]: I1201 08:45:53.828510 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c56494f-200d-4f94-ad70-1f53e7b5d1fe-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8ttnq\" (UID: \"5c56494f-200d-4f94-ad70-1f53e7b5d1fe\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8ttnq" Dec 01 08:45:53 crc kubenswrapper[5004]: I1201 08:45:53.838143 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c56494f-200d-4f94-ad70-1f53e7b5d1fe-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8ttnq\" (UID: \"5c56494f-200d-4f94-ad70-1f53e7b5d1fe\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8ttnq" Dec 01 08:45:53 crc kubenswrapper[5004]: I1201 08:45:53.838797 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c56494f-200d-4f94-ad70-1f53e7b5d1fe-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8ttnq\" (UID: \"5c56494f-200d-4f94-ad70-1f53e7b5d1fe\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8ttnq" Dec 01 08:45:53 crc kubenswrapper[5004]: I1201 08:45:53.846248 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5c56494f-200d-4f94-ad70-1f53e7b5d1fe-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8ttnq\" (UID: \"5c56494f-200d-4f94-ad70-1f53e7b5d1fe\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8ttnq" Dec 01 08:45:53 crc kubenswrapper[5004]: I1201 08:45:53.856723 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhxf2\" (UniqueName: \"kubernetes.io/projected/5c56494f-200d-4f94-ad70-1f53e7b5d1fe-kube-api-access-jhxf2\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8ttnq\" (UID: \"5c56494f-200d-4f94-ad70-1f53e7b5d1fe\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8ttnq" Dec 01 08:45:53 crc kubenswrapper[5004]: I1201 08:45:53.959915 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8ttnq" Dec 01 08:45:54 crc kubenswrapper[5004]: I1201 08:45:54.134784 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-74fd44f8dd-bd4pt" Dec 01 08:45:54 crc kubenswrapper[5004]: I1201 08:45:54.196183 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-7cfb9cfbf9-fqxms"] Dec 01 08:45:54 crc kubenswrapper[5004]: I1201 08:45:54.196452 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-7cfb9cfbf9-fqxms" podUID="66885fce-0b69-4fc5-b4cd-f7b33bee9046" containerName="heat-engine" containerID="cri-o://e46324c44b100e99dbf3734ebb47701ffa88d1c729fc38d2cb362f220ab6edf1" gracePeriod=60 Dec 01 08:45:54 crc kubenswrapper[5004]: I1201 08:45:54.686817 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8ttnq"] Dec 01 08:45:55 crc kubenswrapper[5004]: E1201 08:45:55.224067 5004 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e46324c44b100e99dbf3734ebb47701ffa88d1c729fc38d2cb362f220ab6edf1" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 01 08:45:55 crc kubenswrapper[5004]: E1201 08:45:55.225694 5004 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e46324c44b100e99dbf3734ebb47701ffa88d1c729fc38d2cb362f220ab6edf1" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 01 08:45:55 crc kubenswrapper[5004]: E1201 08:45:55.226850 5004 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e46324c44b100e99dbf3734ebb47701ffa88d1c729fc38d2cb362f220ab6edf1" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 01 08:45:55 crc kubenswrapper[5004]: E1201 08:45:55.226881 5004 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-7cfb9cfbf9-fqxms" podUID="66885fce-0b69-4fc5-b4cd-f7b33bee9046" containerName="heat-engine" Dec 01 08:45:55 crc kubenswrapper[5004]: I1201 08:45:55.594176 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8ttnq" event={"ID":"5c56494f-200d-4f94-ad70-1f53e7b5d1fe","Type":"ContainerStarted","Data":"6c139e02b335554f26d7d65e03536d4a22b888c932cbf2290865939f2ae1a189"} Dec 01 08:45:55 crc kubenswrapper[5004]: I1201 08:45:55.963000 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-hxtzx"] Dec 01 08:45:55 crc kubenswrapper[5004]: I1201 08:45:55.973626 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-hxtzx"] Dec 01 08:45:56 crc kubenswrapper[5004]: I1201 08:45:56.071011 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-lc7hw"] Dec 01 08:45:56 crc kubenswrapper[5004]: I1201 08:45:56.072906 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-lc7hw" Dec 01 08:45:56 crc kubenswrapper[5004]: I1201 08:45:56.075876 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 01 08:45:56 crc kubenswrapper[5004]: I1201 08:45:56.109788 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-lc7hw"] Dec 01 08:45:56 crc kubenswrapper[5004]: I1201 08:45:56.185641 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/779fe233-ca28-4e4d-adb7-fbf03e3e751e-config-data\") pod \"aodh-db-sync-lc7hw\" (UID: \"779fe233-ca28-4e4d-adb7-fbf03e3e751e\") " pod="openstack/aodh-db-sync-lc7hw" Dec 01 08:45:56 crc kubenswrapper[5004]: I1201 08:45:56.186047 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/779fe233-ca28-4e4d-adb7-fbf03e3e751e-scripts\") pod \"aodh-db-sync-lc7hw\" (UID: \"779fe233-ca28-4e4d-adb7-fbf03e3e751e\") " pod="openstack/aodh-db-sync-lc7hw" Dec 01 08:45:56 crc kubenswrapper[5004]: I1201 08:45:56.186106 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/779fe233-ca28-4e4d-adb7-fbf03e3e751e-combined-ca-bundle\") pod \"aodh-db-sync-lc7hw\" (UID: \"779fe233-ca28-4e4d-adb7-fbf03e3e751e\") " pod="openstack/aodh-db-sync-lc7hw" Dec 01 08:45:56 crc kubenswrapper[5004]: I1201 08:45:56.186221 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4gl8\" (UniqueName: \"kubernetes.io/projected/779fe233-ca28-4e4d-adb7-fbf03e3e751e-kube-api-access-d4gl8\") pod \"aodh-db-sync-lc7hw\" (UID: \"779fe233-ca28-4e4d-adb7-fbf03e3e751e\") " pod="openstack/aodh-db-sync-lc7hw" Dec 01 08:45:56 crc kubenswrapper[5004]: I1201 08:45:56.288060 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4gl8\" (UniqueName: \"kubernetes.io/projected/779fe233-ca28-4e4d-adb7-fbf03e3e751e-kube-api-access-d4gl8\") pod \"aodh-db-sync-lc7hw\" (UID: \"779fe233-ca28-4e4d-adb7-fbf03e3e751e\") " pod="openstack/aodh-db-sync-lc7hw" Dec 01 08:45:56 crc kubenswrapper[5004]: I1201 08:45:56.288493 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/779fe233-ca28-4e4d-adb7-fbf03e3e751e-config-data\") pod \"aodh-db-sync-lc7hw\" (UID: \"779fe233-ca28-4e4d-adb7-fbf03e3e751e\") " pod="openstack/aodh-db-sync-lc7hw" Dec 01 08:45:56 crc kubenswrapper[5004]: I1201 08:45:56.289686 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/779fe233-ca28-4e4d-adb7-fbf03e3e751e-scripts\") pod \"aodh-db-sync-lc7hw\" (UID: \"779fe233-ca28-4e4d-adb7-fbf03e3e751e\") " pod="openstack/aodh-db-sync-lc7hw" Dec 01 08:45:56 crc kubenswrapper[5004]: I1201 08:45:56.289799 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/779fe233-ca28-4e4d-adb7-fbf03e3e751e-combined-ca-bundle\") pod \"aodh-db-sync-lc7hw\" (UID: \"779fe233-ca28-4e4d-adb7-fbf03e3e751e\") " pod="openstack/aodh-db-sync-lc7hw" Dec 01 08:45:56 crc kubenswrapper[5004]: I1201 08:45:56.298206 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/779fe233-ca28-4e4d-adb7-fbf03e3e751e-scripts\") pod \"aodh-db-sync-lc7hw\" (UID: \"779fe233-ca28-4e4d-adb7-fbf03e3e751e\") " pod="openstack/aodh-db-sync-lc7hw" Dec 01 08:45:56 crc kubenswrapper[5004]: I1201 08:45:56.298595 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/779fe233-ca28-4e4d-adb7-fbf03e3e751e-config-data\") pod \"aodh-db-sync-lc7hw\" (UID: \"779fe233-ca28-4e4d-adb7-fbf03e3e751e\") " pod="openstack/aodh-db-sync-lc7hw" Dec 01 08:45:56 crc kubenswrapper[5004]: I1201 08:45:56.302200 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/779fe233-ca28-4e4d-adb7-fbf03e3e751e-combined-ca-bundle\") pod \"aodh-db-sync-lc7hw\" (UID: \"779fe233-ca28-4e4d-adb7-fbf03e3e751e\") " pod="openstack/aodh-db-sync-lc7hw" Dec 01 08:45:56 crc kubenswrapper[5004]: I1201 08:45:56.306055 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4gl8\" (UniqueName: \"kubernetes.io/projected/779fe233-ca28-4e4d-adb7-fbf03e3e751e-kube-api-access-d4gl8\") pod \"aodh-db-sync-lc7hw\" (UID: \"779fe233-ca28-4e4d-adb7-fbf03e3e751e\") " pod="openstack/aodh-db-sync-lc7hw" Dec 01 08:45:56 crc kubenswrapper[5004]: I1201 08:45:56.392626 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-lc7hw" Dec 01 08:45:56 crc kubenswrapper[5004]: I1201 08:45:56.778169 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34e30621-736a-4bfd-8b6d-fbbb4350e4ad" path="/var/lib/kubelet/pods/34e30621-736a-4bfd-8b6d-fbbb4350e4ad/volumes" Dec 01 08:45:56 crc kubenswrapper[5004]: I1201 08:45:56.955196 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-lc7hw"] Dec 01 08:45:56 crc kubenswrapper[5004]: W1201 08:45:56.964143 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod779fe233_ca28_4e4d_adb7_fbf03e3e751e.slice/crio-a0c996052d70271dbbfc521f40750491529d667f3085f21169a0f09a5bc478b4 WatchSource:0}: Error finding container a0c996052d70271dbbfc521f40750491529d667f3085f21169a0f09a5bc478b4: Status 404 returned error can't find the container with id a0c996052d70271dbbfc521f40750491529d667f3085f21169a0f09a5bc478b4 Dec 01 08:45:57 crc kubenswrapper[5004]: I1201 08:45:57.622430 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-lc7hw" event={"ID":"779fe233-ca28-4e4d-adb7-fbf03e3e751e","Type":"ContainerStarted","Data":"a0c996052d70271dbbfc521f40750491529d667f3085f21169a0f09a5bc478b4"} Dec 01 08:45:59 crc kubenswrapper[5004]: I1201 08:45:59.759839 5004 scope.go:117] "RemoveContainer" containerID="70f45b3d2a6bb4bf0d87f18bc08b282543e238e1d391a58745914af24ad7956c" Dec 01 08:45:59 crc kubenswrapper[5004]: E1201 08:45:59.760440 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 08:46:02 crc kubenswrapper[5004]: I1201 08:46:02.368915 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 01 08:46:02 crc kubenswrapper[5004]: I1201 08:46:02.396001 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 01 08:46:03 crc kubenswrapper[5004]: I1201 08:46:03.716703 5004 generic.go:334] "Generic (PLEG): container finished" podID="66885fce-0b69-4fc5-b4cd-f7b33bee9046" containerID="e46324c44b100e99dbf3734ebb47701ffa88d1c729fc38d2cb362f220ab6edf1" exitCode=0 Dec 01 08:46:03 crc kubenswrapper[5004]: I1201 08:46:03.716815 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-7cfb9cfbf9-fqxms" event={"ID":"66885fce-0b69-4fc5-b4cd-f7b33bee9046","Type":"ContainerDied","Data":"e46324c44b100e99dbf3734ebb47701ffa88d1c729fc38d2cb362f220ab6edf1"} Dec 01 08:46:05 crc kubenswrapper[5004]: E1201 08:46:05.220654 5004 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e46324c44b100e99dbf3734ebb47701ffa88d1c729fc38d2cb362f220ab6edf1 is running failed: container process not found" containerID="e46324c44b100e99dbf3734ebb47701ffa88d1c729fc38d2cb362f220ab6edf1" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 01 08:46:05 crc kubenswrapper[5004]: E1201 08:46:05.222112 5004 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e46324c44b100e99dbf3734ebb47701ffa88d1c729fc38d2cb362f220ab6edf1 is running failed: container process not found" containerID="e46324c44b100e99dbf3734ebb47701ffa88d1c729fc38d2cb362f220ab6edf1" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 01 08:46:05 crc kubenswrapper[5004]: E1201 08:46:05.222735 5004 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e46324c44b100e99dbf3734ebb47701ffa88d1c729fc38d2cb362f220ab6edf1 is running failed: container process not found" containerID="e46324c44b100e99dbf3734ebb47701ffa88d1c729fc38d2cb362f220ab6edf1" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 01 08:46:05 crc kubenswrapper[5004]: E1201 08:46:05.222787 5004 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e46324c44b100e99dbf3734ebb47701ffa88d1c729fc38d2cb362f220ab6edf1 is running failed: container process not found" probeType="Readiness" pod="openstack/heat-engine-7cfb9cfbf9-fqxms" podUID="66885fce-0b69-4fc5-b4cd-f7b33bee9046" containerName="heat-engine" Dec 01 08:46:09 crc kubenswrapper[5004]: I1201 08:46:09.048910 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-7cfb9cfbf9-fqxms" Dec 01 08:46:09 crc kubenswrapper[5004]: I1201 08:46:09.139043 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66885fce-0b69-4fc5-b4cd-f7b33bee9046-combined-ca-bundle\") pod \"66885fce-0b69-4fc5-b4cd-f7b33bee9046\" (UID: \"66885fce-0b69-4fc5-b4cd-f7b33bee9046\") " Dec 01 08:46:09 crc kubenswrapper[5004]: I1201 08:46:09.139220 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/66885fce-0b69-4fc5-b4cd-f7b33bee9046-config-data-custom\") pod \"66885fce-0b69-4fc5-b4cd-f7b33bee9046\" (UID: \"66885fce-0b69-4fc5-b4cd-f7b33bee9046\") " Dec 01 08:46:09 crc kubenswrapper[5004]: I1201 08:46:09.139302 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xr6vh\" (UniqueName: \"kubernetes.io/projected/66885fce-0b69-4fc5-b4cd-f7b33bee9046-kube-api-access-xr6vh\") pod \"66885fce-0b69-4fc5-b4cd-f7b33bee9046\" (UID: \"66885fce-0b69-4fc5-b4cd-f7b33bee9046\") " Dec 01 08:46:09 crc kubenswrapper[5004]: I1201 08:46:09.139425 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66885fce-0b69-4fc5-b4cd-f7b33bee9046-config-data\") pod \"66885fce-0b69-4fc5-b4cd-f7b33bee9046\" (UID: \"66885fce-0b69-4fc5-b4cd-f7b33bee9046\") " Dec 01 08:46:09 crc kubenswrapper[5004]: I1201 08:46:09.145377 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66885fce-0b69-4fc5-b4cd-f7b33bee9046-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "66885fce-0b69-4fc5-b4cd-f7b33bee9046" (UID: "66885fce-0b69-4fc5-b4cd-f7b33bee9046"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:46:09 crc kubenswrapper[5004]: I1201 08:46:09.148820 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66885fce-0b69-4fc5-b4cd-f7b33bee9046-kube-api-access-xr6vh" (OuterVolumeSpecName: "kube-api-access-xr6vh") pod "66885fce-0b69-4fc5-b4cd-f7b33bee9046" (UID: "66885fce-0b69-4fc5-b4cd-f7b33bee9046"). InnerVolumeSpecName "kube-api-access-xr6vh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:46:09 crc kubenswrapper[5004]: I1201 08:46:09.178954 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66885fce-0b69-4fc5-b4cd-f7b33bee9046-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "66885fce-0b69-4fc5-b4cd-f7b33bee9046" (UID: "66885fce-0b69-4fc5-b4cd-f7b33bee9046"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:46:09 crc kubenswrapper[5004]: I1201 08:46:09.220382 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66885fce-0b69-4fc5-b4cd-f7b33bee9046-config-data" (OuterVolumeSpecName: "config-data") pod "66885fce-0b69-4fc5-b4cd-f7b33bee9046" (UID: "66885fce-0b69-4fc5-b4cd-f7b33bee9046"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:46:09 crc kubenswrapper[5004]: I1201 08:46:09.242624 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66885fce-0b69-4fc5-b4cd-f7b33bee9046-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 08:46:09 crc kubenswrapper[5004]: I1201 08:46:09.242665 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66885fce-0b69-4fc5-b4cd-f7b33bee9046-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:46:09 crc kubenswrapper[5004]: I1201 08:46:09.242681 5004 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/66885fce-0b69-4fc5-b4cd-f7b33bee9046-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 01 08:46:09 crc kubenswrapper[5004]: I1201 08:46:09.242693 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xr6vh\" (UniqueName: \"kubernetes.io/projected/66885fce-0b69-4fc5-b4cd-f7b33bee9046-kube-api-access-xr6vh\") on node \"crc\" DevicePath \"\"" Dec 01 08:46:09 crc kubenswrapper[5004]: E1201 08:46:09.616615 5004 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-aodh-api:current-tested" Dec 01 08:46:09 crc kubenswrapper[5004]: E1201 08:46:09.616682 5004 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-aodh-api:current-tested" Dec 01 08:46:09 crc kubenswrapper[5004]: E1201 08:46:09.616838 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:aodh-db-sync,Image:quay.rdoproject.org/podified-master-centos10/openstack-aodh-api:current-tested,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:AodhPassword,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:osp-secret,},Key:AodhPassword,Optional:nil,},},},EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:aodh-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d4gl8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42402,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod aodh-db-sync-lc7hw_openstack(779fe233-ca28-4e4d-adb7-fbf03e3e751e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 08:46:09 crc kubenswrapper[5004]: E1201 08:46:09.618211 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"aodh-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/aodh-db-sync-lc7hw" podUID="779fe233-ca28-4e4d-adb7-fbf03e3e751e" Dec 01 08:46:09 crc kubenswrapper[5004]: I1201 08:46:09.618285 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 08:46:09 crc kubenswrapper[5004]: I1201 08:46:09.831527 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-7cfb9cfbf9-fqxms" event={"ID":"66885fce-0b69-4fc5-b4cd-f7b33bee9046","Type":"ContainerDied","Data":"1fb43b408fe73f1ef3a012e55d07847dac54817931de9e8088ce2c850afcadc7"} Dec 01 08:46:09 crc kubenswrapper[5004]: I1201 08:46:09.831584 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-7cfb9cfbf9-fqxms" Dec 01 08:46:09 crc kubenswrapper[5004]: I1201 08:46:09.831946 5004 scope.go:117] "RemoveContainer" containerID="e46324c44b100e99dbf3734ebb47701ffa88d1c729fc38d2cb362f220ab6edf1" Dec 01 08:46:09 crc kubenswrapper[5004]: E1201 08:46:09.833673 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"aodh-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-aodh-api:current-tested\\\"\"" pod="openstack/aodh-db-sync-lc7hw" podUID="779fe233-ca28-4e4d-adb7-fbf03e3e751e" Dec 01 08:46:09 crc kubenswrapper[5004]: I1201 08:46:09.890586 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-7cfb9cfbf9-fqxms"] Dec 01 08:46:09 crc kubenswrapper[5004]: I1201 08:46:09.901620 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-7cfb9cfbf9-fqxms"] Dec 01 08:46:10 crc kubenswrapper[5004]: I1201 08:46:10.774776 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66885fce-0b69-4fc5-b4cd-f7b33bee9046" path="/var/lib/kubelet/pods/66885fce-0b69-4fc5-b4cd-f7b33bee9046/volumes" Dec 01 08:46:10 crc kubenswrapper[5004]: I1201 08:46:10.850331 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8ttnq" event={"ID":"5c56494f-200d-4f94-ad70-1f53e7b5d1fe","Type":"ContainerStarted","Data":"4f899c4b0f38b2585def938ed719a1a87ac43f64ac85ee844aa7bd115b5c2d16"} Dec 01 08:46:10 crc kubenswrapper[5004]: I1201 08:46:10.874342 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8ttnq" podStartSLOduration=2.943523529 podStartE2EDuration="17.874326064s" podCreationTimestamp="2025-12-01 08:45:53 +0000 UTC" firstStartedPulling="2025-12-01 08:45:54.684346285 +0000 UTC m=+1732.249338267" lastFinishedPulling="2025-12-01 08:46:09.61514882 +0000 UTC m=+1747.180140802" observedRunningTime="2025-12-01 08:46:10.87078624 +0000 UTC m=+1748.435778262" watchObservedRunningTime="2025-12-01 08:46:10.874326064 +0000 UTC m=+1748.439318046" Dec 01 08:46:11 crc kubenswrapper[5004]: I1201 08:46:11.760001 5004 scope.go:117] "RemoveContainer" containerID="70f45b3d2a6bb4bf0d87f18bc08b282543e238e1d391a58745914af24ad7956c" Dec 01 08:46:11 crc kubenswrapper[5004]: E1201 08:46:11.760510 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 08:46:11 crc kubenswrapper[5004]: I1201 08:46:11.928415 5004 scope.go:117] "RemoveContainer" containerID="b9fd84922bf36669f215326eb888558fc06f6cbf95bab1e4f4a38827a42a3a6d" Dec 01 08:46:12 crc kubenswrapper[5004]: I1201 08:46:12.390807 5004 scope.go:117] "RemoveContainer" containerID="4de3c6d2d3d88f3acc76200c5a2cb8933d7a660a9287a6c92cb0c5adb7e50d17" Dec 01 08:46:12 crc kubenswrapper[5004]: I1201 08:46:12.431374 5004 scope.go:117] "RemoveContainer" containerID="aa8a0ddeefe479aac16227176ae767c4796462f8d57e843478acde9f393a951f" Dec 01 08:46:12 crc kubenswrapper[5004]: I1201 08:46:12.522191 5004 scope.go:117] "RemoveContainer" containerID="d3bb2dbdf7036f16f9ae1e924e8377910b900e9426f42d3670895a9275c98393" Dec 01 08:46:12 crc kubenswrapper[5004]: I1201 08:46:12.563172 5004 scope.go:117] "RemoveContainer" containerID="af4e575a8ca8e74879136f8b21f258959582db3fe1df8c6bf51d969809ba44e4" Dec 01 08:46:12 crc kubenswrapper[5004]: I1201 08:46:12.586867 5004 scope.go:117] "RemoveContainer" containerID="fd6b84e7f410a429e25a4656ba7585b1a342e9cf42fc1d50125a3d8802012aa0" Dec 01 08:46:21 crc kubenswrapper[5004]: I1201 08:46:21.565543 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-k2xpb"] Dec 01 08:46:21 crc kubenswrapper[5004]: E1201 08:46:21.567054 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66885fce-0b69-4fc5-b4cd-f7b33bee9046" containerName="heat-engine" Dec 01 08:46:21 crc kubenswrapper[5004]: I1201 08:46:21.567077 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="66885fce-0b69-4fc5-b4cd-f7b33bee9046" containerName="heat-engine" Dec 01 08:46:21 crc kubenswrapper[5004]: I1201 08:46:21.567548 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="66885fce-0b69-4fc5-b4cd-f7b33bee9046" containerName="heat-engine" Dec 01 08:46:21 crc kubenswrapper[5004]: I1201 08:46:21.572641 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k2xpb" Dec 01 08:46:21 crc kubenswrapper[5004]: I1201 08:46:21.589473 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k2xpb"] Dec 01 08:46:21 crc kubenswrapper[5004]: I1201 08:46:21.668973 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5796\" (UniqueName: \"kubernetes.io/projected/dcf52602-5561-42cc-839c-5de03a1c2df5-kube-api-access-f5796\") pod \"redhat-marketplace-k2xpb\" (UID: \"dcf52602-5561-42cc-839c-5de03a1c2df5\") " pod="openshift-marketplace/redhat-marketplace-k2xpb" Dec 01 08:46:21 crc kubenswrapper[5004]: I1201 08:46:21.669049 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcf52602-5561-42cc-839c-5de03a1c2df5-utilities\") pod \"redhat-marketplace-k2xpb\" (UID: \"dcf52602-5561-42cc-839c-5de03a1c2df5\") " pod="openshift-marketplace/redhat-marketplace-k2xpb" Dec 01 08:46:21 crc kubenswrapper[5004]: I1201 08:46:21.669521 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcf52602-5561-42cc-839c-5de03a1c2df5-catalog-content\") pod \"redhat-marketplace-k2xpb\" (UID: \"dcf52602-5561-42cc-839c-5de03a1c2df5\") " pod="openshift-marketplace/redhat-marketplace-k2xpb" Dec 01 08:46:21 crc kubenswrapper[5004]: I1201 08:46:21.772938 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5796\" (UniqueName: \"kubernetes.io/projected/dcf52602-5561-42cc-839c-5de03a1c2df5-kube-api-access-f5796\") pod \"redhat-marketplace-k2xpb\" (UID: \"dcf52602-5561-42cc-839c-5de03a1c2df5\") " pod="openshift-marketplace/redhat-marketplace-k2xpb" Dec 01 08:46:21 crc kubenswrapper[5004]: I1201 08:46:21.773059 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcf52602-5561-42cc-839c-5de03a1c2df5-utilities\") pod \"redhat-marketplace-k2xpb\" (UID: \"dcf52602-5561-42cc-839c-5de03a1c2df5\") " pod="openshift-marketplace/redhat-marketplace-k2xpb" Dec 01 08:46:21 crc kubenswrapper[5004]: I1201 08:46:21.773273 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcf52602-5561-42cc-839c-5de03a1c2df5-catalog-content\") pod \"redhat-marketplace-k2xpb\" (UID: \"dcf52602-5561-42cc-839c-5de03a1c2df5\") " pod="openshift-marketplace/redhat-marketplace-k2xpb" Dec 01 08:46:21 crc kubenswrapper[5004]: I1201 08:46:21.773783 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcf52602-5561-42cc-839c-5de03a1c2df5-utilities\") pod \"redhat-marketplace-k2xpb\" (UID: \"dcf52602-5561-42cc-839c-5de03a1c2df5\") " pod="openshift-marketplace/redhat-marketplace-k2xpb" Dec 01 08:46:21 crc kubenswrapper[5004]: I1201 08:46:21.773786 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcf52602-5561-42cc-839c-5de03a1c2df5-catalog-content\") pod \"redhat-marketplace-k2xpb\" (UID: \"dcf52602-5561-42cc-839c-5de03a1c2df5\") " pod="openshift-marketplace/redhat-marketplace-k2xpb" Dec 01 08:46:21 crc kubenswrapper[5004]: I1201 08:46:21.803221 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5796\" (UniqueName: \"kubernetes.io/projected/dcf52602-5561-42cc-839c-5de03a1c2df5-kube-api-access-f5796\") pod \"redhat-marketplace-k2xpb\" (UID: \"dcf52602-5561-42cc-839c-5de03a1c2df5\") " pod="openshift-marketplace/redhat-marketplace-k2xpb" Dec 01 08:46:21 crc kubenswrapper[5004]: I1201 08:46:21.907425 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k2xpb" Dec 01 08:46:22 crc kubenswrapper[5004]: I1201 08:46:22.443920 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k2xpb"] Dec 01 08:46:22 crc kubenswrapper[5004]: I1201 08:46:22.987267 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 01 08:46:23 crc kubenswrapper[5004]: I1201 08:46:23.015151 5004 generic.go:334] "Generic (PLEG): container finished" podID="dcf52602-5561-42cc-839c-5de03a1c2df5" containerID="1c94751d2c3bae881fc5f4bb9c78986190eb3295e7ce908698c275a6e9b90098" exitCode=0 Dec 01 08:46:23 crc kubenswrapper[5004]: I1201 08:46:23.015226 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k2xpb" event={"ID":"dcf52602-5561-42cc-839c-5de03a1c2df5","Type":"ContainerDied","Data":"1c94751d2c3bae881fc5f4bb9c78986190eb3295e7ce908698c275a6e9b90098"} Dec 01 08:46:23 crc kubenswrapper[5004]: I1201 08:46:23.015895 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k2xpb" event={"ID":"dcf52602-5561-42cc-839c-5de03a1c2df5","Type":"ContainerStarted","Data":"f11725e71ea0635c5b8fb1662323c0154bb385d3805506320985c37b65215237"} Dec 01 08:46:24 crc kubenswrapper[5004]: I1201 08:46:24.030258 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k2xpb" event={"ID":"dcf52602-5561-42cc-839c-5de03a1c2df5","Type":"ContainerStarted","Data":"c2bb27d3c71493c0c09270577c523c9b65b1439c9eb98361fe60e654e5c97e38"} Dec 01 08:46:24 crc kubenswrapper[5004]: I1201 08:46:24.032183 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-lc7hw" event={"ID":"779fe233-ca28-4e4d-adb7-fbf03e3e751e","Type":"ContainerStarted","Data":"638d9453ccc6e981c9a02c8db54414892192a8639d04934294eab12a4dcac589"} Dec 01 08:46:24 crc kubenswrapper[5004]: I1201 08:46:24.073753 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-lc7hw" podStartSLOduration=2.063224972 podStartE2EDuration="28.073730986s" podCreationTimestamp="2025-12-01 08:45:56 +0000 UTC" firstStartedPulling="2025-12-01 08:45:56.973702725 +0000 UTC m=+1734.538694707" lastFinishedPulling="2025-12-01 08:46:22.984208739 +0000 UTC m=+1760.549200721" observedRunningTime="2025-12-01 08:46:24.068757378 +0000 UTC m=+1761.633749360" watchObservedRunningTime="2025-12-01 08:46:24.073730986 +0000 UTC m=+1761.638722968" Dec 01 08:46:25 crc kubenswrapper[5004]: I1201 08:46:25.051046 5004 generic.go:334] "Generic (PLEG): container finished" podID="dcf52602-5561-42cc-839c-5de03a1c2df5" containerID="c2bb27d3c71493c0c09270577c523c9b65b1439c9eb98361fe60e654e5c97e38" exitCode=0 Dec 01 08:46:25 crc kubenswrapper[5004]: I1201 08:46:25.051182 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k2xpb" event={"ID":"dcf52602-5561-42cc-839c-5de03a1c2df5","Type":"ContainerDied","Data":"c2bb27d3c71493c0c09270577c523c9b65b1439c9eb98361fe60e654e5c97e38"} Dec 01 08:46:25 crc kubenswrapper[5004]: I1201 08:46:25.054620 5004 generic.go:334] "Generic (PLEG): container finished" podID="5c56494f-200d-4f94-ad70-1f53e7b5d1fe" containerID="4f899c4b0f38b2585def938ed719a1a87ac43f64ac85ee844aa7bd115b5c2d16" exitCode=0 Dec 01 08:46:25 crc kubenswrapper[5004]: I1201 08:46:25.054688 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8ttnq" event={"ID":"5c56494f-200d-4f94-ad70-1f53e7b5d1fe","Type":"ContainerDied","Data":"4f899c4b0f38b2585def938ed719a1a87ac43f64ac85ee844aa7bd115b5c2d16"} Dec 01 08:46:25 crc kubenswrapper[5004]: I1201 08:46:25.760306 5004 scope.go:117] "RemoveContainer" containerID="70f45b3d2a6bb4bf0d87f18bc08b282543e238e1d391a58745914af24ad7956c" Dec 01 08:46:25 crc kubenswrapper[5004]: E1201 08:46:25.761062 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 08:46:26 crc kubenswrapper[5004]: I1201 08:46:26.105308 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k2xpb" event={"ID":"dcf52602-5561-42cc-839c-5de03a1c2df5","Type":"ContainerStarted","Data":"d4969f821bd1ff8d85cd2301352f4644afe17c6c0650c7a1cd754ca079f97753"} Dec 01 08:46:26 crc kubenswrapper[5004]: I1201 08:46:26.113417 5004 generic.go:334] "Generic (PLEG): container finished" podID="779fe233-ca28-4e4d-adb7-fbf03e3e751e" containerID="638d9453ccc6e981c9a02c8db54414892192a8639d04934294eab12a4dcac589" exitCode=0 Dec 01 08:46:26 crc kubenswrapper[5004]: I1201 08:46:26.113631 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-lc7hw" event={"ID":"779fe233-ca28-4e4d-adb7-fbf03e3e751e","Type":"ContainerDied","Data":"638d9453ccc6e981c9a02c8db54414892192a8639d04934294eab12a4dcac589"} Dec 01 08:46:26 crc kubenswrapper[5004]: I1201 08:46:26.142531 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-k2xpb" podStartSLOduration=2.6802321129999997 podStartE2EDuration="5.142514136s" podCreationTimestamp="2025-12-01 08:46:21 +0000 UTC" firstStartedPulling="2025-12-01 08:46:23.036701918 +0000 UTC m=+1760.601693900" lastFinishedPulling="2025-12-01 08:46:25.498983931 +0000 UTC m=+1763.063975923" observedRunningTime="2025-12-01 08:46:26.130085263 +0000 UTC m=+1763.695077255" watchObservedRunningTime="2025-12-01 08:46:26.142514136 +0000 UTC m=+1763.707506108" Dec 01 08:46:26 crc kubenswrapper[5004]: I1201 08:46:26.647696 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8ttnq" Dec 01 08:46:26 crc kubenswrapper[5004]: I1201 08:46:26.806438 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c56494f-200d-4f94-ad70-1f53e7b5d1fe-repo-setup-combined-ca-bundle\") pod \"5c56494f-200d-4f94-ad70-1f53e7b5d1fe\" (UID: \"5c56494f-200d-4f94-ad70-1f53e7b5d1fe\") " Dec 01 08:46:26 crc kubenswrapper[5004]: I1201 08:46:26.806503 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c56494f-200d-4f94-ad70-1f53e7b5d1fe-inventory\") pod \"5c56494f-200d-4f94-ad70-1f53e7b5d1fe\" (UID: \"5c56494f-200d-4f94-ad70-1f53e7b5d1fe\") " Dec 01 08:46:26 crc kubenswrapper[5004]: I1201 08:46:26.806676 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhxf2\" (UniqueName: \"kubernetes.io/projected/5c56494f-200d-4f94-ad70-1f53e7b5d1fe-kube-api-access-jhxf2\") pod \"5c56494f-200d-4f94-ad70-1f53e7b5d1fe\" (UID: \"5c56494f-200d-4f94-ad70-1f53e7b5d1fe\") " Dec 01 08:46:26 crc kubenswrapper[5004]: I1201 08:46:26.806860 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5c56494f-200d-4f94-ad70-1f53e7b5d1fe-ssh-key\") pod \"5c56494f-200d-4f94-ad70-1f53e7b5d1fe\" (UID: \"5c56494f-200d-4f94-ad70-1f53e7b5d1fe\") " Dec 01 08:46:26 crc kubenswrapper[5004]: I1201 08:46:26.811777 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c56494f-200d-4f94-ad70-1f53e7b5d1fe-kube-api-access-jhxf2" (OuterVolumeSpecName: "kube-api-access-jhxf2") pod "5c56494f-200d-4f94-ad70-1f53e7b5d1fe" (UID: "5c56494f-200d-4f94-ad70-1f53e7b5d1fe"). InnerVolumeSpecName "kube-api-access-jhxf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:46:26 crc kubenswrapper[5004]: I1201 08:46:26.811890 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c56494f-200d-4f94-ad70-1f53e7b5d1fe-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "5c56494f-200d-4f94-ad70-1f53e7b5d1fe" (UID: "5c56494f-200d-4f94-ad70-1f53e7b5d1fe"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:46:26 crc kubenswrapper[5004]: I1201 08:46:26.847475 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c56494f-200d-4f94-ad70-1f53e7b5d1fe-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5c56494f-200d-4f94-ad70-1f53e7b5d1fe" (UID: "5c56494f-200d-4f94-ad70-1f53e7b5d1fe"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:46:26 crc kubenswrapper[5004]: I1201 08:46:26.847742 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c56494f-200d-4f94-ad70-1f53e7b5d1fe-inventory" (OuterVolumeSpecName: "inventory") pod "5c56494f-200d-4f94-ad70-1f53e7b5d1fe" (UID: "5c56494f-200d-4f94-ad70-1f53e7b5d1fe"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:46:26 crc kubenswrapper[5004]: I1201 08:46:26.908960 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhxf2\" (UniqueName: \"kubernetes.io/projected/5c56494f-200d-4f94-ad70-1f53e7b5d1fe-kube-api-access-jhxf2\") on node \"crc\" DevicePath \"\"" Dec 01 08:46:26 crc kubenswrapper[5004]: I1201 08:46:26.908993 5004 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5c56494f-200d-4f94-ad70-1f53e7b5d1fe-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 08:46:26 crc kubenswrapper[5004]: I1201 08:46:26.909003 5004 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c56494f-200d-4f94-ad70-1f53e7b5d1fe-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:46:26 crc kubenswrapper[5004]: I1201 08:46:26.909014 5004 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c56494f-200d-4f94-ad70-1f53e7b5d1fe-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 08:46:27 crc kubenswrapper[5004]: I1201 08:46:27.127024 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8ttnq" event={"ID":"5c56494f-200d-4f94-ad70-1f53e7b5d1fe","Type":"ContainerDied","Data":"6c139e02b335554f26d7d65e03536d4a22b888c932cbf2290865939f2ae1a189"} Dec 01 08:46:27 crc kubenswrapper[5004]: I1201 08:46:27.127365 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c139e02b335554f26d7d65e03536d4a22b888c932cbf2290865939f2ae1a189" Dec 01 08:46:27 crc kubenswrapper[5004]: I1201 08:46:27.127430 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8ttnq" Dec 01 08:46:27 crc kubenswrapper[5004]: I1201 08:46:27.313364 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-n2xnj"] Dec 01 08:46:27 crc kubenswrapper[5004]: E1201 08:46:27.314043 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c56494f-200d-4f94-ad70-1f53e7b5d1fe" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 01 08:46:27 crc kubenswrapper[5004]: I1201 08:46:27.314084 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c56494f-200d-4f94-ad70-1f53e7b5d1fe" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 01 08:46:27 crc kubenswrapper[5004]: I1201 08:46:27.314392 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c56494f-200d-4f94-ad70-1f53e7b5d1fe" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 01 08:46:27 crc kubenswrapper[5004]: I1201 08:46:27.315400 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-n2xnj" Dec 01 08:46:27 crc kubenswrapper[5004]: I1201 08:46:27.320540 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pdnrq" Dec 01 08:46:27 crc kubenswrapper[5004]: I1201 08:46:27.320827 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 08:46:27 crc kubenswrapper[5004]: I1201 08:46:27.322013 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 08:46:27 crc kubenswrapper[5004]: I1201 08:46:27.322086 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 08:46:27 crc kubenswrapper[5004]: I1201 08:46:27.327799 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-n2xnj"] Dec 01 08:46:27 crc kubenswrapper[5004]: I1201 08:46:27.424384 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgjzr\" (UniqueName: \"kubernetes.io/projected/a7cd8c79-5eb5-4883-b64c-f83dff955fd0-kube-api-access-jgjzr\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-n2xnj\" (UID: \"a7cd8c79-5eb5-4883-b64c-f83dff955fd0\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-n2xnj" Dec 01 08:46:27 crc kubenswrapper[5004]: I1201 08:46:27.424471 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a7cd8c79-5eb5-4883-b64c-f83dff955fd0-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-n2xnj\" (UID: \"a7cd8c79-5eb5-4883-b64c-f83dff955fd0\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-n2xnj" Dec 01 08:46:27 crc kubenswrapper[5004]: I1201 08:46:27.424882 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a7cd8c79-5eb5-4883-b64c-f83dff955fd0-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-n2xnj\" (UID: \"a7cd8c79-5eb5-4883-b64c-f83dff955fd0\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-n2xnj" Dec 01 08:46:27 crc kubenswrapper[5004]: I1201 08:46:27.526698 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgjzr\" (UniqueName: \"kubernetes.io/projected/a7cd8c79-5eb5-4883-b64c-f83dff955fd0-kube-api-access-jgjzr\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-n2xnj\" (UID: \"a7cd8c79-5eb5-4883-b64c-f83dff955fd0\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-n2xnj" Dec 01 08:46:27 crc kubenswrapper[5004]: I1201 08:46:27.527016 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a7cd8c79-5eb5-4883-b64c-f83dff955fd0-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-n2xnj\" (UID: \"a7cd8c79-5eb5-4883-b64c-f83dff955fd0\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-n2xnj" Dec 01 08:46:27 crc kubenswrapper[5004]: I1201 08:46:27.527108 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a7cd8c79-5eb5-4883-b64c-f83dff955fd0-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-n2xnj\" (UID: \"a7cd8c79-5eb5-4883-b64c-f83dff955fd0\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-n2xnj" Dec 01 08:46:27 crc kubenswrapper[5004]: I1201 08:46:27.535308 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a7cd8c79-5eb5-4883-b64c-f83dff955fd0-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-n2xnj\" (UID: \"a7cd8c79-5eb5-4883-b64c-f83dff955fd0\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-n2xnj" Dec 01 08:46:27 crc kubenswrapper[5004]: I1201 08:46:27.535419 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a7cd8c79-5eb5-4883-b64c-f83dff955fd0-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-n2xnj\" (UID: \"a7cd8c79-5eb5-4883-b64c-f83dff955fd0\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-n2xnj" Dec 01 08:46:27 crc kubenswrapper[5004]: I1201 08:46:27.546898 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgjzr\" (UniqueName: \"kubernetes.io/projected/a7cd8c79-5eb5-4883-b64c-f83dff955fd0-kube-api-access-jgjzr\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-n2xnj\" (UID: \"a7cd8c79-5eb5-4883-b64c-f83dff955fd0\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-n2xnj" Dec 01 08:46:27 crc kubenswrapper[5004]: I1201 08:46:27.627805 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-lc7hw" Dec 01 08:46:27 crc kubenswrapper[5004]: I1201 08:46:27.644370 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-n2xnj" Dec 01 08:46:27 crc kubenswrapper[5004]: I1201 08:46:27.730598 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/779fe233-ca28-4e4d-adb7-fbf03e3e751e-combined-ca-bundle\") pod \"779fe233-ca28-4e4d-adb7-fbf03e3e751e\" (UID: \"779fe233-ca28-4e4d-adb7-fbf03e3e751e\") " Dec 01 08:46:27 crc kubenswrapper[5004]: I1201 08:46:27.730768 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/779fe233-ca28-4e4d-adb7-fbf03e3e751e-scripts\") pod \"779fe233-ca28-4e4d-adb7-fbf03e3e751e\" (UID: \"779fe233-ca28-4e4d-adb7-fbf03e3e751e\") " Dec 01 08:46:27 crc kubenswrapper[5004]: I1201 08:46:27.730982 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4gl8\" (UniqueName: \"kubernetes.io/projected/779fe233-ca28-4e4d-adb7-fbf03e3e751e-kube-api-access-d4gl8\") pod \"779fe233-ca28-4e4d-adb7-fbf03e3e751e\" (UID: \"779fe233-ca28-4e4d-adb7-fbf03e3e751e\") " Dec 01 08:46:27 crc kubenswrapper[5004]: I1201 08:46:27.731066 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/779fe233-ca28-4e4d-adb7-fbf03e3e751e-config-data\") pod \"779fe233-ca28-4e4d-adb7-fbf03e3e751e\" (UID: \"779fe233-ca28-4e4d-adb7-fbf03e3e751e\") " Dec 01 08:46:27 crc kubenswrapper[5004]: I1201 08:46:27.742467 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/779fe233-ca28-4e4d-adb7-fbf03e3e751e-scripts" (OuterVolumeSpecName: "scripts") pod "779fe233-ca28-4e4d-adb7-fbf03e3e751e" (UID: "779fe233-ca28-4e4d-adb7-fbf03e3e751e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:46:27 crc kubenswrapper[5004]: I1201 08:46:27.753007 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/779fe233-ca28-4e4d-adb7-fbf03e3e751e-kube-api-access-d4gl8" (OuterVolumeSpecName: "kube-api-access-d4gl8") pod "779fe233-ca28-4e4d-adb7-fbf03e3e751e" (UID: "779fe233-ca28-4e4d-adb7-fbf03e3e751e"). InnerVolumeSpecName "kube-api-access-d4gl8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:46:27 crc kubenswrapper[5004]: I1201 08:46:27.766806 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/779fe233-ca28-4e4d-adb7-fbf03e3e751e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "779fe233-ca28-4e4d-adb7-fbf03e3e751e" (UID: "779fe233-ca28-4e4d-adb7-fbf03e3e751e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:46:27 crc kubenswrapper[5004]: I1201 08:46:27.777398 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/779fe233-ca28-4e4d-adb7-fbf03e3e751e-config-data" (OuterVolumeSpecName: "config-data") pod "779fe233-ca28-4e4d-adb7-fbf03e3e751e" (UID: "779fe233-ca28-4e4d-adb7-fbf03e3e751e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:46:27 crc kubenswrapper[5004]: I1201 08:46:27.834988 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/779fe233-ca28-4e4d-adb7-fbf03e3e751e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:46:27 crc kubenswrapper[5004]: I1201 08:46:27.835028 5004 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/779fe233-ca28-4e4d-adb7-fbf03e3e751e-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 08:46:27 crc kubenswrapper[5004]: I1201 08:46:27.835041 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4gl8\" (UniqueName: \"kubernetes.io/projected/779fe233-ca28-4e4d-adb7-fbf03e3e751e-kube-api-access-d4gl8\") on node \"crc\" DevicePath \"\"" Dec 01 08:46:27 crc kubenswrapper[5004]: I1201 08:46:27.835055 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/779fe233-ca28-4e4d-adb7-fbf03e3e751e-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 08:46:28 crc kubenswrapper[5004]: I1201 08:46:28.145070 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-lc7hw" event={"ID":"779fe233-ca28-4e4d-adb7-fbf03e3e751e","Type":"ContainerDied","Data":"a0c996052d70271dbbfc521f40750491529d667f3085f21169a0f09a5bc478b4"} Dec 01 08:46:28 crc kubenswrapper[5004]: I1201 08:46:28.145504 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0c996052d70271dbbfc521f40750491529d667f3085f21169a0f09a5bc478b4" Dec 01 08:46:28 crc kubenswrapper[5004]: I1201 08:46:28.145715 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-lc7hw" Dec 01 08:46:28 crc kubenswrapper[5004]: I1201 08:46:28.316748 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-n2xnj"] Dec 01 08:46:28 crc kubenswrapper[5004]: W1201 08:46:28.319409 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7cd8c79_5eb5_4883_b64c_f83dff955fd0.slice/crio-eb055b9b014ea7bee7940f6b917646b0fd418738d6190d8dd4607b31273d8633 WatchSource:0}: Error finding container eb055b9b014ea7bee7940f6b917646b0fd418738d6190d8dd4607b31273d8633: Status 404 returned error can't find the container with id eb055b9b014ea7bee7940f6b917646b0fd418738d6190d8dd4607b31273d8633 Dec 01 08:46:29 crc kubenswrapper[5004]: I1201 08:46:29.161512 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-n2xnj" event={"ID":"a7cd8c79-5eb5-4883-b64c-f83dff955fd0","Type":"ContainerStarted","Data":"eb055b9b014ea7bee7940f6b917646b0fd418738d6190d8dd4607b31273d8633"} Dec 01 08:46:30 crc kubenswrapper[5004]: I1201 08:46:30.176217 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-n2xnj" event={"ID":"a7cd8c79-5eb5-4883-b64c-f83dff955fd0","Type":"ContainerStarted","Data":"6eecd9f2c42da50f6df9fb5e740e4003733972312a8ef18b8cc92a79d5d0d805"} Dec 01 08:46:30 crc kubenswrapper[5004]: I1201 08:46:30.197181 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-n2xnj" podStartSLOduration=2.5737204289999998 podStartE2EDuration="3.19715824s" podCreationTimestamp="2025-12-01 08:46:27 +0000 UTC" firstStartedPulling="2025-12-01 08:46:28.321947851 +0000 UTC m=+1765.886939833" lastFinishedPulling="2025-12-01 08:46:28.945385652 +0000 UTC m=+1766.510377644" observedRunningTime="2025-12-01 08:46:30.196330971 +0000 UTC m=+1767.761322993" watchObservedRunningTime="2025-12-01 08:46:30.19715824 +0000 UTC m=+1767.762150252" Dec 01 08:46:31 crc kubenswrapper[5004]: I1201 08:46:31.169597 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Dec 01 08:46:31 crc kubenswrapper[5004]: I1201 08:46:31.170082 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="254314c7-5d69-4a31-b624-d985125bacee" containerName="aodh-notifier" containerID="cri-o://bfa91a9ee828473fdb3b0b3758debed03550479d13e5f986d09d96bfcaece393" gracePeriod=30 Dec 01 08:46:31 crc kubenswrapper[5004]: I1201 08:46:31.170117 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="254314c7-5d69-4a31-b624-d985125bacee" containerName="aodh-listener" containerID="cri-o://c7165fc56293d9f4bc95eb11bffcafa2f0bdacd92c7af0ff7c6c6039eca8db9c" gracePeriod=30 Dec 01 08:46:31 crc kubenswrapper[5004]: I1201 08:46:31.170230 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="254314c7-5d69-4a31-b624-d985125bacee" containerName="aodh-evaluator" containerID="cri-o://eae44cb9d6865a271883cb2bc942df42834c1acbe6e2299c68d4e66399188c6a" gracePeriod=30 Dec 01 08:46:31 crc kubenswrapper[5004]: I1201 08:46:31.170385 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="254314c7-5d69-4a31-b624-d985125bacee" containerName="aodh-api" containerID="cri-o://108bc634745220ffabaaad3cea124a02ba4a300a2ab78a8e63f1c5b6c285e421" gracePeriod=30 Dec 01 08:46:31 crc kubenswrapper[5004]: I1201 08:46:31.907879 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-k2xpb" Dec 01 08:46:31 crc kubenswrapper[5004]: I1201 08:46:31.908412 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-k2xpb" Dec 01 08:46:31 crc kubenswrapper[5004]: I1201 08:46:31.962704 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-k2xpb" Dec 01 08:46:32 crc kubenswrapper[5004]: I1201 08:46:32.210736 5004 generic.go:334] "Generic (PLEG): container finished" podID="254314c7-5d69-4a31-b624-d985125bacee" containerID="eae44cb9d6865a271883cb2bc942df42834c1acbe6e2299c68d4e66399188c6a" exitCode=0 Dec 01 08:46:32 crc kubenswrapper[5004]: I1201 08:46:32.211226 5004 generic.go:334] "Generic (PLEG): container finished" podID="254314c7-5d69-4a31-b624-d985125bacee" containerID="108bc634745220ffabaaad3cea124a02ba4a300a2ab78a8e63f1c5b6c285e421" exitCode=0 Dec 01 08:46:32 crc kubenswrapper[5004]: I1201 08:46:32.210856 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"254314c7-5d69-4a31-b624-d985125bacee","Type":"ContainerDied","Data":"eae44cb9d6865a271883cb2bc942df42834c1acbe6e2299c68d4e66399188c6a"} Dec 01 08:46:32 crc kubenswrapper[5004]: I1201 08:46:32.211319 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"254314c7-5d69-4a31-b624-d985125bacee","Type":"ContainerDied","Data":"108bc634745220ffabaaad3cea124a02ba4a300a2ab78a8e63f1c5b6c285e421"} Dec 01 08:46:32 crc kubenswrapper[5004]: I1201 08:46:32.273422 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-k2xpb" Dec 01 08:46:32 crc kubenswrapper[5004]: I1201 08:46:32.557119 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k2xpb"] Dec 01 08:46:33 crc kubenswrapper[5004]: I1201 08:46:33.226799 5004 generic.go:334] "Generic (PLEG): container finished" podID="a7cd8c79-5eb5-4883-b64c-f83dff955fd0" containerID="6eecd9f2c42da50f6df9fb5e740e4003733972312a8ef18b8cc92a79d5d0d805" exitCode=0 Dec 01 08:46:33 crc kubenswrapper[5004]: I1201 08:46:33.226913 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-n2xnj" event={"ID":"a7cd8c79-5eb5-4883-b64c-f83dff955fd0","Type":"ContainerDied","Data":"6eecd9f2c42da50f6df9fb5e740e4003733972312a8ef18b8cc92a79d5d0d805"} Dec 01 08:46:34 crc kubenswrapper[5004]: I1201 08:46:34.242838 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-k2xpb" podUID="dcf52602-5561-42cc-839c-5de03a1c2df5" containerName="registry-server" containerID="cri-o://d4969f821bd1ff8d85cd2301352f4644afe17c6c0650c7a1cd754ca079f97753" gracePeriod=2 Dec 01 08:46:34 crc kubenswrapper[5004]: I1201 08:46:34.812654 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-n2xnj" Dec 01 08:46:34 crc kubenswrapper[5004]: I1201 08:46:34.822250 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k2xpb" Dec 01 08:46:34 crc kubenswrapper[5004]: I1201 08:46:34.831471 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcf52602-5561-42cc-839c-5de03a1c2df5-catalog-content\") pod \"dcf52602-5561-42cc-839c-5de03a1c2df5\" (UID: \"dcf52602-5561-42cc-839c-5de03a1c2df5\") " Dec 01 08:46:34 crc kubenswrapper[5004]: I1201 08:46:34.831947 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a7cd8c79-5eb5-4883-b64c-f83dff955fd0-inventory\") pod \"a7cd8c79-5eb5-4883-b64c-f83dff955fd0\" (UID: \"a7cd8c79-5eb5-4883-b64c-f83dff955fd0\") " Dec 01 08:46:34 crc kubenswrapper[5004]: I1201 08:46:34.832210 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgjzr\" (UniqueName: \"kubernetes.io/projected/a7cd8c79-5eb5-4883-b64c-f83dff955fd0-kube-api-access-jgjzr\") pod \"a7cd8c79-5eb5-4883-b64c-f83dff955fd0\" (UID: \"a7cd8c79-5eb5-4883-b64c-f83dff955fd0\") " Dec 01 08:46:34 crc kubenswrapper[5004]: I1201 08:46:34.832418 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a7cd8c79-5eb5-4883-b64c-f83dff955fd0-ssh-key\") pod \"a7cd8c79-5eb5-4883-b64c-f83dff955fd0\" (UID: \"a7cd8c79-5eb5-4883-b64c-f83dff955fd0\") " Dec 01 08:46:34 crc kubenswrapper[5004]: I1201 08:46:34.832787 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5796\" (UniqueName: \"kubernetes.io/projected/dcf52602-5561-42cc-839c-5de03a1c2df5-kube-api-access-f5796\") pod \"dcf52602-5561-42cc-839c-5de03a1c2df5\" (UID: \"dcf52602-5561-42cc-839c-5de03a1c2df5\") " Dec 01 08:46:34 crc kubenswrapper[5004]: I1201 08:46:34.833031 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcf52602-5561-42cc-839c-5de03a1c2df5-utilities\") pod \"dcf52602-5561-42cc-839c-5de03a1c2df5\" (UID: \"dcf52602-5561-42cc-839c-5de03a1c2df5\") " Dec 01 08:46:34 crc kubenswrapper[5004]: I1201 08:46:34.834506 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dcf52602-5561-42cc-839c-5de03a1c2df5-utilities" (OuterVolumeSpecName: "utilities") pod "dcf52602-5561-42cc-839c-5de03a1c2df5" (UID: "dcf52602-5561-42cc-839c-5de03a1c2df5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:46:34 crc kubenswrapper[5004]: I1201 08:46:34.838601 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcf52602-5561-42cc-839c-5de03a1c2df5-kube-api-access-f5796" (OuterVolumeSpecName: "kube-api-access-f5796") pod "dcf52602-5561-42cc-839c-5de03a1c2df5" (UID: "dcf52602-5561-42cc-839c-5de03a1c2df5"). InnerVolumeSpecName "kube-api-access-f5796". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:46:34 crc kubenswrapper[5004]: I1201 08:46:34.839703 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7cd8c79-5eb5-4883-b64c-f83dff955fd0-kube-api-access-jgjzr" (OuterVolumeSpecName: "kube-api-access-jgjzr") pod "a7cd8c79-5eb5-4883-b64c-f83dff955fd0" (UID: "a7cd8c79-5eb5-4883-b64c-f83dff955fd0"). InnerVolumeSpecName "kube-api-access-jgjzr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:46:34 crc kubenswrapper[5004]: I1201 08:46:34.873284 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7cd8c79-5eb5-4883-b64c-f83dff955fd0-inventory" (OuterVolumeSpecName: "inventory") pod "a7cd8c79-5eb5-4883-b64c-f83dff955fd0" (UID: "a7cd8c79-5eb5-4883-b64c-f83dff955fd0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:46:34 crc kubenswrapper[5004]: I1201 08:46:34.884806 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dcf52602-5561-42cc-839c-5de03a1c2df5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dcf52602-5561-42cc-839c-5de03a1c2df5" (UID: "dcf52602-5561-42cc-839c-5de03a1c2df5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:46:34 crc kubenswrapper[5004]: I1201 08:46:34.888181 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7cd8c79-5eb5-4883-b64c-f83dff955fd0-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a7cd8c79-5eb5-4883-b64c-f83dff955fd0" (UID: "a7cd8c79-5eb5-4883-b64c-f83dff955fd0"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:46:34 crc kubenswrapper[5004]: I1201 08:46:34.936296 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5796\" (UniqueName: \"kubernetes.io/projected/dcf52602-5561-42cc-839c-5de03a1c2df5-kube-api-access-f5796\") on node \"crc\" DevicePath \"\"" Dec 01 08:46:34 crc kubenswrapper[5004]: I1201 08:46:34.936594 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcf52602-5561-42cc-839c-5de03a1c2df5-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 08:46:34 crc kubenswrapper[5004]: I1201 08:46:34.936692 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcf52602-5561-42cc-839c-5de03a1c2df5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 08:46:34 crc kubenswrapper[5004]: I1201 08:46:34.936766 5004 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a7cd8c79-5eb5-4883-b64c-f83dff955fd0-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 08:46:34 crc kubenswrapper[5004]: I1201 08:46:34.936839 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgjzr\" (UniqueName: \"kubernetes.io/projected/a7cd8c79-5eb5-4883-b64c-f83dff955fd0-kube-api-access-jgjzr\") on node \"crc\" DevicePath \"\"" Dec 01 08:46:34 crc kubenswrapper[5004]: I1201 08:46:34.936918 5004 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a7cd8c79-5eb5-4883-b64c-f83dff955fd0-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 08:46:35 crc kubenswrapper[5004]: I1201 08:46:35.257598 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-n2xnj" event={"ID":"a7cd8c79-5eb5-4883-b64c-f83dff955fd0","Type":"ContainerDied","Data":"eb055b9b014ea7bee7940f6b917646b0fd418738d6190d8dd4607b31273d8633"} Dec 01 08:46:35 crc kubenswrapper[5004]: I1201 08:46:35.258832 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb055b9b014ea7bee7940f6b917646b0fd418738d6190d8dd4607b31273d8633" Dec 01 08:46:35 crc kubenswrapper[5004]: I1201 08:46:35.257721 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-n2xnj" Dec 01 08:46:35 crc kubenswrapper[5004]: I1201 08:46:35.261270 5004 generic.go:334] "Generic (PLEG): container finished" podID="dcf52602-5561-42cc-839c-5de03a1c2df5" containerID="d4969f821bd1ff8d85cd2301352f4644afe17c6c0650c7a1cd754ca079f97753" exitCode=0 Dec 01 08:46:35 crc kubenswrapper[5004]: I1201 08:46:35.261344 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k2xpb" Dec 01 08:46:35 crc kubenswrapper[5004]: I1201 08:46:35.261485 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k2xpb" event={"ID":"dcf52602-5561-42cc-839c-5de03a1c2df5","Type":"ContainerDied","Data":"d4969f821bd1ff8d85cd2301352f4644afe17c6c0650c7a1cd754ca079f97753"} Dec 01 08:46:35 crc kubenswrapper[5004]: I1201 08:46:35.261648 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k2xpb" event={"ID":"dcf52602-5561-42cc-839c-5de03a1c2df5","Type":"ContainerDied","Data":"f11725e71ea0635c5b8fb1662323c0154bb385d3805506320985c37b65215237"} Dec 01 08:46:35 crc kubenswrapper[5004]: I1201 08:46:35.261793 5004 scope.go:117] "RemoveContainer" containerID="d4969f821bd1ff8d85cd2301352f4644afe17c6c0650c7a1cd754ca079f97753" Dec 01 08:46:35 crc kubenswrapper[5004]: I1201 08:46:35.324900 5004 scope.go:117] "RemoveContainer" containerID="c2bb27d3c71493c0c09270577c523c9b65b1439c9eb98361fe60e654e5c97e38" Dec 01 08:46:35 crc kubenswrapper[5004]: I1201 08:46:35.340504 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k2xpb"] Dec 01 08:46:35 crc kubenswrapper[5004]: I1201 08:46:35.371342 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-k2xpb"] Dec 01 08:46:35 crc kubenswrapper[5004]: I1201 08:46:35.371408 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-l7cvw"] Dec 01 08:46:35 crc kubenswrapper[5004]: E1201 08:46:35.371861 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="779fe233-ca28-4e4d-adb7-fbf03e3e751e" containerName="aodh-db-sync" Dec 01 08:46:35 crc kubenswrapper[5004]: I1201 08:46:35.371878 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="779fe233-ca28-4e4d-adb7-fbf03e3e751e" containerName="aodh-db-sync" Dec 01 08:46:35 crc kubenswrapper[5004]: E1201 08:46:35.371896 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcf52602-5561-42cc-839c-5de03a1c2df5" containerName="registry-server" Dec 01 08:46:35 crc kubenswrapper[5004]: I1201 08:46:35.371903 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcf52602-5561-42cc-839c-5de03a1c2df5" containerName="registry-server" Dec 01 08:46:35 crc kubenswrapper[5004]: E1201 08:46:35.371918 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7cd8c79-5eb5-4883-b64c-f83dff955fd0" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 01 08:46:35 crc kubenswrapper[5004]: I1201 08:46:35.371925 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7cd8c79-5eb5-4883-b64c-f83dff955fd0" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 01 08:46:35 crc kubenswrapper[5004]: E1201 08:46:35.371937 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcf52602-5561-42cc-839c-5de03a1c2df5" containerName="extract-utilities" Dec 01 08:46:35 crc kubenswrapper[5004]: I1201 08:46:35.371943 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcf52602-5561-42cc-839c-5de03a1c2df5" containerName="extract-utilities" Dec 01 08:46:35 crc kubenswrapper[5004]: E1201 08:46:35.371951 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcf52602-5561-42cc-839c-5de03a1c2df5" containerName="extract-content" Dec 01 08:46:35 crc kubenswrapper[5004]: I1201 08:46:35.371957 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcf52602-5561-42cc-839c-5de03a1c2df5" containerName="extract-content" Dec 01 08:46:35 crc kubenswrapper[5004]: I1201 08:46:35.372175 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7cd8c79-5eb5-4883-b64c-f83dff955fd0" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 01 08:46:35 crc kubenswrapper[5004]: I1201 08:46:35.372189 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="779fe233-ca28-4e4d-adb7-fbf03e3e751e" containerName="aodh-db-sync" Dec 01 08:46:35 crc kubenswrapper[5004]: I1201 08:46:35.372221 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcf52602-5561-42cc-839c-5de03a1c2df5" containerName="registry-server" Dec 01 08:46:35 crc kubenswrapper[5004]: I1201 08:46:35.372936 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-l7cvw" Dec 01 08:46:35 crc kubenswrapper[5004]: I1201 08:46:35.380805 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-l7cvw"] Dec 01 08:46:35 crc kubenswrapper[5004]: I1201 08:46:35.392854 5004 scope.go:117] "RemoveContainer" containerID="1c94751d2c3bae881fc5f4bb9c78986190eb3295e7ce908698c275a6e9b90098" Dec 01 08:46:35 crc kubenswrapper[5004]: I1201 08:46:35.393504 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pdnrq" Dec 01 08:46:35 crc kubenswrapper[5004]: I1201 08:46:35.393670 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 08:46:35 crc kubenswrapper[5004]: I1201 08:46:35.393845 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 08:46:35 crc kubenswrapper[5004]: I1201 08:46:35.393943 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 08:46:35 crc kubenswrapper[5004]: I1201 08:46:35.445232 5004 scope.go:117] "RemoveContainer" containerID="d4969f821bd1ff8d85cd2301352f4644afe17c6c0650c7a1cd754ca079f97753" Dec 01 08:46:35 crc kubenswrapper[5004]: E1201 08:46:35.445829 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4969f821bd1ff8d85cd2301352f4644afe17c6c0650c7a1cd754ca079f97753\": container with ID starting with d4969f821bd1ff8d85cd2301352f4644afe17c6c0650c7a1cd754ca079f97753 not found: ID does not exist" containerID="d4969f821bd1ff8d85cd2301352f4644afe17c6c0650c7a1cd754ca079f97753" Dec 01 08:46:35 crc kubenswrapper[5004]: I1201 08:46:35.445895 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4969f821bd1ff8d85cd2301352f4644afe17c6c0650c7a1cd754ca079f97753"} err="failed to get container status \"d4969f821bd1ff8d85cd2301352f4644afe17c6c0650c7a1cd754ca079f97753\": rpc error: code = NotFound desc = could not find container \"d4969f821bd1ff8d85cd2301352f4644afe17c6c0650c7a1cd754ca079f97753\": container with ID starting with d4969f821bd1ff8d85cd2301352f4644afe17c6c0650c7a1cd754ca079f97753 not found: ID does not exist" Dec 01 08:46:35 crc kubenswrapper[5004]: I1201 08:46:35.445931 5004 scope.go:117] "RemoveContainer" containerID="c2bb27d3c71493c0c09270577c523c9b65b1439c9eb98361fe60e654e5c97e38" Dec 01 08:46:35 crc kubenswrapper[5004]: E1201 08:46:35.446613 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2bb27d3c71493c0c09270577c523c9b65b1439c9eb98361fe60e654e5c97e38\": container with ID starting with c2bb27d3c71493c0c09270577c523c9b65b1439c9eb98361fe60e654e5c97e38 not found: ID does not exist" containerID="c2bb27d3c71493c0c09270577c523c9b65b1439c9eb98361fe60e654e5c97e38" Dec 01 08:46:35 crc kubenswrapper[5004]: I1201 08:46:35.446664 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2bb27d3c71493c0c09270577c523c9b65b1439c9eb98361fe60e654e5c97e38"} err="failed to get container status \"c2bb27d3c71493c0c09270577c523c9b65b1439c9eb98361fe60e654e5c97e38\": rpc error: code = NotFound desc = could not find container \"c2bb27d3c71493c0c09270577c523c9b65b1439c9eb98361fe60e654e5c97e38\": container with ID starting with c2bb27d3c71493c0c09270577c523c9b65b1439c9eb98361fe60e654e5c97e38 not found: ID does not exist" Dec 01 08:46:35 crc kubenswrapper[5004]: I1201 08:46:35.446702 5004 scope.go:117] "RemoveContainer" containerID="1c94751d2c3bae881fc5f4bb9c78986190eb3295e7ce908698c275a6e9b90098" Dec 01 08:46:35 crc kubenswrapper[5004]: E1201 08:46:35.447017 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c94751d2c3bae881fc5f4bb9c78986190eb3295e7ce908698c275a6e9b90098\": container with ID starting with 1c94751d2c3bae881fc5f4bb9c78986190eb3295e7ce908698c275a6e9b90098 not found: ID does not exist" containerID="1c94751d2c3bae881fc5f4bb9c78986190eb3295e7ce908698c275a6e9b90098" Dec 01 08:46:35 crc kubenswrapper[5004]: I1201 08:46:35.447050 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c94751d2c3bae881fc5f4bb9c78986190eb3295e7ce908698c275a6e9b90098"} err="failed to get container status \"1c94751d2c3bae881fc5f4bb9c78986190eb3295e7ce908698c275a6e9b90098\": rpc error: code = NotFound desc = could not find container \"1c94751d2c3bae881fc5f4bb9c78986190eb3295e7ce908698c275a6e9b90098\": container with ID starting with 1c94751d2c3bae881fc5f4bb9c78986190eb3295e7ce908698c275a6e9b90098 not found: ID does not exist" Dec 01 08:46:35 crc kubenswrapper[5004]: I1201 08:46:35.495735 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76e7add3-c357-40c8-b77c-dd408f7315d2-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-l7cvw\" (UID: \"76e7add3-c357-40c8-b77c-dd408f7315d2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-l7cvw" Dec 01 08:46:35 crc kubenswrapper[5004]: I1201 08:46:35.496083 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/76e7add3-c357-40c8-b77c-dd408f7315d2-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-l7cvw\" (UID: \"76e7add3-c357-40c8-b77c-dd408f7315d2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-l7cvw" Dec 01 08:46:35 crc kubenswrapper[5004]: I1201 08:46:35.496153 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/76e7add3-c357-40c8-b77c-dd408f7315d2-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-l7cvw\" (UID: \"76e7add3-c357-40c8-b77c-dd408f7315d2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-l7cvw" Dec 01 08:46:35 crc kubenswrapper[5004]: I1201 08:46:35.496280 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snt76\" (UniqueName: \"kubernetes.io/projected/76e7add3-c357-40c8-b77c-dd408f7315d2-kube-api-access-snt76\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-l7cvw\" (UID: \"76e7add3-c357-40c8-b77c-dd408f7315d2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-l7cvw" Dec 01 08:46:35 crc kubenswrapper[5004]: I1201 08:46:35.599408 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snt76\" (UniqueName: \"kubernetes.io/projected/76e7add3-c357-40c8-b77c-dd408f7315d2-kube-api-access-snt76\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-l7cvw\" (UID: \"76e7add3-c357-40c8-b77c-dd408f7315d2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-l7cvw" Dec 01 08:46:35 crc kubenswrapper[5004]: I1201 08:46:35.599620 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76e7add3-c357-40c8-b77c-dd408f7315d2-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-l7cvw\" (UID: \"76e7add3-c357-40c8-b77c-dd408f7315d2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-l7cvw" Dec 01 08:46:35 crc kubenswrapper[5004]: I1201 08:46:35.599720 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/76e7add3-c357-40c8-b77c-dd408f7315d2-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-l7cvw\" (UID: \"76e7add3-c357-40c8-b77c-dd408f7315d2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-l7cvw" Dec 01 08:46:35 crc kubenswrapper[5004]: I1201 08:46:35.600949 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/76e7add3-c357-40c8-b77c-dd408f7315d2-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-l7cvw\" (UID: \"76e7add3-c357-40c8-b77c-dd408f7315d2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-l7cvw" Dec 01 08:46:35 crc kubenswrapper[5004]: I1201 08:46:35.605783 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/76e7add3-c357-40c8-b77c-dd408f7315d2-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-l7cvw\" (UID: \"76e7add3-c357-40c8-b77c-dd408f7315d2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-l7cvw" Dec 01 08:46:35 crc kubenswrapper[5004]: I1201 08:46:35.606063 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76e7add3-c357-40c8-b77c-dd408f7315d2-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-l7cvw\" (UID: \"76e7add3-c357-40c8-b77c-dd408f7315d2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-l7cvw" Dec 01 08:46:35 crc kubenswrapper[5004]: I1201 08:46:35.609710 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/76e7add3-c357-40c8-b77c-dd408f7315d2-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-l7cvw\" (UID: \"76e7add3-c357-40c8-b77c-dd408f7315d2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-l7cvw" Dec 01 08:46:35 crc kubenswrapper[5004]: I1201 08:46:35.621866 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snt76\" (UniqueName: \"kubernetes.io/projected/76e7add3-c357-40c8-b77c-dd408f7315d2-kube-api-access-snt76\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-l7cvw\" (UID: \"76e7add3-c357-40c8-b77c-dd408f7315d2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-l7cvw" Dec 01 08:46:35 crc kubenswrapper[5004]: I1201 08:46:35.795400 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-l7cvw" Dec 01 08:46:36 crc kubenswrapper[5004]: I1201 08:46:36.404030 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-l7cvw"] Dec 01 08:46:36 crc kubenswrapper[5004]: I1201 08:46:36.778650 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcf52602-5561-42cc-839c-5de03a1c2df5" path="/var/lib/kubelet/pods/dcf52602-5561-42cc-839c-5de03a1c2df5/volumes" Dec 01 08:46:37 crc kubenswrapper[5004]: I1201 08:46:37.292500 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-l7cvw" event={"ID":"76e7add3-c357-40c8-b77c-dd408f7315d2","Type":"ContainerStarted","Data":"47e1756aeb3354fb8aaa80fc956939adc7fa844b73cc783e51b2d82137e756fb"} Dec 01 08:46:37 crc kubenswrapper[5004]: I1201 08:46:37.758534 5004 scope.go:117] "RemoveContainer" containerID="70f45b3d2a6bb4bf0d87f18bc08b282543e238e1d391a58745914af24ad7956c" Dec 01 08:46:37 crc kubenswrapper[5004]: E1201 08:46:37.758929 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 08:46:38 crc kubenswrapper[5004]: I1201 08:46:38.320210 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-l7cvw" event={"ID":"76e7add3-c357-40c8-b77c-dd408f7315d2","Type":"ContainerStarted","Data":"50fe6912d82c944f9060d57de71a4e4144e6d15a3ee36f50b48b260ad62f3669"} Dec 01 08:46:38 crc kubenswrapper[5004]: I1201 08:46:38.348940 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-l7cvw" podStartSLOduration=2.8013499 podStartE2EDuration="3.348911s" podCreationTimestamp="2025-12-01 08:46:35 +0000 UTC" firstStartedPulling="2025-12-01 08:46:36.41990578 +0000 UTC m=+1773.984897802" lastFinishedPulling="2025-12-01 08:46:36.96746691 +0000 UTC m=+1774.532458902" observedRunningTime="2025-12-01 08:46:38.337691725 +0000 UTC m=+1775.902683747" watchObservedRunningTime="2025-12-01 08:46:38.348911 +0000 UTC m=+1775.913903002" Dec 01 08:46:39 crc kubenswrapper[5004]: I1201 08:46:39.335837 5004 generic.go:334] "Generic (PLEG): container finished" podID="254314c7-5d69-4a31-b624-d985125bacee" containerID="c7165fc56293d9f4bc95eb11bffcafa2f0bdacd92c7af0ff7c6c6039eca8db9c" exitCode=0 Dec 01 08:46:39 crc kubenswrapper[5004]: I1201 08:46:39.337414 5004 generic.go:334] "Generic (PLEG): container finished" podID="254314c7-5d69-4a31-b624-d985125bacee" containerID="bfa91a9ee828473fdb3b0b3758debed03550479d13e5f986d09d96bfcaece393" exitCode=0 Dec 01 08:46:39 crc kubenswrapper[5004]: I1201 08:46:39.335914 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"254314c7-5d69-4a31-b624-d985125bacee","Type":"ContainerDied","Data":"c7165fc56293d9f4bc95eb11bffcafa2f0bdacd92c7af0ff7c6c6039eca8db9c"} Dec 01 08:46:39 crc kubenswrapper[5004]: I1201 08:46:39.337638 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"254314c7-5d69-4a31-b624-d985125bacee","Type":"ContainerDied","Data":"bfa91a9ee828473fdb3b0b3758debed03550479d13e5f986d09d96bfcaece393"} Dec 01 08:46:39 crc kubenswrapper[5004]: I1201 08:46:39.337667 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"254314c7-5d69-4a31-b624-d985125bacee","Type":"ContainerDied","Data":"b39670f2d5f8a299b202368dbff731a29c2a6e7c91eba916211742d49d35967d"} Dec 01 08:46:39 crc kubenswrapper[5004]: I1201 08:46:39.337680 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b39670f2d5f8a299b202368dbff731a29c2a6e7c91eba916211742d49d35967d" Dec 01 08:46:39 crc kubenswrapper[5004]: I1201 08:46:39.438486 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 01 08:46:39 crc kubenswrapper[5004]: I1201 08:46:39.519387 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/254314c7-5d69-4a31-b624-d985125bacee-public-tls-certs\") pod \"254314c7-5d69-4a31-b624-d985125bacee\" (UID: \"254314c7-5d69-4a31-b624-d985125bacee\") " Dec 01 08:46:39 crc kubenswrapper[5004]: I1201 08:46:39.519607 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/254314c7-5d69-4a31-b624-d985125bacee-scripts\") pod \"254314c7-5d69-4a31-b624-d985125bacee\" (UID: \"254314c7-5d69-4a31-b624-d985125bacee\") " Dec 01 08:46:39 crc kubenswrapper[5004]: I1201 08:46:39.519689 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/254314c7-5d69-4a31-b624-d985125bacee-combined-ca-bundle\") pod \"254314c7-5d69-4a31-b624-d985125bacee\" (UID: \"254314c7-5d69-4a31-b624-d985125bacee\") " Dec 01 08:46:39 crc kubenswrapper[5004]: I1201 08:46:39.519765 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dswgk\" (UniqueName: \"kubernetes.io/projected/254314c7-5d69-4a31-b624-d985125bacee-kube-api-access-dswgk\") pod \"254314c7-5d69-4a31-b624-d985125bacee\" (UID: \"254314c7-5d69-4a31-b624-d985125bacee\") " Dec 01 08:46:39 crc kubenswrapper[5004]: I1201 08:46:39.519820 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/254314c7-5d69-4a31-b624-d985125bacee-config-data\") pod \"254314c7-5d69-4a31-b624-d985125bacee\" (UID: \"254314c7-5d69-4a31-b624-d985125bacee\") " Dec 01 08:46:39 crc kubenswrapper[5004]: I1201 08:46:39.519886 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/254314c7-5d69-4a31-b624-d985125bacee-internal-tls-certs\") pod \"254314c7-5d69-4a31-b624-d985125bacee\" (UID: \"254314c7-5d69-4a31-b624-d985125bacee\") " Dec 01 08:46:39 crc kubenswrapper[5004]: I1201 08:46:39.553015 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/254314c7-5d69-4a31-b624-d985125bacee-scripts" (OuterVolumeSpecName: "scripts") pod "254314c7-5d69-4a31-b624-d985125bacee" (UID: "254314c7-5d69-4a31-b624-d985125bacee"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:46:39 crc kubenswrapper[5004]: I1201 08:46:39.561492 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/254314c7-5d69-4a31-b624-d985125bacee-kube-api-access-dswgk" (OuterVolumeSpecName: "kube-api-access-dswgk") pod "254314c7-5d69-4a31-b624-d985125bacee" (UID: "254314c7-5d69-4a31-b624-d985125bacee"). InnerVolumeSpecName "kube-api-access-dswgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:46:39 crc kubenswrapper[5004]: I1201 08:46:39.610058 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/254314c7-5d69-4a31-b624-d985125bacee-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "254314c7-5d69-4a31-b624-d985125bacee" (UID: "254314c7-5d69-4a31-b624-d985125bacee"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:46:39 crc kubenswrapper[5004]: I1201 08:46:39.625655 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/254314c7-5d69-4a31-b624-d985125bacee-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "254314c7-5d69-4a31-b624-d985125bacee" (UID: "254314c7-5d69-4a31-b624-d985125bacee"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:46:39 crc kubenswrapper[5004]: I1201 08:46:39.628031 5004 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/254314c7-5d69-4a31-b624-d985125bacee-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 08:46:39 crc kubenswrapper[5004]: I1201 08:46:39.628314 5004 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/254314c7-5d69-4a31-b624-d985125bacee-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 08:46:39 crc kubenswrapper[5004]: I1201 08:46:39.628545 5004 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/254314c7-5d69-4a31-b624-d985125bacee-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 08:46:39 crc kubenswrapper[5004]: I1201 08:46:39.628712 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dswgk\" (UniqueName: \"kubernetes.io/projected/254314c7-5d69-4a31-b624-d985125bacee-kube-api-access-dswgk\") on node \"crc\" DevicePath \"\"" Dec 01 08:46:39 crc kubenswrapper[5004]: I1201 08:46:39.679100 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/254314c7-5d69-4a31-b624-d985125bacee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "254314c7-5d69-4a31-b624-d985125bacee" (UID: "254314c7-5d69-4a31-b624-d985125bacee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:46:39 crc kubenswrapper[5004]: I1201 08:46:39.701494 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/254314c7-5d69-4a31-b624-d985125bacee-config-data" (OuterVolumeSpecName: "config-data") pod "254314c7-5d69-4a31-b624-d985125bacee" (UID: "254314c7-5d69-4a31-b624-d985125bacee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:46:39 crc kubenswrapper[5004]: I1201 08:46:39.730546 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/254314c7-5d69-4a31-b624-d985125bacee-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 08:46:39 crc kubenswrapper[5004]: I1201 08:46:39.730592 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/254314c7-5d69-4a31-b624-d985125bacee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:46:40 crc kubenswrapper[5004]: I1201 08:46:40.350015 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 01 08:46:40 crc kubenswrapper[5004]: I1201 08:46:40.402325 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Dec 01 08:46:40 crc kubenswrapper[5004]: I1201 08:46:40.419221 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Dec 01 08:46:40 crc kubenswrapper[5004]: I1201 08:46:40.431697 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Dec 01 08:46:40 crc kubenswrapper[5004]: E1201 08:46:40.432272 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="254314c7-5d69-4a31-b624-d985125bacee" containerName="aodh-listener" Dec 01 08:46:40 crc kubenswrapper[5004]: I1201 08:46:40.432291 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="254314c7-5d69-4a31-b624-d985125bacee" containerName="aodh-listener" Dec 01 08:46:40 crc kubenswrapper[5004]: E1201 08:46:40.432316 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="254314c7-5d69-4a31-b624-d985125bacee" containerName="aodh-notifier" Dec 01 08:46:40 crc kubenswrapper[5004]: I1201 08:46:40.432322 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="254314c7-5d69-4a31-b624-d985125bacee" containerName="aodh-notifier" Dec 01 08:46:40 crc kubenswrapper[5004]: E1201 08:46:40.432355 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="254314c7-5d69-4a31-b624-d985125bacee" containerName="aodh-api" Dec 01 08:46:40 crc kubenswrapper[5004]: I1201 08:46:40.432362 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="254314c7-5d69-4a31-b624-d985125bacee" containerName="aodh-api" Dec 01 08:46:40 crc kubenswrapper[5004]: E1201 08:46:40.432372 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="254314c7-5d69-4a31-b624-d985125bacee" containerName="aodh-evaluator" Dec 01 08:46:40 crc kubenswrapper[5004]: I1201 08:46:40.432379 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="254314c7-5d69-4a31-b624-d985125bacee" containerName="aodh-evaluator" Dec 01 08:46:40 crc kubenswrapper[5004]: I1201 08:46:40.432627 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="254314c7-5d69-4a31-b624-d985125bacee" containerName="aodh-notifier" Dec 01 08:46:40 crc kubenswrapper[5004]: I1201 08:46:40.432639 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="254314c7-5d69-4a31-b624-d985125bacee" containerName="aodh-evaluator" Dec 01 08:46:40 crc kubenswrapper[5004]: I1201 08:46:40.432653 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="254314c7-5d69-4a31-b624-d985125bacee" containerName="aodh-listener" Dec 01 08:46:40 crc kubenswrapper[5004]: I1201 08:46:40.432665 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="254314c7-5d69-4a31-b624-d985125bacee" containerName="aodh-api" Dec 01 08:46:40 crc kubenswrapper[5004]: I1201 08:46:40.434799 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 01 08:46:40 crc kubenswrapper[5004]: I1201 08:46:40.439696 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Dec 01 08:46:40 crc kubenswrapper[5004]: I1201 08:46:40.439858 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Dec 01 08:46:40 crc kubenswrapper[5004]: I1201 08:46:40.440736 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Dec 01 08:46:40 crc kubenswrapper[5004]: I1201 08:46:40.441122 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Dec 01 08:46:40 crc kubenswrapper[5004]: I1201 08:46:40.453938 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-hrc7d" Dec 01 08:46:40 crc kubenswrapper[5004]: I1201 08:46:40.454483 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 01 08:46:40 crc kubenswrapper[5004]: I1201 08:46:40.554013 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2920a19e-815a-4972-83c6-7c85f961f88a-public-tls-certs\") pod \"aodh-0\" (UID: \"2920a19e-815a-4972-83c6-7c85f961f88a\") " pod="openstack/aodh-0" Dec 01 08:46:40 crc kubenswrapper[5004]: I1201 08:46:40.554101 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2920a19e-815a-4972-83c6-7c85f961f88a-combined-ca-bundle\") pod \"aodh-0\" (UID: \"2920a19e-815a-4972-83c6-7c85f961f88a\") " pod="openstack/aodh-0" Dec 01 08:46:40 crc kubenswrapper[5004]: I1201 08:46:40.554172 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2920a19e-815a-4972-83c6-7c85f961f88a-internal-tls-certs\") pod \"aodh-0\" (UID: \"2920a19e-815a-4972-83c6-7c85f961f88a\") " pod="openstack/aodh-0" Dec 01 08:46:40 crc kubenswrapper[5004]: I1201 08:46:40.554238 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2920a19e-815a-4972-83c6-7c85f961f88a-config-data\") pod \"aodh-0\" (UID: \"2920a19e-815a-4972-83c6-7c85f961f88a\") " pod="openstack/aodh-0" Dec 01 08:46:40 crc kubenswrapper[5004]: I1201 08:46:40.554434 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcsld\" (UniqueName: \"kubernetes.io/projected/2920a19e-815a-4972-83c6-7c85f961f88a-kube-api-access-jcsld\") pod \"aodh-0\" (UID: \"2920a19e-815a-4972-83c6-7c85f961f88a\") " pod="openstack/aodh-0" Dec 01 08:46:40 crc kubenswrapper[5004]: I1201 08:46:40.554475 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2920a19e-815a-4972-83c6-7c85f961f88a-scripts\") pod \"aodh-0\" (UID: \"2920a19e-815a-4972-83c6-7c85f961f88a\") " pod="openstack/aodh-0" Dec 01 08:46:40 crc kubenswrapper[5004]: I1201 08:46:40.658851 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2920a19e-815a-4972-83c6-7c85f961f88a-public-tls-certs\") pod \"aodh-0\" (UID: \"2920a19e-815a-4972-83c6-7c85f961f88a\") " pod="openstack/aodh-0" Dec 01 08:46:40 crc kubenswrapper[5004]: I1201 08:46:40.658941 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2920a19e-815a-4972-83c6-7c85f961f88a-combined-ca-bundle\") pod \"aodh-0\" (UID: \"2920a19e-815a-4972-83c6-7c85f961f88a\") " pod="openstack/aodh-0" Dec 01 08:46:40 crc kubenswrapper[5004]: I1201 08:46:40.658966 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2920a19e-815a-4972-83c6-7c85f961f88a-internal-tls-certs\") pod \"aodh-0\" (UID: \"2920a19e-815a-4972-83c6-7c85f961f88a\") " pod="openstack/aodh-0" Dec 01 08:46:40 crc kubenswrapper[5004]: I1201 08:46:40.659058 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2920a19e-815a-4972-83c6-7c85f961f88a-config-data\") pod \"aodh-0\" (UID: \"2920a19e-815a-4972-83c6-7c85f961f88a\") " pod="openstack/aodh-0" Dec 01 08:46:40 crc kubenswrapper[5004]: I1201 08:46:40.659218 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcsld\" (UniqueName: \"kubernetes.io/projected/2920a19e-815a-4972-83c6-7c85f961f88a-kube-api-access-jcsld\") pod \"aodh-0\" (UID: \"2920a19e-815a-4972-83c6-7c85f961f88a\") " pod="openstack/aodh-0" Dec 01 08:46:40 crc kubenswrapper[5004]: I1201 08:46:40.659240 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2920a19e-815a-4972-83c6-7c85f961f88a-scripts\") pod \"aodh-0\" (UID: \"2920a19e-815a-4972-83c6-7c85f961f88a\") " pod="openstack/aodh-0" Dec 01 08:46:40 crc kubenswrapper[5004]: I1201 08:46:40.665993 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2920a19e-815a-4972-83c6-7c85f961f88a-internal-tls-certs\") pod \"aodh-0\" (UID: \"2920a19e-815a-4972-83c6-7c85f961f88a\") " pod="openstack/aodh-0" Dec 01 08:46:40 crc kubenswrapper[5004]: I1201 08:46:40.667418 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2920a19e-815a-4972-83c6-7c85f961f88a-config-data\") pod \"aodh-0\" (UID: \"2920a19e-815a-4972-83c6-7c85f961f88a\") " pod="openstack/aodh-0" Dec 01 08:46:40 crc kubenswrapper[5004]: I1201 08:46:40.669053 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2920a19e-815a-4972-83c6-7c85f961f88a-public-tls-certs\") pod \"aodh-0\" (UID: \"2920a19e-815a-4972-83c6-7c85f961f88a\") " pod="openstack/aodh-0" Dec 01 08:46:40 crc kubenswrapper[5004]: I1201 08:46:40.674236 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2920a19e-815a-4972-83c6-7c85f961f88a-combined-ca-bundle\") pod \"aodh-0\" (UID: \"2920a19e-815a-4972-83c6-7c85f961f88a\") " pod="openstack/aodh-0" Dec 01 08:46:40 crc kubenswrapper[5004]: I1201 08:46:40.678253 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2920a19e-815a-4972-83c6-7c85f961f88a-scripts\") pod \"aodh-0\" (UID: \"2920a19e-815a-4972-83c6-7c85f961f88a\") " pod="openstack/aodh-0" Dec 01 08:46:40 crc kubenswrapper[5004]: I1201 08:46:40.683187 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcsld\" (UniqueName: \"kubernetes.io/projected/2920a19e-815a-4972-83c6-7c85f961f88a-kube-api-access-jcsld\") pod \"aodh-0\" (UID: \"2920a19e-815a-4972-83c6-7c85f961f88a\") " pod="openstack/aodh-0" Dec 01 08:46:40 crc kubenswrapper[5004]: I1201 08:46:40.771604 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="254314c7-5d69-4a31-b624-d985125bacee" path="/var/lib/kubelet/pods/254314c7-5d69-4a31-b624-d985125bacee/volumes" Dec 01 08:46:40 crc kubenswrapper[5004]: I1201 08:46:40.772396 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 01 08:46:41 crc kubenswrapper[5004]: I1201 08:46:41.418062 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 01 08:46:41 crc kubenswrapper[5004]: W1201 08:46:41.432776 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2920a19e_815a_4972_83c6_7c85f961f88a.slice/crio-404f94f0d5afc3ad6088541b2a63d0364af51bce13ce73be778b0e926d0a6c2e WatchSource:0}: Error finding container 404f94f0d5afc3ad6088541b2a63d0364af51bce13ce73be778b0e926d0a6c2e: Status 404 returned error can't find the container with id 404f94f0d5afc3ad6088541b2a63d0364af51bce13ce73be778b0e926d0a6c2e Dec 01 08:46:42 crc kubenswrapper[5004]: I1201 08:46:42.381256 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"2920a19e-815a-4972-83c6-7c85f961f88a","Type":"ContainerStarted","Data":"47e190111ec6de2c09a4ac5493cdaefa87668fc60cda1a99ac1f5d6837365b4b"} Dec 01 08:46:42 crc kubenswrapper[5004]: I1201 08:46:42.381951 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"2920a19e-815a-4972-83c6-7c85f961f88a","Type":"ContainerStarted","Data":"404f94f0d5afc3ad6088541b2a63d0364af51bce13ce73be778b0e926d0a6c2e"} Dec 01 08:46:43 crc kubenswrapper[5004]: I1201 08:46:43.393963 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"2920a19e-815a-4972-83c6-7c85f961f88a","Type":"ContainerStarted","Data":"836ce2b7691fc2daf9521f9e06b1b1fde986d2c2f7fb66e74b6b15e8530402f3"} Dec 01 08:46:45 crc kubenswrapper[5004]: I1201 08:46:45.447395 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"2920a19e-815a-4972-83c6-7c85f961f88a","Type":"ContainerStarted","Data":"4828c9c6a42bc248a1a5b86470f9b785b5fd96838701497fab6e1c371648e7fb"} Dec 01 08:46:46 crc kubenswrapper[5004]: I1201 08:46:46.473034 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"2920a19e-815a-4972-83c6-7c85f961f88a","Type":"ContainerStarted","Data":"d861cc4da9fb0f70001bd0a326283285bf339f029e3bc62dc0e8d031fd9a14f2"} Dec 01 08:46:46 crc kubenswrapper[5004]: I1201 08:46:46.506509 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.453524184 podStartE2EDuration="6.506489608s" podCreationTimestamp="2025-12-01 08:46:40 +0000 UTC" firstStartedPulling="2025-12-01 08:46:41.437151774 +0000 UTC m=+1779.002143756" lastFinishedPulling="2025-12-01 08:46:45.490117198 +0000 UTC m=+1783.055109180" observedRunningTime="2025-12-01 08:46:46.503261591 +0000 UTC m=+1784.068253573" watchObservedRunningTime="2025-12-01 08:46:46.506489608 +0000 UTC m=+1784.071481600" Dec 01 08:46:49 crc kubenswrapper[5004]: I1201 08:46:49.759256 5004 scope.go:117] "RemoveContainer" containerID="70f45b3d2a6bb4bf0d87f18bc08b282543e238e1d391a58745914af24ad7956c" Dec 01 08:46:49 crc kubenswrapper[5004]: E1201 08:46:49.760261 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 08:47:02 crc kubenswrapper[5004]: I1201 08:47:02.772760 5004 scope.go:117] "RemoveContainer" containerID="70f45b3d2a6bb4bf0d87f18bc08b282543e238e1d391a58745914af24ad7956c" Dec 01 08:47:02 crc kubenswrapper[5004]: E1201 08:47:02.773818 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 08:47:12 crc kubenswrapper[5004]: I1201 08:47:12.911343 5004 scope.go:117] "RemoveContainer" containerID="2bf02db439a0f6bf3e43b98b9fc9fd95ebabfb103c9374b5210216103eed3ec7" Dec 01 08:47:12 crc kubenswrapper[5004]: I1201 08:47:12.936917 5004 scope.go:117] "RemoveContainer" containerID="a88b56ebeae335b6b5f78e4d59b23d3201cc5bcb8179f86ff401aa2015390c91" Dec 01 08:47:12 crc kubenswrapper[5004]: I1201 08:47:12.957931 5004 scope.go:117] "RemoveContainer" containerID="84bc7259ce202f981795b79e6494280f52d4b5a4b6161309b9ecbae984c2aca9" Dec 01 08:47:14 crc kubenswrapper[5004]: I1201 08:47:14.762095 5004 scope.go:117] "RemoveContainer" containerID="70f45b3d2a6bb4bf0d87f18bc08b282543e238e1d391a58745914af24ad7956c" Dec 01 08:47:14 crc kubenswrapper[5004]: E1201 08:47:14.763529 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 08:47:29 crc kubenswrapper[5004]: I1201 08:47:29.758667 5004 scope.go:117] "RemoveContainer" containerID="70f45b3d2a6bb4bf0d87f18bc08b282543e238e1d391a58745914af24ad7956c" Dec 01 08:47:29 crc kubenswrapper[5004]: E1201 08:47:29.759396 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 08:47:43 crc kubenswrapper[5004]: I1201 08:47:43.759420 5004 scope.go:117] "RemoveContainer" containerID="70f45b3d2a6bb4bf0d87f18bc08b282543e238e1d391a58745914af24ad7956c" Dec 01 08:47:43 crc kubenswrapper[5004]: E1201 08:47:43.760401 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 08:47:58 crc kubenswrapper[5004]: I1201 08:47:58.759665 5004 scope.go:117] "RemoveContainer" containerID="70f45b3d2a6bb4bf0d87f18bc08b282543e238e1d391a58745914af24ad7956c" Dec 01 08:47:58 crc kubenswrapper[5004]: E1201 08:47:58.760381 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 08:48:12 crc kubenswrapper[5004]: I1201 08:48:12.776048 5004 scope.go:117] "RemoveContainer" containerID="70f45b3d2a6bb4bf0d87f18bc08b282543e238e1d391a58745914af24ad7956c" Dec 01 08:48:12 crc kubenswrapper[5004]: E1201 08:48:12.777731 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 08:48:13 crc kubenswrapper[5004]: I1201 08:48:13.066060 5004 scope.go:117] "RemoveContainer" containerID="a7674e31b785c47a0c5f335ab681e055f6c081ef07eb98adb40bb183d9b0987b" Dec 01 08:48:13 crc kubenswrapper[5004]: I1201 08:48:13.101094 5004 scope.go:117] "RemoveContainer" containerID="a79f9e7e64a47dd07136cf47c21fcdd10ecfc38370970bd9af1f1ff18a1ab176" Dec 01 08:48:25 crc kubenswrapper[5004]: I1201 08:48:25.759541 5004 scope.go:117] "RemoveContainer" containerID="70f45b3d2a6bb4bf0d87f18bc08b282543e238e1d391a58745914af24ad7956c" Dec 01 08:48:25 crc kubenswrapper[5004]: E1201 08:48:25.760602 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 08:48:38 crc kubenswrapper[5004]: I1201 08:48:38.759119 5004 scope.go:117] "RemoveContainer" containerID="70f45b3d2a6bb4bf0d87f18bc08b282543e238e1d391a58745914af24ad7956c" Dec 01 08:48:38 crc kubenswrapper[5004]: E1201 08:48:38.760305 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 08:48:50 crc kubenswrapper[5004]: I1201 08:48:50.760545 5004 scope.go:117] "RemoveContainer" containerID="70f45b3d2a6bb4bf0d87f18bc08b282543e238e1d391a58745914af24ad7956c" Dec 01 08:48:50 crc kubenswrapper[5004]: E1201 08:48:50.761896 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 08:49:03 crc kubenswrapper[5004]: I1201 08:49:03.759606 5004 scope.go:117] "RemoveContainer" containerID="70f45b3d2a6bb4bf0d87f18bc08b282543e238e1d391a58745914af24ad7956c" Dec 01 08:49:03 crc kubenswrapper[5004]: E1201 08:49:03.760719 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 08:49:04 crc kubenswrapper[5004]: I1201 08:49:04.059444 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-3f6b-account-create-update-2gfdj"] Dec 01 08:49:04 crc kubenswrapper[5004]: I1201 08:49:04.079775 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-d5gjb"] Dec 01 08:49:04 crc kubenswrapper[5004]: I1201 08:49:04.091681 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-3f6b-account-create-update-2gfdj"] Dec 01 08:49:04 crc kubenswrapper[5004]: I1201 08:49:04.104081 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-d5gjb"] Dec 01 08:49:04 crc kubenswrapper[5004]: I1201 08:49:04.781542 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f70e608-5f44-45d3-9c98-d22ed21cf952" path="/var/lib/kubelet/pods/9f70e608-5f44-45d3-9c98-d22ed21cf952/volumes" Dec 01 08:49:04 crc kubenswrapper[5004]: I1201 08:49:04.784217 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b61fea80-031c-4ee8-b99f-562a8bb879eb" path="/var/lib/kubelet/pods/b61fea80-031c-4ee8-b99f-562a8bb879eb/volumes" Dec 01 08:49:10 crc kubenswrapper[5004]: I1201 08:49:10.031228 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-c26nm"] Dec 01 08:49:10 crc kubenswrapper[5004]: I1201 08:49:10.050008 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-2cda-account-create-update-gzdd5"] Dec 01 08:49:10 crc kubenswrapper[5004]: I1201 08:49:10.062273 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-c26nm"] Dec 01 08:49:10 crc kubenswrapper[5004]: I1201 08:49:10.072401 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-2cda-account-create-update-gzdd5"] Dec 01 08:49:10 crc kubenswrapper[5004]: I1201 08:49:10.783201 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="803e9984-2114-48e5-8d49-4536bd3cc6ef" path="/var/lib/kubelet/pods/803e9984-2114-48e5-8d49-4536bd3cc6ef/volumes" Dec 01 08:49:10 crc kubenswrapper[5004]: I1201 08:49:10.784997 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd186cbb-dea0-4dd5-9328-935a4041a137" path="/var/lib/kubelet/pods/cd186cbb-dea0-4dd5-9328-935a4041a137/volumes" Dec 01 08:49:13 crc kubenswrapper[5004]: I1201 08:49:13.181252 5004 scope.go:117] "RemoveContainer" containerID="f23a85d1a241230784e372b2b68e777a885643c89da240dbda07441ec0857e39" Dec 01 08:49:13 crc kubenswrapper[5004]: I1201 08:49:13.243482 5004 scope.go:117] "RemoveContainer" containerID="d8a63febb8a8556f14ad5f947b65573a3ffe986bedf429596c00b97ca667ffca" Dec 01 08:49:13 crc kubenswrapper[5004]: I1201 08:49:13.295682 5004 scope.go:117] "RemoveContainer" containerID="a410c8306ba4e6c1633fed1ba85921a2fdf9711c53f980b87e3693d9b9adb70a" Dec 01 08:49:13 crc kubenswrapper[5004]: I1201 08:49:13.332731 5004 scope.go:117] "RemoveContainer" containerID="6827a7e54d43df41de9a37d9daa31f6e85299558d37f5eff5f15ec59ef888bc7" Dec 01 08:49:13 crc kubenswrapper[5004]: I1201 08:49:13.384852 5004 scope.go:117] "RemoveContainer" containerID="13338ddb68fdaed9d4d35936b6e07985146beb7009af833b95f044b50a8f906c" Dec 01 08:49:14 crc kubenswrapper[5004]: I1201 08:49:14.057201 5004 scope.go:117] "RemoveContainer" containerID="b49bfa958360807f66ead1b9a529493cd21242e17329f00d236357a6bfa29197" Dec 01 08:49:14 crc kubenswrapper[5004]: I1201 08:49:14.110016 5004 scope.go:117] "RemoveContainer" containerID="aa15d35533926d66b511361f8c464ae977f0281b05407483211edc0c7ab555d9" Dec 01 08:49:14 crc kubenswrapper[5004]: I1201 08:49:14.153503 5004 scope.go:117] "RemoveContainer" containerID="639a91cddffd8c36fa3b41966ba1065c7bd09ff518d776caad8ffe332c1198df" Dec 01 08:49:16 crc kubenswrapper[5004]: I1201 08:49:16.047760 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-d3bc-account-create-update-d2dfq"] Dec 01 08:49:16 crc kubenswrapper[5004]: I1201 08:49:16.065321 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-d3bc-account-create-update-d2dfq"] Dec 01 08:49:16 crc kubenswrapper[5004]: I1201 08:49:16.076942 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-dwhqn"] Dec 01 08:49:16 crc kubenswrapper[5004]: I1201 08:49:16.111943 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-88qsh"] Dec 01 08:49:16 crc kubenswrapper[5004]: I1201 08:49:16.128138 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-b5b4-account-create-update-4c6cr"] Dec 01 08:49:16 crc kubenswrapper[5004]: I1201 08:49:16.135433 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-dwhqn"] Dec 01 08:49:16 crc kubenswrapper[5004]: I1201 08:49:16.147156 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-b5b4-account-create-update-4c6cr"] Dec 01 08:49:16 crc kubenswrapper[5004]: I1201 08:49:16.158517 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-88qsh"] Dec 01 08:49:16 crc kubenswrapper[5004]: I1201 08:49:16.788172 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5e4ccfc-1b99-4ee0-ab46-8e3e53669634" path="/var/lib/kubelet/pods/a5e4ccfc-1b99-4ee0-ab46-8e3e53669634/volumes" Dec 01 08:49:16 crc kubenswrapper[5004]: I1201 08:49:16.790865 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0817a2b-0491-4c64-a2d8-0c03a938dd4a" path="/var/lib/kubelet/pods/c0817a2b-0491-4c64-a2d8-0c03a938dd4a/volumes" Dec 01 08:49:16 crc kubenswrapper[5004]: I1201 08:49:16.792106 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf702a13-23f3-471e-b896-03b6e58d429d" path="/var/lib/kubelet/pods/cf702a13-23f3-471e-b896-03b6e58d429d/volumes" Dec 01 08:49:16 crc kubenswrapper[5004]: I1201 08:49:16.793258 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f716a201-3ee3-486c-ba54-785c9e603805" path="/var/lib/kubelet/pods/f716a201-3ee3-486c-ba54-785c9e603805/volumes" Dec 01 08:49:17 crc kubenswrapper[5004]: I1201 08:49:17.759056 5004 scope.go:117] "RemoveContainer" containerID="70f45b3d2a6bb4bf0d87f18bc08b282543e238e1d391a58745914af24ad7956c" Dec 01 08:49:17 crc kubenswrapper[5004]: E1201 08:49:17.759462 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 08:49:19 crc kubenswrapper[5004]: I1201 08:49:19.046649 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-dvjqn"] Dec 01 08:49:19 crc kubenswrapper[5004]: I1201 08:49:19.062532 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-bd92-account-create-update-zhnsd"] Dec 01 08:49:19 crc kubenswrapper[5004]: I1201 08:49:19.072624 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-dvjqn"] Dec 01 08:49:19 crc kubenswrapper[5004]: I1201 08:49:19.082101 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-bd92-account-create-update-zhnsd"] Dec 01 08:49:20 crc kubenswrapper[5004]: I1201 08:49:20.774679 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d4469a9-2f5c-4a4f-a6cf-aa8d58aa8da4" path="/var/lib/kubelet/pods/1d4469a9-2f5c-4a4f-a6cf-aa8d58aa8da4/volumes" Dec 01 08:49:20 crc kubenswrapper[5004]: I1201 08:49:20.776278 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5facd201-c917-4a20-86f7-1a95b6604bd1" path="/var/lib/kubelet/pods/5facd201-c917-4a20-86f7-1a95b6604bd1/volumes" Dec 01 08:49:29 crc kubenswrapper[5004]: I1201 08:49:29.758985 5004 scope.go:117] "RemoveContainer" containerID="70f45b3d2a6bb4bf0d87f18bc08b282543e238e1d391a58745914af24ad7956c" Dec 01 08:49:29 crc kubenswrapper[5004]: E1201 08:49:29.759713 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 08:49:40 crc kubenswrapper[5004]: I1201 08:49:40.760885 5004 scope.go:117] "RemoveContainer" containerID="70f45b3d2a6bb4bf0d87f18bc08b282543e238e1d391a58745914af24ad7956c" Dec 01 08:49:41 crc kubenswrapper[5004]: I1201 08:49:41.932399 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" event={"ID":"f9977ebb-82de-4e96-8763-0b5a84f8d4ce","Type":"ContainerStarted","Data":"c25705b2a5c3bf1cd77f5014f3a7b5fe92ef589d53ece0784883141577010bcc"} Dec 01 08:49:45 crc kubenswrapper[5004]: I1201 08:49:45.992250 5004 generic.go:334] "Generic (PLEG): container finished" podID="76e7add3-c357-40c8-b77c-dd408f7315d2" containerID="50fe6912d82c944f9060d57de71a4e4144e6d15a3ee36f50b48b260ad62f3669" exitCode=0 Dec 01 08:49:45 crc kubenswrapper[5004]: I1201 08:49:45.992386 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-l7cvw" event={"ID":"76e7add3-c357-40c8-b77c-dd408f7315d2","Type":"ContainerDied","Data":"50fe6912d82c944f9060d57de71a4e4144e6d15a3ee36f50b48b260ad62f3669"} Dec 01 08:49:47 crc kubenswrapper[5004]: I1201 08:49:47.557811 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-l7cvw" Dec 01 08:49:47 crc kubenswrapper[5004]: I1201 08:49:47.625438 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/76e7add3-c357-40c8-b77c-dd408f7315d2-inventory\") pod \"76e7add3-c357-40c8-b77c-dd408f7315d2\" (UID: \"76e7add3-c357-40c8-b77c-dd408f7315d2\") " Dec 01 08:49:47 crc kubenswrapper[5004]: I1201 08:49:47.625501 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76e7add3-c357-40c8-b77c-dd408f7315d2-bootstrap-combined-ca-bundle\") pod \"76e7add3-c357-40c8-b77c-dd408f7315d2\" (UID: \"76e7add3-c357-40c8-b77c-dd408f7315d2\") " Dec 01 08:49:47 crc kubenswrapper[5004]: I1201 08:49:47.625584 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snt76\" (UniqueName: \"kubernetes.io/projected/76e7add3-c357-40c8-b77c-dd408f7315d2-kube-api-access-snt76\") pod \"76e7add3-c357-40c8-b77c-dd408f7315d2\" (UID: \"76e7add3-c357-40c8-b77c-dd408f7315d2\") " Dec 01 08:49:47 crc kubenswrapper[5004]: I1201 08:49:47.626044 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/76e7add3-c357-40c8-b77c-dd408f7315d2-ssh-key\") pod \"76e7add3-c357-40c8-b77c-dd408f7315d2\" (UID: \"76e7add3-c357-40c8-b77c-dd408f7315d2\") " Dec 01 08:49:47 crc kubenswrapper[5004]: I1201 08:49:47.634964 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76e7add3-c357-40c8-b77c-dd408f7315d2-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "76e7add3-c357-40c8-b77c-dd408f7315d2" (UID: "76e7add3-c357-40c8-b77c-dd408f7315d2"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:49:47 crc kubenswrapper[5004]: I1201 08:49:47.638796 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76e7add3-c357-40c8-b77c-dd408f7315d2-kube-api-access-snt76" (OuterVolumeSpecName: "kube-api-access-snt76") pod "76e7add3-c357-40c8-b77c-dd408f7315d2" (UID: "76e7add3-c357-40c8-b77c-dd408f7315d2"). InnerVolumeSpecName "kube-api-access-snt76". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:49:47 crc kubenswrapper[5004]: I1201 08:49:47.677028 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76e7add3-c357-40c8-b77c-dd408f7315d2-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "76e7add3-c357-40c8-b77c-dd408f7315d2" (UID: "76e7add3-c357-40c8-b77c-dd408f7315d2"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:49:47 crc kubenswrapper[5004]: I1201 08:49:47.679295 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76e7add3-c357-40c8-b77c-dd408f7315d2-inventory" (OuterVolumeSpecName: "inventory") pod "76e7add3-c357-40c8-b77c-dd408f7315d2" (UID: "76e7add3-c357-40c8-b77c-dd408f7315d2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:49:47 crc kubenswrapper[5004]: I1201 08:49:47.727991 5004 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/76e7add3-c357-40c8-b77c-dd408f7315d2-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 08:49:47 crc kubenswrapper[5004]: I1201 08:49:47.728028 5004 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/76e7add3-c357-40c8-b77c-dd408f7315d2-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 08:49:47 crc kubenswrapper[5004]: I1201 08:49:47.728041 5004 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76e7add3-c357-40c8-b77c-dd408f7315d2-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:49:47 crc kubenswrapper[5004]: I1201 08:49:47.728054 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snt76\" (UniqueName: \"kubernetes.io/projected/76e7add3-c357-40c8-b77c-dd408f7315d2-kube-api-access-snt76\") on node \"crc\" DevicePath \"\"" Dec 01 08:49:48 crc kubenswrapper[5004]: I1201 08:49:48.018684 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-l7cvw" event={"ID":"76e7add3-c357-40c8-b77c-dd408f7315d2","Type":"ContainerDied","Data":"47e1756aeb3354fb8aaa80fc956939adc7fa844b73cc783e51b2d82137e756fb"} Dec 01 08:49:48 crc kubenswrapper[5004]: I1201 08:49:48.019017 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="47e1756aeb3354fb8aaa80fc956939adc7fa844b73cc783e51b2d82137e756fb" Dec 01 08:49:48 crc kubenswrapper[5004]: I1201 08:49:48.018758 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-l7cvw" Dec 01 08:49:48 crc kubenswrapper[5004]: I1201 08:49:48.178688 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6frqs"] Dec 01 08:49:48 crc kubenswrapper[5004]: E1201 08:49:48.179311 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76e7add3-c357-40c8-b77c-dd408f7315d2" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 01 08:49:48 crc kubenswrapper[5004]: I1201 08:49:48.179337 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="76e7add3-c357-40c8-b77c-dd408f7315d2" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 01 08:49:48 crc kubenswrapper[5004]: I1201 08:49:48.179676 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="76e7add3-c357-40c8-b77c-dd408f7315d2" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 01 08:49:48 crc kubenswrapper[5004]: I1201 08:49:48.180692 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6frqs" Dec 01 08:49:48 crc kubenswrapper[5004]: I1201 08:49:48.188204 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 08:49:48 crc kubenswrapper[5004]: I1201 08:49:48.188438 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 08:49:48 crc kubenswrapper[5004]: I1201 08:49:48.188576 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 08:49:48 crc kubenswrapper[5004]: I1201 08:49:48.188938 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pdnrq" Dec 01 08:49:48 crc kubenswrapper[5004]: I1201 08:49:48.225691 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6frqs"] Dec 01 08:49:48 crc kubenswrapper[5004]: I1201 08:49:48.250130 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbw6j\" (UniqueName: \"kubernetes.io/projected/1d1fef94-adb6-4276-8448-af6b16e5d9ff-kube-api-access-sbw6j\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6frqs\" (UID: \"1d1fef94-adb6-4276-8448-af6b16e5d9ff\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6frqs" Dec 01 08:49:48 crc kubenswrapper[5004]: I1201 08:49:48.250302 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1d1fef94-adb6-4276-8448-af6b16e5d9ff-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6frqs\" (UID: \"1d1fef94-adb6-4276-8448-af6b16e5d9ff\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6frqs" Dec 01 08:49:48 crc kubenswrapper[5004]: I1201 08:49:48.250818 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1d1fef94-adb6-4276-8448-af6b16e5d9ff-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6frqs\" (UID: \"1d1fef94-adb6-4276-8448-af6b16e5d9ff\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6frqs" Dec 01 08:49:48 crc kubenswrapper[5004]: I1201 08:49:48.353840 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbw6j\" (UniqueName: \"kubernetes.io/projected/1d1fef94-adb6-4276-8448-af6b16e5d9ff-kube-api-access-sbw6j\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6frqs\" (UID: \"1d1fef94-adb6-4276-8448-af6b16e5d9ff\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6frqs" Dec 01 08:49:48 crc kubenswrapper[5004]: I1201 08:49:48.353912 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1d1fef94-adb6-4276-8448-af6b16e5d9ff-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6frqs\" (UID: \"1d1fef94-adb6-4276-8448-af6b16e5d9ff\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6frqs" Dec 01 08:49:48 crc kubenswrapper[5004]: I1201 08:49:48.353998 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1d1fef94-adb6-4276-8448-af6b16e5d9ff-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6frqs\" (UID: \"1d1fef94-adb6-4276-8448-af6b16e5d9ff\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6frqs" Dec 01 08:49:48 crc kubenswrapper[5004]: I1201 08:49:48.359012 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1d1fef94-adb6-4276-8448-af6b16e5d9ff-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6frqs\" (UID: \"1d1fef94-adb6-4276-8448-af6b16e5d9ff\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6frqs" Dec 01 08:49:48 crc kubenswrapper[5004]: I1201 08:49:48.361410 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1d1fef94-adb6-4276-8448-af6b16e5d9ff-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6frqs\" (UID: \"1d1fef94-adb6-4276-8448-af6b16e5d9ff\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6frqs" Dec 01 08:49:48 crc kubenswrapper[5004]: I1201 08:49:48.369736 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbw6j\" (UniqueName: \"kubernetes.io/projected/1d1fef94-adb6-4276-8448-af6b16e5d9ff-kube-api-access-sbw6j\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6frqs\" (UID: \"1d1fef94-adb6-4276-8448-af6b16e5d9ff\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6frqs" Dec 01 08:49:48 crc kubenswrapper[5004]: I1201 08:49:48.560729 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6frqs" Dec 01 08:49:49 crc kubenswrapper[5004]: I1201 08:49:49.108620 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6frqs"] Dec 01 08:49:49 crc kubenswrapper[5004]: I1201 08:49:49.115579 5004 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 08:49:50 crc kubenswrapper[5004]: I1201 08:49:50.053514 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6frqs" event={"ID":"1d1fef94-adb6-4276-8448-af6b16e5d9ff","Type":"ContainerStarted","Data":"83a6de755b11103d024764e647e0def6044c68295d317500e157c3eded73abf7"} Dec 01 08:49:50 crc kubenswrapper[5004]: I1201 08:49:50.054035 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6frqs" event={"ID":"1d1fef94-adb6-4276-8448-af6b16e5d9ff","Type":"ContainerStarted","Data":"be7fd32743fd0432ccda5974343a90ec16f39b46053b7cc220d5d4f9e4a7da19"} Dec 01 08:49:50 crc kubenswrapper[5004]: I1201 08:49:50.091487 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6frqs" podStartSLOduration=1.5730271120000001 podStartE2EDuration="2.091461129s" podCreationTimestamp="2025-12-01 08:49:48 +0000 UTC" firstStartedPulling="2025-12-01 08:49:49.115308341 +0000 UTC m=+1966.680300333" lastFinishedPulling="2025-12-01 08:49:49.633742368 +0000 UTC m=+1967.198734350" observedRunningTime="2025-12-01 08:49:50.073455191 +0000 UTC m=+1967.638447213" watchObservedRunningTime="2025-12-01 08:49:50.091461129 +0000 UTC m=+1967.656453111" Dec 01 08:49:51 crc kubenswrapper[5004]: I1201 08:49:51.065725 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-dlfw2"] Dec 01 08:49:51 crc kubenswrapper[5004]: I1201 08:49:51.077265 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-kbqp2"] Dec 01 08:49:51 crc kubenswrapper[5004]: I1201 08:49:51.093718 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-kbqp2"] Dec 01 08:49:51 crc kubenswrapper[5004]: I1201 08:49:51.110981 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-e274-account-create-update-jn8sw"] Dec 01 08:49:51 crc kubenswrapper[5004]: I1201 08:49:51.121516 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-z4cqn"] Dec 01 08:49:51 crc kubenswrapper[5004]: I1201 08:49:51.133350 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfc0-account-create-update-6ccd5"] Dec 01 08:49:51 crc kubenswrapper[5004]: I1201 08:49:51.144406 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-2nz97"] Dec 01 08:49:51 crc kubenswrapper[5004]: I1201 08:49:51.158623 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-dlfw2"] Dec 01 08:49:51 crc kubenswrapper[5004]: I1201 08:49:51.170183 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-cba6-account-create-update-rzw6q"] Dec 01 08:49:51 crc kubenswrapper[5004]: I1201 08:49:51.183039 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-29bc-account-create-update-sgcbq"] Dec 01 08:49:51 crc kubenswrapper[5004]: I1201 08:49:51.196343 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-2nz97"] Dec 01 08:49:51 crc kubenswrapper[5004]: I1201 08:49:51.208077 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-z4cqn"] Dec 01 08:49:51 crc kubenswrapper[5004]: I1201 08:49:51.221806 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-cba6-account-create-update-rzw6q"] Dec 01 08:49:51 crc kubenswrapper[5004]: I1201 08:49:51.232010 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfc0-account-create-update-6ccd5"] Dec 01 08:49:51 crc kubenswrapper[5004]: I1201 08:49:51.242057 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-e274-account-create-update-jn8sw"] Dec 01 08:49:51 crc kubenswrapper[5004]: I1201 08:49:51.252748 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-29bc-account-create-update-sgcbq"] Dec 01 08:49:52 crc kubenswrapper[5004]: I1201 08:49:52.770888 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00524f9c-1b21-433f-8886-9b685c169469" path="/var/lib/kubelet/pods/00524f9c-1b21-433f-8886-9b685c169469/volumes" Dec 01 08:49:52 crc kubenswrapper[5004]: I1201 08:49:52.772327 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38373747-4dbe-42c7-9060-ada117a776e8" path="/var/lib/kubelet/pods/38373747-4dbe-42c7-9060-ada117a776e8/volumes" Dec 01 08:49:52 crc kubenswrapper[5004]: I1201 08:49:52.772945 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73a0817f-d450-4a46-863a-a7483f144851" path="/var/lib/kubelet/pods/73a0817f-d450-4a46-863a-a7483f144851/volumes" Dec 01 08:49:52 crc kubenswrapper[5004]: I1201 08:49:52.773586 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f34518b-865b-4fe5-b2ff-7060cfced9eb" path="/var/lib/kubelet/pods/9f34518b-865b-4fe5-b2ff-7060cfced9eb/volumes" Dec 01 08:49:52 crc kubenswrapper[5004]: I1201 08:49:52.774792 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac742a71-b3d4-433d-b550-12300a92941d" path="/var/lib/kubelet/pods/ac742a71-b3d4-433d-b550-12300a92941d/volumes" Dec 01 08:49:52 crc kubenswrapper[5004]: I1201 08:49:52.775508 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c59984a3-cd16-4aa7-841e-29de227d4f70" path="/var/lib/kubelet/pods/c59984a3-cd16-4aa7-841e-29de227d4f70/volumes" Dec 01 08:49:52 crc kubenswrapper[5004]: I1201 08:49:52.776398 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd7ffdb9-9cec-4f3a-8b55-65d8f6e42286" path="/var/lib/kubelet/pods/dd7ffdb9-9cec-4f3a-8b55-65d8f6e42286/volumes" Dec 01 08:49:52 crc kubenswrapper[5004]: I1201 08:49:52.777644 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2a8cf8a-0ff4-4fba-a0fa-e62d564230c3" path="/var/lib/kubelet/pods/e2a8cf8a-0ff4-4fba-a0fa-e62d564230c3/volumes" Dec 01 08:49:54 crc kubenswrapper[5004]: I1201 08:49:54.054364 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-vp28t"] Dec 01 08:49:54 crc kubenswrapper[5004]: I1201 08:49:54.069016 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-vp28t"] Dec 01 08:49:54 crc kubenswrapper[5004]: I1201 08:49:54.772810 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96f41ff9-6619-4262-8ecb-0a577f611f68" path="/var/lib/kubelet/pods/96f41ff9-6619-4262-8ecb-0a577f611f68/volumes" Dec 01 08:49:58 crc kubenswrapper[5004]: I1201 08:49:58.037086 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-wlpnv"] Dec 01 08:49:58 crc kubenswrapper[5004]: I1201 08:49:58.055369 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-wlpnv"] Dec 01 08:49:58 crc kubenswrapper[5004]: I1201 08:49:58.775296 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc5ce303-7a1e-40b2-86f6-861898171b29" path="/var/lib/kubelet/pods/fc5ce303-7a1e-40b2-86f6-861898171b29/volumes" Dec 01 08:50:14 crc kubenswrapper[5004]: I1201 08:50:14.376506 5004 scope.go:117] "RemoveContainer" containerID="6cdeeb25645ef9ece11e2207f448c252ccaf6e0317573978fdd1df83bec2dfa0" Dec 01 08:50:14 crc kubenswrapper[5004]: I1201 08:50:14.416430 5004 scope.go:117] "RemoveContainer" containerID="fbef74624a42737b16db9753cb53083ffb17a94a36748a8290bcd3c4a7d868a5" Dec 01 08:50:14 crc kubenswrapper[5004]: I1201 08:50:14.470916 5004 scope.go:117] "RemoveContainer" containerID="8967286d52c4953e254ebe1d7620e40da3f3c0b514f684c4b6e4d9ac963665ee" Dec 01 08:50:14 crc kubenswrapper[5004]: I1201 08:50:14.541245 5004 scope.go:117] "RemoveContainer" containerID="eae44cb9d6865a271883cb2bc942df42834c1acbe6e2299c68d4e66399188c6a" Dec 01 08:50:14 crc kubenswrapper[5004]: I1201 08:50:14.580417 5004 scope.go:117] "RemoveContainer" containerID="7d7480261e115075222fa0617873257b35057ada4729aab52996815fb4c5fe07" Dec 01 08:50:14 crc kubenswrapper[5004]: I1201 08:50:14.621513 5004 scope.go:117] "RemoveContainer" containerID="da82e03befca2bf3ab63cb88ce3c79cca63e76d05276ed4118d1de5853eb3be5" Dec 01 08:50:14 crc kubenswrapper[5004]: I1201 08:50:14.745881 5004 scope.go:117] "RemoveContainer" containerID="c661d564a31c9f258ed7a2bd32494a86bfb175494539a7c0df1df08e5114efe1" Dec 01 08:50:14 crc kubenswrapper[5004]: I1201 08:50:14.777578 5004 scope.go:117] "RemoveContainer" containerID="cdb5df3a5cc301e4c812d5825d2ce15c2b16d6a8caa39f1c5fd496e98e12085f" Dec 01 08:50:14 crc kubenswrapper[5004]: I1201 08:50:14.801775 5004 scope.go:117] "RemoveContainer" containerID="95e7525308b38af4759baae3c397c557cb54a7e90e0123a85a516e3116e0c0f5" Dec 01 08:50:14 crc kubenswrapper[5004]: I1201 08:50:14.826398 5004 scope.go:117] "RemoveContainer" containerID="bfa91a9ee828473fdb3b0b3758debed03550479d13e5f986d09d96bfcaece393" Dec 01 08:50:14 crc kubenswrapper[5004]: I1201 08:50:14.869421 5004 scope.go:117] "RemoveContainer" containerID="eeabc7c4ec03654e4c7de9c13196eaa169ce257ae0ad80bc728660322766fb84" Dec 01 08:50:14 crc kubenswrapper[5004]: I1201 08:50:14.898802 5004 scope.go:117] "RemoveContainer" containerID="c7165fc56293d9f4bc95eb11bffcafa2f0bdacd92c7af0ff7c6c6039eca8db9c" Dec 01 08:50:14 crc kubenswrapper[5004]: I1201 08:50:14.928587 5004 scope.go:117] "RemoveContainer" containerID="d0a59018b3622bb84896710d76b6dff37ddcd7c9d8cf9f4f30a8b43dc3695768" Dec 01 08:50:14 crc kubenswrapper[5004]: I1201 08:50:14.949790 5004 scope.go:117] "RemoveContainer" containerID="98f2535ae0a25e20bcb06db1552d9613f56abf1fe5c4d14ede6957f9aacf1df2" Dec 01 08:50:14 crc kubenswrapper[5004]: I1201 08:50:14.972233 5004 scope.go:117] "RemoveContainer" containerID="86b40f1f3075aac7be5469dc08cc364ca5d64826b94e214f37b900ceb881d552" Dec 01 08:50:15 crc kubenswrapper[5004]: I1201 08:50:15.002599 5004 scope.go:117] "RemoveContainer" containerID="8c8510fcff8031c7a9daf6c7d56792ec25e303f0f25a8d8ac95b1d4083593225" Dec 01 08:50:15 crc kubenswrapper[5004]: I1201 08:50:15.024276 5004 scope.go:117] "RemoveContainer" containerID="c8c8f04a8e2afb0cdfb75fd84a86f5dd282017b6649feadc92a9109ebe31898c" Dec 01 08:50:15 crc kubenswrapper[5004]: I1201 08:50:15.045731 5004 scope.go:117] "RemoveContainer" containerID="6e2ee528ac706228aefbe21d58b251f96222ef8ade16570465449fde858fe964" Dec 01 08:50:15 crc kubenswrapper[5004]: I1201 08:50:15.067375 5004 scope.go:117] "RemoveContainer" containerID="108bc634745220ffabaaad3cea124a02ba4a300a2ab78a8e63f1c5b6c285e421" Dec 01 08:50:15 crc kubenswrapper[5004]: I1201 08:50:15.092839 5004 scope.go:117] "RemoveContainer" containerID="247a94e6863d939428ba1456ca3b6a08df9811c1d6d875563e57d4221da23638" Dec 01 08:50:29 crc kubenswrapper[5004]: I1201 08:50:29.053100 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-68h6f"] Dec 01 08:50:29 crc kubenswrapper[5004]: I1201 08:50:29.072233 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-68h6f"] Dec 01 08:50:30 crc kubenswrapper[5004]: I1201 08:50:30.771889 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87079b8d-839c-42d2-95d1-33dee4ca61e1" path="/var/lib/kubelet/pods/87079b8d-839c-42d2-95d1-33dee4ca61e1/volumes" Dec 01 08:50:36 crc kubenswrapper[5004]: I1201 08:50:36.059339 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-7jdwx"] Dec 01 08:50:36 crc kubenswrapper[5004]: I1201 08:50:36.075570 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-7jdwx"] Dec 01 08:50:36 crc kubenswrapper[5004]: I1201 08:50:36.784646 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5303a09-48ed-4287-83ba-0fb70fe199d0" path="/var/lib/kubelet/pods/e5303a09-48ed-4287-83ba-0fb70fe199d0/volumes" Dec 01 08:50:42 crc kubenswrapper[5004]: I1201 08:50:42.264833 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tlzk7"] Dec 01 08:50:42 crc kubenswrapper[5004]: I1201 08:50:42.269655 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tlzk7" Dec 01 08:50:42 crc kubenswrapper[5004]: I1201 08:50:42.288893 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tlzk7"] Dec 01 08:50:42 crc kubenswrapper[5004]: I1201 08:50:42.421214 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqzff\" (UniqueName: \"kubernetes.io/projected/843198a4-4c62-4d4a-a4eb-45def6244871-kube-api-access-nqzff\") pod \"redhat-operators-tlzk7\" (UID: \"843198a4-4c62-4d4a-a4eb-45def6244871\") " pod="openshift-marketplace/redhat-operators-tlzk7" Dec 01 08:50:42 crc kubenswrapper[5004]: I1201 08:50:42.421384 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/843198a4-4c62-4d4a-a4eb-45def6244871-utilities\") pod \"redhat-operators-tlzk7\" (UID: \"843198a4-4c62-4d4a-a4eb-45def6244871\") " pod="openshift-marketplace/redhat-operators-tlzk7" Dec 01 08:50:42 crc kubenswrapper[5004]: I1201 08:50:42.421432 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/843198a4-4c62-4d4a-a4eb-45def6244871-catalog-content\") pod \"redhat-operators-tlzk7\" (UID: \"843198a4-4c62-4d4a-a4eb-45def6244871\") " pod="openshift-marketplace/redhat-operators-tlzk7" Dec 01 08:50:42 crc kubenswrapper[5004]: I1201 08:50:42.523683 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/843198a4-4c62-4d4a-a4eb-45def6244871-utilities\") pod \"redhat-operators-tlzk7\" (UID: \"843198a4-4c62-4d4a-a4eb-45def6244871\") " pod="openshift-marketplace/redhat-operators-tlzk7" Dec 01 08:50:42 crc kubenswrapper[5004]: I1201 08:50:42.524042 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/843198a4-4c62-4d4a-a4eb-45def6244871-catalog-content\") pod \"redhat-operators-tlzk7\" (UID: \"843198a4-4c62-4d4a-a4eb-45def6244871\") " pod="openshift-marketplace/redhat-operators-tlzk7" Dec 01 08:50:42 crc kubenswrapper[5004]: I1201 08:50:42.524207 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqzff\" (UniqueName: \"kubernetes.io/projected/843198a4-4c62-4d4a-a4eb-45def6244871-kube-api-access-nqzff\") pod \"redhat-operators-tlzk7\" (UID: \"843198a4-4c62-4d4a-a4eb-45def6244871\") " pod="openshift-marketplace/redhat-operators-tlzk7" Dec 01 08:50:42 crc kubenswrapper[5004]: I1201 08:50:42.525202 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/843198a4-4c62-4d4a-a4eb-45def6244871-utilities\") pod \"redhat-operators-tlzk7\" (UID: \"843198a4-4c62-4d4a-a4eb-45def6244871\") " pod="openshift-marketplace/redhat-operators-tlzk7" Dec 01 08:50:42 crc kubenswrapper[5004]: I1201 08:50:42.525494 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/843198a4-4c62-4d4a-a4eb-45def6244871-catalog-content\") pod \"redhat-operators-tlzk7\" (UID: \"843198a4-4c62-4d4a-a4eb-45def6244871\") " pod="openshift-marketplace/redhat-operators-tlzk7" Dec 01 08:50:42 crc kubenswrapper[5004]: I1201 08:50:42.544882 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqzff\" (UniqueName: \"kubernetes.io/projected/843198a4-4c62-4d4a-a4eb-45def6244871-kube-api-access-nqzff\") pod \"redhat-operators-tlzk7\" (UID: \"843198a4-4c62-4d4a-a4eb-45def6244871\") " pod="openshift-marketplace/redhat-operators-tlzk7" Dec 01 08:50:42 crc kubenswrapper[5004]: I1201 08:50:42.621304 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tlzk7" Dec 01 08:50:43 crc kubenswrapper[5004]: I1201 08:50:43.141585 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tlzk7"] Dec 01 08:50:43 crc kubenswrapper[5004]: I1201 08:50:43.704970 5004 generic.go:334] "Generic (PLEG): container finished" podID="843198a4-4c62-4d4a-a4eb-45def6244871" containerID="f5d3f6dce85f2127fd1ab719ff7c1a1318c174f716f54325296c04581d3c2d1a" exitCode=0 Dec 01 08:50:43 crc kubenswrapper[5004]: I1201 08:50:43.705020 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tlzk7" event={"ID":"843198a4-4c62-4d4a-a4eb-45def6244871","Type":"ContainerDied","Data":"f5d3f6dce85f2127fd1ab719ff7c1a1318c174f716f54325296c04581d3c2d1a"} Dec 01 08:50:43 crc kubenswrapper[5004]: I1201 08:50:43.705265 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tlzk7" event={"ID":"843198a4-4c62-4d4a-a4eb-45def6244871","Type":"ContainerStarted","Data":"372de5967e82f1a5fce8caced4091ee8c68cbaf510c689b8c0e26a1d6f84833b"} Dec 01 08:50:44 crc kubenswrapper[5004]: I1201 08:50:44.051466 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-hlrzt"] Dec 01 08:50:44 crc kubenswrapper[5004]: I1201 08:50:44.064387 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-hlrzt"] Dec 01 08:50:44 crc kubenswrapper[5004]: I1201 08:50:44.773436 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="713ee9f1-6421-4ffa-aed2-5f762d8cba63" path="/var/lib/kubelet/pods/713ee9f1-6421-4ffa-aed2-5f762d8cba63/volumes" Dec 01 08:50:45 crc kubenswrapper[5004]: I1201 08:50:45.737950 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tlzk7" event={"ID":"843198a4-4c62-4d4a-a4eb-45def6244871","Type":"ContainerStarted","Data":"7ca03501e4c2801e45e3b54ec8b57596b6f18c1bf5115135df2f4030d51dc0ff"} Dec 01 08:50:47 crc kubenswrapper[5004]: I1201 08:50:47.762478 5004 generic.go:334] "Generic (PLEG): container finished" podID="843198a4-4c62-4d4a-a4eb-45def6244871" containerID="7ca03501e4c2801e45e3b54ec8b57596b6f18c1bf5115135df2f4030d51dc0ff" exitCode=0 Dec 01 08:50:47 crc kubenswrapper[5004]: I1201 08:50:47.762510 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tlzk7" event={"ID":"843198a4-4c62-4d4a-a4eb-45def6244871","Type":"ContainerDied","Data":"7ca03501e4c2801e45e3b54ec8b57596b6f18c1bf5115135df2f4030d51dc0ff"} Dec 01 08:50:49 crc kubenswrapper[5004]: I1201 08:50:49.788781 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tlzk7" event={"ID":"843198a4-4c62-4d4a-a4eb-45def6244871","Type":"ContainerStarted","Data":"d6e456008056b297f1ef97e45be0014049c9516716ec1456f2d5a294cd0b0645"} Dec 01 08:50:49 crc kubenswrapper[5004]: I1201 08:50:49.809631 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tlzk7" podStartSLOduration=2.71985253 podStartE2EDuration="7.809611563s" podCreationTimestamp="2025-12-01 08:50:42 +0000 UTC" firstStartedPulling="2025-12-01 08:50:43.707144745 +0000 UTC m=+2021.272136727" lastFinishedPulling="2025-12-01 08:50:48.796903768 +0000 UTC m=+2026.361895760" observedRunningTime="2025-12-01 08:50:49.806536339 +0000 UTC m=+2027.371528321" watchObservedRunningTime="2025-12-01 08:50:49.809611563 +0000 UTC m=+2027.374603555" Dec 01 08:50:52 crc kubenswrapper[5004]: I1201 08:50:52.622498 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tlzk7" Dec 01 08:50:52 crc kubenswrapper[5004]: I1201 08:50:52.623144 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tlzk7" Dec 01 08:50:53 crc kubenswrapper[5004]: I1201 08:50:53.038342 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-ln7f8"] Dec 01 08:50:53 crc kubenswrapper[5004]: I1201 08:50:53.048590 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-vgmk8"] Dec 01 08:50:53 crc kubenswrapper[5004]: I1201 08:50:53.062897 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-vgmk8"] Dec 01 08:50:53 crc kubenswrapper[5004]: I1201 08:50:53.080461 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-ln7f8"] Dec 01 08:50:53 crc kubenswrapper[5004]: I1201 08:50:53.676727 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tlzk7" podUID="843198a4-4c62-4d4a-a4eb-45def6244871" containerName="registry-server" probeResult="failure" output=< Dec 01 08:50:53 crc kubenswrapper[5004]: timeout: failed to connect service ":50051" within 1s Dec 01 08:50:53 crc kubenswrapper[5004]: > Dec 01 08:50:54 crc kubenswrapper[5004]: I1201 08:50:54.771118 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0eccea77-d6ee-4592-ad47-1f29ca2a943b" path="/var/lib/kubelet/pods/0eccea77-d6ee-4592-ad47-1f29ca2a943b/volumes" Dec 01 08:50:54 crc kubenswrapper[5004]: I1201 08:50:54.772167 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="165d617f-a220-49b1-af2b-65d4c509962c" path="/var/lib/kubelet/pods/165d617f-a220-49b1-af2b-65d4c509962c/volumes" Dec 01 08:51:02 crc kubenswrapper[5004]: I1201 08:51:02.674100 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tlzk7" Dec 01 08:51:02 crc kubenswrapper[5004]: I1201 08:51:02.729048 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tlzk7" Dec 01 08:51:02 crc kubenswrapper[5004]: I1201 08:51:02.956785 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tlzk7"] Dec 01 08:51:03 crc kubenswrapper[5004]: I1201 08:51:03.965477 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tlzk7" podUID="843198a4-4c62-4d4a-a4eb-45def6244871" containerName="registry-server" containerID="cri-o://d6e456008056b297f1ef97e45be0014049c9516716ec1456f2d5a294cd0b0645" gracePeriod=2 Dec 01 08:51:04 crc kubenswrapper[5004]: I1201 08:51:04.984426 5004 generic.go:334] "Generic (PLEG): container finished" podID="843198a4-4c62-4d4a-a4eb-45def6244871" containerID="d6e456008056b297f1ef97e45be0014049c9516716ec1456f2d5a294cd0b0645" exitCode=0 Dec 01 08:51:04 crc kubenswrapper[5004]: I1201 08:51:04.984661 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tlzk7" event={"ID":"843198a4-4c62-4d4a-a4eb-45def6244871","Type":"ContainerDied","Data":"d6e456008056b297f1ef97e45be0014049c9516716ec1456f2d5a294cd0b0645"} Dec 01 08:51:05 crc kubenswrapper[5004]: I1201 08:51:05.138200 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tlzk7" Dec 01 08:51:05 crc kubenswrapper[5004]: I1201 08:51:05.320480 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqzff\" (UniqueName: \"kubernetes.io/projected/843198a4-4c62-4d4a-a4eb-45def6244871-kube-api-access-nqzff\") pod \"843198a4-4c62-4d4a-a4eb-45def6244871\" (UID: \"843198a4-4c62-4d4a-a4eb-45def6244871\") " Dec 01 08:51:05 crc kubenswrapper[5004]: I1201 08:51:05.320796 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/843198a4-4c62-4d4a-a4eb-45def6244871-catalog-content\") pod \"843198a4-4c62-4d4a-a4eb-45def6244871\" (UID: \"843198a4-4c62-4d4a-a4eb-45def6244871\") " Dec 01 08:51:05 crc kubenswrapper[5004]: I1201 08:51:05.320831 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/843198a4-4c62-4d4a-a4eb-45def6244871-utilities\") pod \"843198a4-4c62-4d4a-a4eb-45def6244871\" (UID: \"843198a4-4c62-4d4a-a4eb-45def6244871\") " Dec 01 08:51:05 crc kubenswrapper[5004]: I1201 08:51:05.321442 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/843198a4-4c62-4d4a-a4eb-45def6244871-utilities" (OuterVolumeSpecName: "utilities") pod "843198a4-4c62-4d4a-a4eb-45def6244871" (UID: "843198a4-4c62-4d4a-a4eb-45def6244871"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:51:05 crc kubenswrapper[5004]: I1201 08:51:05.425881 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/843198a4-4c62-4d4a-a4eb-45def6244871-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 08:51:05 crc kubenswrapper[5004]: I1201 08:51:05.426704 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/843198a4-4c62-4d4a-a4eb-45def6244871-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "843198a4-4c62-4d4a-a4eb-45def6244871" (UID: "843198a4-4c62-4d4a-a4eb-45def6244871"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:51:05 crc kubenswrapper[5004]: I1201 08:51:05.528029 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/843198a4-4c62-4d4a-a4eb-45def6244871-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 08:51:05 crc kubenswrapper[5004]: I1201 08:51:05.881871 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/843198a4-4c62-4d4a-a4eb-45def6244871-kube-api-access-nqzff" (OuterVolumeSpecName: "kube-api-access-nqzff") pod "843198a4-4c62-4d4a-a4eb-45def6244871" (UID: "843198a4-4c62-4d4a-a4eb-45def6244871"). InnerVolumeSpecName "kube-api-access-nqzff". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:51:05 crc kubenswrapper[5004]: I1201 08:51:05.937218 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqzff\" (UniqueName: \"kubernetes.io/projected/843198a4-4c62-4d4a-a4eb-45def6244871-kube-api-access-nqzff\") on node \"crc\" DevicePath \"\"" Dec 01 08:51:06 crc kubenswrapper[5004]: I1201 08:51:06.003685 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tlzk7" event={"ID":"843198a4-4c62-4d4a-a4eb-45def6244871","Type":"ContainerDied","Data":"372de5967e82f1a5fce8caced4091ee8c68cbaf510c689b8c0e26a1d6f84833b"} Dec 01 08:51:06 crc kubenswrapper[5004]: I1201 08:51:06.003741 5004 scope.go:117] "RemoveContainer" containerID="d6e456008056b297f1ef97e45be0014049c9516716ec1456f2d5a294cd0b0645" Dec 01 08:51:06 crc kubenswrapper[5004]: I1201 08:51:06.004647 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tlzk7" Dec 01 08:51:06 crc kubenswrapper[5004]: I1201 08:51:06.026882 5004 scope.go:117] "RemoveContainer" containerID="7ca03501e4c2801e45e3b54ec8b57596b6f18c1bf5115135df2f4030d51dc0ff" Dec 01 08:51:06 crc kubenswrapper[5004]: I1201 08:51:06.047464 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tlzk7"] Dec 01 08:51:06 crc kubenswrapper[5004]: I1201 08:51:06.067525 5004 scope.go:117] "RemoveContainer" containerID="f5d3f6dce85f2127fd1ab719ff7c1a1318c174f716f54325296c04581d3c2d1a" Dec 01 08:51:06 crc kubenswrapper[5004]: I1201 08:51:06.074620 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tlzk7"] Dec 01 08:51:06 crc kubenswrapper[5004]: I1201 08:51:06.772278 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="843198a4-4c62-4d4a-a4eb-45def6244871" path="/var/lib/kubelet/pods/843198a4-4c62-4d4a-a4eb-45def6244871/volumes" Dec 01 08:51:15 crc kubenswrapper[5004]: I1201 08:51:15.443791 5004 scope.go:117] "RemoveContainer" containerID="e56b5fd8e27ba254e70b7ec601bc6d3d8f18ae99c0d1825e9b01b4807757acbf" Dec 01 08:51:15 crc kubenswrapper[5004]: I1201 08:51:15.523191 5004 scope.go:117] "RemoveContainer" containerID="a5e210e1ece1b7cb88d92cd73ce1261bd023a742628e4cd59a7610d4e53cd7c5" Dec 01 08:51:15 crc kubenswrapper[5004]: I1201 08:51:15.598989 5004 scope.go:117] "RemoveContainer" containerID="2e5a177f0147bb0a08e716773ead0c5f7639009e7d855103ecf90be8667d222d" Dec 01 08:51:15 crc kubenswrapper[5004]: I1201 08:51:15.626739 5004 scope.go:117] "RemoveContainer" containerID="5dda4d99565c4be4d42864bdcdc7309d6c705f0d76d76b289b4046a8e6ef092f" Dec 01 08:51:15 crc kubenswrapper[5004]: I1201 08:51:15.675180 5004 scope.go:117] "RemoveContainer" containerID="5d0a501941db692fc8607f877460be6c252bfb759ba05cd87257d71106f20640" Dec 01 08:51:52 crc kubenswrapper[5004]: I1201 08:51:52.058838 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-94kdj"] Dec 01 08:51:52 crc kubenswrapper[5004]: I1201 08:51:52.072219 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-94kdj"] Dec 01 08:51:52 crc kubenswrapper[5004]: I1201 08:51:52.777435 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eede402f-9c2a-4dcf-9de9-33df959b5cd8" path="/var/lib/kubelet/pods/eede402f-9c2a-4dcf-9de9-33df959b5cd8/volumes" Dec 01 08:51:53 crc kubenswrapper[5004]: I1201 08:51:53.053425 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-w9gjc"] Dec 01 08:51:53 crc kubenswrapper[5004]: I1201 08:51:53.073699 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-xclx6"] Dec 01 08:51:53 crc kubenswrapper[5004]: I1201 08:51:53.084290 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-w9gjc"] Dec 01 08:51:53 crc kubenswrapper[5004]: I1201 08:51:53.093380 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-xclx6"] Dec 01 08:51:53 crc kubenswrapper[5004]: I1201 08:51:53.103086 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-ae05-account-create-update-fx8ms"] Dec 01 08:51:53 crc kubenswrapper[5004]: I1201 08:51:53.112591 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-ae05-account-create-update-fx8ms"] Dec 01 08:51:54 crc kubenswrapper[5004]: I1201 08:51:54.042390 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-12e3-account-create-update-2kzjl"] Dec 01 08:51:54 crc kubenswrapper[5004]: I1201 08:51:54.051639 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-976b-account-create-update-8qblk"] Dec 01 08:51:54 crc kubenswrapper[5004]: I1201 08:51:54.061347 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-976b-account-create-update-8qblk"] Dec 01 08:51:54 crc kubenswrapper[5004]: I1201 08:51:54.071494 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-12e3-account-create-update-2kzjl"] Dec 01 08:51:54 crc kubenswrapper[5004]: I1201 08:51:54.781079 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f01fa36-7d6e-48b6-a764-8aa381b5cf7a" path="/var/lib/kubelet/pods/4f01fa36-7d6e-48b6-a764-8aa381b5cf7a/volumes" Dec 01 08:51:54 crc kubenswrapper[5004]: I1201 08:51:54.782905 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5135af8b-1b58-4816-ace1-424059c5267a" path="/var/lib/kubelet/pods/5135af8b-1b58-4816-ace1-424059c5267a/volumes" Dec 01 08:51:54 crc kubenswrapper[5004]: I1201 08:51:54.785990 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7924d39b-830f-4f91-ae39-6b04c31d3f61" path="/var/lib/kubelet/pods/7924d39b-830f-4f91-ae39-6b04c31d3f61/volumes" Dec 01 08:51:54 crc kubenswrapper[5004]: I1201 08:51:54.787601 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b50edddf-3daf-4bee-83da-8c44123a382f" path="/var/lib/kubelet/pods/b50edddf-3daf-4bee-83da-8c44123a382f/volumes" Dec 01 08:51:54 crc kubenswrapper[5004]: I1201 08:51:54.788187 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1b1e393-9cd1-4f89-8b3e-b02ee0d397bd" path="/var/lib/kubelet/pods/e1b1e393-9cd1-4f89-8b3e-b02ee0d397bd/volumes" Dec 01 08:52:06 crc kubenswrapper[5004]: I1201 08:52:06.752083 5004 generic.go:334] "Generic (PLEG): container finished" podID="1d1fef94-adb6-4276-8448-af6b16e5d9ff" containerID="83a6de755b11103d024764e647e0def6044c68295d317500e157c3eded73abf7" exitCode=0 Dec 01 08:52:06 crc kubenswrapper[5004]: I1201 08:52:06.752182 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6frqs" event={"ID":"1d1fef94-adb6-4276-8448-af6b16e5d9ff","Type":"ContainerDied","Data":"83a6de755b11103d024764e647e0def6044c68295d317500e157c3eded73abf7"} Dec 01 08:52:08 crc kubenswrapper[5004]: I1201 08:52:08.296807 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6frqs" Dec 01 08:52:08 crc kubenswrapper[5004]: I1201 08:52:08.388403 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbw6j\" (UniqueName: \"kubernetes.io/projected/1d1fef94-adb6-4276-8448-af6b16e5d9ff-kube-api-access-sbw6j\") pod \"1d1fef94-adb6-4276-8448-af6b16e5d9ff\" (UID: \"1d1fef94-adb6-4276-8448-af6b16e5d9ff\") " Dec 01 08:52:08 crc kubenswrapper[5004]: I1201 08:52:08.388622 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1d1fef94-adb6-4276-8448-af6b16e5d9ff-ssh-key\") pod \"1d1fef94-adb6-4276-8448-af6b16e5d9ff\" (UID: \"1d1fef94-adb6-4276-8448-af6b16e5d9ff\") " Dec 01 08:52:08 crc kubenswrapper[5004]: I1201 08:52:08.388698 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1d1fef94-adb6-4276-8448-af6b16e5d9ff-inventory\") pod \"1d1fef94-adb6-4276-8448-af6b16e5d9ff\" (UID: \"1d1fef94-adb6-4276-8448-af6b16e5d9ff\") " Dec 01 08:52:08 crc kubenswrapper[5004]: I1201 08:52:08.394879 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d1fef94-adb6-4276-8448-af6b16e5d9ff-kube-api-access-sbw6j" (OuterVolumeSpecName: "kube-api-access-sbw6j") pod "1d1fef94-adb6-4276-8448-af6b16e5d9ff" (UID: "1d1fef94-adb6-4276-8448-af6b16e5d9ff"). InnerVolumeSpecName "kube-api-access-sbw6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:52:08 crc kubenswrapper[5004]: I1201 08:52:08.428702 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d1fef94-adb6-4276-8448-af6b16e5d9ff-inventory" (OuterVolumeSpecName: "inventory") pod "1d1fef94-adb6-4276-8448-af6b16e5d9ff" (UID: "1d1fef94-adb6-4276-8448-af6b16e5d9ff"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:52:08 crc kubenswrapper[5004]: I1201 08:52:08.428732 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d1fef94-adb6-4276-8448-af6b16e5d9ff-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1d1fef94-adb6-4276-8448-af6b16e5d9ff" (UID: "1d1fef94-adb6-4276-8448-af6b16e5d9ff"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:52:08 crc kubenswrapper[5004]: I1201 08:52:08.491799 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbw6j\" (UniqueName: \"kubernetes.io/projected/1d1fef94-adb6-4276-8448-af6b16e5d9ff-kube-api-access-sbw6j\") on node \"crc\" DevicePath \"\"" Dec 01 08:52:08 crc kubenswrapper[5004]: I1201 08:52:08.491840 5004 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1d1fef94-adb6-4276-8448-af6b16e5d9ff-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 08:52:08 crc kubenswrapper[5004]: I1201 08:52:08.491853 5004 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1d1fef94-adb6-4276-8448-af6b16e5d9ff-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 08:52:08 crc kubenswrapper[5004]: I1201 08:52:08.729543 5004 patch_prober.go:28] interesting pod/machine-config-daemon-fvdgt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 08:52:08 crc kubenswrapper[5004]: I1201 08:52:08.729695 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 08:52:08 crc kubenswrapper[5004]: I1201 08:52:08.787046 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6frqs" event={"ID":"1d1fef94-adb6-4276-8448-af6b16e5d9ff","Type":"ContainerDied","Data":"be7fd32743fd0432ccda5974343a90ec16f39b46053b7cc220d5d4f9e4a7da19"} Dec 01 08:52:08 crc kubenswrapper[5004]: I1201 08:52:08.787101 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be7fd32743fd0432ccda5974343a90ec16f39b46053b7cc220d5d4f9e4a7da19" Dec 01 08:52:08 crc kubenswrapper[5004]: I1201 08:52:08.787340 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6frqs" Dec 01 08:52:08 crc kubenswrapper[5004]: I1201 08:52:08.884149 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hfxnb"] Dec 01 08:52:08 crc kubenswrapper[5004]: E1201 08:52:08.884754 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="843198a4-4c62-4d4a-a4eb-45def6244871" containerName="extract-content" Dec 01 08:52:08 crc kubenswrapper[5004]: I1201 08:52:08.884772 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="843198a4-4c62-4d4a-a4eb-45def6244871" containerName="extract-content" Dec 01 08:52:08 crc kubenswrapper[5004]: E1201 08:52:08.884781 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d1fef94-adb6-4276-8448-af6b16e5d9ff" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 01 08:52:08 crc kubenswrapper[5004]: I1201 08:52:08.884789 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d1fef94-adb6-4276-8448-af6b16e5d9ff" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 01 08:52:08 crc kubenswrapper[5004]: E1201 08:52:08.884810 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="843198a4-4c62-4d4a-a4eb-45def6244871" containerName="extract-utilities" Dec 01 08:52:08 crc kubenswrapper[5004]: I1201 08:52:08.884815 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="843198a4-4c62-4d4a-a4eb-45def6244871" containerName="extract-utilities" Dec 01 08:52:08 crc kubenswrapper[5004]: E1201 08:52:08.884840 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="843198a4-4c62-4d4a-a4eb-45def6244871" containerName="registry-server" Dec 01 08:52:08 crc kubenswrapper[5004]: I1201 08:52:08.884846 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="843198a4-4c62-4d4a-a4eb-45def6244871" containerName="registry-server" Dec 01 08:52:08 crc kubenswrapper[5004]: I1201 08:52:08.885065 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="843198a4-4c62-4d4a-a4eb-45def6244871" containerName="registry-server" Dec 01 08:52:08 crc kubenswrapper[5004]: I1201 08:52:08.885095 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d1fef94-adb6-4276-8448-af6b16e5d9ff" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 01 08:52:08 crc kubenswrapper[5004]: I1201 08:52:08.885905 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hfxnb" Dec 01 08:52:08 crc kubenswrapper[5004]: I1201 08:52:08.890181 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 08:52:08 crc kubenswrapper[5004]: I1201 08:52:08.890331 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 08:52:08 crc kubenswrapper[5004]: I1201 08:52:08.891148 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 08:52:08 crc kubenswrapper[5004]: I1201 08:52:08.893171 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pdnrq" Dec 01 08:52:08 crc kubenswrapper[5004]: I1201 08:52:08.897355 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hfxnb"] Dec 01 08:52:09 crc kubenswrapper[5004]: I1201 08:52:09.006662 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99ck4\" (UniqueName: \"kubernetes.io/projected/54a59a5d-6353-4420-9600-3fdfbaa42595-kube-api-access-99ck4\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hfxnb\" (UID: \"54a59a5d-6353-4420-9600-3fdfbaa42595\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hfxnb" Dec 01 08:52:09 crc kubenswrapper[5004]: I1201 08:52:09.006745 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/54a59a5d-6353-4420-9600-3fdfbaa42595-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hfxnb\" (UID: \"54a59a5d-6353-4420-9600-3fdfbaa42595\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hfxnb" Dec 01 08:52:09 crc kubenswrapper[5004]: I1201 08:52:09.006801 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/54a59a5d-6353-4420-9600-3fdfbaa42595-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hfxnb\" (UID: \"54a59a5d-6353-4420-9600-3fdfbaa42595\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hfxnb" Dec 01 08:52:09 crc kubenswrapper[5004]: I1201 08:52:09.108943 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99ck4\" (UniqueName: \"kubernetes.io/projected/54a59a5d-6353-4420-9600-3fdfbaa42595-kube-api-access-99ck4\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hfxnb\" (UID: \"54a59a5d-6353-4420-9600-3fdfbaa42595\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hfxnb" Dec 01 08:52:09 crc kubenswrapper[5004]: I1201 08:52:09.109019 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/54a59a5d-6353-4420-9600-3fdfbaa42595-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hfxnb\" (UID: \"54a59a5d-6353-4420-9600-3fdfbaa42595\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hfxnb" Dec 01 08:52:09 crc kubenswrapper[5004]: I1201 08:52:09.109069 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/54a59a5d-6353-4420-9600-3fdfbaa42595-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hfxnb\" (UID: \"54a59a5d-6353-4420-9600-3fdfbaa42595\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hfxnb" Dec 01 08:52:09 crc kubenswrapper[5004]: I1201 08:52:09.114605 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/54a59a5d-6353-4420-9600-3fdfbaa42595-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hfxnb\" (UID: \"54a59a5d-6353-4420-9600-3fdfbaa42595\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hfxnb" Dec 01 08:52:09 crc kubenswrapper[5004]: I1201 08:52:09.120032 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/54a59a5d-6353-4420-9600-3fdfbaa42595-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hfxnb\" (UID: \"54a59a5d-6353-4420-9600-3fdfbaa42595\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hfxnb" Dec 01 08:52:09 crc kubenswrapper[5004]: I1201 08:52:09.129605 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99ck4\" (UniqueName: \"kubernetes.io/projected/54a59a5d-6353-4420-9600-3fdfbaa42595-kube-api-access-99ck4\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hfxnb\" (UID: \"54a59a5d-6353-4420-9600-3fdfbaa42595\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hfxnb" Dec 01 08:52:09 crc kubenswrapper[5004]: I1201 08:52:09.212221 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hfxnb" Dec 01 08:52:09 crc kubenswrapper[5004]: I1201 08:52:09.784605 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hfxnb"] Dec 01 08:52:09 crc kubenswrapper[5004]: I1201 08:52:09.805704 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hfxnb" event={"ID":"54a59a5d-6353-4420-9600-3fdfbaa42595","Type":"ContainerStarted","Data":"f39f634bdddcd6a8b2c0b1c1b1bfa8f85bf7a43cbc5a1c0b108f04a138d693a4"} Dec 01 08:52:10 crc kubenswrapper[5004]: I1201 08:52:10.822135 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hfxnb" event={"ID":"54a59a5d-6353-4420-9600-3fdfbaa42595","Type":"ContainerStarted","Data":"42d4c61b977bac8265802f8f942bf744baf965a3001e355dfa81a87b827cc1fd"} Dec 01 08:52:10 crc kubenswrapper[5004]: I1201 08:52:10.859861 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hfxnb" podStartSLOduration=2.381600695 podStartE2EDuration="2.859839205s" podCreationTimestamp="2025-12-01 08:52:08 +0000 UTC" firstStartedPulling="2025-12-01 08:52:09.77880113 +0000 UTC m=+2107.343793132" lastFinishedPulling="2025-12-01 08:52:10.25703966 +0000 UTC m=+2107.822031642" observedRunningTime="2025-12-01 08:52:10.842946745 +0000 UTC m=+2108.407938807" watchObservedRunningTime="2025-12-01 08:52:10.859839205 +0000 UTC m=+2108.424831197" Dec 01 08:52:15 crc kubenswrapper[5004]: I1201 08:52:15.882295 5004 scope.go:117] "RemoveContainer" containerID="aba2e67a068c132261944ce48d830b7e582b9569e72ad46e903c39e66dac9188" Dec 01 08:52:15 crc kubenswrapper[5004]: I1201 08:52:15.921478 5004 scope.go:117] "RemoveContainer" containerID="b6b9ec3f129d85014f5e0cce074128a269bd4922e99b5e8fe4e00c1b8ca34694" Dec 01 08:52:15 crc kubenswrapper[5004]: I1201 08:52:15.977808 5004 scope.go:117] "RemoveContainer" containerID="4fffe034b0d93a9159bf6c2b3f4a5ff39245208968004a8dd1c2f266ff170e85" Dec 01 08:52:16 crc kubenswrapper[5004]: I1201 08:52:16.031616 5004 scope.go:117] "RemoveContainer" containerID="7d48562efbf85d7916e4e50b6ae5391b54d57c9b3e1a0991748ecbc5e1047ec7" Dec 01 08:52:16 crc kubenswrapper[5004]: I1201 08:52:16.047182 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-1bcc-account-create-update-99f9v"] Dec 01 08:52:16 crc kubenswrapper[5004]: I1201 08:52:16.070844 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-1bcc-account-create-update-99f9v"] Dec 01 08:52:16 crc kubenswrapper[5004]: I1201 08:52:16.109178 5004 scope.go:117] "RemoveContainer" containerID="e60e90156b1e6887f3e8343b122b6c71bd9eb9d3a25ce4ddb55b7de2aa02ce1c" Dec 01 08:52:16 crc kubenswrapper[5004]: I1201 08:52:16.153521 5004 scope.go:117] "RemoveContainer" containerID="0884f39c38ab64c5732becfbe9732e9cafa3905f930cb554f1f9eff46823ef94" Dec 01 08:52:16 crc kubenswrapper[5004]: I1201 08:52:16.779096 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47a280b1-dda9-451b-8791-990446098df5" path="/var/lib/kubelet/pods/47a280b1-dda9-451b-8791-990446098df5/volumes" Dec 01 08:52:17 crc kubenswrapper[5004]: I1201 08:52:17.040389 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-rpt6t"] Dec 01 08:52:17 crc kubenswrapper[5004]: I1201 08:52:17.050928 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-rpt6t"] Dec 01 08:52:18 crc kubenswrapper[5004]: I1201 08:52:18.775048 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cbde161-56a4-49c2-9e44-e86fc7c4a82f" path="/var/lib/kubelet/pods/8cbde161-56a4-49c2-9e44-e86fc7c4a82f/volumes" Dec 01 08:52:21 crc kubenswrapper[5004]: I1201 08:52:21.034199 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-598mn"] Dec 01 08:52:21 crc kubenswrapper[5004]: I1201 08:52:21.049628 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-598mn"] Dec 01 08:52:22 crc kubenswrapper[5004]: I1201 08:52:22.773636 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="058c0853-8613-4681-b617-fb985abce304" path="/var/lib/kubelet/pods/058c0853-8613-4681-b617-fb985abce304/volumes" Dec 01 08:52:38 crc kubenswrapper[5004]: I1201 08:52:38.729069 5004 patch_prober.go:28] interesting pod/machine-config-daemon-fvdgt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 08:52:38 crc kubenswrapper[5004]: I1201 08:52:38.729984 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 08:52:48 crc kubenswrapper[5004]: I1201 08:52:48.050152 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-6rnpl"] Dec 01 08:52:48 crc kubenswrapper[5004]: I1201 08:52:48.067294 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-6rnpl"] Dec 01 08:52:48 crc kubenswrapper[5004]: I1201 08:52:48.770552 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="902842db-d6a2-4ae1-8d5e-25b637f4db2c" path="/var/lib/kubelet/pods/902842db-d6a2-4ae1-8d5e-25b637f4db2c/volumes" Dec 01 08:52:50 crc kubenswrapper[5004]: I1201 08:52:50.037929 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-5gbbc"] Dec 01 08:52:50 crc kubenswrapper[5004]: I1201 08:52:50.051667 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-5gbbc"] Dec 01 08:52:50 crc kubenswrapper[5004]: I1201 08:52:50.777754 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31898e1-9571-43cb-b754-5544a7898213" path="/var/lib/kubelet/pods/a31898e1-9571-43cb-b754-5544a7898213/volumes" Dec 01 08:53:02 crc kubenswrapper[5004]: I1201 08:53:02.056497 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cv6zt"] Dec 01 08:53:02 crc kubenswrapper[5004]: I1201 08:53:02.065761 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cv6zt" Dec 01 08:53:02 crc kubenswrapper[5004]: I1201 08:53:02.088621 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cv6zt"] Dec 01 08:53:02 crc kubenswrapper[5004]: I1201 08:53:02.205988 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ca2567c-b311-446f-ac70-98d098596207-catalog-content\") pod \"community-operators-cv6zt\" (UID: \"7ca2567c-b311-446f-ac70-98d098596207\") " pod="openshift-marketplace/community-operators-cv6zt" Dec 01 08:53:02 crc kubenswrapper[5004]: I1201 08:53:02.206035 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ca2567c-b311-446f-ac70-98d098596207-utilities\") pod \"community-operators-cv6zt\" (UID: \"7ca2567c-b311-446f-ac70-98d098596207\") " pod="openshift-marketplace/community-operators-cv6zt" Dec 01 08:53:02 crc kubenswrapper[5004]: I1201 08:53:02.206138 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-492m6\" (UniqueName: \"kubernetes.io/projected/7ca2567c-b311-446f-ac70-98d098596207-kube-api-access-492m6\") pod \"community-operators-cv6zt\" (UID: \"7ca2567c-b311-446f-ac70-98d098596207\") " pod="openshift-marketplace/community-operators-cv6zt" Dec 01 08:53:02 crc kubenswrapper[5004]: I1201 08:53:02.308707 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-492m6\" (UniqueName: \"kubernetes.io/projected/7ca2567c-b311-446f-ac70-98d098596207-kube-api-access-492m6\") pod \"community-operators-cv6zt\" (UID: \"7ca2567c-b311-446f-ac70-98d098596207\") " pod="openshift-marketplace/community-operators-cv6zt" Dec 01 08:53:02 crc kubenswrapper[5004]: I1201 08:53:02.308937 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ca2567c-b311-446f-ac70-98d098596207-catalog-content\") pod \"community-operators-cv6zt\" (UID: \"7ca2567c-b311-446f-ac70-98d098596207\") " pod="openshift-marketplace/community-operators-cv6zt" Dec 01 08:53:02 crc kubenswrapper[5004]: I1201 08:53:02.308974 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ca2567c-b311-446f-ac70-98d098596207-utilities\") pod \"community-operators-cv6zt\" (UID: \"7ca2567c-b311-446f-ac70-98d098596207\") " pod="openshift-marketplace/community-operators-cv6zt" Dec 01 08:53:02 crc kubenswrapper[5004]: I1201 08:53:02.309605 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ca2567c-b311-446f-ac70-98d098596207-utilities\") pod \"community-operators-cv6zt\" (UID: \"7ca2567c-b311-446f-ac70-98d098596207\") " pod="openshift-marketplace/community-operators-cv6zt" Dec 01 08:53:02 crc kubenswrapper[5004]: I1201 08:53:02.309685 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ca2567c-b311-446f-ac70-98d098596207-catalog-content\") pod \"community-operators-cv6zt\" (UID: \"7ca2567c-b311-446f-ac70-98d098596207\") " pod="openshift-marketplace/community-operators-cv6zt" Dec 01 08:53:02 crc kubenswrapper[5004]: I1201 08:53:02.337650 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-492m6\" (UniqueName: \"kubernetes.io/projected/7ca2567c-b311-446f-ac70-98d098596207-kube-api-access-492m6\") pod \"community-operators-cv6zt\" (UID: \"7ca2567c-b311-446f-ac70-98d098596207\") " pod="openshift-marketplace/community-operators-cv6zt" Dec 01 08:53:02 crc kubenswrapper[5004]: I1201 08:53:02.389040 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cv6zt" Dec 01 08:53:03 crc kubenswrapper[5004]: I1201 08:53:03.007245 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cv6zt"] Dec 01 08:53:03 crc kubenswrapper[5004]: I1201 08:53:03.539317 5004 generic.go:334] "Generic (PLEG): container finished" podID="7ca2567c-b311-446f-ac70-98d098596207" containerID="3f370979b8de4f34616762545bef4a6c3f7ec4bb61215da83239680374ea5d50" exitCode=0 Dec 01 08:53:03 crc kubenswrapper[5004]: I1201 08:53:03.539405 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cv6zt" event={"ID":"7ca2567c-b311-446f-ac70-98d098596207","Type":"ContainerDied","Data":"3f370979b8de4f34616762545bef4a6c3f7ec4bb61215da83239680374ea5d50"} Dec 01 08:53:03 crc kubenswrapper[5004]: I1201 08:53:03.539715 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cv6zt" event={"ID":"7ca2567c-b311-446f-ac70-98d098596207","Type":"ContainerStarted","Data":"a2a8b321195d56bed338bd4e4f481a8203d9782adcf3f82a7e72323ea0a4eb20"} Dec 01 08:53:05 crc kubenswrapper[5004]: I1201 08:53:05.568812 5004 generic.go:334] "Generic (PLEG): container finished" podID="7ca2567c-b311-446f-ac70-98d098596207" containerID="bff8206e36cba54f1dc5ee8495e659132023787ceaec155130a7dc2b5ec69452" exitCode=0 Dec 01 08:53:05 crc kubenswrapper[5004]: I1201 08:53:05.568920 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cv6zt" event={"ID":"7ca2567c-b311-446f-ac70-98d098596207","Type":"ContainerDied","Data":"bff8206e36cba54f1dc5ee8495e659132023787ceaec155130a7dc2b5ec69452"} Dec 01 08:53:06 crc kubenswrapper[5004]: I1201 08:53:06.583397 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cv6zt" event={"ID":"7ca2567c-b311-446f-ac70-98d098596207","Type":"ContainerStarted","Data":"a596ab331a544c122ffd8ca98066ecf75139d59a0ef1a972bc4fce1c4c169157"} Dec 01 08:53:06 crc kubenswrapper[5004]: I1201 08:53:06.604644 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cv6zt" podStartSLOduration=2.165938235 podStartE2EDuration="4.604627524s" podCreationTimestamp="2025-12-01 08:53:02 +0000 UTC" firstStartedPulling="2025-12-01 08:53:03.541847393 +0000 UTC m=+2161.106839375" lastFinishedPulling="2025-12-01 08:53:05.980536672 +0000 UTC m=+2163.545528664" observedRunningTime="2025-12-01 08:53:06.603054946 +0000 UTC m=+2164.168046948" watchObservedRunningTime="2025-12-01 08:53:06.604627524 +0000 UTC m=+2164.169619516" Dec 01 08:53:08 crc kubenswrapper[5004]: I1201 08:53:08.729349 5004 patch_prober.go:28] interesting pod/machine-config-daemon-fvdgt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 08:53:08 crc kubenswrapper[5004]: I1201 08:53:08.729425 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 08:53:08 crc kubenswrapper[5004]: I1201 08:53:08.729482 5004 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" Dec 01 08:53:08 crc kubenswrapper[5004]: I1201 08:53:08.730777 5004 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c25705b2a5c3bf1cd77f5014f3a7b5fe92ef589d53ece0784883141577010bcc"} pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 08:53:08 crc kubenswrapper[5004]: I1201 08:53:08.730850 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" containerID="cri-o://c25705b2a5c3bf1cd77f5014f3a7b5fe92ef589d53ece0784883141577010bcc" gracePeriod=600 Dec 01 08:53:09 crc kubenswrapper[5004]: I1201 08:53:09.620964 5004 generic.go:334] "Generic (PLEG): container finished" podID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerID="c25705b2a5c3bf1cd77f5014f3a7b5fe92ef589d53ece0784883141577010bcc" exitCode=0 Dec 01 08:53:09 crc kubenswrapper[5004]: I1201 08:53:09.621087 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" event={"ID":"f9977ebb-82de-4e96-8763-0b5a84f8d4ce","Type":"ContainerDied","Data":"c25705b2a5c3bf1cd77f5014f3a7b5fe92ef589d53ece0784883141577010bcc"} Dec 01 08:53:09 crc kubenswrapper[5004]: I1201 08:53:09.621518 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" event={"ID":"f9977ebb-82de-4e96-8763-0b5a84f8d4ce","Type":"ContainerStarted","Data":"35549cd96a84a4d57cde0021a5b3e7f3cc68411360397308b1e916325429b080"} Dec 01 08:53:09 crc kubenswrapper[5004]: I1201 08:53:09.621546 5004 scope.go:117] "RemoveContainer" containerID="70f45b3d2a6bb4bf0d87f18bc08b282543e238e1d391a58745914af24ad7956c" Dec 01 08:53:12 crc kubenswrapper[5004]: I1201 08:53:12.389541 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cv6zt" Dec 01 08:53:12 crc kubenswrapper[5004]: I1201 08:53:12.390204 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cv6zt" Dec 01 08:53:12 crc kubenswrapper[5004]: I1201 08:53:12.490299 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cv6zt" Dec 01 08:53:12 crc kubenswrapper[5004]: I1201 08:53:12.745292 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cv6zt" Dec 01 08:53:12 crc kubenswrapper[5004]: I1201 08:53:12.817910 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cv6zt"] Dec 01 08:53:14 crc kubenswrapper[5004]: I1201 08:53:14.692891 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-cv6zt" podUID="7ca2567c-b311-446f-ac70-98d098596207" containerName="registry-server" containerID="cri-o://a596ab331a544c122ffd8ca98066ecf75139d59a0ef1a972bc4fce1c4c169157" gracePeriod=2 Dec 01 08:53:15 crc kubenswrapper[5004]: I1201 08:53:15.292959 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cv6zt" Dec 01 08:53:15 crc kubenswrapper[5004]: I1201 08:53:15.436920 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ca2567c-b311-446f-ac70-98d098596207-utilities\") pod \"7ca2567c-b311-446f-ac70-98d098596207\" (UID: \"7ca2567c-b311-446f-ac70-98d098596207\") " Dec 01 08:53:15 crc kubenswrapper[5004]: I1201 08:53:15.437109 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ca2567c-b311-446f-ac70-98d098596207-catalog-content\") pod \"7ca2567c-b311-446f-ac70-98d098596207\" (UID: \"7ca2567c-b311-446f-ac70-98d098596207\") " Dec 01 08:53:15 crc kubenswrapper[5004]: I1201 08:53:15.437312 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-492m6\" (UniqueName: \"kubernetes.io/projected/7ca2567c-b311-446f-ac70-98d098596207-kube-api-access-492m6\") pod \"7ca2567c-b311-446f-ac70-98d098596207\" (UID: \"7ca2567c-b311-446f-ac70-98d098596207\") " Dec 01 08:53:15 crc kubenswrapper[5004]: I1201 08:53:15.437915 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ca2567c-b311-446f-ac70-98d098596207-utilities" (OuterVolumeSpecName: "utilities") pod "7ca2567c-b311-446f-ac70-98d098596207" (UID: "7ca2567c-b311-446f-ac70-98d098596207"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:53:15 crc kubenswrapper[5004]: I1201 08:53:15.438397 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ca2567c-b311-446f-ac70-98d098596207-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 08:53:15 crc kubenswrapper[5004]: I1201 08:53:15.444355 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ca2567c-b311-446f-ac70-98d098596207-kube-api-access-492m6" (OuterVolumeSpecName: "kube-api-access-492m6") pod "7ca2567c-b311-446f-ac70-98d098596207" (UID: "7ca2567c-b311-446f-ac70-98d098596207"). InnerVolumeSpecName "kube-api-access-492m6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:53:15 crc kubenswrapper[5004]: I1201 08:53:15.489057 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ca2567c-b311-446f-ac70-98d098596207-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7ca2567c-b311-446f-ac70-98d098596207" (UID: "7ca2567c-b311-446f-ac70-98d098596207"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:53:15 crc kubenswrapper[5004]: I1201 08:53:15.541392 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ca2567c-b311-446f-ac70-98d098596207-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 08:53:15 crc kubenswrapper[5004]: I1201 08:53:15.541437 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-492m6\" (UniqueName: \"kubernetes.io/projected/7ca2567c-b311-446f-ac70-98d098596207-kube-api-access-492m6\") on node \"crc\" DevicePath \"\"" Dec 01 08:53:15 crc kubenswrapper[5004]: I1201 08:53:15.709528 5004 generic.go:334] "Generic (PLEG): container finished" podID="7ca2567c-b311-446f-ac70-98d098596207" containerID="a596ab331a544c122ffd8ca98066ecf75139d59a0ef1a972bc4fce1c4c169157" exitCode=0 Dec 01 08:53:15 crc kubenswrapper[5004]: I1201 08:53:15.709602 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cv6zt" Dec 01 08:53:15 crc kubenswrapper[5004]: I1201 08:53:15.709616 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cv6zt" event={"ID":"7ca2567c-b311-446f-ac70-98d098596207","Type":"ContainerDied","Data":"a596ab331a544c122ffd8ca98066ecf75139d59a0ef1a972bc4fce1c4c169157"} Dec 01 08:53:15 crc kubenswrapper[5004]: I1201 08:53:15.709853 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cv6zt" event={"ID":"7ca2567c-b311-446f-ac70-98d098596207","Type":"ContainerDied","Data":"a2a8b321195d56bed338bd4e4f481a8203d9782adcf3f82a7e72323ea0a4eb20"} Dec 01 08:53:15 crc kubenswrapper[5004]: I1201 08:53:15.709915 5004 scope.go:117] "RemoveContainer" containerID="a596ab331a544c122ffd8ca98066ecf75139d59a0ef1a972bc4fce1c4c169157" Dec 01 08:53:15 crc kubenswrapper[5004]: I1201 08:53:15.752765 5004 scope.go:117] "RemoveContainer" containerID="bff8206e36cba54f1dc5ee8495e659132023787ceaec155130a7dc2b5ec69452" Dec 01 08:53:15 crc kubenswrapper[5004]: I1201 08:53:15.764961 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cv6zt"] Dec 01 08:53:15 crc kubenswrapper[5004]: I1201 08:53:15.774236 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-cv6zt"] Dec 01 08:53:15 crc kubenswrapper[5004]: I1201 08:53:15.791656 5004 scope.go:117] "RemoveContainer" containerID="3f370979b8de4f34616762545bef4a6c3f7ec4bb61215da83239680374ea5d50" Dec 01 08:53:15 crc kubenswrapper[5004]: I1201 08:53:15.865786 5004 scope.go:117] "RemoveContainer" containerID="a596ab331a544c122ffd8ca98066ecf75139d59a0ef1a972bc4fce1c4c169157" Dec 01 08:53:15 crc kubenswrapper[5004]: E1201 08:53:15.866338 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a596ab331a544c122ffd8ca98066ecf75139d59a0ef1a972bc4fce1c4c169157\": container with ID starting with a596ab331a544c122ffd8ca98066ecf75139d59a0ef1a972bc4fce1c4c169157 not found: ID does not exist" containerID="a596ab331a544c122ffd8ca98066ecf75139d59a0ef1a972bc4fce1c4c169157" Dec 01 08:53:15 crc kubenswrapper[5004]: I1201 08:53:15.866393 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a596ab331a544c122ffd8ca98066ecf75139d59a0ef1a972bc4fce1c4c169157"} err="failed to get container status \"a596ab331a544c122ffd8ca98066ecf75139d59a0ef1a972bc4fce1c4c169157\": rpc error: code = NotFound desc = could not find container \"a596ab331a544c122ffd8ca98066ecf75139d59a0ef1a972bc4fce1c4c169157\": container with ID starting with a596ab331a544c122ffd8ca98066ecf75139d59a0ef1a972bc4fce1c4c169157 not found: ID does not exist" Dec 01 08:53:15 crc kubenswrapper[5004]: I1201 08:53:15.866424 5004 scope.go:117] "RemoveContainer" containerID="bff8206e36cba54f1dc5ee8495e659132023787ceaec155130a7dc2b5ec69452" Dec 01 08:53:15 crc kubenswrapper[5004]: E1201 08:53:15.866853 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bff8206e36cba54f1dc5ee8495e659132023787ceaec155130a7dc2b5ec69452\": container with ID starting with bff8206e36cba54f1dc5ee8495e659132023787ceaec155130a7dc2b5ec69452 not found: ID does not exist" containerID="bff8206e36cba54f1dc5ee8495e659132023787ceaec155130a7dc2b5ec69452" Dec 01 08:53:15 crc kubenswrapper[5004]: I1201 08:53:15.866910 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bff8206e36cba54f1dc5ee8495e659132023787ceaec155130a7dc2b5ec69452"} err="failed to get container status \"bff8206e36cba54f1dc5ee8495e659132023787ceaec155130a7dc2b5ec69452\": rpc error: code = NotFound desc = could not find container \"bff8206e36cba54f1dc5ee8495e659132023787ceaec155130a7dc2b5ec69452\": container with ID starting with bff8206e36cba54f1dc5ee8495e659132023787ceaec155130a7dc2b5ec69452 not found: ID does not exist" Dec 01 08:53:15 crc kubenswrapper[5004]: I1201 08:53:15.866946 5004 scope.go:117] "RemoveContainer" containerID="3f370979b8de4f34616762545bef4a6c3f7ec4bb61215da83239680374ea5d50" Dec 01 08:53:15 crc kubenswrapper[5004]: E1201 08:53:15.867335 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f370979b8de4f34616762545bef4a6c3f7ec4bb61215da83239680374ea5d50\": container with ID starting with 3f370979b8de4f34616762545bef4a6c3f7ec4bb61215da83239680374ea5d50 not found: ID does not exist" containerID="3f370979b8de4f34616762545bef4a6c3f7ec4bb61215da83239680374ea5d50" Dec 01 08:53:15 crc kubenswrapper[5004]: I1201 08:53:15.867380 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f370979b8de4f34616762545bef4a6c3f7ec4bb61215da83239680374ea5d50"} err="failed to get container status \"3f370979b8de4f34616762545bef4a6c3f7ec4bb61215da83239680374ea5d50\": rpc error: code = NotFound desc = could not find container \"3f370979b8de4f34616762545bef4a6c3f7ec4bb61215da83239680374ea5d50\": container with ID starting with 3f370979b8de4f34616762545bef4a6c3f7ec4bb61215da83239680374ea5d50 not found: ID does not exist" Dec 01 08:53:16 crc kubenswrapper[5004]: I1201 08:53:16.310433 5004 scope.go:117] "RemoveContainer" containerID="75604ee40e80496e4599aae3f1fb7cc18dd761c466f5be26677f5ac186554dbd" Dec 01 08:53:16 crc kubenswrapper[5004]: I1201 08:53:16.364547 5004 scope.go:117] "RemoveContainer" containerID="4936719999b16beb7679b4dd16c55d0eafcd9b7145a203c8432b4cf062a9f454" Dec 01 08:53:16 crc kubenswrapper[5004]: I1201 08:53:16.424792 5004 scope.go:117] "RemoveContainer" containerID="e638f1b500c8ca992c0b749a358bda55f4b30342cbfb24291a3875f8aea1bdf8" Dec 01 08:53:16 crc kubenswrapper[5004]: I1201 08:53:16.482712 5004 scope.go:117] "RemoveContainer" containerID="6974c52a424b6d086dc89af8391e8f1b387a3788995ab7b740b0374b7bf52360" Dec 01 08:53:16 crc kubenswrapper[5004]: I1201 08:53:16.524982 5004 scope.go:117] "RemoveContainer" containerID="d396d1b5dd78f4ad9d79e792b9b18ece3e9a8e63b83164dbc75f56468b206893" Dec 01 08:53:16 crc kubenswrapper[5004]: I1201 08:53:16.772108 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ca2567c-b311-446f-ac70-98d098596207" path="/var/lib/kubelet/pods/7ca2567c-b311-446f-ac70-98d098596207/volumes" Dec 01 08:53:32 crc kubenswrapper[5004]: I1201 08:53:32.049069 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-tvtkb"] Dec 01 08:53:32 crc kubenswrapper[5004]: I1201 08:53:32.063356 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-tvtkb"] Dec 01 08:53:32 crc kubenswrapper[5004]: E1201 08:53:32.192881 5004 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod54a59a5d_6353_4420_9600_3fdfbaa42595.slice/crio-conmon-42d4c61b977bac8265802f8f942bf744baf965a3001e355dfa81a87b827cc1fd.scope\": RecentStats: unable to find data in memory cache]" Dec 01 08:53:32 crc kubenswrapper[5004]: I1201 08:53:32.792097 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03552a97-36f1-421c-8dfe-359fe79b8a7f" path="/var/lib/kubelet/pods/03552a97-36f1-421c-8dfe-359fe79b8a7f/volumes" Dec 01 08:53:32 crc kubenswrapper[5004]: I1201 08:53:32.942959 5004 generic.go:334] "Generic (PLEG): container finished" podID="54a59a5d-6353-4420-9600-3fdfbaa42595" containerID="42d4c61b977bac8265802f8f942bf744baf965a3001e355dfa81a87b827cc1fd" exitCode=0 Dec 01 08:53:32 crc kubenswrapper[5004]: I1201 08:53:32.943066 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hfxnb" event={"ID":"54a59a5d-6353-4420-9600-3fdfbaa42595","Type":"ContainerDied","Data":"42d4c61b977bac8265802f8f942bf744baf965a3001e355dfa81a87b827cc1fd"} Dec 01 08:53:34 crc kubenswrapper[5004]: I1201 08:53:34.627068 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hfxnb" Dec 01 08:53:34 crc kubenswrapper[5004]: I1201 08:53:34.785009 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/54a59a5d-6353-4420-9600-3fdfbaa42595-ssh-key\") pod \"54a59a5d-6353-4420-9600-3fdfbaa42595\" (UID: \"54a59a5d-6353-4420-9600-3fdfbaa42595\") " Dec 01 08:53:34 crc kubenswrapper[5004]: I1201 08:53:34.785158 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/54a59a5d-6353-4420-9600-3fdfbaa42595-inventory\") pod \"54a59a5d-6353-4420-9600-3fdfbaa42595\" (UID: \"54a59a5d-6353-4420-9600-3fdfbaa42595\") " Dec 01 08:53:34 crc kubenswrapper[5004]: I1201 08:53:34.785417 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99ck4\" (UniqueName: \"kubernetes.io/projected/54a59a5d-6353-4420-9600-3fdfbaa42595-kube-api-access-99ck4\") pod \"54a59a5d-6353-4420-9600-3fdfbaa42595\" (UID: \"54a59a5d-6353-4420-9600-3fdfbaa42595\") " Dec 01 08:53:34 crc kubenswrapper[5004]: I1201 08:53:34.793032 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54a59a5d-6353-4420-9600-3fdfbaa42595-kube-api-access-99ck4" (OuterVolumeSpecName: "kube-api-access-99ck4") pod "54a59a5d-6353-4420-9600-3fdfbaa42595" (UID: "54a59a5d-6353-4420-9600-3fdfbaa42595"). InnerVolumeSpecName "kube-api-access-99ck4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:53:34 crc kubenswrapper[5004]: I1201 08:53:34.830962 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54a59a5d-6353-4420-9600-3fdfbaa42595-inventory" (OuterVolumeSpecName: "inventory") pod "54a59a5d-6353-4420-9600-3fdfbaa42595" (UID: "54a59a5d-6353-4420-9600-3fdfbaa42595"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:53:34 crc kubenswrapper[5004]: I1201 08:53:34.831926 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54a59a5d-6353-4420-9600-3fdfbaa42595-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "54a59a5d-6353-4420-9600-3fdfbaa42595" (UID: "54a59a5d-6353-4420-9600-3fdfbaa42595"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:53:34 crc kubenswrapper[5004]: I1201 08:53:34.896659 5004 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/54a59a5d-6353-4420-9600-3fdfbaa42595-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 08:53:34 crc kubenswrapper[5004]: I1201 08:53:34.896695 5004 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/54a59a5d-6353-4420-9600-3fdfbaa42595-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 08:53:34 crc kubenswrapper[5004]: I1201 08:53:34.896736 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99ck4\" (UniqueName: \"kubernetes.io/projected/54a59a5d-6353-4420-9600-3fdfbaa42595-kube-api-access-99ck4\") on node \"crc\" DevicePath \"\"" Dec 01 08:53:34 crc kubenswrapper[5004]: I1201 08:53:34.972725 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hfxnb" Dec 01 08:53:34 crc kubenswrapper[5004]: I1201 08:53:34.972773 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hfxnb" event={"ID":"54a59a5d-6353-4420-9600-3fdfbaa42595","Type":"ContainerDied","Data":"f39f634bdddcd6a8b2c0b1c1b1bfa8f85bf7a43cbc5a1c0b108f04a138d693a4"} Dec 01 08:53:34 crc kubenswrapper[5004]: I1201 08:53:34.972943 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f39f634bdddcd6a8b2c0b1c1b1bfa8f85bf7a43cbc5a1c0b108f04a138d693a4" Dec 01 08:53:35 crc kubenswrapper[5004]: I1201 08:53:35.086186 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lw9tq"] Dec 01 08:53:35 crc kubenswrapper[5004]: E1201 08:53:35.087024 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ca2567c-b311-446f-ac70-98d098596207" containerName="extract-content" Dec 01 08:53:35 crc kubenswrapper[5004]: I1201 08:53:35.087054 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ca2567c-b311-446f-ac70-98d098596207" containerName="extract-content" Dec 01 08:53:35 crc kubenswrapper[5004]: E1201 08:53:35.087096 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ca2567c-b311-446f-ac70-98d098596207" containerName="registry-server" Dec 01 08:53:35 crc kubenswrapper[5004]: I1201 08:53:35.087106 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ca2567c-b311-446f-ac70-98d098596207" containerName="registry-server" Dec 01 08:53:35 crc kubenswrapper[5004]: E1201 08:53:35.087143 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ca2567c-b311-446f-ac70-98d098596207" containerName="extract-utilities" Dec 01 08:53:35 crc kubenswrapper[5004]: I1201 08:53:35.087153 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ca2567c-b311-446f-ac70-98d098596207" containerName="extract-utilities" Dec 01 08:53:35 crc kubenswrapper[5004]: E1201 08:53:35.087201 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54a59a5d-6353-4420-9600-3fdfbaa42595" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 01 08:53:35 crc kubenswrapper[5004]: I1201 08:53:35.087214 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="54a59a5d-6353-4420-9600-3fdfbaa42595" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 01 08:53:35 crc kubenswrapper[5004]: I1201 08:53:35.087603 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ca2567c-b311-446f-ac70-98d098596207" containerName="registry-server" Dec 01 08:53:35 crc kubenswrapper[5004]: I1201 08:53:35.087667 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="54a59a5d-6353-4420-9600-3fdfbaa42595" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 01 08:53:35 crc kubenswrapper[5004]: I1201 08:53:35.089082 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lw9tq" Dec 01 08:53:35 crc kubenswrapper[5004]: I1201 08:53:35.093228 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 08:53:35 crc kubenswrapper[5004]: I1201 08:53:35.093464 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 08:53:35 crc kubenswrapper[5004]: I1201 08:53:35.096549 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 08:53:35 crc kubenswrapper[5004]: I1201 08:53:35.096834 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pdnrq" Dec 01 08:53:35 crc kubenswrapper[5004]: I1201 08:53:35.100883 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lw9tq"] Dec 01 08:53:35 crc kubenswrapper[5004]: I1201 08:53:35.102210 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/995d1b4f-5178-4522-989d-ba19dbc7316f-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lw9tq\" (UID: \"995d1b4f-5178-4522-989d-ba19dbc7316f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lw9tq" Dec 01 08:53:35 crc kubenswrapper[5004]: I1201 08:53:35.102294 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqjcj\" (UniqueName: \"kubernetes.io/projected/995d1b4f-5178-4522-989d-ba19dbc7316f-kube-api-access-nqjcj\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lw9tq\" (UID: \"995d1b4f-5178-4522-989d-ba19dbc7316f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lw9tq" Dec 01 08:53:35 crc kubenswrapper[5004]: I1201 08:53:35.102348 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/995d1b4f-5178-4522-989d-ba19dbc7316f-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lw9tq\" (UID: \"995d1b4f-5178-4522-989d-ba19dbc7316f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lw9tq" Dec 01 08:53:35 crc kubenswrapper[5004]: I1201 08:53:35.204108 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/995d1b4f-5178-4522-989d-ba19dbc7316f-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lw9tq\" (UID: \"995d1b4f-5178-4522-989d-ba19dbc7316f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lw9tq" Dec 01 08:53:35 crc kubenswrapper[5004]: I1201 08:53:35.204477 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqjcj\" (UniqueName: \"kubernetes.io/projected/995d1b4f-5178-4522-989d-ba19dbc7316f-kube-api-access-nqjcj\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lw9tq\" (UID: \"995d1b4f-5178-4522-989d-ba19dbc7316f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lw9tq" Dec 01 08:53:35 crc kubenswrapper[5004]: I1201 08:53:35.204652 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/995d1b4f-5178-4522-989d-ba19dbc7316f-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lw9tq\" (UID: \"995d1b4f-5178-4522-989d-ba19dbc7316f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lw9tq" Dec 01 08:53:35 crc kubenswrapper[5004]: I1201 08:53:35.209551 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/995d1b4f-5178-4522-989d-ba19dbc7316f-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lw9tq\" (UID: \"995d1b4f-5178-4522-989d-ba19dbc7316f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lw9tq" Dec 01 08:53:35 crc kubenswrapper[5004]: I1201 08:53:35.211204 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/995d1b4f-5178-4522-989d-ba19dbc7316f-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lw9tq\" (UID: \"995d1b4f-5178-4522-989d-ba19dbc7316f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lw9tq" Dec 01 08:53:35 crc kubenswrapper[5004]: I1201 08:53:35.224805 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqjcj\" (UniqueName: \"kubernetes.io/projected/995d1b4f-5178-4522-989d-ba19dbc7316f-kube-api-access-nqjcj\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lw9tq\" (UID: \"995d1b4f-5178-4522-989d-ba19dbc7316f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lw9tq" Dec 01 08:53:35 crc kubenswrapper[5004]: I1201 08:53:35.409378 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lw9tq" Dec 01 08:53:36 crc kubenswrapper[5004]: I1201 08:53:36.091256 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lw9tq"] Dec 01 08:53:36 crc kubenswrapper[5004]: I1201 08:53:36.999255 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lw9tq" event={"ID":"995d1b4f-5178-4522-989d-ba19dbc7316f","Type":"ContainerStarted","Data":"4fa85cf590c38eadf18794a0cb2ea883cf0a47674bd4375427d5682dc578164d"} Dec 01 08:53:38 crc kubenswrapper[5004]: I1201 08:53:38.023168 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lw9tq" event={"ID":"995d1b4f-5178-4522-989d-ba19dbc7316f","Type":"ContainerStarted","Data":"eedfa7510c09d98d272bb2bca9e1bad003ef9b8ccd3ab2fe949ab25b9de905f7"} Dec 01 08:53:39 crc kubenswrapper[5004]: I1201 08:53:39.053172 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lw9tq" podStartSLOduration=3.017152013 podStartE2EDuration="4.053142084s" podCreationTimestamp="2025-12-01 08:53:35 +0000 UTC" firstStartedPulling="2025-12-01 08:53:36.08584297 +0000 UTC m=+2193.650834992" lastFinishedPulling="2025-12-01 08:53:37.121833081 +0000 UTC m=+2194.686825063" observedRunningTime="2025-12-01 08:53:39.048104901 +0000 UTC m=+2196.613096913" watchObservedRunningTime="2025-12-01 08:53:39.053142084 +0000 UTC m=+2196.618134076" Dec 01 08:53:44 crc kubenswrapper[5004]: I1201 08:53:44.110133 5004 generic.go:334] "Generic (PLEG): container finished" podID="995d1b4f-5178-4522-989d-ba19dbc7316f" containerID="eedfa7510c09d98d272bb2bca9e1bad003ef9b8ccd3ab2fe949ab25b9de905f7" exitCode=0 Dec 01 08:53:44 crc kubenswrapper[5004]: I1201 08:53:44.110231 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lw9tq" event={"ID":"995d1b4f-5178-4522-989d-ba19dbc7316f","Type":"ContainerDied","Data":"eedfa7510c09d98d272bb2bca9e1bad003ef9b8ccd3ab2fe949ab25b9de905f7"} Dec 01 08:53:45 crc kubenswrapper[5004]: I1201 08:53:45.721142 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lw9tq" Dec 01 08:53:45 crc kubenswrapper[5004]: I1201 08:53:45.875723 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/995d1b4f-5178-4522-989d-ba19dbc7316f-ssh-key\") pod \"995d1b4f-5178-4522-989d-ba19dbc7316f\" (UID: \"995d1b4f-5178-4522-989d-ba19dbc7316f\") " Dec 01 08:53:45 crc kubenswrapper[5004]: I1201 08:53:45.875840 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqjcj\" (UniqueName: \"kubernetes.io/projected/995d1b4f-5178-4522-989d-ba19dbc7316f-kube-api-access-nqjcj\") pod \"995d1b4f-5178-4522-989d-ba19dbc7316f\" (UID: \"995d1b4f-5178-4522-989d-ba19dbc7316f\") " Dec 01 08:53:45 crc kubenswrapper[5004]: I1201 08:53:45.876168 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/995d1b4f-5178-4522-989d-ba19dbc7316f-inventory\") pod \"995d1b4f-5178-4522-989d-ba19dbc7316f\" (UID: \"995d1b4f-5178-4522-989d-ba19dbc7316f\") " Dec 01 08:53:45 crc kubenswrapper[5004]: I1201 08:53:45.887072 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/995d1b4f-5178-4522-989d-ba19dbc7316f-kube-api-access-nqjcj" (OuterVolumeSpecName: "kube-api-access-nqjcj") pod "995d1b4f-5178-4522-989d-ba19dbc7316f" (UID: "995d1b4f-5178-4522-989d-ba19dbc7316f"). InnerVolumeSpecName "kube-api-access-nqjcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:53:45 crc kubenswrapper[5004]: I1201 08:53:45.920629 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/995d1b4f-5178-4522-989d-ba19dbc7316f-inventory" (OuterVolumeSpecName: "inventory") pod "995d1b4f-5178-4522-989d-ba19dbc7316f" (UID: "995d1b4f-5178-4522-989d-ba19dbc7316f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:53:45 crc kubenswrapper[5004]: I1201 08:53:45.931915 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/995d1b4f-5178-4522-989d-ba19dbc7316f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "995d1b4f-5178-4522-989d-ba19dbc7316f" (UID: "995d1b4f-5178-4522-989d-ba19dbc7316f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:53:45 crc kubenswrapper[5004]: I1201 08:53:45.983156 5004 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/995d1b4f-5178-4522-989d-ba19dbc7316f-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 08:53:45 crc kubenswrapper[5004]: I1201 08:53:45.983218 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqjcj\" (UniqueName: \"kubernetes.io/projected/995d1b4f-5178-4522-989d-ba19dbc7316f-kube-api-access-nqjcj\") on node \"crc\" DevicePath \"\"" Dec 01 08:53:45 crc kubenswrapper[5004]: I1201 08:53:45.983239 5004 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/995d1b4f-5178-4522-989d-ba19dbc7316f-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 08:53:46 crc kubenswrapper[5004]: I1201 08:53:46.144657 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lw9tq" event={"ID":"995d1b4f-5178-4522-989d-ba19dbc7316f","Type":"ContainerDied","Data":"4fa85cf590c38eadf18794a0cb2ea883cf0a47674bd4375427d5682dc578164d"} Dec 01 08:53:46 crc kubenswrapper[5004]: I1201 08:53:46.144980 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4fa85cf590c38eadf18794a0cb2ea883cf0a47674bd4375427d5682dc578164d" Dec 01 08:53:46 crc kubenswrapper[5004]: I1201 08:53:46.144766 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lw9tq" Dec 01 08:53:46 crc kubenswrapper[5004]: I1201 08:53:46.286068 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-r8g28"] Dec 01 08:53:46 crc kubenswrapper[5004]: E1201 08:53:46.302657 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="995d1b4f-5178-4522-989d-ba19dbc7316f" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 01 08:53:46 crc kubenswrapper[5004]: I1201 08:53:46.302692 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="995d1b4f-5178-4522-989d-ba19dbc7316f" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 01 08:53:46 crc kubenswrapper[5004]: I1201 08:53:46.303676 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="995d1b4f-5178-4522-989d-ba19dbc7316f" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 01 08:53:46 crc kubenswrapper[5004]: I1201 08:53:46.311085 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r8g28" Dec 01 08:53:46 crc kubenswrapper[5004]: I1201 08:53:46.315478 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 08:53:46 crc kubenswrapper[5004]: I1201 08:53:46.317087 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pdnrq" Dec 01 08:53:46 crc kubenswrapper[5004]: I1201 08:53:46.317364 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 08:53:46 crc kubenswrapper[5004]: I1201 08:53:46.317362 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 08:53:46 crc kubenswrapper[5004]: I1201 08:53:46.345273 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-r8g28"] Dec 01 08:53:46 crc kubenswrapper[5004]: I1201 08:53:46.409052 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/66d71651-3d44-4bef-8259-19908c843f85-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-r8g28\" (UID: \"66d71651-3d44-4bef-8259-19908c843f85\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r8g28" Dec 01 08:53:46 crc kubenswrapper[5004]: I1201 08:53:46.409145 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/66d71651-3d44-4bef-8259-19908c843f85-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-r8g28\" (UID: \"66d71651-3d44-4bef-8259-19908c843f85\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r8g28" Dec 01 08:53:46 crc kubenswrapper[5004]: I1201 08:53:46.409261 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xq2c\" (UniqueName: \"kubernetes.io/projected/66d71651-3d44-4bef-8259-19908c843f85-kube-api-access-7xq2c\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-r8g28\" (UID: \"66d71651-3d44-4bef-8259-19908c843f85\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r8g28" Dec 01 08:53:46 crc kubenswrapper[5004]: I1201 08:53:46.510305 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xq2c\" (UniqueName: \"kubernetes.io/projected/66d71651-3d44-4bef-8259-19908c843f85-kube-api-access-7xq2c\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-r8g28\" (UID: \"66d71651-3d44-4bef-8259-19908c843f85\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r8g28" Dec 01 08:53:46 crc kubenswrapper[5004]: I1201 08:53:46.510441 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/66d71651-3d44-4bef-8259-19908c843f85-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-r8g28\" (UID: \"66d71651-3d44-4bef-8259-19908c843f85\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r8g28" Dec 01 08:53:46 crc kubenswrapper[5004]: I1201 08:53:46.510584 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/66d71651-3d44-4bef-8259-19908c843f85-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-r8g28\" (UID: \"66d71651-3d44-4bef-8259-19908c843f85\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r8g28" Dec 01 08:53:46 crc kubenswrapper[5004]: I1201 08:53:46.518048 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/66d71651-3d44-4bef-8259-19908c843f85-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-r8g28\" (UID: \"66d71651-3d44-4bef-8259-19908c843f85\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r8g28" Dec 01 08:53:46 crc kubenswrapper[5004]: I1201 08:53:46.519828 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/66d71651-3d44-4bef-8259-19908c843f85-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-r8g28\" (UID: \"66d71651-3d44-4bef-8259-19908c843f85\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r8g28" Dec 01 08:53:46 crc kubenswrapper[5004]: I1201 08:53:46.533479 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xq2c\" (UniqueName: \"kubernetes.io/projected/66d71651-3d44-4bef-8259-19908c843f85-kube-api-access-7xq2c\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-r8g28\" (UID: \"66d71651-3d44-4bef-8259-19908c843f85\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r8g28" Dec 01 08:53:46 crc kubenswrapper[5004]: I1201 08:53:46.648992 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r8g28" Dec 01 08:53:47 crc kubenswrapper[5004]: I1201 08:53:47.169623 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-r8g28"] Dec 01 08:53:48 crc kubenswrapper[5004]: I1201 08:53:48.170531 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r8g28" event={"ID":"66d71651-3d44-4bef-8259-19908c843f85","Type":"ContainerStarted","Data":"406f2ca73eb781ae9a7e574ffb1cd855624378d36a9a8960375901fde3fda3b2"} Dec 01 08:53:48 crc kubenswrapper[5004]: I1201 08:53:48.236585 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zbkmv"] Dec 01 08:53:48 crc kubenswrapper[5004]: I1201 08:53:48.240407 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zbkmv" Dec 01 08:53:48 crc kubenswrapper[5004]: I1201 08:53:48.256192 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da692b22-b0be-4d72-9347-090f436454c5-utilities\") pod \"certified-operators-zbkmv\" (UID: \"da692b22-b0be-4d72-9347-090f436454c5\") " pod="openshift-marketplace/certified-operators-zbkmv" Dec 01 08:53:48 crc kubenswrapper[5004]: I1201 08:53:48.256331 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da692b22-b0be-4d72-9347-090f436454c5-catalog-content\") pod \"certified-operators-zbkmv\" (UID: \"da692b22-b0be-4d72-9347-090f436454c5\") " pod="openshift-marketplace/certified-operators-zbkmv" Dec 01 08:53:48 crc kubenswrapper[5004]: I1201 08:53:48.256476 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8z7hq\" (UniqueName: \"kubernetes.io/projected/da692b22-b0be-4d72-9347-090f436454c5-kube-api-access-8z7hq\") pod \"certified-operators-zbkmv\" (UID: \"da692b22-b0be-4d72-9347-090f436454c5\") " pod="openshift-marketplace/certified-operators-zbkmv" Dec 01 08:53:48 crc kubenswrapper[5004]: I1201 08:53:48.260207 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zbkmv"] Dec 01 08:53:48 crc kubenswrapper[5004]: I1201 08:53:48.358076 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da692b22-b0be-4d72-9347-090f436454c5-catalog-content\") pod \"certified-operators-zbkmv\" (UID: \"da692b22-b0be-4d72-9347-090f436454c5\") " pod="openshift-marketplace/certified-operators-zbkmv" Dec 01 08:53:48 crc kubenswrapper[5004]: I1201 08:53:48.358472 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8z7hq\" (UniqueName: \"kubernetes.io/projected/da692b22-b0be-4d72-9347-090f436454c5-kube-api-access-8z7hq\") pod \"certified-operators-zbkmv\" (UID: \"da692b22-b0be-4d72-9347-090f436454c5\") " pod="openshift-marketplace/certified-operators-zbkmv" Dec 01 08:53:48 crc kubenswrapper[5004]: I1201 08:53:48.358646 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da692b22-b0be-4d72-9347-090f436454c5-utilities\") pod \"certified-operators-zbkmv\" (UID: \"da692b22-b0be-4d72-9347-090f436454c5\") " pod="openshift-marketplace/certified-operators-zbkmv" Dec 01 08:53:48 crc kubenswrapper[5004]: I1201 08:53:48.359029 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da692b22-b0be-4d72-9347-090f436454c5-utilities\") pod \"certified-operators-zbkmv\" (UID: \"da692b22-b0be-4d72-9347-090f436454c5\") " pod="openshift-marketplace/certified-operators-zbkmv" Dec 01 08:53:48 crc kubenswrapper[5004]: I1201 08:53:48.359238 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da692b22-b0be-4d72-9347-090f436454c5-catalog-content\") pod \"certified-operators-zbkmv\" (UID: \"da692b22-b0be-4d72-9347-090f436454c5\") " pod="openshift-marketplace/certified-operators-zbkmv" Dec 01 08:53:48 crc kubenswrapper[5004]: I1201 08:53:48.379585 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8z7hq\" (UniqueName: \"kubernetes.io/projected/da692b22-b0be-4d72-9347-090f436454c5-kube-api-access-8z7hq\") pod \"certified-operators-zbkmv\" (UID: \"da692b22-b0be-4d72-9347-090f436454c5\") " pod="openshift-marketplace/certified-operators-zbkmv" Dec 01 08:53:48 crc kubenswrapper[5004]: I1201 08:53:48.585850 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zbkmv" Dec 01 08:53:49 crc kubenswrapper[5004]: I1201 08:53:49.155760 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zbkmv"] Dec 01 08:53:49 crc kubenswrapper[5004]: I1201 08:53:49.182499 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zbkmv" event={"ID":"da692b22-b0be-4d72-9347-090f436454c5","Type":"ContainerStarted","Data":"92b35395d5c48690e92b1570e6e83fd1bb93205ac219010080b6a6a3478ac901"} Dec 01 08:53:49 crc kubenswrapper[5004]: I1201 08:53:49.183737 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r8g28" event={"ID":"66d71651-3d44-4bef-8259-19908c843f85","Type":"ContainerStarted","Data":"16049db11c165dd4cadce9c9064bb3f83fe0223b68352eb06441f25f8d234769"} Dec 01 08:53:49 crc kubenswrapper[5004]: I1201 08:53:49.215871 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r8g28" podStartSLOduration=2.474698441 podStartE2EDuration="3.215842336s" podCreationTimestamp="2025-12-01 08:53:46 +0000 UTC" firstStartedPulling="2025-12-01 08:53:47.179579084 +0000 UTC m=+2204.744571086" lastFinishedPulling="2025-12-01 08:53:47.920722999 +0000 UTC m=+2205.485714981" observedRunningTime="2025-12-01 08:53:49.20289784 +0000 UTC m=+2206.767889822" watchObservedRunningTime="2025-12-01 08:53:49.215842336 +0000 UTC m=+2206.780834358" Dec 01 08:53:50 crc kubenswrapper[5004]: I1201 08:53:50.202453 5004 generic.go:334] "Generic (PLEG): container finished" podID="da692b22-b0be-4d72-9347-090f436454c5" containerID="65ee1ffcf2a4cb1964b32a0b8fa41d3083829bfd3a0494d946dad5c47521f956" exitCode=0 Dec 01 08:53:50 crc kubenswrapper[5004]: I1201 08:53:50.202513 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zbkmv" event={"ID":"da692b22-b0be-4d72-9347-090f436454c5","Type":"ContainerDied","Data":"65ee1ffcf2a4cb1964b32a0b8fa41d3083829bfd3a0494d946dad5c47521f956"} Dec 01 08:53:52 crc kubenswrapper[5004]: I1201 08:53:52.228425 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zbkmv" event={"ID":"da692b22-b0be-4d72-9347-090f436454c5","Type":"ContainerStarted","Data":"852c5c481c049eedc045209c6150da68c0d886e0576715cbdcf2be3844428d9e"} Dec 01 08:53:54 crc kubenswrapper[5004]: I1201 08:53:54.261360 5004 generic.go:334] "Generic (PLEG): container finished" podID="da692b22-b0be-4d72-9347-090f436454c5" containerID="852c5c481c049eedc045209c6150da68c0d886e0576715cbdcf2be3844428d9e" exitCode=0 Dec 01 08:53:54 crc kubenswrapper[5004]: I1201 08:53:54.261712 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zbkmv" event={"ID":"da692b22-b0be-4d72-9347-090f436454c5","Type":"ContainerDied","Data":"852c5c481c049eedc045209c6150da68c0d886e0576715cbdcf2be3844428d9e"} Dec 01 08:53:56 crc kubenswrapper[5004]: I1201 08:53:56.289780 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zbkmv" event={"ID":"da692b22-b0be-4d72-9347-090f436454c5","Type":"ContainerStarted","Data":"fe9f2ef0d726487cc6d95171ab86d1e3fd12427b1c4f7db3552b9adba8c2112a"} Dec 01 08:53:56 crc kubenswrapper[5004]: I1201 08:53:56.335760 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zbkmv" podStartSLOduration=3.209322324 podStartE2EDuration="8.335733044s" podCreationTimestamp="2025-12-01 08:53:48 +0000 UTC" firstStartedPulling="2025-12-01 08:53:50.20521217 +0000 UTC m=+2207.770204162" lastFinishedPulling="2025-12-01 08:53:55.33162291 +0000 UTC m=+2212.896614882" observedRunningTime="2025-12-01 08:53:56.316202958 +0000 UTC m=+2213.881194990" watchObservedRunningTime="2025-12-01 08:53:56.335733044 +0000 UTC m=+2213.900725066" Dec 01 08:53:58 crc kubenswrapper[5004]: I1201 08:53:58.587287 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zbkmv" Dec 01 08:53:58 crc kubenswrapper[5004]: I1201 08:53:58.588174 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zbkmv" Dec 01 08:53:58 crc kubenswrapper[5004]: I1201 08:53:58.671185 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zbkmv" Dec 01 08:54:08 crc kubenswrapper[5004]: I1201 08:54:08.651728 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zbkmv" Dec 01 08:54:08 crc kubenswrapper[5004]: I1201 08:54:08.721334 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zbkmv"] Dec 01 08:54:09 crc kubenswrapper[5004]: I1201 08:54:09.455057 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zbkmv" podUID="da692b22-b0be-4d72-9347-090f436454c5" containerName="registry-server" containerID="cri-o://fe9f2ef0d726487cc6d95171ab86d1e3fd12427b1c4f7db3552b9adba8c2112a" gracePeriod=2 Dec 01 08:54:09 crc kubenswrapper[5004]: I1201 08:54:09.977462 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zbkmv" Dec 01 08:54:10 crc kubenswrapper[5004]: I1201 08:54:10.046358 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da692b22-b0be-4d72-9347-090f436454c5-utilities\") pod \"da692b22-b0be-4d72-9347-090f436454c5\" (UID: \"da692b22-b0be-4d72-9347-090f436454c5\") " Dec 01 08:54:10 crc kubenswrapper[5004]: I1201 08:54:10.046612 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da692b22-b0be-4d72-9347-090f436454c5-catalog-content\") pod \"da692b22-b0be-4d72-9347-090f436454c5\" (UID: \"da692b22-b0be-4d72-9347-090f436454c5\") " Dec 01 08:54:10 crc kubenswrapper[5004]: I1201 08:54:10.046728 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8z7hq\" (UniqueName: \"kubernetes.io/projected/da692b22-b0be-4d72-9347-090f436454c5-kube-api-access-8z7hq\") pod \"da692b22-b0be-4d72-9347-090f436454c5\" (UID: \"da692b22-b0be-4d72-9347-090f436454c5\") " Dec 01 08:54:10 crc kubenswrapper[5004]: I1201 08:54:10.048374 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da692b22-b0be-4d72-9347-090f436454c5-utilities" (OuterVolumeSpecName: "utilities") pod "da692b22-b0be-4d72-9347-090f436454c5" (UID: "da692b22-b0be-4d72-9347-090f436454c5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:54:10 crc kubenswrapper[5004]: I1201 08:54:10.060417 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da692b22-b0be-4d72-9347-090f436454c5-kube-api-access-8z7hq" (OuterVolumeSpecName: "kube-api-access-8z7hq") pod "da692b22-b0be-4d72-9347-090f436454c5" (UID: "da692b22-b0be-4d72-9347-090f436454c5"). InnerVolumeSpecName "kube-api-access-8z7hq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:54:10 crc kubenswrapper[5004]: I1201 08:54:10.098832 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da692b22-b0be-4d72-9347-090f436454c5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "da692b22-b0be-4d72-9347-090f436454c5" (UID: "da692b22-b0be-4d72-9347-090f436454c5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:54:10 crc kubenswrapper[5004]: I1201 08:54:10.149159 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da692b22-b0be-4d72-9347-090f436454c5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 08:54:10 crc kubenswrapper[5004]: I1201 08:54:10.149192 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8z7hq\" (UniqueName: \"kubernetes.io/projected/da692b22-b0be-4d72-9347-090f436454c5-kube-api-access-8z7hq\") on node \"crc\" DevicePath \"\"" Dec 01 08:54:10 crc kubenswrapper[5004]: I1201 08:54:10.149204 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da692b22-b0be-4d72-9347-090f436454c5-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 08:54:10 crc kubenswrapper[5004]: I1201 08:54:10.469331 5004 generic.go:334] "Generic (PLEG): container finished" podID="da692b22-b0be-4d72-9347-090f436454c5" containerID="fe9f2ef0d726487cc6d95171ab86d1e3fd12427b1c4f7db3552b9adba8c2112a" exitCode=0 Dec 01 08:54:10 crc kubenswrapper[5004]: I1201 08:54:10.469379 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zbkmv" event={"ID":"da692b22-b0be-4d72-9347-090f436454c5","Type":"ContainerDied","Data":"fe9f2ef0d726487cc6d95171ab86d1e3fd12427b1c4f7db3552b9adba8c2112a"} Dec 01 08:54:10 crc kubenswrapper[5004]: I1201 08:54:10.469387 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zbkmv" Dec 01 08:54:10 crc kubenswrapper[5004]: I1201 08:54:10.469416 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zbkmv" event={"ID":"da692b22-b0be-4d72-9347-090f436454c5","Type":"ContainerDied","Data":"92b35395d5c48690e92b1570e6e83fd1bb93205ac219010080b6a6a3478ac901"} Dec 01 08:54:10 crc kubenswrapper[5004]: I1201 08:54:10.469437 5004 scope.go:117] "RemoveContainer" containerID="fe9f2ef0d726487cc6d95171ab86d1e3fd12427b1c4f7db3552b9adba8c2112a" Dec 01 08:54:10 crc kubenswrapper[5004]: I1201 08:54:10.501078 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zbkmv"] Dec 01 08:54:10 crc kubenswrapper[5004]: I1201 08:54:10.503937 5004 scope.go:117] "RemoveContainer" containerID="852c5c481c049eedc045209c6150da68c0d886e0576715cbdcf2be3844428d9e" Dec 01 08:54:10 crc kubenswrapper[5004]: I1201 08:54:10.512494 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zbkmv"] Dec 01 08:54:10 crc kubenswrapper[5004]: I1201 08:54:10.537895 5004 scope.go:117] "RemoveContainer" containerID="65ee1ffcf2a4cb1964b32a0b8fa41d3083829bfd3a0494d946dad5c47521f956" Dec 01 08:54:10 crc kubenswrapper[5004]: I1201 08:54:10.573927 5004 scope.go:117] "RemoveContainer" containerID="fe9f2ef0d726487cc6d95171ab86d1e3fd12427b1c4f7db3552b9adba8c2112a" Dec 01 08:54:10 crc kubenswrapper[5004]: E1201 08:54:10.574515 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe9f2ef0d726487cc6d95171ab86d1e3fd12427b1c4f7db3552b9adba8c2112a\": container with ID starting with fe9f2ef0d726487cc6d95171ab86d1e3fd12427b1c4f7db3552b9adba8c2112a not found: ID does not exist" containerID="fe9f2ef0d726487cc6d95171ab86d1e3fd12427b1c4f7db3552b9adba8c2112a" Dec 01 08:54:10 crc kubenswrapper[5004]: I1201 08:54:10.574650 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe9f2ef0d726487cc6d95171ab86d1e3fd12427b1c4f7db3552b9adba8c2112a"} err="failed to get container status \"fe9f2ef0d726487cc6d95171ab86d1e3fd12427b1c4f7db3552b9adba8c2112a\": rpc error: code = NotFound desc = could not find container \"fe9f2ef0d726487cc6d95171ab86d1e3fd12427b1c4f7db3552b9adba8c2112a\": container with ID starting with fe9f2ef0d726487cc6d95171ab86d1e3fd12427b1c4f7db3552b9adba8c2112a not found: ID does not exist" Dec 01 08:54:10 crc kubenswrapper[5004]: I1201 08:54:10.574745 5004 scope.go:117] "RemoveContainer" containerID="852c5c481c049eedc045209c6150da68c0d886e0576715cbdcf2be3844428d9e" Dec 01 08:54:10 crc kubenswrapper[5004]: E1201 08:54:10.575244 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"852c5c481c049eedc045209c6150da68c0d886e0576715cbdcf2be3844428d9e\": container with ID starting with 852c5c481c049eedc045209c6150da68c0d886e0576715cbdcf2be3844428d9e not found: ID does not exist" containerID="852c5c481c049eedc045209c6150da68c0d886e0576715cbdcf2be3844428d9e" Dec 01 08:54:10 crc kubenswrapper[5004]: I1201 08:54:10.575290 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"852c5c481c049eedc045209c6150da68c0d886e0576715cbdcf2be3844428d9e"} err="failed to get container status \"852c5c481c049eedc045209c6150da68c0d886e0576715cbdcf2be3844428d9e\": rpc error: code = NotFound desc = could not find container \"852c5c481c049eedc045209c6150da68c0d886e0576715cbdcf2be3844428d9e\": container with ID starting with 852c5c481c049eedc045209c6150da68c0d886e0576715cbdcf2be3844428d9e not found: ID does not exist" Dec 01 08:54:10 crc kubenswrapper[5004]: I1201 08:54:10.575320 5004 scope.go:117] "RemoveContainer" containerID="65ee1ffcf2a4cb1964b32a0b8fa41d3083829bfd3a0494d946dad5c47521f956" Dec 01 08:54:10 crc kubenswrapper[5004]: E1201 08:54:10.575633 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65ee1ffcf2a4cb1964b32a0b8fa41d3083829bfd3a0494d946dad5c47521f956\": container with ID starting with 65ee1ffcf2a4cb1964b32a0b8fa41d3083829bfd3a0494d946dad5c47521f956 not found: ID does not exist" containerID="65ee1ffcf2a4cb1964b32a0b8fa41d3083829bfd3a0494d946dad5c47521f956" Dec 01 08:54:10 crc kubenswrapper[5004]: I1201 08:54:10.575658 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65ee1ffcf2a4cb1964b32a0b8fa41d3083829bfd3a0494d946dad5c47521f956"} err="failed to get container status \"65ee1ffcf2a4cb1964b32a0b8fa41d3083829bfd3a0494d946dad5c47521f956\": rpc error: code = NotFound desc = could not find container \"65ee1ffcf2a4cb1964b32a0b8fa41d3083829bfd3a0494d946dad5c47521f956\": container with ID starting with 65ee1ffcf2a4cb1964b32a0b8fa41d3083829bfd3a0494d946dad5c47521f956 not found: ID does not exist" Dec 01 08:54:10 crc kubenswrapper[5004]: I1201 08:54:10.782660 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da692b22-b0be-4d72-9347-090f436454c5" path="/var/lib/kubelet/pods/da692b22-b0be-4d72-9347-090f436454c5/volumes" Dec 01 08:54:16 crc kubenswrapper[5004]: I1201 08:54:16.723151 5004 scope.go:117] "RemoveContainer" containerID="eefbec68660f5a6a8770dc5ee6f71f3c7b18cef614bc4087bac9a2369ec84460" Dec 01 08:54:30 crc kubenswrapper[5004]: I1201 08:54:30.730781 5004 generic.go:334] "Generic (PLEG): container finished" podID="66d71651-3d44-4bef-8259-19908c843f85" containerID="16049db11c165dd4cadce9c9064bb3f83fe0223b68352eb06441f25f8d234769" exitCode=0 Dec 01 08:54:30 crc kubenswrapper[5004]: I1201 08:54:30.730860 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r8g28" event={"ID":"66d71651-3d44-4bef-8259-19908c843f85","Type":"ContainerDied","Data":"16049db11c165dd4cadce9c9064bb3f83fe0223b68352eb06441f25f8d234769"} Dec 01 08:54:32 crc kubenswrapper[5004]: I1201 08:54:32.318019 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r8g28" Dec 01 08:54:32 crc kubenswrapper[5004]: I1201 08:54:32.475454 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/66d71651-3d44-4bef-8259-19908c843f85-inventory\") pod \"66d71651-3d44-4bef-8259-19908c843f85\" (UID: \"66d71651-3d44-4bef-8259-19908c843f85\") " Dec 01 08:54:32 crc kubenswrapper[5004]: I1201 08:54:32.476041 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xq2c\" (UniqueName: \"kubernetes.io/projected/66d71651-3d44-4bef-8259-19908c843f85-kube-api-access-7xq2c\") pod \"66d71651-3d44-4bef-8259-19908c843f85\" (UID: \"66d71651-3d44-4bef-8259-19908c843f85\") " Dec 01 08:54:32 crc kubenswrapper[5004]: I1201 08:54:32.476073 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/66d71651-3d44-4bef-8259-19908c843f85-ssh-key\") pod \"66d71651-3d44-4bef-8259-19908c843f85\" (UID: \"66d71651-3d44-4bef-8259-19908c843f85\") " Dec 01 08:54:32 crc kubenswrapper[5004]: I1201 08:54:32.490335 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66d71651-3d44-4bef-8259-19908c843f85-kube-api-access-7xq2c" (OuterVolumeSpecName: "kube-api-access-7xq2c") pod "66d71651-3d44-4bef-8259-19908c843f85" (UID: "66d71651-3d44-4bef-8259-19908c843f85"). InnerVolumeSpecName "kube-api-access-7xq2c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:54:32 crc kubenswrapper[5004]: I1201 08:54:32.511279 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66d71651-3d44-4bef-8259-19908c843f85-inventory" (OuterVolumeSpecName: "inventory") pod "66d71651-3d44-4bef-8259-19908c843f85" (UID: "66d71651-3d44-4bef-8259-19908c843f85"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:54:32 crc kubenswrapper[5004]: I1201 08:54:32.513870 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66d71651-3d44-4bef-8259-19908c843f85-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "66d71651-3d44-4bef-8259-19908c843f85" (UID: "66d71651-3d44-4bef-8259-19908c843f85"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:54:32 crc kubenswrapper[5004]: I1201 08:54:32.579217 5004 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/66d71651-3d44-4bef-8259-19908c843f85-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 08:54:32 crc kubenswrapper[5004]: I1201 08:54:32.579255 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xq2c\" (UniqueName: \"kubernetes.io/projected/66d71651-3d44-4bef-8259-19908c843f85-kube-api-access-7xq2c\") on node \"crc\" DevicePath \"\"" Dec 01 08:54:32 crc kubenswrapper[5004]: I1201 08:54:32.579269 5004 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/66d71651-3d44-4bef-8259-19908c843f85-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 08:54:32 crc kubenswrapper[5004]: I1201 08:54:32.805684 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r8g28" Dec 01 08:54:32 crc kubenswrapper[5004]: I1201 08:54:32.836740 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r8g28" event={"ID":"66d71651-3d44-4bef-8259-19908c843f85","Type":"ContainerDied","Data":"406f2ca73eb781ae9a7e574ffb1cd855624378d36a9a8960375901fde3fda3b2"} Dec 01 08:54:32 crc kubenswrapper[5004]: I1201 08:54:32.836786 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="406f2ca73eb781ae9a7e574ffb1cd855624378d36a9a8960375901fde3fda3b2" Dec 01 08:54:32 crc kubenswrapper[5004]: I1201 08:54:32.872786 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p7x8j"] Dec 01 08:54:32 crc kubenswrapper[5004]: E1201 08:54:32.873305 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da692b22-b0be-4d72-9347-090f436454c5" containerName="extract-utilities" Dec 01 08:54:32 crc kubenswrapper[5004]: I1201 08:54:32.873323 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="da692b22-b0be-4d72-9347-090f436454c5" containerName="extract-utilities" Dec 01 08:54:32 crc kubenswrapper[5004]: E1201 08:54:32.873347 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da692b22-b0be-4d72-9347-090f436454c5" containerName="registry-server" Dec 01 08:54:32 crc kubenswrapper[5004]: I1201 08:54:32.873354 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="da692b22-b0be-4d72-9347-090f436454c5" containerName="registry-server" Dec 01 08:54:32 crc kubenswrapper[5004]: E1201 08:54:32.873380 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da692b22-b0be-4d72-9347-090f436454c5" containerName="extract-content" Dec 01 08:54:32 crc kubenswrapper[5004]: I1201 08:54:32.873386 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="da692b22-b0be-4d72-9347-090f436454c5" containerName="extract-content" Dec 01 08:54:32 crc kubenswrapper[5004]: E1201 08:54:32.873396 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66d71651-3d44-4bef-8259-19908c843f85" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 01 08:54:32 crc kubenswrapper[5004]: I1201 08:54:32.873404 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="66d71651-3d44-4bef-8259-19908c843f85" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 01 08:54:32 crc kubenswrapper[5004]: I1201 08:54:32.873634 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="da692b22-b0be-4d72-9347-090f436454c5" containerName="registry-server" Dec 01 08:54:32 crc kubenswrapper[5004]: I1201 08:54:32.873665 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="66d71651-3d44-4bef-8259-19908c843f85" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 01 08:54:32 crc kubenswrapper[5004]: I1201 08:54:32.874497 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p7x8j" Dec 01 08:54:32 crc kubenswrapper[5004]: I1201 08:54:32.876332 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 08:54:32 crc kubenswrapper[5004]: I1201 08:54:32.877529 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 08:54:32 crc kubenswrapper[5004]: I1201 08:54:32.877718 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 08:54:32 crc kubenswrapper[5004]: I1201 08:54:32.877829 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pdnrq" Dec 01 08:54:32 crc kubenswrapper[5004]: I1201 08:54:32.899861 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p7x8j"] Dec 01 08:54:32 crc kubenswrapper[5004]: I1201 08:54:32.994729 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/604b379e-d0f5-469b-abcd-6c9717007b3f-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-p7x8j\" (UID: \"604b379e-d0f5-469b-abcd-6c9717007b3f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p7x8j" Dec 01 08:54:32 crc kubenswrapper[5004]: I1201 08:54:32.994842 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hk7s\" (UniqueName: \"kubernetes.io/projected/604b379e-d0f5-469b-abcd-6c9717007b3f-kube-api-access-9hk7s\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-p7x8j\" (UID: \"604b379e-d0f5-469b-abcd-6c9717007b3f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p7x8j" Dec 01 08:54:32 crc kubenswrapper[5004]: I1201 08:54:32.994901 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/604b379e-d0f5-469b-abcd-6c9717007b3f-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-p7x8j\" (UID: \"604b379e-d0f5-469b-abcd-6c9717007b3f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p7x8j" Dec 01 08:54:33 crc kubenswrapper[5004]: I1201 08:54:33.097240 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hk7s\" (UniqueName: \"kubernetes.io/projected/604b379e-d0f5-469b-abcd-6c9717007b3f-kube-api-access-9hk7s\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-p7x8j\" (UID: \"604b379e-d0f5-469b-abcd-6c9717007b3f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p7x8j" Dec 01 08:54:33 crc kubenswrapper[5004]: I1201 08:54:33.097364 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/604b379e-d0f5-469b-abcd-6c9717007b3f-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-p7x8j\" (UID: \"604b379e-d0f5-469b-abcd-6c9717007b3f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p7x8j" Dec 01 08:54:33 crc kubenswrapper[5004]: I1201 08:54:33.097532 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/604b379e-d0f5-469b-abcd-6c9717007b3f-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-p7x8j\" (UID: \"604b379e-d0f5-469b-abcd-6c9717007b3f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p7x8j" Dec 01 08:54:33 crc kubenswrapper[5004]: I1201 08:54:33.107469 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/604b379e-d0f5-469b-abcd-6c9717007b3f-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-p7x8j\" (UID: \"604b379e-d0f5-469b-abcd-6c9717007b3f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p7x8j" Dec 01 08:54:33 crc kubenswrapper[5004]: I1201 08:54:33.108032 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/604b379e-d0f5-469b-abcd-6c9717007b3f-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-p7x8j\" (UID: \"604b379e-d0f5-469b-abcd-6c9717007b3f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p7x8j" Dec 01 08:54:33 crc kubenswrapper[5004]: I1201 08:54:33.119144 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hk7s\" (UniqueName: \"kubernetes.io/projected/604b379e-d0f5-469b-abcd-6c9717007b3f-kube-api-access-9hk7s\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-p7x8j\" (UID: \"604b379e-d0f5-469b-abcd-6c9717007b3f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p7x8j" Dec 01 08:54:33 crc kubenswrapper[5004]: I1201 08:54:33.235308 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p7x8j" Dec 01 08:54:33 crc kubenswrapper[5004]: I1201 08:54:33.849496 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p7x8j"] Dec 01 08:54:34 crc kubenswrapper[5004]: I1201 08:54:34.831488 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p7x8j" event={"ID":"604b379e-d0f5-469b-abcd-6c9717007b3f","Type":"ContainerStarted","Data":"860df5014b6aeea94620c6d2cc51f70a50d28fb2bf31e0d0148100c15380b0dc"} Dec 01 08:54:34 crc kubenswrapper[5004]: I1201 08:54:34.832362 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p7x8j" event={"ID":"604b379e-d0f5-469b-abcd-6c9717007b3f","Type":"ContainerStarted","Data":"434fcdffceb50ffc3c2a0f2dad8e74bb7324265e9ed6475453ac758b0d4b893a"} Dec 01 08:54:34 crc kubenswrapper[5004]: I1201 08:54:34.875106 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p7x8j" podStartSLOduration=2.276378187 podStartE2EDuration="2.87507948s" podCreationTimestamp="2025-12-01 08:54:32 +0000 UTC" firstStartedPulling="2025-12-01 08:54:33.850830635 +0000 UTC m=+2251.415822617" lastFinishedPulling="2025-12-01 08:54:34.449531888 +0000 UTC m=+2252.014523910" observedRunningTime="2025-12-01 08:54:34.852244553 +0000 UTC m=+2252.417236545" watchObservedRunningTime="2025-12-01 08:54:34.87507948 +0000 UTC m=+2252.440071482" Dec 01 08:55:32 crc kubenswrapper[5004]: I1201 08:55:32.060011 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-sskn9"] Dec 01 08:55:32 crc kubenswrapper[5004]: I1201 08:55:32.072307 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-sskn9"] Dec 01 08:55:32 crc kubenswrapper[5004]: I1201 08:55:32.774330 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fc1c621-41a4-4fcd-ab52-dd45c3d82080" path="/var/lib/kubelet/pods/1fc1c621-41a4-4fcd-ab52-dd45c3d82080/volumes" Dec 01 08:55:35 crc kubenswrapper[5004]: I1201 08:55:35.651195 5004 generic.go:334] "Generic (PLEG): container finished" podID="604b379e-d0f5-469b-abcd-6c9717007b3f" containerID="860df5014b6aeea94620c6d2cc51f70a50d28fb2bf31e0d0148100c15380b0dc" exitCode=0 Dec 01 08:55:35 crc kubenswrapper[5004]: I1201 08:55:35.651340 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p7x8j" event={"ID":"604b379e-d0f5-469b-abcd-6c9717007b3f","Type":"ContainerDied","Data":"860df5014b6aeea94620c6d2cc51f70a50d28fb2bf31e0d0148100c15380b0dc"} Dec 01 08:55:37 crc kubenswrapper[5004]: I1201 08:55:37.303476 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p7x8j" Dec 01 08:55:37 crc kubenswrapper[5004]: I1201 08:55:37.447139 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/604b379e-d0f5-469b-abcd-6c9717007b3f-ssh-key\") pod \"604b379e-d0f5-469b-abcd-6c9717007b3f\" (UID: \"604b379e-d0f5-469b-abcd-6c9717007b3f\") " Dec 01 08:55:37 crc kubenswrapper[5004]: I1201 08:55:37.447211 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/604b379e-d0f5-469b-abcd-6c9717007b3f-inventory\") pod \"604b379e-d0f5-469b-abcd-6c9717007b3f\" (UID: \"604b379e-d0f5-469b-abcd-6c9717007b3f\") " Dec 01 08:55:37 crc kubenswrapper[5004]: I1201 08:55:37.447501 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hk7s\" (UniqueName: \"kubernetes.io/projected/604b379e-d0f5-469b-abcd-6c9717007b3f-kube-api-access-9hk7s\") pod \"604b379e-d0f5-469b-abcd-6c9717007b3f\" (UID: \"604b379e-d0f5-469b-abcd-6c9717007b3f\") " Dec 01 08:55:37 crc kubenswrapper[5004]: I1201 08:55:37.455131 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/604b379e-d0f5-469b-abcd-6c9717007b3f-kube-api-access-9hk7s" (OuterVolumeSpecName: "kube-api-access-9hk7s") pod "604b379e-d0f5-469b-abcd-6c9717007b3f" (UID: "604b379e-d0f5-469b-abcd-6c9717007b3f"). InnerVolumeSpecName "kube-api-access-9hk7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:55:37 crc kubenswrapper[5004]: I1201 08:55:37.486959 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/604b379e-d0f5-469b-abcd-6c9717007b3f-inventory" (OuterVolumeSpecName: "inventory") pod "604b379e-d0f5-469b-abcd-6c9717007b3f" (UID: "604b379e-d0f5-469b-abcd-6c9717007b3f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:55:37 crc kubenswrapper[5004]: I1201 08:55:37.496044 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/604b379e-d0f5-469b-abcd-6c9717007b3f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "604b379e-d0f5-469b-abcd-6c9717007b3f" (UID: "604b379e-d0f5-469b-abcd-6c9717007b3f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:55:37 crc kubenswrapper[5004]: I1201 08:55:37.551006 5004 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/604b379e-d0f5-469b-abcd-6c9717007b3f-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 08:55:37 crc kubenswrapper[5004]: I1201 08:55:37.551050 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hk7s\" (UniqueName: \"kubernetes.io/projected/604b379e-d0f5-469b-abcd-6c9717007b3f-kube-api-access-9hk7s\") on node \"crc\" DevicePath \"\"" Dec 01 08:55:37 crc kubenswrapper[5004]: I1201 08:55:37.551066 5004 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/604b379e-d0f5-469b-abcd-6c9717007b3f-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 08:55:37 crc kubenswrapper[5004]: I1201 08:55:37.678809 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p7x8j" event={"ID":"604b379e-d0f5-469b-abcd-6c9717007b3f","Type":"ContainerDied","Data":"434fcdffceb50ffc3c2a0f2dad8e74bb7324265e9ed6475453ac758b0d4b893a"} Dec 01 08:55:37 crc kubenswrapper[5004]: I1201 08:55:37.678859 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="434fcdffceb50ffc3c2a0f2dad8e74bb7324265e9ed6475453ac758b0d4b893a" Dec 01 08:55:37 crc kubenswrapper[5004]: I1201 08:55:37.678880 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-p7x8j" Dec 01 08:55:37 crc kubenswrapper[5004]: I1201 08:55:37.807226 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-jhdl8"] Dec 01 08:55:37 crc kubenswrapper[5004]: E1201 08:55:37.807736 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="604b379e-d0f5-469b-abcd-6c9717007b3f" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 01 08:55:37 crc kubenswrapper[5004]: I1201 08:55:37.807754 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="604b379e-d0f5-469b-abcd-6c9717007b3f" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 01 08:55:37 crc kubenswrapper[5004]: I1201 08:55:37.808006 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="604b379e-d0f5-469b-abcd-6c9717007b3f" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 01 08:55:37 crc kubenswrapper[5004]: I1201 08:55:37.809406 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-jhdl8" Dec 01 08:55:37 crc kubenswrapper[5004]: I1201 08:55:37.823363 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 08:55:37 crc kubenswrapper[5004]: I1201 08:55:37.835715 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 08:55:37 crc kubenswrapper[5004]: I1201 08:55:37.836229 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 08:55:37 crc kubenswrapper[5004]: I1201 08:55:37.836486 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pdnrq" Dec 01 08:55:37 crc kubenswrapper[5004]: I1201 08:55:37.842292 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-jhdl8"] Dec 01 08:55:37 crc kubenswrapper[5004]: I1201 08:55:37.859072 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/47faa1f0-89dc-4cf2-a026-b425da197aaf-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-jhdl8\" (UID: \"47faa1f0-89dc-4cf2-a026-b425da197aaf\") " pod="openstack/ssh-known-hosts-edpm-deployment-jhdl8" Dec 01 08:55:37 crc kubenswrapper[5004]: I1201 08:55:37.859325 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7j5pg\" (UniqueName: \"kubernetes.io/projected/47faa1f0-89dc-4cf2-a026-b425da197aaf-kube-api-access-7j5pg\") pod \"ssh-known-hosts-edpm-deployment-jhdl8\" (UID: \"47faa1f0-89dc-4cf2-a026-b425da197aaf\") " pod="openstack/ssh-known-hosts-edpm-deployment-jhdl8" Dec 01 08:55:37 crc kubenswrapper[5004]: I1201 08:55:37.859507 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/47faa1f0-89dc-4cf2-a026-b425da197aaf-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-jhdl8\" (UID: \"47faa1f0-89dc-4cf2-a026-b425da197aaf\") " pod="openstack/ssh-known-hosts-edpm-deployment-jhdl8" Dec 01 08:55:37 crc kubenswrapper[5004]: I1201 08:55:37.961217 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/47faa1f0-89dc-4cf2-a026-b425da197aaf-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-jhdl8\" (UID: \"47faa1f0-89dc-4cf2-a026-b425da197aaf\") " pod="openstack/ssh-known-hosts-edpm-deployment-jhdl8" Dec 01 08:55:37 crc kubenswrapper[5004]: I1201 08:55:37.961691 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7j5pg\" (UniqueName: \"kubernetes.io/projected/47faa1f0-89dc-4cf2-a026-b425da197aaf-kube-api-access-7j5pg\") pod \"ssh-known-hosts-edpm-deployment-jhdl8\" (UID: \"47faa1f0-89dc-4cf2-a026-b425da197aaf\") " pod="openstack/ssh-known-hosts-edpm-deployment-jhdl8" Dec 01 08:55:37 crc kubenswrapper[5004]: I1201 08:55:37.961794 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/47faa1f0-89dc-4cf2-a026-b425da197aaf-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-jhdl8\" (UID: \"47faa1f0-89dc-4cf2-a026-b425da197aaf\") " pod="openstack/ssh-known-hosts-edpm-deployment-jhdl8" Dec 01 08:55:37 crc kubenswrapper[5004]: I1201 08:55:37.965382 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/47faa1f0-89dc-4cf2-a026-b425da197aaf-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-jhdl8\" (UID: \"47faa1f0-89dc-4cf2-a026-b425da197aaf\") " pod="openstack/ssh-known-hosts-edpm-deployment-jhdl8" Dec 01 08:55:37 crc kubenswrapper[5004]: I1201 08:55:37.965861 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/47faa1f0-89dc-4cf2-a026-b425da197aaf-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-jhdl8\" (UID: \"47faa1f0-89dc-4cf2-a026-b425da197aaf\") " pod="openstack/ssh-known-hosts-edpm-deployment-jhdl8" Dec 01 08:55:37 crc kubenswrapper[5004]: I1201 08:55:37.978503 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7j5pg\" (UniqueName: \"kubernetes.io/projected/47faa1f0-89dc-4cf2-a026-b425da197aaf-kube-api-access-7j5pg\") pod \"ssh-known-hosts-edpm-deployment-jhdl8\" (UID: \"47faa1f0-89dc-4cf2-a026-b425da197aaf\") " pod="openstack/ssh-known-hosts-edpm-deployment-jhdl8" Dec 01 08:55:38 crc kubenswrapper[5004]: I1201 08:55:38.145529 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-jhdl8" Dec 01 08:55:38 crc kubenswrapper[5004]: I1201 08:55:38.721603 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-jhdl8"] Dec 01 08:55:38 crc kubenswrapper[5004]: I1201 08:55:38.725671 5004 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 08:55:38 crc kubenswrapper[5004]: I1201 08:55:38.729120 5004 patch_prober.go:28] interesting pod/machine-config-daemon-fvdgt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 08:55:38 crc kubenswrapper[5004]: I1201 08:55:38.729193 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 08:55:39 crc kubenswrapper[5004]: I1201 08:55:39.702007 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-jhdl8" event={"ID":"47faa1f0-89dc-4cf2-a026-b425da197aaf","Type":"ContainerStarted","Data":"d7c038c7827381be4fe4907c36af82c4f819ba66a92bedb8c979fcfc2a578de3"} Dec 01 08:55:40 crc kubenswrapper[5004]: I1201 08:55:40.718045 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-jhdl8" event={"ID":"47faa1f0-89dc-4cf2-a026-b425da197aaf","Type":"ContainerStarted","Data":"37837df72df19b4db248d06397f6a7a4cc2970a857350de23bf7624bd80cf57c"} Dec 01 08:55:40 crc kubenswrapper[5004]: I1201 08:55:40.738138 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-jhdl8" podStartSLOduration=2.538810154 podStartE2EDuration="3.738114745s" podCreationTimestamp="2025-12-01 08:55:37 +0000 UTC" firstStartedPulling="2025-12-01 08:55:38.72546427 +0000 UTC m=+2316.290456252" lastFinishedPulling="2025-12-01 08:55:39.924768851 +0000 UTC m=+2317.489760843" observedRunningTime="2025-12-01 08:55:40.735119532 +0000 UTC m=+2318.300111514" watchObservedRunningTime="2025-12-01 08:55:40.738114745 +0000 UTC m=+2318.303106757" Dec 01 08:55:48 crc kubenswrapper[5004]: I1201 08:55:48.824018 5004 generic.go:334] "Generic (PLEG): container finished" podID="47faa1f0-89dc-4cf2-a026-b425da197aaf" containerID="37837df72df19b4db248d06397f6a7a4cc2970a857350de23bf7624bd80cf57c" exitCode=0 Dec 01 08:55:48 crc kubenswrapper[5004]: I1201 08:55:48.824433 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-jhdl8" event={"ID":"47faa1f0-89dc-4cf2-a026-b425da197aaf","Type":"ContainerDied","Data":"37837df72df19b4db248d06397f6a7a4cc2970a857350de23bf7624bd80cf57c"} Dec 01 08:55:50 crc kubenswrapper[5004]: I1201 08:55:50.415719 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-jhdl8" Dec 01 08:55:50 crc kubenswrapper[5004]: I1201 08:55:50.513495 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/47faa1f0-89dc-4cf2-a026-b425da197aaf-inventory-0\") pod \"47faa1f0-89dc-4cf2-a026-b425da197aaf\" (UID: \"47faa1f0-89dc-4cf2-a026-b425da197aaf\") " Dec 01 08:55:50 crc kubenswrapper[5004]: I1201 08:55:50.513800 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/47faa1f0-89dc-4cf2-a026-b425da197aaf-ssh-key-openstack-edpm-ipam\") pod \"47faa1f0-89dc-4cf2-a026-b425da197aaf\" (UID: \"47faa1f0-89dc-4cf2-a026-b425da197aaf\") " Dec 01 08:55:50 crc kubenswrapper[5004]: I1201 08:55:50.513943 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7j5pg\" (UniqueName: \"kubernetes.io/projected/47faa1f0-89dc-4cf2-a026-b425da197aaf-kube-api-access-7j5pg\") pod \"47faa1f0-89dc-4cf2-a026-b425da197aaf\" (UID: \"47faa1f0-89dc-4cf2-a026-b425da197aaf\") " Dec 01 08:55:50 crc kubenswrapper[5004]: I1201 08:55:50.519480 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47faa1f0-89dc-4cf2-a026-b425da197aaf-kube-api-access-7j5pg" (OuterVolumeSpecName: "kube-api-access-7j5pg") pod "47faa1f0-89dc-4cf2-a026-b425da197aaf" (UID: "47faa1f0-89dc-4cf2-a026-b425da197aaf"). InnerVolumeSpecName "kube-api-access-7j5pg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:55:50 crc kubenswrapper[5004]: I1201 08:55:50.562998 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47faa1f0-89dc-4cf2-a026-b425da197aaf-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "47faa1f0-89dc-4cf2-a026-b425da197aaf" (UID: "47faa1f0-89dc-4cf2-a026-b425da197aaf"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:55:50 crc kubenswrapper[5004]: I1201 08:55:50.565893 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47faa1f0-89dc-4cf2-a026-b425da197aaf-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "47faa1f0-89dc-4cf2-a026-b425da197aaf" (UID: "47faa1f0-89dc-4cf2-a026-b425da197aaf"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:55:50 crc kubenswrapper[5004]: I1201 08:55:50.618126 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7j5pg\" (UniqueName: \"kubernetes.io/projected/47faa1f0-89dc-4cf2-a026-b425da197aaf-kube-api-access-7j5pg\") on node \"crc\" DevicePath \"\"" Dec 01 08:55:50 crc kubenswrapper[5004]: I1201 08:55:50.618168 5004 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/47faa1f0-89dc-4cf2-a026-b425da197aaf-inventory-0\") on node \"crc\" DevicePath \"\"" Dec 01 08:55:50 crc kubenswrapper[5004]: I1201 08:55:50.618183 5004 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/47faa1f0-89dc-4cf2-a026-b425da197aaf-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 01 08:55:50 crc kubenswrapper[5004]: I1201 08:55:50.852539 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-jhdl8" event={"ID":"47faa1f0-89dc-4cf2-a026-b425da197aaf","Type":"ContainerDied","Data":"d7c038c7827381be4fe4907c36af82c4f819ba66a92bedb8c979fcfc2a578de3"} Dec 01 08:55:50 crc kubenswrapper[5004]: I1201 08:55:50.852610 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-jhdl8" Dec 01 08:55:50 crc kubenswrapper[5004]: I1201 08:55:50.852633 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7c038c7827381be4fe4907c36af82c4f819ba66a92bedb8c979fcfc2a578de3" Dec 01 08:55:50 crc kubenswrapper[5004]: I1201 08:55:50.941479 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-g5zz7"] Dec 01 08:55:50 crc kubenswrapper[5004]: E1201 08:55:50.942229 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47faa1f0-89dc-4cf2-a026-b425da197aaf" containerName="ssh-known-hosts-edpm-deployment" Dec 01 08:55:50 crc kubenswrapper[5004]: I1201 08:55:50.942246 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="47faa1f0-89dc-4cf2-a026-b425da197aaf" containerName="ssh-known-hosts-edpm-deployment" Dec 01 08:55:50 crc kubenswrapper[5004]: I1201 08:55:50.942503 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="47faa1f0-89dc-4cf2-a026-b425da197aaf" containerName="ssh-known-hosts-edpm-deployment" Dec 01 08:55:50 crc kubenswrapper[5004]: I1201 08:55:50.946882 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-g5zz7" Dec 01 08:55:50 crc kubenswrapper[5004]: I1201 08:55:50.970288 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 08:55:50 crc kubenswrapper[5004]: I1201 08:55:50.970435 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 08:55:50 crc kubenswrapper[5004]: I1201 08:55:50.971542 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pdnrq" Dec 01 08:55:50 crc kubenswrapper[5004]: I1201 08:55:50.972418 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 08:55:50 crc kubenswrapper[5004]: I1201 08:55:50.990531 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-g5zz7"] Dec 01 08:55:51 crc kubenswrapper[5004]: I1201 08:55:51.027003 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/33c18f99-b111-4d1d-bcc0-003f3a58deee-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-g5zz7\" (UID: \"33c18f99-b111-4d1d-bcc0-003f3a58deee\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-g5zz7" Dec 01 08:55:51 crc kubenswrapper[5004]: I1201 08:55:51.027304 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgtdt\" (UniqueName: \"kubernetes.io/projected/33c18f99-b111-4d1d-bcc0-003f3a58deee-kube-api-access-mgtdt\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-g5zz7\" (UID: \"33c18f99-b111-4d1d-bcc0-003f3a58deee\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-g5zz7" Dec 01 08:55:51 crc kubenswrapper[5004]: I1201 08:55:51.027517 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/33c18f99-b111-4d1d-bcc0-003f3a58deee-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-g5zz7\" (UID: \"33c18f99-b111-4d1d-bcc0-003f3a58deee\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-g5zz7" Dec 01 08:55:51 crc kubenswrapper[5004]: I1201 08:55:51.129692 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgtdt\" (UniqueName: \"kubernetes.io/projected/33c18f99-b111-4d1d-bcc0-003f3a58deee-kube-api-access-mgtdt\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-g5zz7\" (UID: \"33c18f99-b111-4d1d-bcc0-003f3a58deee\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-g5zz7" Dec 01 08:55:51 crc kubenswrapper[5004]: I1201 08:55:51.129812 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/33c18f99-b111-4d1d-bcc0-003f3a58deee-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-g5zz7\" (UID: \"33c18f99-b111-4d1d-bcc0-003f3a58deee\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-g5zz7" Dec 01 08:55:51 crc kubenswrapper[5004]: I1201 08:55:51.129932 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/33c18f99-b111-4d1d-bcc0-003f3a58deee-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-g5zz7\" (UID: \"33c18f99-b111-4d1d-bcc0-003f3a58deee\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-g5zz7" Dec 01 08:55:51 crc kubenswrapper[5004]: I1201 08:55:51.135075 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/33c18f99-b111-4d1d-bcc0-003f3a58deee-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-g5zz7\" (UID: \"33c18f99-b111-4d1d-bcc0-003f3a58deee\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-g5zz7" Dec 01 08:55:51 crc kubenswrapper[5004]: I1201 08:55:51.140409 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/33c18f99-b111-4d1d-bcc0-003f3a58deee-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-g5zz7\" (UID: \"33c18f99-b111-4d1d-bcc0-003f3a58deee\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-g5zz7" Dec 01 08:55:51 crc kubenswrapper[5004]: I1201 08:55:51.145068 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgtdt\" (UniqueName: \"kubernetes.io/projected/33c18f99-b111-4d1d-bcc0-003f3a58deee-kube-api-access-mgtdt\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-g5zz7\" (UID: \"33c18f99-b111-4d1d-bcc0-003f3a58deee\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-g5zz7" Dec 01 08:55:51 crc kubenswrapper[5004]: I1201 08:55:51.290529 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-g5zz7" Dec 01 08:55:51 crc kubenswrapper[5004]: I1201 08:55:51.874729 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-g5zz7"] Dec 01 08:55:52 crc kubenswrapper[5004]: I1201 08:55:52.878542 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-g5zz7" event={"ID":"33c18f99-b111-4d1d-bcc0-003f3a58deee","Type":"ContainerStarted","Data":"f06b345ca19be599d9900c2e17ca533e7c00baafc73809d1f7806b72a050f7b0"} Dec 01 08:55:53 crc kubenswrapper[5004]: I1201 08:55:53.893242 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-g5zz7" event={"ID":"33c18f99-b111-4d1d-bcc0-003f3a58deee","Type":"ContainerStarted","Data":"d68109b18f575d6cba421463d51ce192f3f690f097640e0e5556e935c12307f5"} Dec 01 08:55:53 crc kubenswrapper[5004]: I1201 08:55:53.921548 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-g5zz7" podStartSLOduration=3.115643321 podStartE2EDuration="3.921518723s" podCreationTimestamp="2025-12-01 08:55:50 +0000 UTC" firstStartedPulling="2025-12-01 08:55:51.877151615 +0000 UTC m=+2329.442143597" lastFinishedPulling="2025-12-01 08:55:52.683027007 +0000 UTC m=+2330.248018999" observedRunningTime="2025-12-01 08:55:53.921223226 +0000 UTC m=+2331.486215238" watchObservedRunningTime="2025-12-01 08:55:53.921518723 +0000 UTC m=+2331.486510715" Dec 01 08:56:02 crc kubenswrapper[5004]: I1201 08:56:02.014260 5004 generic.go:334] "Generic (PLEG): container finished" podID="33c18f99-b111-4d1d-bcc0-003f3a58deee" containerID="d68109b18f575d6cba421463d51ce192f3f690f097640e0e5556e935c12307f5" exitCode=0 Dec 01 08:56:02 crc kubenswrapper[5004]: I1201 08:56:02.014643 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-g5zz7" event={"ID":"33c18f99-b111-4d1d-bcc0-003f3a58deee","Type":"ContainerDied","Data":"d68109b18f575d6cba421463d51ce192f3f690f097640e0e5556e935c12307f5"} Dec 01 08:56:03 crc kubenswrapper[5004]: I1201 08:56:03.523439 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-g5zz7" Dec 01 08:56:03 crc kubenswrapper[5004]: I1201 08:56:03.671747 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/33c18f99-b111-4d1d-bcc0-003f3a58deee-inventory\") pod \"33c18f99-b111-4d1d-bcc0-003f3a58deee\" (UID: \"33c18f99-b111-4d1d-bcc0-003f3a58deee\") " Dec 01 08:56:03 crc kubenswrapper[5004]: I1201 08:56:03.671961 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgtdt\" (UniqueName: \"kubernetes.io/projected/33c18f99-b111-4d1d-bcc0-003f3a58deee-kube-api-access-mgtdt\") pod \"33c18f99-b111-4d1d-bcc0-003f3a58deee\" (UID: \"33c18f99-b111-4d1d-bcc0-003f3a58deee\") " Dec 01 08:56:03 crc kubenswrapper[5004]: I1201 08:56:03.672041 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/33c18f99-b111-4d1d-bcc0-003f3a58deee-ssh-key\") pod \"33c18f99-b111-4d1d-bcc0-003f3a58deee\" (UID: \"33c18f99-b111-4d1d-bcc0-003f3a58deee\") " Dec 01 08:56:03 crc kubenswrapper[5004]: I1201 08:56:03.677731 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33c18f99-b111-4d1d-bcc0-003f3a58deee-kube-api-access-mgtdt" (OuterVolumeSpecName: "kube-api-access-mgtdt") pod "33c18f99-b111-4d1d-bcc0-003f3a58deee" (UID: "33c18f99-b111-4d1d-bcc0-003f3a58deee"). InnerVolumeSpecName "kube-api-access-mgtdt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:56:03 crc kubenswrapper[5004]: I1201 08:56:03.719599 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33c18f99-b111-4d1d-bcc0-003f3a58deee-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "33c18f99-b111-4d1d-bcc0-003f3a58deee" (UID: "33c18f99-b111-4d1d-bcc0-003f3a58deee"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:56:03 crc kubenswrapper[5004]: I1201 08:56:03.743470 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33c18f99-b111-4d1d-bcc0-003f3a58deee-inventory" (OuterVolumeSpecName: "inventory") pod "33c18f99-b111-4d1d-bcc0-003f3a58deee" (UID: "33c18f99-b111-4d1d-bcc0-003f3a58deee"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:56:03 crc kubenswrapper[5004]: I1201 08:56:03.774176 5004 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/33c18f99-b111-4d1d-bcc0-003f3a58deee-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 08:56:03 crc kubenswrapper[5004]: I1201 08:56:03.774204 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mgtdt\" (UniqueName: \"kubernetes.io/projected/33c18f99-b111-4d1d-bcc0-003f3a58deee-kube-api-access-mgtdt\") on node \"crc\" DevicePath \"\"" Dec 01 08:56:03 crc kubenswrapper[5004]: I1201 08:56:03.774215 5004 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/33c18f99-b111-4d1d-bcc0-003f3a58deee-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 08:56:04 crc kubenswrapper[5004]: I1201 08:56:04.041361 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-g5zz7" event={"ID":"33c18f99-b111-4d1d-bcc0-003f3a58deee","Type":"ContainerDied","Data":"f06b345ca19be599d9900c2e17ca533e7c00baafc73809d1f7806b72a050f7b0"} Dec 01 08:56:04 crc kubenswrapper[5004]: I1201 08:56:04.041400 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-g5zz7" Dec 01 08:56:04 crc kubenswrapper[5004]: I1201 08:56:04.041418 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f06b345ca19be599d9900c2e17ca533e7c00baafc73809d1f7806b72a050f7b0" Dec 01 08:56:04 crc kubenswrapper[5004]: I1201 08:56:04.152999 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-shf6w"] Dec 01 08:56:04 crc kubenswrapper[5004]: E1201 08:56:04.153492 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33c18f99-b111-4d1d-bcc0-003f3a58deee" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 01 08:56:04 crc kubenswrapper[5004]: I1201 08:56:04.153510 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="33c18f99-b111-4d1d-bcc0-003f3a58deee" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 01 08:56:04 crc kubenswrapper[5004]: I1201 08:56:04.153820 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="33c18f99-b111-4d1d-bcc0-003f3a58deee" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 01 08:56:04 crc kubenswrapper[5004]: I1201 08:56:04.154682 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-shf6w" Dec 01 08:56:04 crc kubenswrapper[5004]: I1201 08:56:04.156656 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pdnrq" Dec 01 08:56:04 crc kubenswrapper[5004]: I1201 08:56:04.156840 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 08:56:04 crc kubenswrapper[5004]: I1201 08:56:04.156958 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 08:56:04 crc kubenswrapper[5004]: I1201 08:56:04.157109 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 08:56:04 crc kubenswrapper[5004]: I1201 08:56:04.172584 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-shf6w"] Dec 01 08:56:04 crc kubenswrapper[5004]: I1201 08:56:04.286554 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cf82dc00-6575-4d77-9cc5-00fddb8957e0-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-shf6w\" (UID: \"cf82dc00-6575-4d77-9cc5-00fddb8957e0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-shf6w" Dec 01 08:56:04 crc kubenswrapper[5004]: I1201 08:56:04.287071 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cf82dc00-6575-4d77-9cc5-00fddb8957e0-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-shf6w\" (UID: \"cf82dc00-6575-4d77-9cc5-00fddb8957e0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-shf6w" Dec 01 08:56:04 crc kubenswrapper[5004]: I1201 08:56:04.287199 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9s548\" (UniqueName: \"kubernetes.io/projected/cf82dc00-6575-4d77-9cc5-00fddb8957e0-kube-api-access-9s548\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-shf6w\" (UID: \"cf82dc00-6575-4d77-9cc5-00fddb8957e0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-shf6w" Dec 01 08:56:04 crc kubenswrapper[5004]: I1201 08:56:04.389283 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cf82dc00-6575-4d77-9cc5-00fddb8957e0-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-shf6w\" (UID: \"cf82dc00-6575-4d77-9cc5-00fddb8957e0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-shf6w" Dec 01 08:56:04 crc kubenswrapper[5004]: I1201 08:56:04.389361 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9s548\" (UniqueName: \"kubernetes.io/projected/cf82dc00-6575-4d77-9cc5-00fddb8957e0-kube-api-access-9s548\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-shf6w\" (UID: \"cf82dc00-6575-4d77-9cc5-00fddb8957e0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-shf6w" Dec 01 08:56:04 crc kubenswrapper[5004]: I1201 08:56:04.389584 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cf82dc00-6575-4d77-9cc5-00fddb8957e0-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-shf6w\" (UID: \"cf82dc00-6575-4d77-9cc5-00fddb8957e0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-shf6w" Dec 01 08:56:04 crc kubenswrapper[5004]: I1201 08:56:04.394321 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cf82dc00-6575-4d77-9cc5-00fddb8957e0-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-shf6w\" (UID: \"cf82dc00-6575-4d77-9cc5-00fddb8957e0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-shf6w" Dec 01 08:56:04 crc kubenswrapper[5004]: I1201 08:56:04.403999 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cf82dc00-6575-4d77-9cc5-00fddb8957e0-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-shf6w\" (UID: \"cf82dc00-6575-4d77-9cc5-00fddb8957e0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-shf6w" Dec 01 08:56:04 crc kubenswrapper[5004]: I1201 08:56:04.406114 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9s548\" (UniqueName: \"kubernetes.io/projected/cf82dc00-6575-4d77-9cc5-00fddb8957e0-kube-api-access-9s548\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-shf6w\" (UID: \"cf82dc00-6575-4d77-9cc5-00fddb8957e0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-shf6w" Dec 01 08:56:04 crc kubenswrapper[5004]: I1201 08:56:04.485305 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-shf6w" Dec 01 08:56:05 crc kubenswrapper[5004]: I1201 08:56:05.025325 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-shf6w"] Dec 01 08:56:05 crc kubenswrapper[5004]: I1201 08:56:05.057496 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-shf6w" event={"ID":"cf82dc00-6575-4d77-9cc5-00fddb8957e0","Type":"ContainerStarted","Data":"cd815637ed8ad08aecfc93362062c54bc7f69df0a4d32be52968aea4888097b9"} Dec 01 08:56:06 crc kubenswrapper[5004]: I1201 08:56:06.068052 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-shf6w" event={"ID":"cf82dc00-6575-4d77-9cc5-00fddb8957e0","Type":"ContainerStarted","Data":"e3be35798b79b24d73555f95b56744a809845aa12f2b99dbf3cfcb3dd09b5f09"} Dec 01 08:56:06 crc kubenswrapper[5004]: I1201 08:56:06.089285 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-shf6w" podStartSLOduration=1.641732867 podStartE2EDuration="2.089267636s" podCreationTimestamp="2025-12-01 08:56:04 +0000 UTC" firstStartedPulling="2025-12-01 08:56:05.020323762 +0000 UTC m=+2342.585315744" lastFinishedPulling="2025-12-01 08:56:05.467858521 +0000 UTC m=+2343.032850513" observedRunningTime="2025-12-01 08:56:06.08283431 +0000 UTC m=+2343.647826292" watchObservedRunningTime="2025-12-01 08:56:06.089267636 +0000 UTC m=+2343.654259618" Dec 01 08:56:08 crc kubenswrapper[5004]: I1201 08:56:08.729383 5004 patch_prober.go:28] interesting pod/machine-config-daemon-fvdgt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 08:56:08 crc kubenswrapper[5004]: I1201 08:56:08.730242 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 08:56:16 crc kubenswrapper[5004]: I1201 08:56:16.920470 5004 scope.go:117] "RemoveContainer" containerID="eedac86e8773a04b9d3fb1bd246de36e21e0a24e59c7b0772bb015eb9b07fe2a" Dec 01 08:56:17 crc kubenswrapper[5004]: I1201 08:56:17.277282 5004 generic.go:334] "Generic (PLEG): container finished" podID="cf82dc00-6575-4d77-9cc5-00fddb8957e0" containerID="e3be35798b79b24d73555f95b56744a809845aa12f2b99dbf3cfcb3dd09b5f09" exitCode=0 Dec 01 08:56:17 crc kubenswrapper[5004]: I1201 08:56:17.277334 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-shf6w" event={"ID":"cf82dc00-6575-4d77-9cc5-00fddb8957e0","Type":"ContainerDied","Data":"e3be35798b79b24d73555f95b56744a809845aa12f2b99dbf3cfcb3dd09b5f09"} Dec 01 08:56:18 crc kubenswrapper[5004]: I1201 08:56:18.775030 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-shf6w" Dec 01 08:56:18 crc kubenswrapper[5004]: I1201 08:56:18.875826 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cf82dc00-6575-4d77-9cc5-00fddb8957e0-ssh-key\") pod \"cf82dc00-6575-4d77-9cc5-00fddb8957e0\" (UID: \"cf82dc00-6575-4d77-9cc5-00fddb8957e0\") " Dec 01 08:56:18 crc kubenswrapper[5004]: I1201 08:56:18.875970 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cf82dc00-6575-4d77-9cc5-00fddb8957e0-inventory\") pod \"cf82dc00-6575-4d77-9cc5-00fddb8957e0\" (UID: \"cf82dc00-6575-4d77-9cc5-00fddb8957e0\") " Dec 01 08:56:18 crc kubenswrapper[5004]: I1201 08:56:18.876449 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9s548\" (UniqueName: \"kubernetes.io/projected/cf82dc00-6575-4d77-9cc5-00fddb8957e0-kube-api-access-9s548\") pod \"cf82dc00-6575-4d77-9cc5-00fddb8957e0\" (UID: \"cf82dc00-6575-4d77-9cc5-00fddb8957e0\") " Dec 01 08:56:18 crc kubenswrapper[5004]: I1201 08:56:18.891880 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf82dc00-6575-4d77-9cc5-00fddb8957e0-kube-api-access-9s548" (OuterVolumeSpecName: "kube-api-access-9s548") pod "cf82dc00-6575-4d77-9cc5-00fddb8957e0" (UID: "cf82dc00-6575-4d77-9cc5-00fddb8957e0"). InnerVolumeSpecName "kube-api-access-9s548". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:56:18 crc kubenswrapper[5004]: I1201 08:56:18.925811 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf82dc00-6575-4d77-9cc5-00fddb8957e0-inventory" (OuterVolumeSpecName: "inventory") pod "cf82dc00-6575-4d77-9cc5-00fddb8957e0" (UID: "cf82dc00-6575-4d77-9cc5-00fddb8957e0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:56:18 crc kubenswrapper[5004]: I1201 08:56:18.940948 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf82dc00-6575-4d77-9cc5-00fddb8957e0-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "cf82dc00-6575-4d77-9cc5-00fddb8957e0" (UID: "cf82dc00-6575-4d77-9cc5-00fddb8957e0"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:56:18 crc kubenswrapper[5004]: I1201 08:56:18.982250 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9s548\" (UniqueName: \"kubernetes.io/projected/cf82dc00-6575-4d77-9cc5-00fddb8957e0-kube-api-access-9s548\") on node \"crc\" DevicePath \"\"" Dec 01 08:56:18 crc kubenswrapper[5004]: I1201 08:56:18.982299 5004 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cf82dc00-6575-4d77-9cc5-00fddb8957e0-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 08:56:18 crc kubenswrapper[5004]: I1201 08:56:18.982316 5004 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cf82dc00-6575-4d77-9cc5-00fddb8957e0-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 08:56:19 crc kubenswrapper[5004]: I1201 08:56:19.307876 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-shf6w" event={"ID":"cf82dc00-6575-4d77-9cc5-00fddb8957e0","Type":"ContainerDied","Data":"cd815637ed8ad08aecfc93362062c54bc7f69df0a4d32be52968aea4888097b9"} Dec 01 08:56:19 crc kubenswrapper[5004]: I1201 08:56:19.307933 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd815637ed8ad08aecfc93362062c54bc7f69df0a4d32be52968aea4888097b9" Dec 01 08:56:19 crc kubenswrapper[5004]: I1201 08:56:19.308438 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-shf6w" Dec 01 08:56:19 crc kubenswrapper[5004]: I1201 08:56:19.409963 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-snd4j"] Dec 01 08:56:19 crc kubenswrapper[5004]: E1201 08:56:19.410601 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf82dc00-6575-4d77-9cc5-00fddb8957e0" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 01 08:56:19 crc kubenswrapper[5004]: I1201 08:56:19.410619 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf82dc00-6575-4d77-9cc5-00fddb8957e0" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 01 08:56:19 crc kubenswrapper[5004]: I1201 08:56:19.410908 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf82dc00-6575-4d77-9cc5-00fddb8957e0" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 01 08:56:19 crc kubenswrapper[5004]: I1201 08:56:19.411640 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-snd4j" Dec 01 08:56:19 crc kubenswrapper[5004]: I1201 08:56:19.415327 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0" Dec 01 08:56:19 crc kubenswrapper[5004]: I1201 08:56:19.415389 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 08:56:19 crc kubenswrapper[5004]: I1201 08:56:19.415627 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Dec 01 08:56:19 crc kubenswrapper[5004]: I1201 08:56:19.415907 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 08:56:19 crc kubenswrapper[5004]: I1201 08:56:19.416673 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Dec 01 08:56:19 crc kubenswrapper[5004]: I1201 08:56:19.417146 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Dec 01 08:56:19 crc kubenswrapper[5004]: I1201 08:56:19.419446 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Dec 01 08:56:19 crc kubenswrapper[5004]: I1201 08:56:19.419738 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pdnrq" Dec 01 08:56:19 crc kubenswrapper[5004]: I1201 08:56:19.420681 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 08:56:19 crc kubenswrapper[5004]: I1201 08:56:19.461179 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-snd4j"] Dec 01 08:56:19 crc kubenswrapper[5004]: I1201 08:56:19.499904 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8edacfd3-e002-434d-bd0a-15400e279c58-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-snd4j\" (UID: \"8edacfd3-e002-434d-bd0a-15400e279c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-snd4j" Dec 01 08:56:19 crc kubenswrapper[5004]: I1201 08:56:19.500035 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8edacfd3-e002-434d-bd0a-15400e279c58-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-snd4j\" (UID: \"8edacfd3-e002-434d-bd0a-15400e279c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-snd4j" Dec 01 08:56:19 crc kubenswrapper[5004]: I1201 08:56:19.500072 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sl68\" (UniqueName: \"kubernetes.io/projected/8edacfd3-e002-434d-bd0a-15400e279c58-kube-api-access-8sl68\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-snd4j\" (UID: \"8edacfd3-e002-434d-bd0a-15400e279c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-snd4j" Dec 01 08:56:19 crc kubenswrapper[5004]: I1201 08:56:19.500318 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8edacfd3-e002-434d-bd0a-15400e279c58-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-snd4j\" (UID: \"8edacfd3-e002-434d-bd0a-15400e279c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-snd4j" Dec 01 08:56:19 crc kubenswrapper[5004]: I1201 08:56:19.500433 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8edacfd3-e002-434d-bd0a-15400e279c58-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-snd4j\" (UID: \"8edacfd3-e002-434d-bd0a-15400e279c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-snd4j" Dec 01 08:56:19 crc kubenswrapper[5004]: I1201 08:56:19.500628 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8edacfd3-e002-434d-bd0a-15400e279c58-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-snd4j\" (UID: \"8edacfd3-e002-434d-bd0a-15400e279c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-snd4j" Dec 01 08:56:19 crc kubenswrapper[5004]: I1201 08:56:19.500665 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8edacfd3-e002-434d-bd0a-15400e279c58-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-snd4j\" (UID: \"8edacfd3-e002-434d-bd0a-15400e279c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-snd4j" Dec 01 08:56:19 crc kubenswrapper[5004]: I1201 08:56:19.500899 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8edacfd3-e002-434d-bd0a-15400e279c58-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-snd4j\" (UID: \"8edacfd3-e002-434d-bd0a-15400e279c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-snd4j" Dec 01 08:56:19 crc kubenswrapper[5004]: I1201 08:56:19.501030 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8edacfd3-e002-434d-bd0a-15400e279c58-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-snd4j\" (UID: \"8edacfd3-e002-434d-bd0a-15400e279c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-snd4j" Dec 01 08:56:19 crc kubenswrapper[5004]: I1201 08:56:19.501117 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8edacfd3-e002-434d-bd0a-15400e279c58-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-snd4j\" (UID: \"8edacfd3-e002-434d-bd0a-15400e279c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-snd4j" Dec 01 08:56:19 crc kubenswrapper[5004]: I1201 08:56:19.501344 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8edacfd3-e002-434d-bd0a-15400e279c58-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-snd4j\" (UID: \"8edacfd3-e002-434d-bd0a-15400e279c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-snd4j" Dec 01 08:56:19 crc kubenswrapper[5004]: I1201 08:56:19.501427 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8edacfd3-e002-434d-bd0a-15400e279c58-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-snd4j\" (UID: \"8edacfd3-e002-434d-bd0a-15400e279c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-snd4j" Dec 01 08:56:19 crc kubenswrapper[5004]: I1201 08:56:19.501546 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8edacfd3-e002-434d-bd0a-15400e279c58-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-snd4j\" (UID: \"8edacfd3-e002-434d-bd0a-15400e279c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-snd4j" Dec 01 08:56:19 crc kubenswrapper[5004]: I1201 08:56:19.501651 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8edacfd3-e002-434d-bd0a-15400e279c58-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-snd4j\" (UID: \"8edacfd3-e002-434d-bd0a-15400e279c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-snd4j" Dec 01 08:56:19 crc kubenswrapper[5004]: I1201 08:56:19.501735 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8edacfd3-e002-434d-bd0a-15400e279c58-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-snd4j\" (UID: \"8edacfd3-e002-434d-bd0a-15400e279c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-snd4j" Dec 01 08:56:19 crc kubenswrapper[5004]: I1201 08:56:19.501867 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8edacfd3-e002-434d-bd0a-15400e279c58-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-snd4j\" (UID: \"8edacfd3-e002-434d-bd0a-15400e279c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-snd4j" Dec 01 08:56:19 crc kubenswrapper[5004]: I1201 08:56:19.604731 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8edacfd3-e002-434d-bd0a-15400e279c58-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-snd4j\" (UID: \"8edacfd3-e002-434d-bd0a-15400e279c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-snd4j" Dec 01 08:56:19 crc kubenswrapper[5004]: I1201 08:56:19.605219 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8edacfd3-e002-434d-bd0a-15400e279c58-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-snd4j\" (UID: \"8edacfd3-e002-434d-bd0a-15400e279c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-snd4j" Dec 01 08:56:19 crc kubenswrapper[5004]: I1201 08:56:19.605369 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8edacfd3-e002-434d-bd0a-15400e279c58-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-snd4j\" (UID: \"8edacfd3-e002-434d-bd0a-15400e279c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-snd4j" Dec 01 08:56:19 crc kubenswrapper[5004]: I1201 08:56:19.605520 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8edacfd3-e002-434d-bd0a-15400e279c58-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-snd4j\" (UID: \"8edacfd3-e002-434d-bd0a-15400e279c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-snd4j" Dec 01 08:56:19 crc kubenswrapper[5004]: I1201 08:56:19.605702 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8edacfd3-e002-434d-bd0a-15400e279c58-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-snd4j\" (UID: \"8edacfd3-e002-434d-bd0a-15400e279c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-snd4j" Dec 01 08:56:19 crc kubenswrapper[5004]: I1201 08:56:19.605820 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8edacfd3-e002-434d-bd0a-15400e279c58-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-snd4j\" (UID: \"8edacfd3-e002-434d-bd0a-15400e279c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-snd4j" Dec 01 08:56:19 crc kubenswrapper[5004]: I1201 08:56:19.605935 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8edacfd3-e002-434d-bd0a-15400e279c58-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-snd4j\" (UID: \"8edacfd3-e002-434d-bd0a-15400e279c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-snd4j" Dec 01 08:56:19 crc kubenswrapper[5004]: I1201 08:56:19.606078 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8edacfd3-e002-434d-bd0a-15400e279c58-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-snd4j\" (UID: \"8edacfd3-e002-434d-bd0a-15400e279c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-snd4j" Dec 01 08:56:19 crc kubenswrapper[5004]: I1201 08:56:19.606185 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8edacfd3-e002-434d-bd0a-15400e279c58-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-snd4j\" (UID: \"8edacfd3-e002-434d-bd0a-15400e279c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-snd4j" Dec 01 08:56:19 crc kubenswrapper[5004]: I1201 08:56:19.606270 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sl68\" (UniqueName: \"kubernetes.io/projected/8edacfd3-e002-434d-bd0a-15400e279c58-kube-api-access-8sl68\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-snd4j\" (UID: \"8edacfd3-e002-434d-bd0a-15400e279c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-snd4j" Dec 01 08:56:19 crc kubenswrapper[5004]: I1201 08:56:19.606401 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8edacfd3-e002-434d-bd0a-15400e279c58-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-snd4j\" (UID: \"8edacfd3-e002-434d-bd0a-15400e279c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-snd4j" Dec 01 08:56:19 crc kubenswrapper[5004]: I1201 08:56:19.606513 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8edacfd3-e002-434d-bd0a-15400e279c58-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-snd4j\" (UID: \"8edacfd3-e002-434d-bd0a-15400e279c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-snd4j" Dec 01 08:56:19 crc kubenswrapper[5004]: I1201 08:56:19.606653 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8edacfd3-e002-434d-bd0a-15400e279c58-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-snd4j\" (UID: \"8edacfd3-e002-434d-bd0a-15400e279c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-snd4j" Dec 01 08:56:19 crc kubenswrapper[5004]: I1201 08:56:19.606739 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8edacfd3-e002-434d-bd0a-15400e279c58-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-snd4j\" (UID: \"8edacfd3-e002-434d-bd0a-15400e279c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-snd4j" Dec 01 08:56:19 crc kubenswrapper[5004]: I1201 08:56:19.606895 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8edacfd3-e002-434d-bd0a-15400e279c58-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-snd4j\" (UID: \"8edacfd3-e002-434d-bd0a-15400e279c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-snd4j" Dec 01 08:56:19 crc kubenswrapper[5004]: I1201 08:56:19.606995 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8edacfd3-e002-434d-bd0a-15400e279c58-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-snd4j\" (UID: \"8edacfd3-e002-434d-bd0a-15400e279c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-snd4j" Dec 01 08:56:19 crc kubenswrapper[5004]: I1201 08:56:19.611578 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8edacfd3-e002-434d-bd0a-15400e279c58-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-snd4j\" (UID: \"8edacfd3-e002-434d-bd0a-15400e279c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-snd4j" Dec 01 08:56:19 crc kubenswrapper[5004]: I1201 08:56:19.613508 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8edacfd3-e002-434d-bd0a-15400e279c58-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-snd4j\" (UID: \"8edacfd3-e002-434d-bd0a-15400e279c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-snd4j" Dec 01 08:56:19 crc kubenswrapper[5004]: I1201 08:56:19.613527 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8edacfd3-e002-434d-bd0a-15400e279c58-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-snd4j\" (UID: \"8edacfd3-e002-434d-bd0a-15400e279c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-snd4j" Dec 01 08:56:19 crc kubenswrapper[5004]: I1201 08:56:19.613601 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8edacfd3-e002-434d-bd0a-15400e279c58-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-snd4j\" (UID: \"8edacfd3-e002-434d-bd0a-15400e279c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-snd4j" Dec 01 08:56:19 crc kubenswrapper[5004]: I1201 08:56:19.621936 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8edacfd3-e002-434d-bd0a-15400e279c58-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-snd4j\" (UID: \"8edacfd3-e002-434d-bd0a-15400e279c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-snd4j" Dec 01 08:56:19 crc kubenswrapper[5004]: I1201 08:56:19.622082 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8edacfd3-e002-434d-bd0a-15400e279c58-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-snd4j\" (UID: \"8edacfd3-e002-434d-bd0a-15400e279c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-snd4j" Dec 01 08:56:19 crc kubenswrapper[5004]: I1201 08:56:19.622111 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8edacfd3-e002-434d-bd0a-15400e279c58-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-snd4j\" (UID: \"8edacfd3-e002-434d-bd0a-15400e279c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-snd4j" Dec 01 08:56:19 crc kubenswrapper[5004]: I1201 08:56:19.622408 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8edacfd3-e002-434d-bd0a-15400e279c58-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-snd4j\" (UID: \"8edacfd3-e002-434d-bd0a-15400e279c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-snd4j" Dec 01 08:56:19 crc kubenswrapper[5004]: I1201 08:56:19.622736 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8edacfd3-e002-434d-bd0a-15400e279c58-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-snd4j\" (UID: \"8edacfd3-e002-434d-bd0a-15400e279c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-snd4j" Dec 01 08:56:19 crc kubenswrapper[5004]: I1201 08:56:19.622853 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8edacfd3-e002-434d-bd0a-15400e279c58-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-snd4j\" (UID: \"8edacfd3-e002-434d-bd0a-15400e279c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-snd4j" Dec 01 08:56:19 crc kubenswrapper[5004]: I1201 08:56:19.622815 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8edacfd3-e002-434d-bd0a-15400e279c58-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-snd4j\" (UID: \"8edacfd3-e002-434d-bd0a-15400e279c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-snd4j" Dec 01 08:56:19 crc kubenswrapper[5004]: I1201 08:56:19.622986 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8edacfd3-e002-434d-bd0a-15400e279c58-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-snd4j\" (UID: \"8edacfd3-e002-434d-bd0a-15400e279c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-snd4j" Dec 01 08:56:19 crc kubenswrapper[5004]: I1201 08:56:19.633819 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8edacfd3-e002-434d-bd0a-15400e279c58-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-snd4j\" (UID: \"8edacfd3-e002-434d-bd0a-15400e279c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-snd4j" Dec 01 08:56:19 crc kubenswrapper[5004]: I1201 08:56:19.634404 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sl68\" (UniqueName: \"kubernetes.io/projected/8edacfd3-e002-434d-bd0a-15400e279c58-kube-api-access-8sl68\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-snd4j\" (UID: \"8edacfd3-e002-434d-bd0a-15400e279c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-snd4j" Dec 01 08:56:19 crc kubenswrapper[5004]: I1201 08:56:19.634520 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8edacfd3-e002-434d-bd0a-15400e279c58-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-snd4j\" (UID: \"8edacfd3-e002-434d-bd0a-15400e279c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-snd4j" Dec 01 08:56:19 crc kubenswrapper[5004]: I1201 08:56:19.634810 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8edacfd3-e002-434d-bd0a-15400e279c58-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-snd4j\" (UID: \"8edacfd3-e002-434d-bd0a-15400e279c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-snd4j" Dec 01 08:56:19 crc kubenswrapper[5004]: I1201 08:56:19.736682 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-snd4j" Dec 01 08:56:20 crc kubenswrapper[5004]: I1201 08:56:20.369724 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-snd4j"] Dec 01 08:56:20 crc kubenswrapper[5004]: W1201 08:56:20.370122 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8edacfd3_e002_434d_bd0a_15400e279c58.slice/crio-3738491c2738faa259ece3948208956742fb9041e1ce520fade490383e4c85a2 WatchSource:0}: Error finding container 3738491c2738faa259ece3948208956742fb9041e1ce520fade490383e4c85a2: Status 404 returned error can't find the container with id 3738491c2738faa259ece3948208956742fb9041e1ce520fade490383e4c85a2 Dec 01 08:56:21 crc kubenswrapper[5004]: I1201 08:56:21.334130 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-snd4j" event={"ID":"8edacfd3-e002-434d-bd0a-15400e279c58","Type":"ContainerStarted","Data":"3e11082e2feeddc0414fb9aa36882e7314700e520563d9f730318937af7b1d35"} Dec 01 08:56:21 crc kubenswrapper[5004]: I1201 08:56:21.334836 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-snd4j" event={"ID":"8edacfd3-e002-434d-bd0a-15400e279c58","Type":"ContainerStarted","Data":"3738491c2738faa259ece3948208956742fb9041e1ce520fade490383e4c85a2"} Dec 01 08:56:21 crc kubenswrapper[5004]: I1201 08:56:21.380436 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-snd4j" podStartSLOduration=1.944503955 podStartE2EDuration="2.380406749s" podCreationTimestamp="2025-12-01 08:56:19 +0000 UTC" firstStartedPulling="2025-12-01 08:56:20.373048236 +0000 UTC m=+2357.938040228" lastFinishedPulling="2025-12-01 08:56:20.80895102 +0000 UTC m=+2358.373943022" observedRunningTime="2025-12-01 08:56:21.359008717 +0000 UTC m=+2358.924000739" watchObservedRunningTime="2025-12-01 08:56:21.380406749 +0000 UTC m=+2358.945398771" Dec 01 08:56:28 crc kubenswrapper[5004]: I1201 08:56:28.053805 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-lc7hw"] Dec 01 08:56:28 crc kubenswrapper[5004]: I1201 08:56:28.065788 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-lc7hw"] Dec 01 08:56:28 crc kubenswrapper[5004]: I1201 08:56:28.775155 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="779fe233-ca28-4e4d-adb7-fbf03e3e751e" path="/var/lib/kubelet/pods/779fe233-ca28-4e4d-adb7-fbf03e3e751e/volumes" Dec 01 08:56:38 crc kubenswrapper[5004]: I1201 08:56:38.729246 5004 patch_prober.go:28] interesting pod/machine-config-daemon-fvdgt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 08:56:38 crc kubenswrapper[5004]: I1201 08:56:38.731852 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 08:56:38 crc kubenswrapper[5004]: I1201 08:56:38.731943 5004 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" Dec 01 08:56:38 crc kubenswrapper[5004]: I1201 08:56:38.732986 5004 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"35549cd96a84a4d57cde0021a5b3e7f3cc68411360397308b1e916325429b080"} pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 08:56:38 crc kubenswrapper[5004]: I1201 08:56:38.733047 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" containerID="cri-o://35549cd96a84a4d57cde0021a5b3e7f3cc68411360397308b1e916325429b080" gracePeriod=600 Dec 01 08:56:38 crc kubenswrapper[5004]: E1201 08:56:38.873917 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 08:56:39 crc kubenswrapper[5004]: I1201 08:56:39.581885 5004 generic.go:334] "Generic (PLEG): container finished" podID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerID="35549cd96a84a4d57cde0021a5b3e7f3cc68411360397308b1e916325429b080" exitCode=0 Dec 01 08:56:39 crc kubenswrapper[5004]: I1201 08:56:39.581978 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" event={"ID":"f9977ebb-82de-4e96-8763-0b5a84f8d4ce","Type":"ContainerDied","Data":"35549cd96a84a4d57cde0021a5b3e7f3cc68411360397308b1e916325429b080"} Dec 01 08:56:39 crc kubenswrapper[5004]: I1201 08:56:39.582104 5004 scope.go:117] "RemoveContainer" containerID="c25705b2a5c3bf1cd77f5014f3a7b5fe92ef589d53ece0784883141577010bcc" Dec 01 08:56:39 crc kubenswrapper[5004]: I1201 08:56:39.583284 5004 scope.go:117] "RemoveContainer" containerID="35549cd96a84a4d57cde0021a5b3e7f3cc68411360397308b1e916325429b080" Dec 01 08:56:39 crc kubenswrapper[5004]: E1201 08:56:39.583980 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 08:56:51 crc kubenswrapper[5004]: I1201 08:56:51.974721 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vfpqg"] Dec 01 08:56:51 crc kubenswrapper[5004]: I1201 08:56:51.981209 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vfpqg" Dec 01 08:56:51 crc kubenswrapper[5004]: I1201 08:56:51.988289 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vfpqg"] Dec 01 08:56:52 crc kubenswrapper[5004]: I1201 08:56:52.171926 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b678fa31-cf5b-4aeb-8aa3-c314a17abc6c-utilities\") pod \"redhat-marketplace-vfpqg\" (UID: \"b678fa31-cf5b-4aeb-8aa3-c314a17abc6c\") " pod="openshift-marketplace/redhat-marketplace-vfpqg" Dec 01 08:56:52 crc kubenswrapper[5004]: I1201 08:56:52.172339 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b678fa31-cf5b-4aeb-8aa3-c314a17abc6c-catalog-content\") pod \"redhat-marketplace-vfpqg\" (UID: \"b678fa31-cf5b-4aeb-8aa3-c314a17abc6c\") " pod="openshift-marketplace/redhat-marketplace-vfpqg" Dec 01 08:56:52 crc kubenswrapper[5004]: I1201 08:56:52.172601 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkfqv\" (UniqueName: \"kubernetes.io/projected/b678fa31-cf5b-4aeb-8aa3-c314a17abc6c-kube-api-access-nkfqv\") pod \"redhat-marketplace-vfpqg\" (UID: \"b678fa31-cf5b-4aeb-8aa3-c314a17abc6c\") " pod="openshift-marketplace/redhat-marketplace-vfpqg" Dec 01 08:56:52 crc kubenswrapper[5004]: I1201 08:56:52.275012 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b678fa31-cf5b-4aeb-8aa3-c314a17abc6c-utilities\") pod \"redhat-marketplace-vfpqg\" (UID: \"b678fa31-cf5b-4aeb-8aa3-c314a17abc6c\") " pod="openshift-marketplace/redhat-marketplace-vfpqg" Dec 01 08:56:52 crc kubenswrapper[5004]: I1201 08:56:52.275068 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b678fa31-cf5b-4aeb-8aa3-c314a17abc6c-catalog-content\") pod \"redhat-marketplace-vfpqg\" (UID: \"b678fa31-cf5b-4aeb-8aa3-c314a17abc6c\") " pod="openshift-marketplace/redhat-marketplace-vfpqg" Dec 01 08:56:52 crc kubenswrapper[5004]: I1201 08:56:52.275178 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkfqv\" (UniqueName: \"kubernetes.io/projected/b678fa31-cf5b-4aeb-8aa3-c314a17abc6c-kube-api-access-nkfqv\") pod \"redhat-marketplace-vfpqg\" (UID: \"b678fa31-cf5b-4aeb-8aa3-c314a17abc6c\") " pod="openshift-marketplace/redhat-marketplace-vfpqg" Dec 01 08:56:52 crc kubenswrapper[5004]: I1201 08:56:52.275597 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b678fa31-cf5b-4aeb-8aa3-c314a17abc6c-utilities\") pod \"redhat-marketplace-vfpqg\" (UID: \"b678fa31-cf5b-4aeb-8aa3-c314a17abc6c\") " pod="openshift-marketplace/redhat-marketplace-vfpqg" Dec 01 08:56:52 crc kubenswrapper[5004]: I1201 08:56:52.275626 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b678fa31-cf5b-4aeb-8aa3-c314a17abc6c-catalog-content\") pod \"redhat-marketplace-vfpqg\" (UID: \"b678fa31-cf5b-4aeb-8aa3-c314a17abc6c\") " pod="openshift-marketplace/redhat-marketplace-vfpqg" Dec 01 08:56:52 crc kubenswrapper[5004]: I1201 08:56:52.296623 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkfqv\" (UniqueName: \"kubernetes.io/projected/b678fa31-cf5b-4aeb-8aa3-c314a17abc6c-kube-api-access-nkfqv\") pod \"redhat-marketplace-vfpqg\" (UID: \"b678fa31-cf5b-4aeb-8aa3-c314a17abc6c\") " pod="openshift-marketplace/redhat-marketplace-vfpqg" Dec 01 08:56:52 crc kubenswrapper[5004]: I1201 08:56:52.319103 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vfpqg" Dec 01 08:56:52 crc kubenswrapper[5004]: I1201 08:56:52.766431 5004 scope.go:117] "RemoveContainer" containerID="35549cd96a84a4d57cde0021a5b3e7f3cc68411360397308b1e916325429b080" Dec 01 08:56:52 crc kubenswrapper[5004]: E1201 08:56:52.767089 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 08:56:52 crc kubenswrapper[5004]: I1201 08:56:52.844370 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vfpqg"] Dec 01 08:56:53 crc kubenswrapper[5004]: I1201 08:56:53.778925 5004 generic.go:334] "Generic (PLEG): container finished" podID="b678fa31-cf5b-4aeb-8aa3-c314a17abc6c" containerID="3d68eb64bded16613c9bbf5c0671914d044a06e0d95a05dbc7dc87f21f9ab688" exitCode=0 Dec 01 08:56:53 crc kubenswrapper[5004]: I1201 08:56:53.778995 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vfpqg" event={"ID":"b678fa31-cf5b-4aeb-8aa3-c314a17abc6c","Type":"ContainerDied","Data":"3d68eb64bded16613c9bbf5c0671914d044a06e0d95a05dbc7dc87f21f9ab688"} Dec 01 08:56:53 crc kubenswrapper[5004]: I1201 08:56:53.779468 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vfpqg" event={"ID":"b678fa31-cf5b-4aeb-8aa3-c314a17abc6c","Type":"ContainerStarted","Data":"4e72677403aedb7810067c7536d68d0bb2c7461410792733ee283421196bb47c"} Dec 01 08:56:55 crc kubenswrapper[5004]: I1201 08:56:55.808067 5004 generic.go:334] "Generic (PLEG): container finished" podID="b678fa31-cf5b-4aeb-8aa3-c314a17abc6c" containerID="16458aa177cabca7166b48045af32688a3421e0f2ac8a0a234f087bc220ce709" exitCode=0 Dec 01 08:56:55 crc kubenswrapper[5004]: I1201 08:56:55.808146 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vfpqg" event={"ID":"b678fa31-cf5b-4aeb-8aa3-c314a17abc6c","Type":"ContainerDied","Data":"16458aa177cabca7166b48045af32688a3421e0f2ac8a0a234f087bc220ce709"} Dec 01 08:56:56 crc kubenswrapper[5004]: I1201 08:56:56.819204 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vfpqg" event={"ID":"b678fa31-cf5b-4aeb-8aa3-c314a17abc6c","Type":"ContainerStarted","Data":"920dafd1fb13007d2ae4e2787d002e17c272f16a418b4fe96c6f1622e0ac94c1"} Dec 01 08:56:56 crc kubenswrapper[5004]: I1201 08:56:56.837319 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vfpqg" podStartSLOduration=3.14557667 podStartE2EDuration="5.837303027s" podCreationTimestamp="2025-12-01 08:56:51 +0000 UTC" firstStartedPulling="2025-12-01 08:56:53.783254846 +0000 UTC m=+2391.348246858" lastFinishedPulling="2025-12-01 08:56:56.474981233 +0000 UTC m=+2394.039973215" observedRunningTime="2025-12-01 08:56:56.833577757 +0000 UTC m=+2394.398569749" watchObservedRunningTime="2025-12-01 08:56:56.837303027 +0000 UTC m=+2394.402295009" Dec 01 08:57:02 crc kubenswrapper[5004]: I1201 08:57:02.319417 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vfpqg" Dec 01 08:57:02 crc kubenswrapper[5004]: I1201 08:57:02.322726 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vfpqg" Dec 01 08:57:02 crc kubenswrapper[5004]: I1201 08:57:02.416470 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vfpqg" Dec 01 08:57:02 crc kubenswrapper[5004]: I1201 08:57:02.948982 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vfpqg" Dec 01 08:57:03 crc kubenswrapper[5004]: I1201 08:57:03.036073 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vfpqg"] Dec 01 08:57:04 crc kubenswrapper[5004]: I1201 08:57:04.912614 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vfpqg" podUID="b678fa31-cf5b-4aeb-8aa3-c314a17abc6c" containerName="registry-server" containerID="cri-o://920dafd1fb13007d2ae4e2787d002e17c272f16a418b4fe96c6f1622e0ac94c1" gracePeriod=2 Dec 01 08:57:05 crc kubenswrapper[5004]: I1201 08:57:05.392056 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vfpqg" Dec 01 08:57:05 crc kubenswrapper[5004]: I1201 08:57:05.565633 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b678fa31-cf5b-4aeb-8aa3-c314a17abc6c-utilities\") pod \"b678fa31-cf5b-4aeb-8aa3-c314a17abc6c\" (UID: \"b678fa31-cf5b-4aeb-8aa3-c314a17abc6c\") " Dec 01 08:57:05 crc kubenswrapper[5004]: I1201 08:57:05.565834 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkfqv\" (UniqueName: \"kubernetes.io/projected/b678fa31-cf5b-4aeb-8aa3-c314a17abc6c-kube-api-access-nkfqv\") pod \"b678fa31-cf5b-4aeb-8aa3-c314a17abc6c\" (UID: \"b678fa31-cf5b-4aeb-8aa3-c314a17abc6c\") " Dec 01 08:57:05 crc kubenswrapper[5004]: I1201 08:57:05.565943 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b678fa31-cf5b-4aeb-8aa3-c314a17abc6c-catalog-content\") pod \"b678fa31-cf5b-4aeb-8aa3-c314a17abc6c\" (UID: \"b678fa31-cf5b-4aeb-8aa3-c314a17abc6c\") " Dec 01 08:57:05 crc kubenswrapper[5004]: I1201 08:57:05.566452 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b678fa31-cf5b-4aeb-8aa3-c314a17abc6c-utilities" (OuterVolumeSpecName: "utilities") pod "b678fa31-cf5b-4aeb-8aa3-c314a17abc6c" (UID: "b678fa31-cf5b-4aeb-8aa3-c314a17abc6c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:57:05 crc kubenswrapper[5004]: I1201 08:57:05.566858 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b678fa31-cf5b-4aeb-8aa3-c314a17abc6c-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 08:57:05 crc kubenswrapper[5004]: I1201 08:57:05.571416 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b678fa31-cf5b-4aeb-8aa3-c314a17abc6c-kube-api-access-nkfqv" (OuterVolumeSpecName: "kube-api-access-nkfqv") pod "b678fa31-cf5b-4aeb-8aa3-c314a17abc6c" (UID: "b678fa31-cf5b-4aeb-8aa3-c314a17abc6c"). InnerVolumeSpecName "kube-api-access-nkfqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:57:05 crc kubenswrapper[5004]: I1201 08:57:05.602684 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b678fa31-cf5b-4aeb-8aa3-c314a17abc6c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b678fa31-cf5b-4aeb-8aa3-c314a17abc6c" (UID: "b678fa31-cf5b-4aeb-8aa3-c314a17abc6c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 08:57:05 crc kubenswrapper[5004]: I1201 08:57:05.669064 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nkfqv\" (UniqueName: \"kubernetes.io/projected/b678fa31-cf5b-4aeb-8aa3-c314a17abc6c-kube-api-access-nkfqv\") on node \"crc\" DevicePath \"\"" Dec 01 08:57:05 crc kubenswrapper[5004]: I1201 08:57:05.669099 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b678fa31-cf5b-4aeb-8aa3-c314a17abc6c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 08:57:05 crc kubenswrapper[5004]: I1201 08:57:05.932346 5004 generic.go:334] "Generic (PLEG): container finished" podID="b678fa31-cf5b-4aeb-8aa3-c314a17abc6c" containerID="920dafd1fb13007d2ae4e2787d002e17c272f16a418b4fe96c6f1622e0ac94c1" exitCode=0 Dec 01 08:57:05 crc kubenswrapper[5004]: I1201 08:57:05.932408 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vfpqg" event={"ID":"b678fa31-cf5b-4aeb-8aa3-c314a17abc6c","Type":"ContainerDied","Data":"920dafd1fb13007d2ae4e2787d002e17c272f16a418b4fe96c6f1622e0ac94c1"} Dec 01 08:57:05 crc kubenswrapper[5004]: I1201 08:57:05.932450 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vfpqg" event={"ID":"b678fa31-cf5b-4aeb-8aa3-c314a17abc6c","Type":"ContainerDied","Data":"4e72677403aedb7810067c7536d68d0bb2c7461410792733ee283421196bb47c"} Dec 01 08:57:05 crc kubenswrapper[5004]: I1201 08:57:05.932489 5004 scope.go:117] "RemoveContainer" containerID="920dafd1fb13007d2ae4e2787d002e17c272f16a418b4fe96c6f1622e0ac94c1" Dec 01 08:57:05 crc kubenswrapper[5004]: I1201 08:57:05.932755 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vfpqg" Dec 01 08:57:05 crc kubenswrapper[5004]: I1201 08:57:05.974798 5004 scope.go:117] "RemoveContainer" containerID="16458aa177cabca7166b48045af32688a3421e0f2ac8a0a234f087bc220ce709" Dec 01 08:57:05 crc kubenswrapper[5004]: I1201 08:57:05.997583 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vfpqg"] Dec 01 08:57:06 crc kubenswrapper[5004]: I1201 08:57:06.014883 5004 scope.go:117] "RemoveContainer" containerID="3d68eb64bded16613c9bbf5c0671914d044a06e0d95a05dbc7dc87f21f9ab688" Dec 01 08:57:06 crc kubenswrapper[5004]: I1201 08:57:06.017856 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vfpqg"] Dec 01 08:57:06 crc kubenswrapper[5004]: I1201 08:57:06.069657 5004 scope.go:117] "RemoveContainer" containerID="920dafd1fb13007d2ae4e2787d002e17c272f16a418b4fe96c6f1622e0ac94c1" Dec 01 08:57:06 crc kubenswrapper[5004]: E1201 08:57:06.070284 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"920dafd1fb13007d2ae4e2787d002e17c272f16a418b4fe96c6f1622e0ac94c1\": container with ID starting with 920dafd1fb13007d2ae4e2787d002e17c272f16a418b4fe96c6f1622e0ac94c1 not found: ID does not exist" containerID="920dafd1fb13007d2ae4e2787d002e17c272f16a418b4fe96c6f1622e0ac94c1" Dec 01 08:57:06 crc kubenswrapper[5004]: I1201 08:57:06.070408 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"920dafd1fb13007d2ae4e2787d002e17c272f16a418b4fe96c6f1622e0ac94c1"} err="failed to get container status \"920dafd1fb13007d2ae4e2787d002e17c272f16a418b4fe96c6f1622e0ac94c1\": rpc error: code = NotFound desc = could not find container \"920dafd1fb13007d2ae4e2787d002e17c272f16a418b4fe96c6f1622e0ac94c1\": container with ID starting with 920dafd1fb13007d2ae4e2787d002e17c272f16a418b4fe96c6f1622e0ac94c1 not found: ID does not exist" Dec 01 08:57:06 crc kubenswrapper[5004]: I1201 08:57:06.070511 5004 scope.go:117] "RemoveContainer" containerID="16458aa177cabca7166b48045af32688a3421e0f2ac8a0a234f087bc220ce709" Dec 01 08:57:06 crc kubenswrapper[5004]: E1201 08:57:06.070966 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16458aa177cabca7166b48045af32688a3421e0f2ac8a0a234f087bc220ce709\": container with ID starting with 16458aa177cabca7166b48045af32688a3421e0f2ac8a0a234f087bc220ce709 not found: ID does not exist" containerID="16458aa177cabca7166b48045af32688a3421e0f2ac8a0a234f087bc220ce709" Dec 01 08:57:06 crc kubenswrapper[5004]: I1201 08:57:06.070995 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16458aa177cabca7166b48045af32688a3421e0f2ac8a0a234f087bc220ce709"} err="failed to get container status \"16458aa177cabca7166b48045af32688a3421e0f2ac8a0a234f087bc220ce709\": rpc error: code = NotFound desc = could not find container \"16458aa177cabca7166b48045af32688a3421e0f2ac8a0a234f087bc220ce709\": container with ID starting with 16458aa177cabca7166b48045af32688a3421e0f2ac8a0a234f087bc220ce709 not found: ID does not exist" Dec 01 08:57:06 crc kubenswrapper[5004]: I1201 08:57:06.071011 5004 scope.go:117] "RemoveContainer" containerID="3d68eb64bded16613c9bbf5c0671914d044a06e0d95a05dbc7dc87f21f9ab688" Dec 01 08:57:06 crc kubenswrapper[5004]: E1201 08:57:06.071352 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d68eb64bded16613c9bbf5c0671914d044a06e0d95a05dbc7dc87f21f9ab688\": container with ID starting with 3d68eb64bded16613c9bbf5c0671914d044a06e0d95a05dbc7dc87f21f9ab688 not found: ID does not exist" containerID="3d68eb64bded16613c9bbf5c0671914d044a06e0d95a05dbc7dc87f21f9ab688" Dec 01 08:57:06 crc kubenswrapper[5004]: I1201 08:57:06.071456 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d68eb64bded16613c9bbf5c0671914d044a06e0d95a05dbc7dc87f21f9ab688"} err="failed to get container status \"3d68eb64bded16613c9bbf5c0671914d044a06e0d95a05dbc7dc87f21f9ab688\": rpc error: code = NotFound desc = could not find container \"3d68eb64bded16613c9bbf5c0671914d044a06e0d95a05dbc7dc87f21f9ab688\": container with ID starting with 3d68eb64bded16613c9bbf5c0671914d044a06e0d95a05dbc7dc87f21f9ab688 not found: ID does not exist" Dec 01 08:57:06 crc kubenswrapper[5004]: I1201 08:57:06.775145 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b678fa31-cf5b-4aeb-8aa3-c314a17abc6c" path="/var/lib/kubelet/pods/b678fa31-cf5b-4aeb-8aa3-c314a17abc6c/volumes" Dec 01 08:57:07 crc kubenswrapper[5004]: I1201 08:57:07.759101 5004 scope.go:117] "RemoveContainer" containerID="35549cd96a84a4d57cde0021a5b3e7f3cc68411360397308b1e916325429b080" Dec 01 08:57:07 crc kubenswrapper[5004]: E1201 08:57:07.759707 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 08:57:14 crc kubenswrapper[5004]: I1201 08:57:14.038150 5004 generic.go:334] "Generic (PLEG): container finished" podID="8edacfd3-e002-434d-bd0a-15400e279c58" containerID="3e11082e2feeddc0414fb9aa36882e7314700e520563d9f730318937af7b1d35" exitCode=0 Dec 01 08:57:14 crc kubenswrapper[5004]: I1201 08:57:14.038322 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-snd4j" event={"ID":"8edacfd3-e002-434d-bd0a-15400e279c58","Type":"ContainerDied","Data":"3e11082e2feeddc0414fb9aa36882e7314700e520563d9f730318937af7b1d35"} Dec 01 08:57:15 crc kubenswrapper[5004]: I1201 08:57:15.518799 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-snd4j" Dec 01 08:57:15 crc kubenswrapper[5004]: I1201 08:57:15.630285 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8sl68\" (UniqueName: \"kubernetes.io/projected/8edacfd3-e002-434d-bd0a-15400e279c58-kube-api-access-8sl68\") pod \"8edacfd3-e002-434d-bd0a-15400e279c58\" (UID: \"8edacfd3-e002-434d-bd0a-15400e279c58\") " Dec 01 08:57:15 crc kubenswrapper[5004]: I1201 08:57:15.630383 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8edacfd3-e002-434d-bd0a-15400e279c58-libvirt-combined-ca-bundle\") pod \"8edacfd3-e002-434d-bd0a-15400e279c58\" (UID: \"8edacfd3-e002-434d-bd0a-15400e279c58\") " Dec 01 08:57:15 crc kubenswrapper[5004]: I1201 08:57:15.630425 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8edacfd3-e002-434d-bd0a-15400e279c58-repo-setup-combined-ca-bundle\") pod \"8edacfd3-e002-434d-bd0a-15400e279c58\" (UID: \"8edacfd3-e002-434d-bd0a-15400e279c58\") " Dec 01 08:57:15 crc kubenswrapper[5004]: I1201 08:57:15.630464 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8edacfd3-e002-434d-bd0a-15400e279c58-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"8edacfd3-e002-434d-bd0a-15400e279c58\" (UID: \"8edacfd3-e002-434d-bd0a-15400e279c58\") " Dec 01 08:57:15 crc kubenswrapper[5004]: I1201 08:57:15.630502 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8edacfd3-e002-434d-bd0a-15400e279c58-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"8edacfd3-e002-434d-bd0a-15400e279c58\" (UID: \"8edacfd3-e002-434d-bd0a-15400e279c58\") " Dec 01 08:57:15 crc kubenswrapper[5004]: I1201 08:57:15.630529 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8edacfd3-e002-434d-bd0a-15400e279c58-nova-combined-ca-bundle\") pod \"8edacfd3-e002-434d-bd0a-15400e279c58\" (UID: \"8edacfd3-e002-434d-bd0a-15400e279c58\") " Dec 01 08:57:15 crc kubenswrapper[5004]: I1201 08:57:15.630572 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8edacfd3-e002-434d-bd0a-15400e279c58-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"8edacfd3-e002-434d-bd0a-15400e279c58\" (UID: \"8edacfd3-e002-434d-bd0a-15400e279c58\") " Dec 01 08:57:15 crc kubenswrapper[5004]: I1201 08:57:15.630603 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8edacfd3-e002-434d-bd0a-15400e279c58-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"8edacfd3-e002-434d-bd0a-15400e279c58\" (UID: \"8edacfd3-e002-434d-bd0a-15400e279c58\") " Dec 01 08:57:15 crc kubenswrapper[5004]: I1201 08:57:15.630622 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8edacfd3-e002-434d-bd0a-15400e279c58-ssh-key\") pod \"8edacfd3-e002-434d-bd0a-15400e279c58\" (UID: \"8edacfd3-e002-434d-bd0a-15400e279c58\") " Dec 01 08:57:15 crc kubenswrapper[5004]: I1201 08:57:15.630679 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8edacfd3-e002-434d-bd0a-15400e279c58-ovn-combined-ca-bundle\") pod \"8edacfd3-e002-434d-bd0a-15400e279c58\" (UID: \"8edacfd3-e002-434d-bd0a-15400e279c58\") " Dec 01 08:57:15 crc kubenswrapper[5004]: I1201 08:57:15.630728 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8edacfd3-e002-434d-bd0a-15400e279c58-neutron-metadata-combined-ca-bundle\") pod \"8edacfd3-e002-434d-bd0a-15400e279c58\" (UID: \"8edacfd3-e002-434d-bd0a-15400e279c58\") " Dec 01 08:57:15 crc kubenswrapper[5004]: I1201 08:57:15.630793 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8edacfd3-e002-434d-bd0a-15400e279c58-telemetry-combined-ca-bundle\") pod \"8edacfd3-e002-434d-bd0a-15400e279c58\" (UID: \"8edacfd3-e002-434d-bd0a-15400e279c58\") " Dec 01 08:57:15 crc kubenswrapper[5004]: I1201 08:57:15.630812 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8edacfd3-e002-434d-bd0a-15400e279c58-inventory\") pod \"8edacfd3-e002-434d-bd0a-15400e279c58\" (UID: \"8edacfd3-e002-434d-bd0a-15400e279c58\") " Dec 01 08:57:15 crc kubenswrapper[5004]: I1201 08:57:15.630873 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8edacfd3-e002-434d-bd0a-15400e279c58-bootstrap-combined-ca-bundle\") pod \"8edacfd3-e002-434d-bd0a-15400e279c58\" (UID: \"8edacfd3-e002-434d-bd0a-15400e279c58\") " Dec 01 08:57:15 crc kubenswrapper[5004]: I1201 08:57:15.630926 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8edacfd3-e002-434d-bd0a-15400e279c58-openstack-edpm-ipam-ovn-default-certs-0\") pod \"8edacfd3-e002-434d-bd0a-15400e279c58\" (UID: \"8edacfd3-e002-434d-bd0a-15400e279c58\") " Dec 01 08:57:15 crc kubenswrapper[5004]: I1201 08:57:15.630979 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8edacfd3-e002-434d-bd0a-15400e279c58-telemetry-power-monitoring-combined-ca-bundle\") pod \"8edacfd3-e002-434d-bd0a-15400e279c58\" (UID: \"8edacfd3-e002-434d-bd0a-15400e279c58\") " Dec 01 08:57:15 crc kubenswrapper[5004]: I1201 08:57:15.636604 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8edacfd3-e002-434d-bd0a-15400e279c58-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "8edacfd3-e002-434d-bd0a-15400e279c58" (UID: "8edacfd3-e002-434d-bd0a-15400e279c58"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:57:15 crc kubenswrapper[5004]: I1201 08:57:15.637077 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8edacfd3-e002-434d-bd0a-15400e279c58-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "8edacfd3-e002-434d-bd0a-15400e279c58" (UID: "8edacfd3-e002-434d-bd0a-15400e279c58"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:57:15 crc kubenswrapper[5004]: I1201 08:57:15.637096 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8edacfd3-e002-434d-bd0a-15400e279c58-telemetry-power-monitoring-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-power-monitoring-combined-ca-bundle") pod "8edacfd3-e002-434d-bd0a-15400e279c58" (UID: "8edacfd3-e002-434d-bd0a-15400e279c58"). InnerVolumeSpecName "telemetry-power-monitoring-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:57:15 crc kubenswrapper[5004]: I1201 08:57:15.637853 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8edacfd3-e002-434d-bd0a-15400e279c58-kube-api-access-8sl68" (OuterVolumeSpecName: "kube-api-access-8sl68") pod "8edacfd3-e002-434d-bd0a-15400e279c58" (UID: "8edacfd3-e002-434d-bd0a-15400e279c58"). InnerVolumeSpecName "kube-api-access-8sl68". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:57:15 crc kubenswrapper[5004]: I1201 08:57:15.638992 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8edacfd3-e002-434d-bd0a-15400e279c58-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "8edacfd3-e002-434d-bd0a-15400e279c58" (UID: "8edacfd3-e002-434d-bd0a-15400e279c58"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:57:15 crc kubenswrapper[5004]: I1201 08:57:15.640255 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8edacfd3-e002-434d-bd0a-15400e279c58-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "8edacfd3-e002-434d-bd0a-15400e279c58" (UID: "8edacfd3-e002-434d-bd0a-15400e279c58"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:57:15 crc kubenswrapper[5004]: I1201 08:57:15.640658 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8edacfd3-e002-434d-bd0a-15400e279c58-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "8edacfd3-e002-434d-bd0a-15400e279c58" (UID: "8edacfd3-e002-434d-bd0a-15400e279c58"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:57:15 crc kubenswrapper[5004]: I1201 08:57:15.640289 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8edacfd3-e002-434d-bd0a-15400e279c58-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0") pod "8edacfd3-e002-434d-bd0a-15400e279c58" (UID: "8edacfd3-e002-434d-bd0a-15400e279c58"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:57:15 crc kubenswrapper[5004]: I1201 08:57:15.641230 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8edacfd3-e002-434d-bd0a-15400e279c58-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "8edacfd3-e002-434d-bd0a-15400e279c58" (UID: "8edacfd3-e002-434d-bd0a-15400e279c58"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:57:15 crc kubenswrapper[5004]: I1201 08:57:15.641451 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8edacfd3-e002-434d-bd0a-15400e279c58-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "8edacfd3-e002-434d-bd0a-15400e279c58" (UID: "8edacfd3-e002-434d-bd0a-15400e279c58"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:57:15 crc kubenswrapper[5004]: I1201 08:57:15.641747 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8edacfd3-e002-434d-bd0a-15400e279c58-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "8edacfd3-e002-434d-bd0a-15400e279c58" (UID: "8edacfd3-e002-434d-bd0a-15400e279c58"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:57:15 crc kubenswrapper[5004]: I1201 08:57:15.643197 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8edacfd3-e002-434d-bd0a-15400e279c58-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "8edacfd3-e002-434d-bd0a-15400e279c58" (UID: "8edacfd3-e002-434d-bd0a-15400e279c58"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:57:15 crc kubenswrapper[5004]: I1201 08:57:15.646832 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8edacfd3-e002-434d-bd0a-15400e279c58-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "8edacfd3-e002-434d-bd0a-15400e279c58" (UID: "8edacfd3-e002-434d-bd0a-15400e279c58"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:57:15 crc kubenswrapper[5004]: I1201 08:57:15.648861 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8edacfd3-e002-434d-bd0a-15400e279c58-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "8edacfd3-e002-434d-bd0a-15400e279c58" (UID: "8edacfd3-e002-434d-bd0a-15400e279c58"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:57:15 crc kubenswrapper[5004]: I1201 08:57:15.691701 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8edacfd3-e002-434d-bd0a-15400e279c58-inventory" (OuterVolumeSpecName: "inventory") pod "8edacfd3-e002-434d-bd0a-15400e279c58" (UID: "8edacfd3-e002-434d-bd0a-15400e279c58"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:57:15 crc kubenswrapper[5004]: I1201 08:57:15.705670 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8edacfd3-e002-434d-bd0a-15400e279c58-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8edacfd3-e002-434d-bd0a-15400e279c58" (UID: "8edacfd3-e002-434d-bd0a-15400e279c58"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:57:15 crc kubenswrapper[5004]: I1201 08:57:15.733106 5004 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8edacfd3-e002-434d-bd0a-15400e279c58-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:57:15 crc kubenswrapper[5004]: I1201 08:57:15.733143 5004 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8edacfd3-e002-434d-bd0a-15400e279c58-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:57:15 crc kubenswrapper[5004]: I1201 08:57:15.733154 5004 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8edacfd3-e002-434d-bd0a-15400e279c58-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 08:57:15 crc kubenswrapper[5004]: I1201 08:57:15.733163 5004 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8edacfd3-e002-434d-bd0a-15400e279c58-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:57:15 crc kubenswrapper[5004]: I1201 08:57:15.733173 5004 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8edacfd3-e002-434d-bd0a-15400e279c58-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 01 08:57:15 crc kubenswrapper[5004]: I1201 08:57:15.733182 5004 reconciler_common.go:293] "Volume detached for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8edacfd3-e002-434d-bd0a-15400e279c58-telemetry-power-monitoring-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:57:15 crc kubenswrapper[5004]: I1201 08:57:15.733195 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8sl68\" (UniqueName: \"kubernetes.io/projected/8edacfd3-e002-434d-bd0a-15400e279c58-kube-api-access-8sl68\") on node \"crc\" DevicePath \"\"" Dec 01 08:57:15 crc kubenswrapper[5004]: I1201 08:57:15.733204 5004 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8edacfd3-e002-434d-bd0a-15400e279c58-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:57:15 crc kubenswrapper[5004]: I1201 08:57:15.733212 5004 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8edacfd3-e002-434d-bd0a-15400e279c58-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:57:15 crc kubenswrapper[5004]: I1201 08:57:15.733221 5004 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8edacfd3-e002-434d-bd0a-15400e279c58-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 01 08:57:15 crc kubenswrapper[5004]: I1201 08:57:15.733235 5004 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8edacfd3-e002-434d-bd0a-15400e279c58-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 01 08:57:15 crc kubenswrapper[5004]: I1201 08:57:15.733245 5004 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8edacfd3-e002-434d-bd0a-15400e279c58-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:57:15 crc kubenswrapper[5004]: I1201 08:57:15.733254 5004 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8edacfd3-e002-434d-bd0a-15400e279c58-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 01 08:57:15 crc kubenswrapper[5004]: I1201 08:57:15.733272 5004 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8edacfd3-e002-434d-bd0a-15400e279c58-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 01 08:57:15 crc kubenswrapper[5004]: I1201 08:57:15.733281 5004 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8edacfd3-e002-434d-bd0a-15400e279c58-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 08:57:15 crc kubenswrapper[5004]: I1201 08:57:15.733291 5004 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8edacfd3-e002-434d-bd0a-15400e279c58-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:57:16 crc kubenswrapper[5004]: I1201 08:57:16.061815 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-snd4j" event={"ID":"8edacfd3-e002-434d-bd0a-15400e279c58","Type":"ContainerDied","Data":"3738491c2738faa259ece3948208956742fb9041e1ce520fade490383e4c85a2"} Dec 01 08:57:16 crc kubenswrapper[5004]: I1201 08:57:16.061853 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3738491c2738faa259ece3948208956742fb9041e1ce520fade490383e4c85a2" Dec 01 08:57:16 crc kubenswrapper[5004]: I1201 08:57:16.061869 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-snd4j" Dec 01 08:57:16 crc kubenswrapper[5004]: I1201 08:57:16.273308 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-fpw2q"] Dec 01 08:57:16 crc kubenswrapper[5004]: E1201 08:57:16.274057 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8edacfd3-e002-434d-bd0a-15400e279c58" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 01 08:57:16 crc kubenswrapper[5004]: I1201 08:57:16.274090 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="8edacfd3-e002-434d-bd0a-15400e279c58" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 01 08:57:16 crc kubenswrapper[5004]: E1201 08:57:16.274151 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b678fa31-cf5b-4aeb-8aa3-c314a17abc6c" containerName="extract-content" Dec 01 08:57:16 crc kubenswrapper[5004]: I1201 08:57:16.274164 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="b678fa31-cf5b-4aeb-8aa3-c314a17abc6c" containerName="extract-content" Dec 01 08:57:16 crc kubenswrapper[5004]: E1201 08:57:16.274208 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b678fa31-cf5b-4aeb-8aa3-c314a17abc6c" containerName="registry-server" Dec 01 08:57:16 crc kubenswrapper[5004]: I1201 08:57:16.274221 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="b678fa31-cf5b-4aeb-8aa3-c314a17abc6c" containerName="registry-server" Dec 01 08:57:16 crc kubenswrapper[5004]: E1201 08:57:16.274253 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b678fa31-cf5b-4aeb-8aa3-c314a17abc6c" containerName="extract-utilities" Dec 01 08:57:16 crc kubenswrapper[5004]: I1201 08:57:16.274267 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="b678fa31-cf5b-4aeb-8aa3-c314a17abc6c" containerName="extract-utilities" Dec 01 08:57:16 crc kubenswrapper[5004]: I1201 08:57:16.274749 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="b678fa31-cf5b-4aeb-8aa3-c314a17abc6c" containerName="registry-server" Dec 01 08:57:16 crc kubenswrapper[5004]: I1201 08:57:16.274851 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="8edacfd3-e002-434d-bd0a-15400e279c58" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 01 08:57:16 crc kubenswrapper[5004]: I1201 08:57:16.276080 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fpw2q" Dec 01 08:57:16 crc kubenswrapper[5004]: I1201 08:57:16.278027 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 08:57:16 crc kubenswrapper[5004]: I1201 08:57:16.284982 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Dec 01 08:57:16 crc kubenswrapper[5004]: I1201 08:57:16.285108 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 08:57:16 crc kubenswrapper[5004]: I1201 08:57:16.285194 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pdnrq" Dec 01 08:57:16 crc kubenswrapper[5004]: I1201 08:57:16.285508 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 08:57:16 crc kubenswrapper[5004]: I1201 08:57:16.303509 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-fpw2q"] Dec 01 08:57:16 crc kubenswrapper[5004]: I1201 08:57:16.349802 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b127ecf6-67a1-486c-b90a-147c4953f2d4-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fpw2q\" (UID: \"b127ecf6-67a1-486c-b90a-147c4953f2d4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fpw2q" Dec 01 08:57:16 crc kubenswrapper[5004]: I1201 08:57:16.349900 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b127ecf6-67a1-486c-b90a-147c4953f2d4-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fpw2q\" (UID: \"b127ecf6-67a1-486c-b90a-147c4953f2d4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fpw2q" Dec 01 08:57:16 crc kubenswrapper[5004]: I1201 08:57:16.349949 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/b127ecf6-67a1-486c-b90a-147c4953f2d4-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fpw2q\" (UID: \"b127ecf6-67a1-486c-b90a-147c4953f2d4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fpw2q" Dec 01 08:57:16 crc kubenswrapper[5004]: I1201 08:57:16.349979 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwqgt\" (UniqueName: \"kubernetes.io/projected/b127ecf6-67a1-486c-b90a-147c4953f2d4-kube-api-access-nwqgt\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fpw2q\" (UID: \"b127ecf6-67a1-486c-b90a-147c4953f2d4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fpw2q" Dec 01 08:57:16 crc kubenswrapper[5004]: I1201 08:57:16.350006 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b127ecf6-67a1-486c-b90a-147c4953f2d4-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fpw2q\" (UID: \"b127ecf6-67a1-486c-b90a-147c4953f2d4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fpw2q" Dec 01 08:57:16 crc kubenswrapper[5004]: I1201 08:57:16.452852 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b127ecf6-67a1-486c-b90a-147c4953f2d4-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fpw2q\" (UID: \"b127ecf6-67a1-486c-b90a-147c4953f2d4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fpw2q" Dec 01 08:57:16 crc kubenswrapper[5004]: I1201 08:57:16.452967 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b127ecf6-67a1-486c-b90a-147c4953f2d4-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fpw2q\" (UID: \"b127ecf6-67a1-486c-b90a-147c4953f2d4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fpw2q" Dec 01 08:57:16 crc kubenswrapper[5004]: I1201 08:57:16.453022 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/b127ecf6-67a1-486c-b90a-147c4953f2d4-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fpw2q\" (UID: \"b127ecf6-67a1-486c-b90a-147c4953f2d4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fpw2q" Dec 01 08:57:16 crc kubenswrapper[5004]: I1201 08:57:16.453053 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwqgt\" (UniqueName: \"kubernetes.io/projected/b127ecf6-67a1-486c-b90a-147c4953f2d4-kube-api-access-nwqgt\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fpw2q\" (UID: \"b127ecf6-67a1-486c-b90a-147c4953f2d4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fpw2q" Dec 01 08:57:16 crc kubenswrapper[5004]: I1201 08:57:16.453069 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b127ecf6-67a1-486c-b90a-147c4953f2d4-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fpw2q\" (UID: \"b127ecf6-67a1-486c-b90a-147c4953f2d4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fpw2q" Dec 01 08:57:16 crc kubenswrapper[5004]: I1201 08:57:16.455057 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/b127ecf6-67a1-486c-b90a-147c4953f2d4-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fpw2q\" (UID: \"b127ecf6-67a1-486c-b90a-147c4953f2d4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fpw2q" Dec 01 08:57:16 crc kubenswrapper[5004]: I1201 08:57:16.459866 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b127ecf6-67a1-486c-b90a-147c4953f2d4-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fpw2q\" (UID: \"b127ecf6-67a1-486c-b90a-147c4953f2d4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fpw2q" Dec 01 08:57:16 crc kubenswrapper[5004]: I1201 08:57:16.460541 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b127ecf6-67a1-486c-b90a-147c4953f2d4-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fpw2q\" (UID: \"b127ecf6-67a1-486c-b90a-147c4953f2d4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fpw2q" Dec 01 08:57:16 crc kubenswrapper[5004]: I1201 08:57:16.475290 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b127ecf6-67a1-486c-b90a-147c4953f2d4-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fpw2q\" (UID: \"b127ecf6-67a1-486c-b90a-147c4953f2d4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fpw2q" Dec 01 08:57:16 crc kubenswrapper[5004]: I1201 08:57:16.478909 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwqgt\" (UniqueName: \"kubernetes.io/projected/b127ecf6-67a1-486c-b90a-147c4953f2d4-kube-api-access-nwqgt\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fpw2q\" (UID: \"b127ecf6-67a1-486c-b90a-147c4953f2d4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fpw2q" Dec 01 08:57:16 crc kubenswrapper[5004]: I1201 08:57:16.601964 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fpw2q" Dec 01 08:57:17 crc kubenswrapper[5004]: I1201 08:57:17.013274 5004 scope.go:117] "RemoveContainer" containerID="638d9453ccc6e981c9a02c8db54414892192a8639d04934294eab12a4dcac589" Dec 01 08:57:18 crc kubenswrapper[5004]: I1201 08:57:17.309057 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-fpw2q"] Dec 01 08:57:18 crc kubenswrapper[5004]: I1201 08:57:18.086600 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fpw2q" event={"ID":"b127ecf6-67a1-486c-b90a-147c4953f2d4","Type":"ContainerStarted","Data":"fec0e5ee4084ddfe91ebb91781bede194fba273bf9f95ba3d7a8516e9fbeba43"} Dec 01 08:57:19 crc kubenswrapper[5004]: I1201 08:57:19.104915 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fpw2q" event={"ID":"b127ecf6-67a1-486c-b90a-147c4953f2d4","Type":"ContainerStarted","Data":"24960c511deceaf8cfa13fe7429304ef1e8ca504d2f955f219b1807c2635f136"} Dec 01 08:57:19 crc kubenswrapper[5004]: I1201 08:57:19.137068 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fpw2q" podStartSLOduration=2.350087821 podStartE2EDuration="3.137049637s" podCreationTimestamp="2025-12-01 08:57:16 +0000 UTC" firstStartedPulling="2025-12-01 08:57:17.314508097 +0000 UTC m=+2414.879500089" lastFinishedPulling="2025-12-01 08:57:18.101469923 +0000 UTC m=+2415.666461905" observedRunningTime="2025-12-01 08:57:19.126723755 +0000 UTC m=+2416.691715817" watchObservedRunningTime="2025-12-01 08:57:19.137049637 +0000 UTC m=+2416.702041629" Dec 01 08:57:19 crc kubenswrapper[5004]: I1201 08:57:19.764701 5004 scope.go:117] "RemoveContainer" containerID="35549cd96a84a4d57cde0021a5b3e7f3cc68411360397308b1e916325429b080" Dec 01 08:57:19 crc kubenswrapper[5004]: E1201 08:57:19.765301 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 08:57:30 crc kubenswrapper[5004]: I1201 08:57:30.759492 5004 scope.go:117] "RemoveContainer" containerID="35549cd96a84a4d57cde0021a5b3e7f3cc68411360397308b1e916325429b080" Dec 01 08:57:30 crc kubenswrapper[5004]: E1201 08:57:30.760717 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 08:57:44 crc kubenswrapper[5004]: I1201 08:57:44.759429 5004 scope.go:117] "RemoveContainer" containerID="35549cd96a84a4d57cde0021a5b3e7f3cc68411360397308b1e916325429b080" Dec 01 08:57:44 crc kubenswrapper[5004]: E1201 08:57:44.760282 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 08:57:56 crc kubenswrapper[5004]: I1201 08:57:56.759509 5004 scope.go:117] "RemoveContainer" containerID="35549cd96a84a4d57cde0021a5b3e7f3cc68411360397308b1e916325429b080" Dec 01 08:57:56 crc kubenswrapper[5004]: E1201 08:57:56.760425 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 08:58:08 crc kubenswrapper[5004]: I1201 08:58:08.759343 5004 scope.go:117] "RemoveContainer" containerID="35549cd96a84a4d57cde0021a5b3e7f3cc68411360397308b1e916325429b080" Dec 01 08:58:08 crc kubenswrapper[5004]: E1201 08:58:08.760502 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 08:58:19 crc kubenswrapper[5004]: I1201 08:58:19.758882 5004 scope.go:117] "RemoveContainer" containerID="35549cd96a84a4d57cde0021a5b3e7f3cc68411360397308b1e916325429b080" Dec 01 08:58:19 crc kubenswrapper[5004]: E1201 08:58:19.759890 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 08:58:28 crc kubenswrapper[5004]: I1201 08:58:28.955584 5004 generic.go:334] "Generic (PLEG): container finished" podID="b127ecf6-67a1-486c-b90a-147c4953f2d4" containerID="24960c511deceaf8cfa13fe7429304ef1e8ca504d2f955f219b1807c2635f136" exitCode=0 Dec 01 08:58:28 crc kubenswrapper[5004]: I1201 08:58:28.955679 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fpw2q" event={"ID":"b127ecf6-67a1-486c-b90a-147c4953f2d4","Type":"ContainerDied","Data":"24960c511deceaf8cfa13fe7429304ef1e8ca504d2f955f219b1807c2635f136"} Dec 01 08:58:30 crc kubenswrapper[5004]: I1201 08:58:30.508233 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fpw2q" Dec 01 08:58:30 crc kubenswrapper[5004]: I1201 08:58:30.666577 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwqgt\" (UniqueName: \"kubernetes.io/projected/b127ecf6-67a1-486c-b90a-147c4953f2d4-kube-api-access-nwqgt\") pod \"b127ecf6-67a1-486c-b90a-147c4953f2d4\" (UID: \"b127ecf6-67a1-486c-b90a-147c4953f2d4\") " Dec 01 08:58:30 crc kubenswrapper[5004]: I1201 08:58:30.666879 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b127ecf6-67a1-486c-b90a-147c4953f2d4-ssh-key\") pod \"b127ecf6-67a1-486c-b90a-147c4953f2d4\" (UID: \"b127ecf6-67a1-486c-b90a-147c4953f2d4\") " Dec 01 08:58:30 crc kubenswrapper[5004]: I1201 08:58:30.666987 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b127ecf6-67a1-486c-b90a-147c4953f2d4-inventory\") pod \"b127ecf6-67a1-486c-b90a-147c4953f2d4\" (UID: \"b127ecf6-67a1-486c-b90a-147c4953f2d4\") " Dec 01 08:58:30 crc kubenswrapper[5004]: I1201 08:58:30.667049 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b127ecf6-67a1-486c-b90a-147c4953f2d4-ovn-combined-ca-bundle\") pod \"b127ecf6-67a1-486c-b90a-147c4953f2d4\" (UID: \"b127ecf6-67a1-486c-b90a-147c4953f2d4\") " Dec 01 08:58:30 crc kubenswrapper[5004]: I1201 08:58:30.667149 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/b127ecf6-67a1-486c-b90a-147c4953f2d4-ovncontroller-config-0\") pod \"b127ecf6-67a1-486c-b90a-147c4953f2d4\" (UID: \"b127ecf6-67a1-486c-b90a-147c4953f2d4\") " Dec 01 08:58:30 crc kubenswrapper[5004]: I1201 08:58:30.673798 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b127ecf6-67a1-486c-b90a-147c4953f2d4-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "b127ecf6-67a1-486c-b90a-147c4953f2d4" (UID: "b127ecf6-67a1-486c-b90a-147c4953f2d4"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:58:30 crc kubenswrapper[5004]: I1201 08:58:30.673849 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b127ecf6-67a1-486c-b90a-147c4953f2d4-kube-api-access-nwqgt" (OuterVolumeSpecName: "kube-api-access-nwqgt") pod "b127ecf6-67a1-486c-b90a-147c4953f2d4" (UID: "b127ecf6-67a1-486c-b90a-147c4953f2d4"). InnerVolumeSpecName "kube-api-access-nwqgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:58:30 crc kubenswrapper[5004]: I1201 08:58:30.698961 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b127ecf6-67a1-486c-b90a-147c4953f2d4-inventory" (OuterVolumeSpecName: "inventory") pod "b127ecf6-67a1-486c-b90a-147c4953f2d4" (UID: "b127ecf6-67a1-486c-b90a-147c4953f2d4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:58:30 crc kubenswrapper[5004]: I1201 08:58:30.700389 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b127ecf6-67a1-486c-b90a-147c4953f2d4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b127ecf6-67a1-486c-b90a-147c4953f2d4" (UID: "b127ecf6-67a1-486c-b90a-147c4953f2d4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:58:30 crc kubenswrapper[5004]: I1201 08:58:30.702792 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b127ecf6-67a1-486c-b90a-147c4953f2d4-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "b127ecf6-67a1-486c-b90a-147c4953f2d4" (UID: "b127ecf6-67a1-486c-b90a-147c4953f2d4"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 08:58:30 crc kubenswrapper[5004]: I1201 08:58:30.769459 5004 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/b127ecf6-67a1-486c-b90a-147c4953f2d4-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Dec 01 08:58:30 crc kubenswrapper[5004]: I1201 08:58:30.769491 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwqgt\" (UniqueName: \"kubernetes.io/projected/b127ecf6-67a1-486c-b90a-147c4953f2d4-kube-api-access-nwqgt\") on node \"crc\" DevicePath \"\"" Dec 01 08:58:30 crc kubenswrapper[5004]: I1201 08:58:30.769500 5004 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b127ecf6-67a1-486c-b90a-147c4953f2d4-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 08:58:30 crc kubenswrapper[5004]: I1201 08:58:30.769508 5004 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b127ecf6-67a1-486c-b90a-147c4953f2d4-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 08:58:30 crc kubenswrapper[5004]: I1201 08:58:30.769517 5004 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b127ecf6-67a1-486c-b90a-147c4953f2d4-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:58:31 crc kubenswrapper[5004]: I1201 08:58:30.999987 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fpw2q" event={"ID":"b127ecf6-67a1-486c-b90a-147c4953f2d4","Type":"ContainerDied","Data":"fec0e5ee4084ddfe91ebb91781bede194fba273bf9f95ba3d7a8516e9fbeba43"} Dec 01 08:58:31 crc kubenswrapper[5004]: I1201 08:58:31.000367 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fec0e5ee4084ddfe91ebb91781bede194fba273bf9f95ba3d7a8516e9fbeba43" Dec 01 08:58:31 crc kubenswrapper[5004]: I1201 08:58:31.000632 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fpw2q" Dec 01 08:58:31 crc kubenswrapper[5004]: I1201 08:58:31.199013 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kt6tb"] Dec 01 08:58:31 crc kubenswrapper[5004]: E1201 08:58:31.200035 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b127ecf6-67a1-486c-b90a-147c4953f2d4" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 01 08:58:31 crc kubenswrapper[5004]: I1201 08:58:31.200076 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="b127ecf6-67a1-486c-b90a-147c4953f2d4" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 01 08:58:31 crc kubenswrapper[5004]: I1201 08:58:31.200593 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="b127ecf6-67a1-486c-b90a-147c4953f2d4" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 01 08:58:31 crc kubenswrapper[5004]: I1201 08:58:31.202098 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kt6tb" Dec 01 08:58:31 crc kubenswrapper[5004]: I1201 08:58:31.207486 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 08:58:31 crc kubenswrapper[5004]: I1201 08:58:31.207593 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Dec 01 08:58:31 crc kubenswrapper[5004]: I1201 08:58:31.208258 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Dec 01 08:58:31 crc kubenswrapper[5004]: I1201 08:58:31.208727 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 08:58:31 crc kubenswrapper[5004]: I1201 08:58:31.210035 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 08:58:31 crc kubenswrapper[5004]: I1201 08:58:31.210544 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pdnrq" Dec 01 08:58:31 crc kubenswrapper[5004]: I1201 08:58:31.221419 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kt6tb"] Dec 01 08:58:31 crc kubenswrapper[5004]: I1201 08:58:31.389772 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2b5p\" (UniqueName: \"kubernetes.io/projected/c7804cf5-e266-43a4-a6da-6e0686d4897c-kube-api-access-f2b5p\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kt6tb\" (UID: \"c7804cf5-e266-43a4-a6da-6e0686d4897c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kt6tb" Dec 01 08:58:31 crc kubenswrapper[5004]: I1201 08:58:31.389863 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c7804cf5-e266-43a4-a6da-6e0686d4897c-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kt6tb\" (UID: \"c7804cf5-e266-43a4-a6da-6e0686d4897c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kt6tb" Dec 01 08:58:31 crc kubenswrapper[5004]: I1201 08:58:31.389902 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c7804cf5-e266-43a4-a6da-6e0686d4897c-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kt6tb\" (UID: \"c7804cf5-e266-43a4-a6da-6e0686d4897c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kt6tb" Dec 01 08:58:31 crc kubenswrapper[5004]: I1201 08:58:31.390141 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c7804cf5-e266-43a4-a6da-6e0686d4897c-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kt6tb\" (UID: \"c7804cf5-e266-43a4-a6da-6e0686d4897c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kt6tb" Dec 01 08:58:31 crc kubenswrapper[5004]: I1201 08:58:31.390279 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7804cf5-e266-43a4-a6da-6e0686d4897c-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kt6tb\" (UID: \"c7804cf5-e266-43a4-a6da-6e0686d4897c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kt6tb" Dec 01 08:58:31 crc kubenswrapper[5004]: I1201 08:58:31.390433 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c7804cf5-e266-43a4-a6da-6e0686d4897c-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kt6tb\" (UID: \"c7804cf5-e266-43a4-a6da-6e0686d4897c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kt6tb" Dec 01 08:58:31 crc kubenswrapper[5004]: I1201 08:58:31.492853 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7804cf5-e266-43a4-a6da-6e0686d4897c-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kt6tb\" (UID: \"c7804cf5-e266-43a4-a6da-6e0686d4897c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kt6tb" Dec 01 08:58:31 crc kubenswrapper[5004]: I1201 08:58:31.493182 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c7804cf5-e266-43a4-a6da-6e0686d4897c-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kt6tb\" (UID: \"c7804cf5-e266-43a4-a6da-6e0686d4897c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kt6tb" Dec 01 08:58:31 crc kubenswrapper[5004]: I1201 08:58:31.493290 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2b5p\" (UniqueName: \"kubernetes.io/projected/c7804cf5-e266-43a4-a6da-6e0686d4897c-kube-api-access-f2b5p\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kt6tb\" (UID: \"c7804cf5-e266-43a4-a6da-6e0686d4897c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kt6tb" Dec 01 08:58:31 crc kubenswrapper[5004]: I1201 08:58:31.493329 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c7804cf5-e266-43a4-a6da-6e0686d4897c-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kt6tb\" (UID: \"c7804cf5-e266-43a4-a6da-6e0686d4897c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kt6tb" Dec 01 08:58:31 crc kubenswrapper[5004]: I1201 08:58:31.493359 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c7804cf5-e266-43a4-a6da-6e0686d4897c-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kt6tb\" (UID: \"c7804cf5-e266-43a4-a6da-6e0686d4897c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kt6tb" Dec 01 08:58:31 crc kubenswrapper[5004]: I1201 08:58:31.493719 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c7804cf5-e266-43a4-a6da-6e0686d4897c-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kt6tb\" (UID: \"c7804cf5-e266-43a4-a6da-6e0686d4897c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kt6tb" Dec 01 08:58:31 crc kubenswrapper[5004]: I1201 08:58:31.498283 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c7804cf5-e266-43a4-a6da-6e0686d4897c-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kt6tb\" (UID: \"c7804cf5-e266-43a4-a6da-6e0686d4897c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kt6tb" Dec 01 08:58:31 crc kubenswrapper[5004]: I1201 08:58:31.499295 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c7804cf5-e266-43a4-a6da-6e0686d4897c-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kt6tb\" (UID: \"c7804cf5-e266-43a4-a6da-6e0686d4897c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kt6tb" Dec 01 08:58:31 crc kubenswrapper[5004]: I1201 08:58:31.499325 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c7804cf5-e266-43a4-a6da-6e0686d4897c-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kt6tb\" (UID: \"c7804cf5-e266-43a4-a6da-6e0686d4897c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kt6tb" Dec 01 08:58:31 crc kubenswrapper[5004]: I1201 08:58:31.500107 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c7804cf5-e266-43a4-a6da-6e0686d4897c-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kt6tb\" (UID: \"c7804cf5-e266-43a4-a6da-6e0686d4897c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kt6tb" Dec 01 08:58:31 crc kubenswrapper[5004]: I1201 08:58:31.504998 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7804cf5-e266-43a4-a6da-6e0686d4897c-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kt6tb\" (UID: \"c7804cf5-e266-43a4-a6da-6e0686d4897c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kt6tb" Dec 01 08:58:31 crc kubenswrapper[5004]: I1201 08:58:31.515342 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2b5p\" (UniqueName: \"kubernetes.io/projected/c7804cf5-e266-43a4-a6da-6e0686d4897c-kube-api-access-f2b5p\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-kt6tb\" (UID: \"c7804cf5-e266-43a4-a6da-6e0686d4897c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kt6tb" Dec 01 08:58:31 crc kubenswrapper[5004]: I1201 08:58:31.585901 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kt6tb" Dec 01 08:58:32 crc kubenswrapper[5004]: I1201 08:58:32.167422 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kt6tb"] Dec 01 08:58:33 crc kubenswrapper[5004]: I1201 08:58:33.027662 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kt6tb" event={"ID":"c7804cf5-e266-43a4-a6da-6e0686d4897c","Type":"ContainerStarted","Data":"d621536436f5a84e668f1b86a201d3806491ca4361b5219a222d3fddeb323cba"} Dec 01 08:58:33 crc kubenswrapper[5004]: I1201 08:58:33.028035 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kt6tb" event={"ID":"c7804cf5-e266-43a4-a6da-6e0686d4897c","Type":"ContainerStarted","Data":"31f77990ac6e6b7219e33cea12b4aa1af19ecbd4d427fd0932e73515e96bc140"} Dec 01 08:58:33 crc kubenswrapper[5004]: I1201 08:58:33.054894 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kt6tb" podStartSLOduration=1.518941296 podStartE2EDuration="2.054874352s" podCreationTimestamp="2025-12-01 08:58:31 +0000 UTC" firstStartedPulling="2025-12-01 08:58:32.158211825 +0000 UTC m=+2489.723203807" lastFinishedPulling="2025-12-01 08:58:32.694144871 +0000 UTC m=+2490.259136863" observedRunningTime="2025-12-01 08:58:33.052994076 +0000 UTC m=+2490.617986078" watchObservedRunningTime="2025-12-01 08:58:33.054874352 +0000 UTC m=+2490.619866344" Dec 01 08:58:34 crc kubenswrapper[5004]: I1201 08:58:34.760302 5004 scope.go:117] "RemoveContainer" containerID="35549cd96a84a4d57cde0021a5b3e7f3cc68411360397308b1e916325429b080" Dec 01 08:58:34 crc kubenswrapper[5004]: E1201 08:58:34.761648 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 08:58:45 crc kubenswrapper[5004]: I1201 08:58:45.759291 5004 scope.go:117] "RemoveContainer" containerID="35549cd96a84a4d57cde0021a5b3e7f3cc68411360397308b1e916325429b080" Dec 01 08:58:45 crc kubenswrapper[5004]: E1201 08:58:45.760639 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 08:58:56 crc kubenswrapper[5004]: I1201 08:58:56.760354 5004 scope.go:117] "RemoveContainer" containerID="35549cd96a84a4d57cde0021a5b3e7f3cc68411360397308b1e916325429b080" Dec 01 08:58:56 crc kubenswrapper[5004]: E1201 08:58:56.780422 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 08:59:10 crc kubenswrapper[5004]: I1201 08:59:10.759807 5004 scope.go:117] "RemoveContainer" containerID="35549cd96a84a4d57cde0021a5b3e7f3cc68411360397308b1e916325429b080" Dec 01 08:59:10 crc kubenswrapper[5004]: E1201 08:59:10.760452 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 08:59:24 crc kubenswrapper[5004]: I1201 08:59:24.760537 5004 scope.go:117] "RemoveContainer" containerID="35549cd96a84a4d57cde0021a5b3e7f3cc68411360397308b1e916325429b080" Dec 01 08:59:24 crc kubenswrapper[5004]: E1201 08:59:24.763067 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 08:59:25 crc kubenswrapper[5004]: I1201 08:59:25.698914 5004 generic.go:334] "Generic (PLEG): container finished" podID="c7804cf5-e266-43a4-a6da-6e0686d4897c" containerID="d621536436f5a84e668f1b86a201d3806491ca4361b5219a222d3fddeb323cba" exitCode=0 Dec 01 08:59:25 crc kubenswrapper[5004]: I1201 08:59:25.698964 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kt6tb" event={"ID":"c7804cf5-e266-43a4-a6da-6e0686d4897c","Type":"ContainerDied","Data":"d621536436f5a84e668f1b86a201d3806491ca4361b5219a222d3fddeb323cba"} Dec 01 08:59:27 crc kubenswrapper[5004]: I1201 08:59:27.258550 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kt6tb" Dec 01 08:59:27 crc kubenswrapper[5004]: I1201 08:59:27.419495 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7804cf5-e266-43a4-a6da-6e0686d4897c-neutron-metadata-combined-ca-bundle\") pod \"c7804cf5-e266-43a4-a6da-6e0686d4897c\" (UID: \"c7804cf5-e266-43a4-a6da-6e0686d4897c\") " Dec 01 08:59:27 crc kubenswrapper[5004]: I1201 08:59:27.420038 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2b5p\" (UniqueName: \"kubernetes.io/projected/c7804cf5-e266-43a4-a6da-6e0686d4897c-kube-api-access-f2b5p\") pod \"c7804cf5-e266-43a4-a6da-6e0686d4897c\" (UID: \"c7804cf5-e266-43a4-a6da-6e0686d4897c\") " Dec 01 08:59:27 crc kubenswrapper[5004]: I1201 08:59:27.420239 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c7804cf5-e266-43a4-a6da-6e0686d4897c-inventory\") pod \"c7804cf5-e266-43a4-a6da-6e0686d4897c\" (UID: \"c7804cf5-e266-43a4-a6da-6e0686d4897c\") " Dec 01 08:59:27 crc kubenswrapper[5004]: I1201 08:59:27.420326 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c7804cf5-e266-43a4-a6da-6e0686d4897c-nova-metadata-neutron-config-0\") pod \"c7804cf5-e266-43a4-a6da-6e0686d4897c\" (UID: \"c7804cf5-e266-43a4-a6da-6e0686d4897c\") " Dec 01 08:59:27 crc kubenswrapper[5004]: I1201 08:59:27.420385 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c7804cf5-e266-43a4-a6da-6e0686d4897c-ssh-key\") pod \"c7804cf5-e266-43a4-a6da-6e0686d4897c\" (UID: \"c7804cf5-e266-43a4-a6da-6e0686d4897c\") " Dec 01 08:59:27 crc kubenswrapper[5004]: I1201 08:59:27.420498 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c7804cf5-e266-43a4-a6da-6e0686d4897c-neutron-ovn-metadata-agent-neutron-config-0\") pod \"c7804cf5-e266-43a4-a6da-6e0686d4897c\" (UID: \"c7804cf5-e266-43a4-a6da-6e0686d4897c\") " Dec 01 08:59:27 crc kubenswrapper[5004]: I1201 08:59:27.427397 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7804cf5-e266-43a4-a6da-6e0686d4897c-kube-api-access-f2b5p" (OuterVolumeSpecName: "kube-api-access-f2b5p") pod "c7804cf5-e266-43a4-a6da-6e0686d4897c" (UID: "c7804cf5-e266-43a4-a6da-6e0686d4897c"). InnerVolumeSpecName "kube-api-access-f2b5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 08:59:27 crc kubenswrapper[5004]: I1201 08:59:27.429542 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7804cf5-e266-43a4-a6da-6e0686d4897c-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "c7804cf5-e266-43a4-a6da-6e0686d4897c" (UID: "c7804cf5-e266-43a4-a6da-6e0686d4897c"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:59:27 crc kubenswrapper[5004]: I1201 08:59:27.452374 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7804cf5-e266-43a4-a6da-6e0686d4897c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c7804cf5-e266-43a4-a6da-6e0686d4897c" (UID: "c7804cf5-e266-43a4-a6da-6e0686d4897c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:59:27 crc kubenswrapper[5004]: I1201 08:59:27.462364 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7804cf5-e266-43a4-a6da-6e0686d4897c-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "c7804cf5-e266-43a4-a6da-6e0686d4897c" (UID: "c7804cf5-e266-43a4-a6da-6e0686d4897c"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:59:27 crc kubenswrapper[5004]: I1201 08:59:27.469018 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7804cf5-e266-43a4-a6da-6e0686d4897c-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "c7804cf5-e266-43a4-a6da-6e0686d4897c" (UID: "c7804cf5-e266-43a4-a6da-6e0686d4897c"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:59:27 crc kubenswrapper[5004]: I1201 08:59:27.472667 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7804cf5-e266-43a4-a6da-6e0686d4897c-inventory" (OuterVolumeSpecName: "inventory") pod "c7804cf5-e266-43a4-a6da-6e0686d4897c" (UID: "c7804cf5-e266-43a4-a6da-6e0686d4897c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 08:59:27 crc kubenswrapper[5004]: I1201 08:59:27.524833 5004 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c7804cf5-e266-43a4-a6da-6e0686d4897c-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 01 08:59:27 crc kubenswrapper[5004]: I1201 08:59:27.524874 5004 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c7804cf5-e266-43a4-a6da-6e0686d4897c-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 08:59:27 crc kubenswrapper[5004]: I1201 08:59:27.524891 5004 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c7804cf5-e266-43a4-a6da-6e0686d4897c-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 01 08:59:27 crc kubenswrapper[5004]: I1201 08:59:27.524906 5004 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7804cf5-e266-43a4-a6da-6e0686d4897c-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 08:59:27 crc kubenswrapper[5004]: I1201 08:59:27.524920 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2b5p\" (UniqueName: \"kubernetes.io/projected/c7804cf5-e266-43a4-a6da-6e0686d4897c-kube-api-access-f2b5p\") on node \"crc\" DevicePath \"\"" Dec 01 08:59:27 crc kubenswrapper[5004]: I1201 08:59:27.524933 5004 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c7804cf5-e266-43a4-a6da-6e0686d4897c-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 08:59:27 crc kubenswrapper[5004]: I1201 08:59:27.728009 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kt6tb" event={"ID":"c7804cf5-e266-43a4-a6da-6e0686d4897c","Type":"ContainerDied","Data":"31f77990ac6e6b7219e33cea12b4aa1af19ecbd4d427fd0932e73515e96bc140"} Dec 01 08:59:27 crc kubenswrapper[5004]: I1201 08:59:27.728052 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31f77990ac6e6b7219e33cea12b4aa1af19ecbd4d427fd0932e73515e96bc140" Dec 01 08:59:27 crc kubenswrapper[5004]: I1201 08:59:27.728081 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-kt6tb" Dec 01 08:59:27 crc kubenswrapper[5004]: I1201 08:59:27.837096 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2p8kr"] Dec 01 08:59:27 crc kubenswrapper[5004]: E1201 08:59:27.837576 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7804cf5-e266-43a4-a6da-6e0686d4897c" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 01 08:59:27 crc kubenswrapper[5004]: I1201 08:59:27.837589 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7804cf5-e266-43a4-a6da-6e0686d4897c" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 01 08:59:27 crc kubenswrapper[5004]: I1201 08:59:27.837811 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7804cf5-e266-43a4-a6da-6e0686d4897c" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 01 08:59:27 crc kubenswrapper[5004]: I1201 08:59:27.838569 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2p8kr" Dec 01 08:59:27 crc kubenswrapper[5004]: I1201 08:59:27.840904 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Dec 01 08:59:27 crc kubenswrapper[5004]: I1201 08:59:27.841154 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 08:59:27 crc kubenswrapper[5004]: I1201 08:59:27.844187 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 08:59:27 crc kubenswrapper[5004]: I1201 08:59:27.844682 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 08:59:27 crc kubenswrapper[5004]: I1201 08:59:27.859351 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pdnrq" Dec 01 08:59:27 crc kubenswrapper[5004]: I1201 08:59:27.873553 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2p8kr"] Dec 01 08:59:27 crc kubenswrapper[5004]: I1201 08:59:27.934932 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2p8kr\" (UID: \"ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2p8kr" Dec 01 08:59:27 crc kubenswrapper[5004]: I1201 08:59:27.935014 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2p8kr\" (UID: \"ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2p8kr" Dec 01 08:59:27 crc kubenswrapper[5004]: I1201 08:59:27.935056 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2p8kr\" (UID: \"ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2p8kr" Dec 01 08:59:27 crc kubenswrapper[5004]: I1201 08:59:27.935082 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzgxj\" (UniqueName: \"kubernetes.io/projected/ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce-kube-api-access-dzgxj\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2p8kr\" (UID: \"ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2p8kr" Dec 01 08:59:27 crc kubenswrapper[5004]: I1201 08:59:27.935121 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2p8kr\" (UID: \"ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2p8kr" Dec 01 08:59:28 crc kubenswrapper[5004]: I1201 08:59:28.037372 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2p8kr\" (UID: \"ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2p8kr" Dec 01 08:59:28 crc kubenswrapper[5004]: I1201 08:59:28.037436 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzgxj\" (UniqueName: \"kubernetes.io/projected/ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce-kube-api-access-dzgxj\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2p8kr\" (UID: \"ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2p8kr" Dec 01 08:59:28 crc kubenswrapper[5004]: I1201 08:59:28.037489 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2p8kr\" (UID: \"ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2p8kr" Dec 01 08:59:28 crc kubenswrapper[5004]: I1201 08:59:28.037654 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2p8kr\" (UID: \"ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2p8kr" Dec 01 08:59:28 crc kubenswrapper[5004]: I1201 08:59:28.037715 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2p8kr\" (UID: \"ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2p8kr" Dec 01 08:59:28 crc kubenswrapper[5004]: I1201 08:59:28.043008 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2p8kr\" (UID: \"ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2p8kr" Dec 01 08:59:28 crc kubenswrapper[5004]: I1201 08:59:28.043012 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2p8kr\" (UID: \"ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2p8kr" Dec 01 08:59:28 crc kubenswrapper[5004]: I1201 08:59:28.046988 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2p8kr\" (UID: \"ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2p8kr" Dec 01 08:59:28 crc kubenswrapper[5004]: I1201 08:59:28.049691 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2p8kr\" (UID: \"ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2p8kr" Dec 01 08:59:28 crc kubenswrapper[5004]: I1201 08:59:28.055221 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzgxj\" (UniqueName: \"kubernetes.io/projected/ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce-kube-api-access-dzgxj\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2p8kr\" (UID: \"ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2p8kr" Dec 01 08:59:28 crc kubenswrapper[5004]: I1201 08:59:28.161055 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2p8kr" Dec 01 08:59:28 crc kubenswrapper[5004]: I1201 08:59:28.815548 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2p8kr"] Dec 01 08:59:29 crc kubenswrapper[5004]: I1201 08:59:29.752957 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2p8kr" event={"ID":"ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce","Type":"ContainerStarted","Data":"ea83bba39161f7a46776209069767b27a5c40515473306f8edcce012c77ec1f1"} Dec 01 08:59:29 crc kubenswrapper[5004]: I1201 08:59:29.753346 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2p8kr" event={"ID":"ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce","Type":"ContainerStarted","Data":"9244370e4c0cfce998ee6da9f233d96b3bfe912c6daf918d13d3b109d71e28c6"} Dec 01 08:59:29 crc kubenswrapper[5004]: I1201 08:59:29.770842 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2p8kr" podStartSLOduration=2.253696409 podStartE2EDuration="2.770823055s" podCreationTimestamp="2025-12-01 08:59:27 +0000 UTC" firstStartedPulling="2025-12-01 08:59:28.819154272 +0000 UTC m=+2546.384146284" lastFinishedPulling="2025-12-01 08:59:29.336280918 +0000 UTC m=+2546.901272930" observedRunningTime="2025-12-01 08:59:29.77024158 +0000 UTC m=+2547.335233582" watchObservedRunningTime="2025-12-01 08:59:29.770823055 +0000 UTC m=+2547.335815057" Dec 01 08:59:37 crc kubenswrapper[5004]: I1201 08:59:37.760034 5004 scope.go:117] "RemoveContainer" containerID="35549cd96a84a4d57cde0021a5b3e7f3cc68411360397308b1e916325429b080" Dec 01 08:59:37 crc kubenswrapper[5004]: E1201 08:59:37.760823 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 08:59:49 crc kubenswrapper[5004]: I1201 08:59:49.758763 5004 scope.go:117] "RemoveContainer" containerID="35549cd96a84a4d57cde0021a5b3e7f3cc68411360397308b1e916325429b080" Dec 01 08:59:49 crc kubenswrapper[5004]: E1201 08:59:49.759591 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:00:00 crc kubenswrapper[5004]: I1201 09:00:00.168301 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409660-ggjd9"] Dec 01 09:00:00 crc kubenswrapper[5004]: I1201 09:00:00.171747 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409660-ggjd9" Dec 01 09:00:00 crc kubenswrapper[5004]: I1201 09:00:00.174942 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 09:00:00 crc kubenswrapper[5004]: I1201 09:00:00.175444 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 09:00:00 crc kubenswrapper[5004]: I1201 09:00:00.179075 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409660-ggjd9"] Dec 01 09:00:00 crc kubenswrapper[5004]: I1201 09:00:00.351424 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdxnt\" (UniqueName: \"kubernetes.io/projected/a83901fd-e069-4f8c-81d7-de04d71937f5-kube-api-access-zdxnt\") pod \"collect-profiles-29409660-ggjd9\" (UID: \"a83901fd-e069-4f8c-81d7-de04d71937f5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409660-ggjd9" Dec 01 09:00:00 crc kubenswrapper[5004]: I1201 09:00:00.352454 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a83901fd-e069-4f8c-81d7-de04d71937f5-config-volume\") pod \"collect-profiles-29409660-ggjd9\" (UID: \"a83901fd-e069-4f8c-81d7-de04d71937f5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409660-ggjd9" Dec 01 09:00:00 crc kubenswrapper[5004]: I1201 09:00:00.352602 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a83901fd-e069-4f8c-81d7-de04d71937f5-secret-volume\") pod \"collect-profiles-29409660-ggjd9\" (UID: \"a83901fd-e069-4f8c-81d7-de04d71937f5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409660-ggjd9" Dec 01 09:00:00 crc kubenswrapper[5004]: I1201 09:00:00.454906 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdxnt\" (UniqueName: \"kubernetes.io/projected/a83901fd-e069-4f8c-81d7-de04d71937f5-kube-api-access-zdxnt\") pod \"collect-profiles-29409660-ggjd9\" (UID: \"a83901fd-e069-4f8c-81d7-de04d71937f5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409660-ggjd9" Dec 01 09:00:00 crc kubenswrapper[5004]: I1201 09:00:00.455111 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a83901fd-e069-4f8c-81d7-de04d71937f5-config-volume\") pod \"collect-profiles-29409660-ggjd9\" (UID: \"a83901fd-e069-4f8c-81d7-de04d71937f5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409660-ggjd9" Dec 01 09:00:00 crc kubenswrapper[5004]: I1201 09:00:00.455147 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a83901fd-e069-4f8c-81d7-de04d71937f5-secret-volume\") pod \"collect-profiles-29409660-ggjd9\" (UID: \"a83901fd-e069-4f8c-81d7-de04d71937f5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409660-ggjd9" Dec 01 09:00:00 crc kubenswrapper[5004]: I1201 09:00:00.456084 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a83901fd-e069-4f8c-81d7-de04d71937f5-config-volume\") pod \"collect-profiles-29409660-ggjd9\" (UID: \"a83901fd-e069-4f8c-81d7-de04d71937f5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409660-ggjd9" Dec 01 09:00:00 crc kubenswrapper[5004]: I1201 09:00:00.461391 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a83901fd-e069-4f8c-81d7-de04d71937f5-secret-volume\") pod \"collect-profiles-29409660-ggjd9\" (UID: \"a83901fd-e069-4f8c-81d7-de04d71937f5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409660-ggjd9" Dec 01 09:00:00 crc kubenswrapper[5004]: I1201 09:00:00.476707 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdxnt\" (UniqueName: \"kubernetes.io/projected/a83901fd-e069-4f8c-81d7-de04d71937f5-kube-api-access-zdxnt\") pod \"collect-profiles-29409660-ggjd9\" (UID: \"a83901fd-e069-4f8c-81d7-de04d71937f5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409660-ggjd9" Dec 01 09:00:00 crc kubenswrapper[5004]: I1201 09:00:00.532666 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409660-ggjd9" Dec 01 09:00:01 crc kubenswrapper[5004]: I1201 09:00:01.045259 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409660-ggjd9"] Dec 01 09:00:01 crc kubenswrapper[5004]: I1201 09:00:01.172319 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409660-ggjd9" event={"ID":"a83901fd-e069-4f8c-81d7-de04d71937f5","Type":"ContainerStarted","Data":"6dd39522fb795b3632eedaa72cdfe94d6078b6eafe7806bdcd7059623f4e2a3a"} Dec 01 09:00:02 crc kubenswrapper[5004]: I1201 09:00:02.192285 5004 generic.go:334] "Generic (PLEG): container finished" podID="a83901fd-e069-4f8c-81d7-de04d71937f5" containerID="362700c191262c4990d04a446016dfe8188328086e503082fd9a3d596f761576" exitCode=0 Dec 01 09:00:02 crc kubenswrapper[5004]: I1201 09:00:02.192367 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409660-ggjd9" event={"ID":"a83901fd-e069-4f8c-81d7-de04d71937f5","Type":"ContainerDied","Data":"362700c191262c4990d04a446016dfe8188328086e503082fd9a3d596f761576"} Dec 01 09:00:03 crc kubenswrapper[5004]: I1201 09:00:03.627197 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409660-ggjd9" Dec 01 09:00:03 crc kubenswrapper[5004]: I1201 09:00:03.659633 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a83901fd-e069-4f8c-81d7-de04d71937f5-secret-volume\") pod \"a83901fd-e069-4f8c-81d7-de04d71937f5\" (UID: \"a83901fd-e069-4f8c-81d7-de04d71937f5\") " Dec 01 09:00:03 crc kubenswrapper[5004]: I1201 09:00:03.660698 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdxnt\" (UniqueName: \"kubernetes.io/projected/a83901fd-e069-4f8c-81d7-de04d71937f5-kube-api-access-zdxnt\") pod \"a83901fd-e069-4f8c-81d7-de04d71937f5\" (UID: \"a83901fd-e069-4f8c-81d7-de04d71937f5\") " Dec 01 09:00:03 crc kubenswrapper[5004]: I1201 09:00:03.660839 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a83901fd-e069-4f8c-81d7-de04d71937f5-config-volume\") pod \"a83901fd-e069-4f8c-81d7-de04d71937f5\" (UID: \"a83901fd-e069-4f8c-81d7-de04d71937f5\") " Dec 01 09:00:03 crc kubenswrapper[5004]: I1201 09:00:03.661432 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a83901fd-e069-4f8c-81d7-de04d71937f5-config-volume" (OuterVolumeSpecName: "config-volume") pod "a83901fd-e069-4f8c-81d7-de04d71937f5" (UID: "a83901fd-e069-4f8c-81d7-de04d71937f5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:00:03 crc kubenswrapper[5004]: I1201 09:00:03.661948 5004 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a83901fd-e069-4f8c-81d7-de04d71937f5-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 09:00:03 crc kubenswrapper[5004]: I1201 09:00:03.671003 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a83901fd-e069-4f8c-81d7-de04d71937f5-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a83901fd-e069-4f8c-81d7-de04d71937f5" (UID: "a83901fd-e069-4f8c-81d7-de04d71937f5"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:00:03 crc kubenswrapper[5004]: I1201 09:00:03.671040 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a83901fd-e069-4f8c-81d7-de04d71937f5-kube-api-access-zdxnt" (OuterVolumeSpecName: "kube-api-access-zdxnt") pod "a83901fd-e069-4f8c-81d7-de04d71937f5" (UID: "a83901fd-e069-4f8c-81d7-de04d71937f5"). InnerVolumeSpecName "kube-api-access-zdxnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:00:03 crc kubenswrapper[5004]: I1201 09:00:03.764124 5004 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a83901fd-e069-4f8c-81d7-de04d71937f5-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 09:00:03 crc kubenswrapper[5004]: I1201 09:00:03.764150 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdxnt\" (UniqueName: \"kubernetes.io/projected/a83901fd-e069-4f8c-81d7-de04d71937f5-kube-api-access-zdxnt\") on node \"crc\" DevicePath \"\"" Dec 01 09:00:04 crc kubenswrapper[5004]: I1201 09:00:04.215979 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409660-ggjd9" event={"ID":"a83901fd-e069-4f8c-81d7-de04d71937f5","Type":"ContainerDied","Data":"6dd39522fb795b3632eedaa72cdfe94d6078b6eafe7806bdcd7059623f4e2a3a"} Dec 01 09:00:04 crc kubenswrapper[5004]: I1201 09:00:04.216029 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6dd39522fb795b3632eedaa72cdfe94d6078b6eafe7806bdcd7059623f4e2a3a" Dec 01 09:00:04 crc kubenswrapper[5004]: I1201 09:00:04.216046 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409660-ggjd9" Dec 01 09:00:04 crc kubenswrapper[5004]: I1201 09:00:04.733768 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409615-5d85l"] Dec 01 09:00:04 crc kubenswrapper[5004]: I1201 09:00:04.746471 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409615-5d85l"] Dec 01 09:00:04 crc kubenswrapper[5004]: I1201 09:00:04.759456 5004 scope.go:117] "RemoveContainer" containerID="35549cd96a84a4d57cde0021a5b3e7f3cc68411360397308b1e916325429b080" Dec 01 09:00:04 crc kubenswrapper[5004]: E1201 09:00:04.759912 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:00:04 crc kubenswrapper[5004]: I1201 09:00:04.774820 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f397145-18ab-4b43-b133-cc42f45bc852" path="/var/lib/kubelet/pods/4f397145-18ab-4b43-b133-cc42f45bc852/volumes" Dec 01 09:00:15 crc kubenswrapper[5004]: I1201 09:00:15.759291 5004 scope.go:117] "RemoveContainer" containerID="35549cd96a84a4d57cde0021a5b3e7f3cc68411360397308b1e916325429b080" Dec 01 09:00:15 crc kubenswrapper[5004]: E1201 09:00:15.760304 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:00:17 crc kubenswrapper[5004]: I1201 09:00:17.165353 5004 scope.go:117] "RemoveContainer" containerID="88637e7089d85bcdeb4f8e3236cd4a87fe28fff0b21507a073034b85e41e17b7" Dec 01 09:00:26 crc kubenswrapper[5004]: I1201 09:00:26.758931 5004 scope.go:117] "RemoveContainer" containerID="35549cd96a84a4d57cde0021a5b3e7f3cc68411360397308b1e916325429b080" Dec 01 09:00:26 crc kubenswrapper[5004]: E1201 09:00:26.759818 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:00:37 crc kubenswrapper[5004]: I1201 09:00:37.759304 5004 scope.go:117] "RemoveContainer" containerID="35549cd96a84a4d57cde0021a5b3e7f3cc68411360397308b1e916325429b080" Dec 01 09:00:37 crc kubenswrapper[5004]: E1201 09:00:37.760211 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:00:45 crc kubenswrapper[5004]: I1201 09:00:45.348414 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-m77z2"] Dec 01 09:00:45 crc kubenswrapper[5004]: E1201 09:00:45.349577 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a83901fd-e069-4f8c-81d7-de04d71937f5" containerName="collect-profiles" Dec 01 09:00:45 crc kubenswrapper[5004]: I1201 09:00:45.349591 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="a83901fd-e069-4f8c-81d7-de04d71937f5" containerName="collect-profiles" Dec 01 09:00:45 crc kubenswrapper[5004]: I1201 09:00:45.349863 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="a83901fd-e069-4f8c-81d7-de04d71937f5" containerName="collect-profiles" Dec 01 09:00:45 crc kubenswrapper[5004]: I1201 09:00:45.351802 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m77z2" Dec 01 09:00:45 crc kubenswrapper[5004]: I1201 09:00:45.377179 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m77z2"] Dec 01 09:00:45 crc kubenswrapper[5004]: I1201 09:00:45.586836 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97009300-9e05-4604-a960-bb770a9e7edf-catalog-content\") pod \"redhat-operators-m77z2\" (UID: \"97009300-9e05-4604-a960-bb770a9e7edf\") " pod="openshift-marketplace/redhat-operators-m77z2" Dec 01 09:00:45 crc kubenswrapper[5004]: I1201 09:00:45.586993 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97009300-9e05-4604-a960-bb770a9e7edf-utilities\") pod \"redhat-operators-m77z2\" (UID: \"97009300-9e05-4604-a960-bb770a9e7edf\") " pod="openshift-marketplace/redhat-operators-m77z2" Dec 01 09:00:45 crc kubenswrapper[5004]: I1201 09:00:45.587043 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zt2d7\" (UniqueName: \"kubernetes.io/projected/97009300-9e05-4604-a960-bb770a9e7edf-kube-api-access-zt2d7\") pod \"redhat-operators-m77z2\" (UID: \"97009300-9e05-4604-a960-bb770a9e7edf\") " pod="openshift-marketplace/redhat-operators-m77z2" Dec 01 09:00:45 crc kubenswrapper[5004]: I1201 09:00:45.689424 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97009300-9e05-4604-a960-bb770a9e7edf-utilities\") pod \"redhat-operators-m77z2\" (UID: \"97009300-9e05-4604-a960-bb770a9e7edf\") " pod="openshift-marketplace/redhat-operators-m77z2" Dec 01 09:00:45 crc kubenswrapper[5004]: I1201 09:00:45.689511 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zt2d7\" (UniqueName: \"kubernetes.io/projected/97009300-9e05-4604-a960-bb770a9e7edf-kube-api-access-zt2d7\") pod \"redhat-operators-m77z2\" (UID: \"97009300-9e05-4604-a960-bb770a9e7edf\") " pod="openshift-marketplace/redhat-operators-m77z2" Dec 01 09:00:45 crc kubenswrapper[5004]: I1201 09:00:45.689633 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97009300-9e05-4604-a960-bb770a9e7edf-catalog-content\") pod \"redhat-operators-m77z2\" (UID: \"97009300-9e05-4604-a960-bb770a9e7edf\") " pod="openshift-marketplace/redhat-operators-m77z2" Dec 01 09:00:45 crc kubenswrapper[5004]: I1201 09:00:45.690078 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97009300-9e05-4604-a960-bb770a9e7edf-utilities\") pod \"redhat-operators-m77z2\" (UID: \"97009300-9e05-4604-a960-bb770a9e7edf\") " pod="openshift-marketplace/redhat-operators-m77z2" Dec 01 09:00:45 crc kubenswrapper[5004]: I1201 09:00:45.690093 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97009300-9e05-4604-a960-bb770a9e7edf-catalog-content\") pod \"redhat-operators-m77z2\" (UID: \"97009300-9e05-4604-a960-bb770a9e7edf\") " pod="openshift-marketplace/redhat-operators-m77z2" Dec 01 09:00:45 crc kubenswrapper[5004]: I1201 09:00:45.714970 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zt2d7\" (UniqueName: \"kubernetes.io/projected/97009300-9e05-4604-a960-bb770a9e7edf-kube-api-access-zt2d7\") pod \"redhat-operators-m77z2\" (UID: \"97009300-9e05-4604-a960-bb770a9e7edf\") " pod="openshift-marketplace/redhat-operators-m77z2" Dec 01 09:00:45 crc kubenswrapper[5004]: I1201 09:00:45.821994 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m77z2" Dec 01 09:00:46 crc kubenswrapper[5004]: I1201 09:00:46.329956 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m77z2"] Dec 01 09:00:46 crc kubenswrapper[5004]: I1201 09:00:46.736991 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m77z2" event={"ID":"97009300-9e05-4604-a960-bb770a9e7edf","Type":"ContainerStarted","Data":"879fe80b6af958a0e5e4b02ad778e7c37199bb80f9a675dfc1bf8a6eebedcf4d"} Dec 01 09:00:46 crc kubenswrapper[5004]: I1201 09:00:46.737047 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m77z2" event={"ID":"97009300-9e05-4604-a960-bb770a9e7edf","Type":"ContainerStarted","Data":"70969cc8aa681bf347d7c11b74f1bcbd02cc7677b559c8d2e4883ebb1cdc3f92"} Dec 01 09:00:47 crc kubenswrapper[5004]: I1201 09:00:47.749652 5004 generic.go:334] "Generic (PLEG): container finished" podID="97009300-9e05-4604-a960-bb770a9e7edf" containerID="879fe80b6af958a0e5e4b02ad778e7c37199bb80f9a675dfc1bf8a6eebedcf4d" exitCode=0 Dec 01 09:00:47 crc kubenswrapper[5004]: I1201 09:00:47.749762 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m77z2" event={"ID":"97009300-9e05-4604-a960-bb770a9e7edf","Type":"ContainerDied","Data":"879fe80b6af958a0e5e4b02ad778e7c37199bb80f9a675dfc1bf8a6eebedcf4d"} Dec 01 09:00:47 crc kubenswrapper[5004]: I1201 09:00:47.754049 5004 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 09:00:52 crc kubenswrapper[5004]: I1201 09:00:52.773509 5004 scope.go:117] "RemoveContainer" containerID="35549cd96a84a4d57cde0021a5b3e7f3cc68411360397308b1e916325429b080" Dec 01 09:00:52 crc kubenswrapper[5004]: E1201 09:00:52.775015 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:00:52 crc kubenswrapper[5004]: I1201 09:00:52.814018 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m77z2" event={"ID":"97009300-9e05-4604-a960-bb770a9e7edf","Type":"ContainerStarted","Data":"8be8bc6cfe0cb3a8d5ffb9f28b7b7d192b673389f173df1554e76aa44615da83"} Dec 01 09:00:55 crc kubenswrapper[5004]: I1201 09:00:55.872400 5004 generic.go:334] "Generic (PLEG): container finished" podID="97009300-9e05-4604-a960-bb770a9e7edf" containerID="8be8bc6cfe0cb3a8d5ffb9f28b7b7d192b673389f173df1554e76aa44615da83" exitCode=0 Dec 01 09:00:55 crc kubenswrapper[5004]: I1201 09:00:55.872940 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m77z2" event={"ID":"97009300-9e05-4604-a960-bb770a9e7edf","Type":"ContainerDied","Data":"8be8bc6cfe0cb3a8d5ffb9f28b7b7d192b673389f173df1554e76aa44615da83"} Dec 01 09:00:57 crc kubenswrapper[5004]: I1201 09:00:57.900791 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m77z2" event={"ID":"97009300-9e05-4604-a960-bb770a9e7edf","Type":"ContainerStarted","Data":"d6f77387e13c8bdb5fc51f7776b06794c6b24839fb09936feabaaf3652025578"} Dec 01 09:00:57 crc kubenswrapper[5004]: I1201 09:00:57.928454 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-m77z2" podStartSLOduration=4.052340584 podStartE2EDuration="12.928434984s" podCreationTimestamp="2025-12-01 09:00:45 +0000 UTC" firstStartedPulling="2025-12-01 09:00:47.753688886 +0000 UTC m=+2625.318680878" lastFinishedPulling="2025-12-01 09:00:56.629783286 +0000 UTC m=+2634.194775278" observedRunningTime="2025-12-01 09:00:57.918776118 +0000 UTC m=+2635.483768090" watchObservedRunningTime="2025-12-01 09:00:57.928434984 +0000 UTC m=+2635.493426966" Dec 01 09:01:00 crc kubenswrapper[5004]: I1201 09:01:00.158579 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29409661-kbzbs"] Dec 01 09:01:00 crc kubenswrapper[5004]: I1201 09:01:00.160393 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29409661-kbzbs" Dec 01 09:01:00 crc kubenswrapper[5004]: I1201 09:01:00.193177 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29409661-kbzbs"] Dec 01 09:01:00 crc kubenswrapper[5004]: I1201 09:01:00.226476 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbh7d\" (UniqueName: \"kubernetes.io/projected/d92bc2f4-8519-45f5-bf8c-f825ce955687-kube-api-access-xbh7d\") pod \"keystone-cron-29409661-kbzbs\" (UID: \"d92bc2f4-8519-45f5-bf8c-f825ce955687\") " pod="openstack/keystone-cron-29409661-kbzbs" Dec 01 09:01:00 crc kubenswrapper[5004]: I1201 09:01:00.226655 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d92bc2f4-8519-45f5-bf8c-f825ce955687-config-data\") pod \"keystone-cron-29409661-kbzbs\" (UID: \"d92bc2f4-8519-45f5-bf8c-f825ce955687\") " pod="openstack/keystone-cron-29409661-kbzbs" Dec 01 09:01:00 crc kubenswrapper[5004]: I1201 09:01:00.226709 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d92bc2f4-8519-45f5-bf8c-f825ce955687-combined-ca-bundle\") pod \"keystone-cron-29409661-kbzbs\" (UID: \"d92bc2f4-8519-45f5-bf8c-f825ce955687\") " pod="openstack/keystone-cron-29409661-kbzbs" Dec 01 09:01:00 crc kubenswrapper[5004]: I1201 09:01:00.226783 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d92bc2f4-8519-45f5-bf8c-f825ce955687-fernet-keys\") pod \"keystone-cron-29409661-kbzbs\" (UID: \"d92bc2f4-8519-45f5-bf8c-f825ce955687\") " pod="openstack/keystone-cron-29409661-kbzbs" Dec 01 09:01:00 crc kubenswrapper[5004]: I1201 09:01:00.328322 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbh7d\" (UniqueName: \"kubernetes.io/projected/d92bc2f4-8519-45f5-bf8c-f825ce955687-kube-api-access-xbh7d\") pod \"keystone-cron-29409661-kbzbs\" (UID: \"d92bc2f4-8519-45f5-bf8c-f825ce955687\") " pod="openstack/keystone-cron-29409661-kbzbs" Dec 01 09:01:00 crc kubenswrapper[5004]: I1201 09:01:00.328440 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d92bc2f4-8519-45f5-bf8c-f825ce955687-config-data\") pod \"keystone-cron-29409661-kbzbs\" (UID: \"d92bc2f4-8519-45f5-bf8c-f825ce955687\") " pod="openstack/keystone-cron-29409661-kbzbs" Dec 01 09:01:00 crc kubenswrapper[5004]: I1201 09:01:00.328472 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d92bc2f4-8519-45f5-bf8c-f825ce955687-combined-ca-bundle\") pod \"keystone-cron-29409661-kbzbs\" (UID: \"d92bc2f4-8519-45f5-bf8c-f825ce955687\") " pod="openstack/keystone-cron-29409661-kbzbs" Dec 01 09:01:00 crc kubenswrapper[5004]: I1201 09:01:00.328523 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d92bc2f4-8519-45f5-bf8c-f825ce955687-fernet-keys\") pod \"keystone-cron-29409661-kbzbs\" (UID: \"d92bc2f4-8519-45f5-bf8c-f825ce955687\") " pod="openstack/keystone-cron-29409661-kbzbs" Dec 01 09:01:00 crc kubenswrapper[5004]: I1201 09:01:00.334490 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d92bc2f4-8519-45f5-bf8c-f825ce955687-fernet-keys\") pod \"keystone-cron-29409661-kbzbs\" (UID: \"d92bc2f4-8519-45f5-bf8c-f825ce955687\") " pod="openstack/keystone-cron-29409661-kbzbs" Dec 01 09:01:00 crc kubenswrapper[5004]: I1201 09:01:00.336889 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d92bc2f4-8519-45f5-bf8c-f825ce955687-combined-ca-bundle\") pod \"keystone-cron-29409661-kbzbs\" (UID: \"d92bc2f4-8519-45f5-bf8c-f825ce955687\") " pod="openstack/keystone-cron-29409661-kbzbs" Dec 01 09:01:00 crc kubenswrapper[5004]: I1201 09:01:00.337758 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d92bc2f4-8519-45f5-bf8c-f825ce955687-config-data\") pod \"keystone-cron-29409661-kbzbs\" (UID: \"d92bc2f4-8519-45f5-bf8c-f825ce955687\") " pod="openstack/keystone-cron-29409661-kbzbs" Dec 01 09:01:00 crc kubenswrapper[5004]: I1201 09:01:00.356896 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbh7d\" (UniqueName: \"kubernetes.io/projected/d92bc2f4-8519-45f5-bf8c-f825ce955687-kube-api-access-xbh7d\") pod \"keystone-cron-29409661-kbzbs\" (UID: \"d92bc2f4-8519-45f5-bf8c-f825ce955687\") " pod="openstack/keystone-cron-29409661-kbzbs" Dec 01 09:01:00 crc kubenswrapper[5004]: I1201 09:01:00.485950 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29409661-kbzbs" Dec 01 09:01:01 crc kubenswrapper[5004]: I1201 09:01:01.035427 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29409661-kbzbs"] Dec 01 09:01:01 crc kubenswrapper[5004]: I1201 09:01:01.944976 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29409661-kbzbs" event={"ID":"d92bc2f4-8519-45f5-bf8c-f825ce955687","Type":"ContainerStarted","Data":"4b318cb33849dc37649f4c77d2499052d287e7ce36d836c1c6866d29eea136bf"} Dec 01 09:01:01 crc kubenswrapper[5004]: I1201 09:01:01.945355 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29409661-kbzbs" event={"ID":"d92bc2f4-8519-45f5-bf8c-f825ce955687","Type":"ContainerStarted","Data":"c5ed92ee24673a57556a4b7a4dca2250961efa21c2dbd8d358a1d0dc4f14380b"} Dec 01 09:01:01 crc kubenswrapper[5004]: I1201 09:01:01.965656 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29409661-kbzbs" podStartSLOduration=1.9656313509999999 podStartE2EDuration="1.965631351s" podCreationTimestamp="2025-12-01 09:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:01:01.960431715 +0000 UTC m=+2639.525423697" watchObservedRunningTime="2025-12-01 09:01:01.965631351 +0000 UTC m=+2639.530623373" Dec 01 09:01:04 crc kubenswrapper[5004]: I1201 09:01:04.002537 5004 generic.go:334] "Generic (PLEG): container finished" podID="d92bc2f4-8519-45f5-bf8c-f825ce955687" containerID="4b318cb33849dc37649f4c77d2499052d287e7ce36d836c1c6866d29eea136bf" exitCode=0 Dec 01 09:01:04 crc kubenswrapper[5004]: I1201 09:01:04.002611 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29409661-kbzbs" event={"ID":"d92bc2f4-8519-45f5-bf8c-f825ce955687","Type":"ContainerDied","Data":"4b318cb33849dc37649f4c77d2499052d287e7ce36d836c1c6866d29eea136bf"} Dec 01 09:01:04 crc kubenswrapper[5004]: I1201 09:01:04.759368 5004 scope.go:117] "RemoveContainer" containerID="35549cd96a84a4d57cde0021a5b3e7f3cc68411360397308b1e916325429b080" Dec 01 09:01:04 crc kubenswrapper[5004]: E1201 09:01:04.759702 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:01:05 crc kubenswrapper[5004]: I1201 09:01:05.410037 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29409661-kbzbs" Dec 01 09:01:05 crc kubenswrapper[5004]: I1201 09:01:05.567538 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d92bc2f4-8519-45f5-bf8c-f825ce955687-fernet-keys\") pod \"d92bc2f4-8519-45f5-bf8c-f825ce955687\" (UID: \"d92bc2f4-8519-45f5-bf8c-f825ce955687\") " Dec 01 09:01:05 crc kubenswrapper[5004]: I1201 09:01:05.567679 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d92bc2f4-8519-45f5-bf8c-f825ce955687-config-data\") pod \"d92bc2f4-8519-45f5-bf8c-f825ce955687\" (UID: \"d92bc2f4-8519-45f5-bf8c-f825ce955687\") " Dec 01 09:01:05 crc kubenswrapper[5004]: I1201 09:01:05.567737 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbh7d\" (UniqueName: \"kubernetes.io/projected/d92bc2f4-8519-45f5-bf8c-f825ce955687-kube-api-access-xbh7d\") pod \"d92bc2f4-8519-45f5-bf8c-f825ce955687\" (UID: \"d92bc2f4-8519-45f5-bf8c-f825ce955687\") " Dec 01 09:01:05 crc kubenswrapper[5004]: I1201 09:01:05.567766 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d92bc2f4-8519-45f5-bf8c-f825ce955687-combined-ca-bundle\") pod \"d92bc2f4-8519-45f5-bf8c-f825ce955687\" (UID: \"d92bc2f4-8519-45f5-bf8c-f825ce955687\") " Dec 01 09:01:05 crc kubenswrapper[5004]: I1201 09:01:05.579731 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d92bc2f4-8519-45f5-bf8c-f825ce955687-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "d92bc2f4-8519-45f5-bf8c-f825ce955687" (UID: "d92bc2f4-8519-45f5-bf8c-f825ce955687"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:01:05 crc kubenswrapper[5004]: I1201 09:01:05.581721 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d92bc2f4-8519-45f5-bf8c-f825ce955687-kube-api-access-xbh7d" (OuterVolumeSpecName: "kube-api-access-xbh7d") pod "d92bc2f4-8519-45f5-bf8c-f825ce955687" (UID: "d92bc2f4-8519-45f5-bf8c-f825ce955687"). InnerVolumeSpecName "kube-api-access-xbh7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:01:05 crc kubenswrapper[5004]: I1201 09:01:05.601397 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d92bc2f4-8519-45f5-bf8c-f825ce955687-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d92bc2f4-8519-45f5-bf8c-f825ce955687" (UID: "d92bc2f4-8519-45f5-bf8c-f825ce955687"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:01:05 crc kubenswrapper[5004]: I1201 09:01:05.635138 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d92bc2f4-8519-45f5-bf8c-f825ce955687-config-data" (OuterVolumeSpecName: "config-data") pod "d92bc2f4-8519-45f5-bf8c-f825ce955687" (UID: "d92bc2f4-8519-45f5-bf8c-f825ce955687"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:01:05 crc kubenswrapper[5004]: I1201 09:01:05.671367 5004 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d92bc2f4-8519-45f5-bf8c-f825ce955687-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 01 09:01:05 crc kubenswrapper[5004]: I1201 09:01:05.671415 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d92bc2f4-8519-45f5-bf8c-f825ce955687-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:01:05 crc kubenswrapper[5004]: I1201 09:01:05.671435 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbh7d\" (UniqueName: \"kubernetes.io/projected/d92bc2f4-8519-45f5-bf8c-f825ce955687-kube-api-access-xbh7d\") on node \"crc\" DevicePath \"\"" Dec 01 09:01:05 crc kubenswrapper[5004]: I1201 09:01:05.671455 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d92bc2f4-8519-45f5-bf8c-f825ce955687-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:01:05 crc kubenswrapper[5004]: I1201 09:01:05.822246 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-m77z2" Dec 01 09:01:05 crc kubenswrapper[5004]: I1201 09:01:05.822659 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-m77z2" Dec 01 09:01:06 crc kubenswrapper[5004]: I1201 09:01:06.026161 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29409661-kbzbs" Dec 01 09:01:06 crc kubenswrapper[5004]: I1201 09:01:06.026222 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29409661-kbzbs" event={"ID":"d92bc2f4-8519-45f5-bf8c-f825ce955687","Type":"ContainerDied","Data":"c5ed92ee24673a57556a4b7a4dca2250961efa21c2dbd8d358a1d0dc4f14380b"} Dec 01 09:01:06 crc kubenswrapper[5004]: I1201 09:01:06.026255 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5ed92ee24673a57556a4b7a4dca2250961efa21c2dbd8d358a1d0dc4f14380b" Dec 01 09:01:06 crc kubenswrapper[5004]: I1201 09:01:06.878035 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-m77z2" podUID="97009300-9e05-4604-a960-bb770a9e7edf" containerName="registry-server" probeResult="failure" output=< Dec 01 09:01:06 crc kubenswrapper[5004]: timeout: failed to connect service ":50051" within 1s Dec 01 09:01:06 crc kubenswrapper[5004]: > Dec 01 09:01:15 crc kubenswrapper[5004]: I1201 09:01:15.915978 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-m77z2" Dec 01 09:01:15 crc kubenswrapper[5004]: I1201 09:01:15.974611 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-m77z2" Dec 01 09:01:16 crc kubenswrapper[5004]: I1201 09:01:16.549718 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m77z2"] Dec 01 09:01:17 crc kubenswrapper[5004]: I1201 09:01:17.143018 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-m77z2" podUID="97009300-9e05-4604-a960-bb770a9e7edf" containerName="registry-server" containerID="cri-o://d6f77387e13c8bdb5fc51f7776b06794c6b24839fb09936feabaaf3652025578" gracePeriod=2 Dec 01 09:01:17 crc kubenswrapper[5004]: I1201 09:01:17.669172 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m77z2" Dec 01 09:01:17 crc kubenswrapper[5004]: I1201 09:01:17.759830 5004 scope.go:117] "RemoveContainer" containerID="35549cd96a84a4d57cde0021a5b3e7f3cc68411360397308b1e916325429b080" Dec 01 09:01:17 crc kubenswrapper[5004]: E1201 09:01:17.760217 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:01:17 crc kubenswrapper[5004]: I1201 09:01:17.761590 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97009300-9e05-4604-a960-bb770a9e7edf-catalog-content\") pod \"97009300-9e05-4604-a960-bb770a9e7edf\" (UID: \"97009300-9e05-4604-a960-bb770a9e7edf\") " Dec 01 09:01:17 crc kubenswrapper[5004]: I1201 09:01:17.761824 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97009300-9e05-4604-a960-bb770a9e7edf-utilities\") pod \"97009300-9e05-4604-a960-bb770a9e7edf\" (UID: \"97009300-9e05-4604-a960-bb770a9e7edf\") " Dec 01 09:01:17 crc kubenswrapper[5004]: I1201 09:01:17.761869 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zt2d7\" (UniqueName: \"kubernetes.io/projected/97009300-9e05-4604-a960-bb770a9e7edf-kube-api-access-zt2d7\") pod \"97009300-9e05-4604-a960-bb770a9e7edf\" (UID: \"97009300-9e05-4604-a960-bb770a9e7edf\") " Dec 01 09:01:17 crc kubenswrapper[5004]: I1201 09:01:17.762788 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97009300-9e05-4604-a960-bb770a9e7edf-utilities" (OuterVolumeSpecName: "utilities") pod "97009300-9e05-4604-a960-bb770a9e7edf" (UID: "97009300-9e05-4604-a960-bb770a9e7edf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:01:17 crc kubenswrapper[5004]: I1201 09:01:17.763283 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97009300-9e05-4604-a960-bb770a9e7edf-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:01:17 crc kubenswrapper[5004]: I1201 09:01:17.778488 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97009300-9e05-4604-a960-bb770a9e7edf-kube-api-access-zt2d7" (OuterVolumeSpecName: "kube-api-access-zt2d7") pod "97009300-9e05-4604-a960-bb770a9e7edf" (UID: "97009300-9e05-4604-a960-bb770a9e7edf"). InnerVolumeSpecName "kube-api-access-zt2d7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:01:17 crc kubenswrapper[5004]: I1201 09:01:17.865871 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zt2d7\" (UniqueName: \"kubernetes.io/projected/97009300-9e05-4604-a960-bb770a9e7edf-kube-api-access-zt2d7\") on node \"crc\" DevicePath \"\"" Dec 01 09:01:17 crc kubenswrapper[5004]: I1201 09:01:17.887331 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97009300-9e05-4604-a960-bb770a9e7edf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "97009300-9e05-4604-a960-bb770a9e7edf" (UID: "97009300-9e05-4604-a960-bb770a9e7edf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:01:17 crc kubenswrapper[5004]: I1201 09:01:17.968607 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97009300-9e05-4604-a960-bb770a9e7edf-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:01:18 crc kubenswrapper[5004]: I1201 09:01:18.157031 5004 generic.go:334] "Generic (PLEG): container finished" podID="97009300-9e05-4604-a960-bb770a9e7edf" containerID="d6f77387e13c8bdb5fc51f7776b06794c6b24839fb09936feabaaf3652025578" exitCode=0 Dec 01 09:01:18 crc kubenswrapper[5004]: I1201 09:01:18.157089 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m77z2" event={"ID":"97009300-9e05-4604-a960-bb770a9e7edf","Type":"ContainerDied","Data":"d6f77387e13c8bdb5fc51f7776b06794c6b24839fb09936feabaaf3652025578"} Dec 01 09:01:18 crc kubenswrapper[5004]: I1201 09:01:18.157140 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m77z2" event={"ID":"97009300-9e05-4604-a960-bb770a9e7edf","Type":"ContainerDied","Data":"70969cc8aa681bf347d7c11b74f1bcbd02cc7677b559c8d2e4883ebb1cdc3f92"} Dec 01 09:01:18 crc kubenswrapper[5004]: I1201 09:01:18.157165 5004 scope.go:117] "RemoveContainer" containerID="d6f77387e13c8bdb5fc51f7776b06794c6b24839fb09936feabaaf3652025578" Dec 01 09:01:18 crc kubenswrapper[5004]: I1201 09:01:18.158163 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m77z2" Dec 01 09:01:18 crc kubenswrapper[5004]: I1201 09:01:18.180053 5004 scope.go:117] "RemoveContainer" containerID="8be8bc6cfe0cb3a8d5ffb9f28b7b7d192b673389f173df1554e76aa44615da83" Dec 01 09:01:18 crc kubenswrapper[5004]: I1201 09:01:18.209023 5004 scope.go:117] "RemoveContainer" containerID="879fe80b6af958a0e5e4b02ad778e7c37199bb80f9a675dfc1bf8a6eebedcf4d" Dec 01 09:01:18 crc kubenswrapper[5004]: I1201 09:01:18.215900 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m77z2"] Dec 01 09:01:18 crc kubenswrapper[5004]: I1201 09:01:18.232870 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-m77z2"] Dec 01 09:01:18 crc kubenswrapper[5004]: I1201 09:01:18.276226 5004 scope.go:117] "RemoveContainer" containerID="d6f77387e13c8bdb5fc51f7776b06794c6b24839fb09936feabaaf3652025578" Dec 01 09:01:18 crc kubenswrapper[5004]: E1201 09:01:18.276727 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6f77387e13c8bdb5fc51f7776b06794c6b24839fb09936feabaaf3652025578\": container with ID starting with d6f77387e13c8bdb5fc51f7776b06794c6b24839fb09936feabaaf3652025578 not found: ID does not exist" containerID="d6f77387e13c8bdb5fc51f7776b06794c6b24839fb09936feabaaf3652025578" Dec 01 09:01:18 crc kubenswrapper[5004]: I1201 09:01:18.276778 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6f77387e13c8bdb5fc51f7776b06794c6b24839fb09936feabaaf3652025578"} err="failed to get container status \"d6f77387e13c8bdb5fc51f7776b06794c6b24839fb09936feabaaf3652025578\": rpc error: code = NotFound desc = could not find container \"d6f77387e13c8bdb5fc51f7776b06794c6b24839fb09936feabaaf3652025578\": container with ID starting with d6f77387e13c8bdb5fc51f7776b06794c6b24839fb09936feabaaf3652025578 not found: ID does not exist" Dec 01 09:01:18 crc kubenswrapper[5004]: I1201 09:01:18.276809 5004 scope.go:117] "RemoveContainer" containerID="8be8bc6cfe0cb3a8d5ffb9f28b7b7d192b673389f173df1554e76aa44615da83" Dec 01 09:01:18 crc kubenswrapper[5004]: E1201 09:01:18.277211 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8be8bc6cfe0cb3a8d5ffb9f28b7b7d192b673389f173df1554e76aa44615da83\": container with ID starting with 8be8bc6cfe0cb3a8d5ffb9f28b7b7d192b673389f173df1554e76aa44615da83 not found: ID does not exist" containerID="8be8bc6cfe0cb3a8d5ffb9f28b7b7d192b673389f173df1554e76aa44615da83" Dec 01 09:01:18 crc kubenswrapper[5004]: I1201 09:01:18.277282 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8be8bc6cfe0cb3a8d5ffb9f28b7b7d192b673389f173df1554e76aa44615da83"} err="failed to get container status \"8be8bc6cfe0cb3a8d5ffb9f28b7b7d192b673389f173df1554e76aa44615da83\": rpc error: code = NotFound desc = could not find container \"8be8bc6cfe0cb3a8d5ffb9f28b7b7d192b673389f173df1554e76aa44615da83\": container with ID starting with 8be8bc6cfe0cb3a8d5ffb9f28b7b7d192b673389f173df1554e76aa44615da83 not found: ID does not exist" Dec 01 09:01:18 crc kubenswrapper[5004]: I1201 09:01:18.277310 5004 scope.go:117] "RemoveContainer" containerID="879fe80b6af958a0e5e4b02ad778e7c37199bb80f9a675dfc1bf8a6eebedcf4d" Dec 01 09:01:18 crc kubenswrapper[5004]: E1201 09:01:18.277825 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"879fe80b6af958a0e5e4b02ad778e7c37199bb80f9a675dfc1bf8a6eebedcf4d\": container with ID starting with 879fe80b6af958a0e5e4b02ad778e7c37199bb80f9a675dfc1bf8a6eebedcf4d not found: ID does not exist" containerID="879fe80b6af958a0e5e4b02ad778e7c37199bb80f9a675dfc1bf8a6eebedcf4d" Dec 01 09:01:18 crc kubenswrapper[5004]: I1201 09:01:18.277867 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"879fe80b6af958a0e5e4b02ad778e7c37199bb80f9a675dfc1bf8a6eebedcf4d"} err="failed to get container status \"879fe80b6af958a0e5e4b02ad778e7c37199bb80f9a675dfc1bf8a6eebedcf4d\": rpc error: code = NotFound desc = could not find container \"879fe80b6af958a0e5e4b02ad778e7c37199bb80f9a675dfc1bf8a6eebedcf4d\": container with ID starting with 879fe80b6af958a0e5e4b02ad778e7c37199bb80f9a675dfc1bf8a6eebedcf4d not found: ID does not exist" Dec 01 09:01:18 crc kubenswrapper[5004]: I1201 09:01:18.771984 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97009300-9e05-4604-a960-bb770a9e7edf" path="/var/lib/kubelet/pods/97009300-9e05-4604-a960-bb770a9e7edf/volumes" Dec 01 09:01:30 crc kubenswrapper[5004]: I1201 09:01:30.759520 5004 scope.go:117] "RemoveContainer" containerID="35549cd96a84a4d57cde0021a5b3e7f3cc68411360397308b1e916325429b080" Dec 01 09:01:30 crc kubenswrapper[5004]: E1201 09:01:30.761343 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:01:45 crc kubenswrapper[5004]: I1201 09:01:45.759545 5004 scope.go:117] "RemoveContainer" containerID="35549cd96a84a4d57cde0021a5b3e7f3cc68411360397308b1e916325429b080" Dec 01 09:01:47 crc kubenswrapper[5004]: I1201 09:01:47.531819 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" event={"ID":"f9977ebb-82de-4e96-8763-0b5a84f8d4ce","Type":"ContainerStarted","Data":"a78d694ae3feedb53c9bcdd2efb1f355640963d1a6d9fa69392d3b2ab8f0f9a6"} Dec 01 09:03:43 crc kubenswrapper[5004]: I1201 09:03:43.037050 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bm49b"] Dec 01 09:03:43 crc kubenswrapper[5004]: E1201 09:03:43.038037 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d92bc2f4-8519-45f5-bf8c-f825ce955687" containerName="keystone-cron" Dec 01 09:03:43 crc kubenswrapper[5004]: I1201 09:03:43.038052 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="d92bc2f4-8519-45f5-bf8c-f825ce955687" containerName="keystone-cron" Dec 01 09:03:43 crc kubenswrapper[5004]: E1201 09:03:43.038067 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97009300-9e05-4604-a960-bb770a9e7edf" containerName="extract-content" Dec 01 09:03:43 crc kubenswrapper[5004]: I1201 09:03:43.038073 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="97009300-9e05-4604-a960-bb770a9e7edf" containerName="extract-content" Dec 01 09:03:43 crc kubenswrapper[5004]: E1201 09:03:43.038093 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97009300-9e05-4604-a960-bb770a9e7edf" containerName="extract-utilities" Dec 01 09:03:43 crc kubenswrapper[5004]: I1201 09:03:43.038099 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="97009300-9e05-4604-a960-bb770a9e7edf" containerName="extract-utilities" Dec 01 09:03:43 crc kubenswrapper[5004]: E1201 09:03:43.038107 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97009300-9e05-4604-a960-bb770a9e7edf" containerName="registry-server" Dec 01 09:03:43 crc kubenswrapper[5004]: I1201 09:03:43.038112 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="97009300-9e05-4604-a960-bb770a9e7edf" containerName="registry-server" Dec 01 09:03:43 crc kubenswrapper[5004]: I1201 09:03:43.038368 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="d92bc2f4-8519-45f5-bf8c-f825ce955687" containerName="keystone-cron" Dec 01 09:03:43 crc kubenswrapper[5004]: I1201 09:03:43.038385 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="97009300-9e05-4604-a960-bb770a9e7edf" containerName="registry-server" Dec 01 09:03:43 crc kubenswrapper[5004]: I1201 09:03:43.040070 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bm49b" Dec 01 09:03:43 crc kubenswrapper[5004]: I1201 09:03:43.064418 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bm49b"] Dec 01 09:03:43 crc kubenswrapper[5004]: I1201 09:03:43.139689 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/313b4cdd-a226-424f-a049-3d0fe083d307-utilities\") pod \"community-operators-bm49b\" (UID: \"313b4cdd-a226-424f-a049-3d0fe083d307\") " pod="openshift-marketplace/community-operators-bm49b" Dec 01 09:03:43 crc kubenswrapper[5004]: I1201 09:03:43.139747 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/313b4cdd-a226-424f-a049-3d0fe083d307-catalog-content\") pod \"community-operators-bm49b\" (UID: \"313b4cdd-a226-424f-a049-3d0fe083d307\") " pod="openshift-marketplace/community-operators-bm49b" Dec 01 09:03:43 crc kubenswrapper[5004]: I1201 09:03:43.139818 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vp78\" (UniqueName: \"kubernetes.io/projected/313b4cdd-a226-424f-a049-3d0fe083d307-kube-api-access-4vp78\") pod \"community-operators-bm49b\" (UID: \"313b4cdd-a226-424f-a049-3d0fe083d307\") " pod="openshift-marketplace/community-operators-bm49b" Dec 01 09:03:43 crc kubenswrapper[5004]: I1201 09:03:43.242865 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vp78\" (UniqueName: \"kubernetes.io/projected/313b4cdd-a226-424f-a049-3d0fe083d307-kube-api-access-4vp78\") pod \"community-operators-bm49b\" (UID: \"313b4cdd-a226-424f-a049-3d0fe083d307\") " pod="openshift-marketplace/community-operators-bm49b" Dec 01 09:03:43 crc kubenswrapper[5004]: I1201 09:03:43.243455 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/313b4cdd-a226-424f-a049-3d0fe083d307-utilities\") pod \"community-operators-bm49b\" (UID: \"313b4cdd-a226-424f-a049-3d0fe083d307\") " pod="openshift-marketplace/community-operators-bm49b" Dec 01 09:03:43 crc kubenswrapper[5004]: I1201 09:03:43.243607 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/313b4cdd-a226-424f-a049-3d0fe083d307-catalog-content\") pod \"community-operators-bm49b\" (UID: \"313b4cdd-a226-424f-a049-3d0fe083d307\") " pod="openshift-marketplace/community-operators-bm49b" Dec 01 09:03:43 crc kubenswrapper[5004]: I1201 09:03:43.243906 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/313b4cdd-a226-424f-a049-3d0fe083d307-utilities\") pod \"community-operators-bm49b\" (UID: \"313b4cdd-a226-424f-a049-3d0fe083d307\") " pod="openshift-marketplace/community-operators-bm49b" Dec 01 09:03:43 crc kubenswrapper[5004]: I1201 09:03:43.243942 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/313b4cdd-a226-424f-a049-3d0fe083d307-catalog-content\") pod \"community-operators-bm49b\" (UID: \"313b4cdd-a226-424f-a049-3d0fe083d307\") " pod="openshift-marketplace/community-operators-bm49b" Dec 01 09:03:43 crc kubenswrapper[5004]: I1201 09:03:43.264582 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vp78\" (UniqueName: \"kubernetes.io/projected/313b4cdd-a226-424f-a049-3d0fe083d307-kube-api-access-4vp78\") pod \"community-operators-bm49b\" (UID: \"313b4cdd-a226-424f-a049-3d0fe083d307\") " pod="openshift-marketplace/community-operators-bm49b" Dec 01 09:03:43 crc kubenswrapper[5004]: I1201 09:03:43.406828 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bm49b" Dec 01 09:03:44 crc kubenswrapper[5004]: I1201 09:03:44.041923 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bm49b"] Dec 01 09:03:44 crc kubenswrapper[5004]: I1201 09:03:44.262751 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bm49b" event={"ID":"313b4cdd-a226-424f-a049-3d0fe083d307","Type":"ContainerStarted","Data":"c0870b98086d101762b5d155ad72d1412716152b55ba5d35c1c8da6d731a51ce"} Dec 01 09:03:45 crc kubenswrapper[5004]: I1201 09:03:45.274898 5004 generic.go:334] "Generic (PLEG): container finished" podID="313b4cdd-a226-424f-a049-3d0fe083d307" containerID="0ac43adfb08abb85899b3718c720d6be71c15b4f59dc81291ea2e7fa65192f19" exitCode=0 Dec 01 09:03:45 crc kubenswrapper[5004]: I1201 09:03:45.274965 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bm49b" event={"ID":"313b4cdd-a226-424f-a049-3d0fe083d307","Type":"ContainerDied","Data":"0ac43adfb08abb85899b3718c720d6be71c15b4f59dc81291ea2e7fa65192f19"} Dec 01 09:03:47 crc kubenswrapper[5004]: I1201 09:03:47.299954 5004 generic.go:334] "Generic (PLEG): container finished" podID="313b4cdd-a226-424f-a049-3d0fe083d307" containerID="639f6ce89b9196ff9aa9e0bd3331c6a275c3839e43f912566f310243855f8d90" exitCode=0 Dec 01 09:03:47 crc kubenswrapper[5004]: I1201 09:03:47.300052 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bm49b" event={"ID":"313b4cdd-a226-424f-a049-3d0fe083d307","Type":"ContainerDied","Data":"639f6ce89b9196ff9aa9e0bd3331c6a275c3839e43f912566f310243855f8d90"} Dec 01 09:03:49 crc kubenswrapper[5004]: I1201 09:03:49.329847 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bm49b" event={"ID":"313b4cdd-a226-424f-a049-3d0fe083d307","Type":"ContainerStarted","Data":"50d5c630cd2d6c904d318014bc8dd47c63710f03cc8e07b40b8be584b96a91f8"} Dec 01 09:03:49 crc kubenswrapper[5004]: I1201 09:03:49.365257 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bm49b" podStartSLOduration=3.186719212 podStartE2EDuration="6.365234111s" podCreationTimestamp="2025-12-01 09:03:43 +0000 UTC" firstStartedPulling="2025-12-01 09:03:45.278141803 +0000 UTC m=+2802.843133795" lastFinishedPulling="2025-12-01 09:03:48.456656692 +0000 UTC m=+2806.021648694" observedRunningTime="2025-12-01 09:03:49.350319326 +0000 UTC m=+2806.915311378" watchObservedRunningTime="2025-12-01 09:03:49.365234111 +0000 UTC m=+2806.930226103" Dec 01 09:03:53 crc kubenswrapper[5004]: I1201 09:03:53.408446 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bm49b" Dec 01 09:03:53 crc kubenswrapper[5004]: I1201 09:03:53.409122 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bm49b" Dec 01 09:03:53 crc kubenswrapper[5004]: I1201 09:03:53.484951 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bm49b" Dec 01 09:03:54 crc kubenswrapper[5004]: I1201 09:03:54.476626 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bm49b" Dec 01 09:03:54 crc kubenswrapper[5004]: I1201 09:03:54.559813 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bm49b"] Dec 01 09:03:56 crc kubenswrapper[5004]: I1201 09:03:56.458964 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bm49b" podUID="313b4cdd-a226-424f-a049-3d0fe083d307" containerName="registry-server" containerID="cri-o://50d5c630cd2d6c904d318014bc8dd47c63710f03cc8e07b40b8be584b96a91f8" gracePeriod=2 Dec 01 09:03:56 crc kubenswrapper[5004]: I1201 09:03:56.979223 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bm49b" Dec 01 09:03:57 crc kubenswrapper[5004]: I1201 09:03:57.097834 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vp78\" (UniqueName: \"kubernetes.io/projected/313b4cdd-a226-424f-a049-3d0fe083d307-kube-api-access-4vp78\") pod \"313b4cdd-a226-424f-a049-3d0fe083d307\" (UID: \"313b4cdd-a226-424f-a049-3d0fe083d307\") " Dec 01 09:03:57 crc kubenswrapper[5004]: I1201 09:03:57.097886 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/313b4cdd-a226-424f-a049-3d0fe083d307-utilities\") pod \"313b4cdd-a226-424f-a049-3d0fe083d307\" (UID: \"313b4cdd-a226-424f-a049-3d0fe083d307\") " Dec 01 09:03:57 crc kubenswrapper[5004]: I1201 09:03:57.097962 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/313b4cdd-a226-424f-a049-3d0fe083d307-catalog-content\") pod \"313b4cdd-a226-424f-a049-3d0fe083d307\" (UID: \"313b4cdd-a226-424f-a049-3d0fe083d307\") " Dec 01 09:03:57 crc kubenswrapper[5004]: I1201 09:03:57.099259 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/313b4cdd-a226-424f-a049-3d0fe083d307-utilities" (OuterVolumeSpecName: "utilities") pod "313b4cdd-a226-424f-a049-3d0fe083d307" (UID: "313b4cdd-a226-424f-a049-3d0fe083d307"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:03:57 crc kubenswrapper[5004]: I1201 09:03:57.107534 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/313b4cdd-a226-424f-a049-3d0fe083d307-kube-api-access-4vp78" (OuterVolumeSpecName: "kube-api-access-4vp78") pod "313b4cdd-a226-424f-a049-3d0fe083d307" (UID: "313b4cdd-a226-424f-a049-3d0fe083d307"). InnerVolumeSpecName "kube-api-access-4vp78". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:03:57 crc kubenswrapper[5004]: I1201 09:03:57.160863 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/313b4cdd-a226-424f-a049-3d0fe083d307-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "313b4cdd-a226-424f-a049-3d0fe083d307" (UID: "313b4cdd-a226-424f-a049-3d0fe083d307"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:03:57 crc kubenswrapper[5004]: I1201 09:03:57.204652 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/313b4cdd-a226-424f-a049-3d0fe083d307-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:03:57 crc kubenswrapper[5004]: I1201 09:03:57.204704 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vp78\" (UniqueName: \"kubernetes.io/projected/313b4cdd-a226-424f-a049-3d0fe083d307-kube-api-access-4vp78\") on node \"crc\" DevicePath \"\"" Dec 01 09:03:57 crc kubenswrapper[5004]: I1201 09:03:57.204725 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/313b4cdd-a226-424f-a049-3d0fe083d307-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:03:57 crc kubenswrapper[5004]: I1201 09:03:57.487990 5004 generic.go:334] "Generic (PLEG): container finished" podID="313b4cdd-a226-424f-a049-3d0fe083d307" containerID="50d5c630cd2d6c904d318014bc8dd47c63710f03cc8e07b40b8be584b96a91f8" exitCode=0 Dec 01 09:03:57 crc kubenswrapper[5004]: I1201 09:03:57.488208 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bm49b" event={"ID":"313b4cdd-a226-424f-a049-3d0fe083d307","Type":"ContainerDied","Data":"50d5c630cd2d6c904d318014bc8dd47c63710f03cc8e07b40b8be584b96a91f8"} Dec 01 09:03:57 crc kubenswrapper[5004]: I1201 09:03:57.489221 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bm49b" event={"ID":"313b4cdd-a226-424f-a049-3d0fe083d307","Type":"ContainerDied","Data":"c0870b98086d101762b5d155ad72d1412716152b55ba5d35c1c8da6d731a51ce"} Dec 01 09:03:57 crc kubenswrapper[5004]: I1201 09:03:57.488357 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bm49b" Dec 01 09:03:57 crc kubenswrapper[5004]: I1201 09:03:57.489389 5004 scope.go:117] "RemoveContainer" containerID="50d5c630cd2d6c904d318014bc8dd47c63710f03cc8e07b40b8be584b96a91f8" Dec 01 09:03:57 crc kubenswrapper[5004]: I1201 09:03:57.535236 5004 scope.go:117] "RemoveContainer" containerID="639f6ce89b9196ff9aa9e0bd3331c6a275c3839e43f912566f310243855f8d90" Dec 01 09:03:57 crc kubenswrapper[5004]: I1201 09:03:57.541129 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bm49b"] Dec 01 09:03:57 crc kubenswrapper[5004]: I1201 09:03:57.566370 5004 scope.go:117] "RemoveContainer" containerID="0ac43adfb08abb85899b3718c720d6be71c15b4f59dc81291ea2e7fa65192f19" Dec 01 09:03:57 crc kubenswrapper[5004]: I1201 09:03:57.570617 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bm49b"] Dec 01 09:03:57 crc kubenswrapper[5004]: I1201 09:03:57.646545 5004 scope.go:117] "RemoveContainer" containerID="50d5c630cd2d6c904d318014bc8dd47c63710f03cc8e07b40b8be584b96a91f8" Dec 01 09:03:57 crc kubenswrapper[5004]: E1201 09:03:57.648049 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50d5c630cd2d6c904d318014bc8dd47c63710f03cc8e07b40b8be584b96a91f8\": container with ID starting with 50d5c630cd2d6c904d318014bc8dd47c63710f03cc8e07b40b8be584b96a91f8 not found: ID does not exist" containerID="50d5c630cd2d6c904d318014bc8dd47c63710f03cc8e07b40b8be584b96a91f8" Dec 01 09:03:57 crc kubenswrapper[5004]: I1201 09:03:57.648152 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50d5c630cd2d6c904d318014bc8dd47c63710f03cc8e07b40b8be584b96a91f8"} err="failed to get container status \"50d5c630cd2d6c904d318014bc8dd47c63710f03cc8e07b40b8be584b96a91f8\": rpc error: code = NotFound desc = could not find container \"50d5c630cd2d6c904d318014bc8dd47c63710f03cc8e07b40b8be584b96a91f8\": container with ID starting with 50d5c630cd2d6c904d318014bc8dd47c63710f03cc8e07b40b8be584b96a91f8 not found: ID does not exist" Dec 01 09:03:57 crc kubenswrapper[5004]: I1201 09:03:57.648224 5004 scope.go:117] "RemoveContainer" containerID="639f6ce89b9196ff9aa9e0bd3331c6a275c3839e43f912566f310243855f8d90" Dec 01 09:03:57 crc kubenswrapper[5004]: E1201 09:03:57.649172 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"639f6ce89b9196ff9aa9e0bd3331c6a275c3839e43f912566f310243855f8d90\": container with ID starting with 639f6ce89b9196ff9aa9e0bd3331c6a275c3839e43f912566f310243855f8d90 not found: ID does not exist" containerID="639f6ce89b9196ff9aa9e0bd3331c6a275c3839e43f912566f310243855f8d90" Dec 01 09:03:57 crc kubenswrapper[5004]: I1201 09:03:57.649197 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"639f6ce89b9196ff9aa9e0bd3331c6a275c3839e43f912566f310243855f8d90"} err="failed to get container status \"639f6ce89b9196ff9aa9e0bd3331c6a275c3839e43f912566f310243855f8d90\": rpc error: code = NotFound desc = could not find container \"639f6ce89b9196ff9aa9e0bd3331c6a275c3839e43f912566f310243855f8d90\": container with ID starting with 639f6ce89b9196ff9aa9e0bd3331c6a275c3839e43f912566f310243855f8d90 not found: ID does not exist" Dec 01 09:03:57 crc kubenswrapper[5004]: I1201 09:03:57.649238 5004 scope.go:117] "RemoveContainer" containerID="0ac43adfb08abb85899b3718c720d6be71c15b4f59dc81291ea2e7fa65192f19" Dec 01 09:03:57 crc kubenswrapper[5004]: E1201 09:03:57.649640 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ac43adfb08abb85899b3718c720d6be71c15b4f59dc81291ea2e7fa65192f19\": container with ID starting with 0ac43adfb08abb85899b3718c720d6be71c15b4f59dc81291ea2e7fa65192f19 not found: ID does not exist" containerID="0ac43adfb08abb85899b3718c720d6be71c15b4f59dc81291ea2e7fa65192f19" Dec 01 09:03:57 crc kubenswrapper[5004]: I1201 09:03:57.649687 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ac43adfb08abb85899b3718c720d6be71c15b4f59dc81291ea2e7fa65192f19"} err="failed to get container status \"0ac43adfb08abb85899b3718c720d6be71c15b4f59dc81291ea2e7fa65192f19\": rpc error: code = NotFound desc = could not find container \"0ac43adfb08abb85899b3718c720d6be71c15b4f59dc81291ea2e7fa65192f19\": container with ID starting with 0ac43adfb08abb85899b3718c720d6be71c15b4f59dc81291ea2e7fa65192f19 not found: ID does not exist" Dec 01 09:03:58 crc kubenswrapper[5004]: I1201 09:03:58.777478 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="313b4cdd-a226-424f-a049-3d0fe083d307" path="/var/lib/kubelet/pods/313b4cdd-a226-424f-a049-3d0fe083d307/volumes" Dec 01 09:04:08 crc kubenswrapper[5004]: I1201 09:04:08.729834 5004 patch_prober.go:28] interesting pod/machine-config-daemon-fvdgt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:04:08 crc kubenswrapper[5004]: I1201 09:04:08.730861 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:04:12 crc kubenswrapper[5004]: I1201 09:04:12.679040 5004 generic.go:334] "Generic (PLEG): container finished" podID="ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce" containerID="ea83bba39161f7a46776209069767b27a5c40515473306f8edcce012c77ec1f1" exitCode=0 Dec 01 09:04:12 crc kubenswrapper[5004]: I1201 09:04:12.679110 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2p8kr" event={"ID":"ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce","Type":"ContainerDied","Data":"ea83bba39161f7a46776209069767b27a5c40515473306f8edcce012c77ec1f1"} Dec 01 09:04:14 crc kubenswrapper[5004]: I1201 09:04:14.273189 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2p8kr" Dec 01 09:04:14 crc kubenswrapper[5004]: I1201 09:04:14.375646 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dzgxj\" (UniqueName: \"kubernetes.io/projected/ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce-kube-api-access-dzgxj\") pod \"ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce\" (UID: \"ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce\") " Dec 01 09:04:14 crc kubenswrapper[5004]: I1201 09:04:14.375737 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce-inventory\") pod \"ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce\" (UID: \"ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce\") " Dec 01 09:04:14 crc kubenswrapper[5004]: I1201 09:04:14.375766 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce-libvirt-secret-0\") pod \"ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce\" (UID: \"ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce\") " Dec 01 09:04:14 crc kubenswrapper[5004]: I1201 09:04:14.375835 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce-ssh-key\") pod \"ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce\" (UID: \"ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce\") " Dec 01 09:04:14 crc kubenswrapper[5004]: I1201 09:04:14.376037 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce-libvirt-combined-ca-bundle\") pod \"ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce\" (UID: \"ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce\") " Dec 01 09:04:14 crc kubenswrapper[5004]: I1201 09:04:14.382283 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce" (UID: "ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:04:14 crc kubenswrapper[5004]: I1201 09:04:14.383216 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce-kube-api-access-dzgxj" (OuterVolumeSpecName: "kube-api-access-dzgxj") pod "ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce" (UID: "ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce"). InnerVolumeSpecName "kube-api-access-dzgxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:04:14 crc kubenswrapper[5004]: I1201 09:04:14.409636 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce-inventory" (OuterVolumeSpecName: "inventory") pod "ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce" (UID: "ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:04:14 crc kubenswrapper[5004]: I1201 09:04:14.430726 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce" (UID: "ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:04:14 crc kubenswrapper[5004]: I1201 09:04:14.438370 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce" (UID: "ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:04:14 crc kubenswrapper[5004]: I1201 09:04:14.478293 5004 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:04:14 crc kubenswrapper[5004]: I1201 09:04:14.478533 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dzgxj\" (UniqueName: \"kubernetes.io/projected/ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce-kube-api-access-dzgxj\") on node \"crc\" DevicePath \"\"" Dec 01 09:04:14 crc kubenswrapper[5004]: I1201 09:04:14.478650 5004 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 09:04:14 crc kubenswrapper[5004]: I1201 09:04:14.478726 5004 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Dec 01 09:04:14 crc kubenswrapper[5004]: I1201 09:04:14.478803 5004 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 09:04:14 crc kubenswrapper[5004]: I1201 09:04:14.705302 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2p8kr" event={"ID":"ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce","Type":"ContainerDied","Data":"9244370e4c0cfce998ee6da9f233d96b3bfe912c6daf918d13d3b109d71e28c6"} Dec 01 09:04:14 crc kubenswrapper[5004]: I1201 09:04:14.705371 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9244370e4c0cfce998ee6da9f233d96b3bfe912c6daf918d13d3b109d71e28c6" Dec 01 09:04:14 crc kubenswrapper[5004]: I1201 09:04:14.705417 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2p8kr" Dec 01 09:04:14 crc kubenswrapper[5004]: I1201 09:04:14.871439 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-65pft"] Dec 01 09:04:14 crc kubenswrapper[5004]: E1201 09:04:14.872112 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="313b4cdd-a226-424f-a049-3d0fe083d307" containerName="registry-server" Dec 01 09:04:14 crc kubenswrapper[5004]: I1201 09:04:14.872130 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="313b4cdd-a226-424f-a049-3d0fe083d307" containerName="registry-server" Dec 01 09:04:14 crc kubenswrapper[5004]: E1201 09:04:14.872202 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="313b4cdd-a226-424f-a049-3d0fe083d307" containerName="extract-content" Dec 01 09:04:14 crc kubenswrapper[5004]: I1201 09:04:14.872210 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="313b4cdd-a226-424f-a049-3d0fe083d307" containerName="extract-content" Dec 01 09:04:14 crc kubenswrapper[5004]: E1201 09:04:14.872227 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 01 09:04:14 crc kubenswrapper[5004]: I1201 09:04:14.872235 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 01 09:04:14 crc kubenswrapper[5004]: E1201 09:04:14.872253 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="313b4cdd-a226-424f-a049-3d0fe083d307" containerName="extract-utilities" Dec 01 09:04:14 crc kubenswrapper[5004]: I1201 09:04:14.872260 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="313b4cdd-a226-424f-a049-3d0fe083d307" containerName="extract-utilities" Dec 01 09:04:14 crc kubenswrapper[5004]: I1201 09:04:14.872504 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="313b4cdd-a226-424f-a049-3d0fe083d307" containerName="registry-server" Dec 01 09:04:14 crc kubenswrapper[5004]: I1201 09:04:14.872535 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 01 09:04:14 crc kubenswrapper[5004]: I1201 09:04:14.873599 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-65pft" Dec 01 09:04:14 crc kubenswrapper[5004]: I1201 09:04:14.875612 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Dec 01 09:04:14 crc kubenswrapper[5004]: I1201 09:04:14.876509 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 09:04:14 crc kubenswrapper[5004]: I1201 09:04:14.877200 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pdnrq" Dec 01 09:04:14 crc kubenswrapper[5004]: I1201 09:04:14.877273 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 09:04:14 crc kubenswrapper[5004]: I1201 09:04:14.877774 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 09:04:14 crc kubenswrapper[5004]: I1201 09:04:14.877793 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Dec 01 09:04:14 crc kubenswrapper[5004]: I1201 09:04:14.878045 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Dec 01 09:04:14 crc kubenswrapper[5004]: I1201 09:04:14.883946 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-65pft"] Dec 01 09:04:14 crc kubenswrapper[5004]: I1201 09:04:14.907312 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/5b434b81-6f6d-48f1-b569-4a31ef7abbec-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-65pft\" (UID: \"5b434b81-6f6d-48f1-b569-4a31ef7abbec\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-65pft" Dec 01 09:04:14 crc kubenswrapper[5004]: I1201 09:04:14.907494 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5b434b81-6f6d-48f1-b569-4a31ef7abbec-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-65pft\" (UID: \"5b434b81-6f6d-48f1-b569-4a31ef7abbec\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-65pft" Dec 01 09:04:14 crc kubenswrapper[5004]: I1201 09:04:14.907577 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/5b434b81-6f6d-48f1-b569-4a31ef7abbec-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-65pft\" (UID: \"5b434b81-6f6d-48f1-b569-4a31ef7abbec\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-65pft" Dec 01 09:04:14 crc kubenswrapper[5004]: I1201 09:04:14.907606 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b434b81-6f6d-48f1-b569-4a31ef7abbec-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-65pft\" (UID: \"5b434b81-6f6d-48f1-b569-4a31ef7abbec\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-65pft" Dec 01 09:04:14 crc kubenswrapper[5004]: I1201 09:04:14.908139 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5b434b81-6f6d-48f1-b569-4a31ef7abbec-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-65pft\" (UID: \"5b434b81-6f6d-48f1-b569-4a31ef7abbec\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-65pft" Dec 01 09:04:14 crc kubenswrapper[5004]: I1201 09:04:14.908220 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/5b434b81-6f6d-48f1-b569-4a31ef7abbec-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-65pft\" (UID: \"5b434b81-6f6d-48f1-b569-4a31ef7abbec\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-65pft" Dec 01 09:04:14 crc kubenswrapper[5004]: I1201 09:04:14.908286 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xccnc\" (UniqueName: \"kubernetes.io/projected/5b434b81-6f6d-48f1-b569-4a31ef7abbec-kube-api-access-xccnc\") pod \"nova-edpm-deployment-openstack-edpm-ipam-65pft\" (UID: \"5b434b81-6f6d-48f1-b569-4a31ef7abbec\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-65pft" Dec 01 09:04:14 crc kubenswrapper[5004]: I1201 09:04:14.908394 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/5b434b81-6f6d-48f1-b569-4a31ef7abbec-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-65pft\" (UID: \"5b434b81-6f6d-48f1-b569-4a31ef7abbec\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-65pft" Dec 01 09:04:14 crc kubenswrapper[5004]: I1201 09:04:14.908488 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/5b434b81-6f6d-48f1-b569-4a31ef7abbec-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-65pft\" (UID: \"5b434b81-6f6d-48f1-b569-4a31ef7abbec\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-65pft" Dec 01 09:04:15 crc kubenswrapper[5004]: I1201 09:04:15.010855 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5b434b81-6f6d-48f1-b569-4a31ef7abbec-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-65pft\" (UID: \"5b434b81-6f6d-48f1-b569-4a31ef7abbec\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-65pft" Dec 01 09:04:15 crc kubenswrapper[5004]: I1201 09:04:15.010973 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/5b434b81-6f6d-48f1-b569-4a31ef7abbec-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-65pft\" (UID: \"5b434b81-6f6d-48f1-b569-4a31ef7abbec\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-65pft" Dec 01 09:04:15 crc kubenswrapper[5004]: I1201 09:04:15.011064 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xccnc\" (UniqueName: \"kubernetes.io/projected/5b434b81-6f6d-48f1-b569-4a31ef7abbec-kube-api-access-xccnc\") pod \"nova-edpm-deployment-openstack-edpm-ipam-65pft\" (UID: \"5b434b81-6f6d-48f1-b569-4a31ef7abbec\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-65pft" Dec 01 09:04:15 crc kubenswrapper[5004]: I1201 09:04:15.011202 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/5b434b81-6f6d-48f1-b569-4a31ef7abbec-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-65pft\" (UID: \"5b434b81-6f6d-48f1-b569-4a31ef7abbec\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-65pft" Dec 01 09:04:15 crc kubenswrapper[5004]: I1201 09:04:15.011325 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/5b434b81-6f6d-48f1-b569-4a31ef7abbec-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-65pft\" (UID: \"5b434b81-6f6d-48f1-b569-4a31ef7abbec\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-65pft" Dec 01 09:04:15 crc kubenswrapper[5004]: I1201 09:04:15.011415 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/5b434b81-6f6d-48f1-b569-4a31ef7abbec-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-65pft\" (UID: \"5b434b81-6f6d-48f1-b569-4a31ef7abbec\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-65pft" Dec 01 09:04:15 crc kubenswrapper[5004]: I1201 09:04:15.011544 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5b434b81-6f6d-48f1-b569-4a31ef7abbec-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-65pft\" (UID: \"5b434b81-6f6d-48f1-b569-4a31ef7abbec\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-65pft" Dec 01 09:04:15 crc kubenswrapper[5004]: I1201 09:04:15.011679 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/5b434b81-6f6d-48f1-b569-4a31ef7abbec-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-65pft\" (UID: \"5b434b81-6f6d-48f1-b569-4a31ef7abbec\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-65pft" Dec 01 09:04:15 crc kubenswrapper[5004]: I1201 09:04:15.011733 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b434b81-6f6d-48f1-b569-4a31ef7abbec-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-65pft\" (UID: \"5b434b81-6f6d-48f1-b569-4a31ef7abbec\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-65pft" Dec 01 09:04:15 crc kubenswrapper[5004]: I1201 09:04:15.012758 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/5b434b81-6f6d-48f1-b569-4a31ef7abbec-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-65pft\" (UID: \"5b434b81-6f6d-48f1-b569-4a31ef7abbec\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-65pft" Dec 01 09:04:15 crc kubenswrapper[5004]: I1201 09:04:15.015310 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5b434b81-6f6d-48f1-b569-4a31ef7abbec-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-65pft\" (UID: \"5b434b81-6f6d-48f1-b569-4a31ef7abbec\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-65pft" Dec 01 09:04:15 crc kubenswrapper[5004]: I1201 09:04:15.022670 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/5b434b81-6f6d-48f1-b569-4a31ef7abbec-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-65pft\" (UID: \"5b434b81-6f6d-48f1-b569-4a31ef7abbec\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-65pft" Dec 01 09:04:15 crc kubenswrapper[5004]: I1201 09:04:15.022719 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/5b434b81-6f6d-48f1-b569-4a31ef7abbec-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-65pft\" (UID: \"5b434b81-6f6d-48f1-b569-4a31ef7abbec\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-65pft" Dec 01 09:04:15 crc kubenswrapper[5004]: I1201 09:04:15.023304 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/5b434b81-6f6d-48f1-b569-4a31ef7abbec-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-65pft\" (UID: \"5b434b81-6f6d-48f1-b569-4a31ef7abbec\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-65pft" Dec 01 09:04:15 crc kubenswrapper[5004]: I1201 09:04:15.023623 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5b434b81-6f6d-48f1-b569-4a31ef7abbec-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-65pft\" (UID: \"5b434b81-6f6d-48f1-b569-4a31ef7abbec\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-65pft" Dec 01 09:04:15 crc kubenswrapper[5004]: I1201 09:04:15.023972 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/5b434b81-6f6d-48f1-b569-4a31ef7abbec-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-65pft\" (UID: \"5b434b81-6f6d-48f1-b569-4a31ef7abbec\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-65pft" Dec 01 09:04:15 crc kubenswrapper[5004]: I1201 09:04:15.036166 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xccnc\" (UniqueName: \"kubernetes.io/projected/5b434b81-6f6d-48f1-b569-4a31ef7abbec-kube-api-access-xccnc\") pod \"nova-edpm-deployment-openstack-edpm-ipam-65pft\" (UID: \"5b434b81-6f6d-48f1-b569-4a31ef7abbec\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-65pft" Dec 01 09:04:15 crc kubenswrapper[5004]: I1201 09:04:15.042405 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b434b81-6f6d-48f1-b569-4a31ef7abbec-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-65pft\" (UID: \"5b434b81-6f6d-48f1-b569-4a31ef7abbec\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-65pft" Dec 01 09:04:15 crc kubenswrapper[5004]: I1201 09:04:15.200124 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-65pft" Dec 01 09:04:15 crc kubenswrapper[5004]: I1201 09:04:15.787117 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-65pft"] Dec 01 09:04:16 crc kubenswrapper[5004]: I1201 09:04:16.731415 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-65pft" event={"ID":"5b434b81-6f6d-48f1-b569-4a31ef7abbec","Type":"ContainerStarted","Data":"cf58f62c0921bcf7a70b41da84f0c483574e57eaefbb803ab804dbc45588360f"} Dec 01 09:04:17 crc kubenswrapper[5004]: I1201 09:04:17.752340 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-65pft" event={"ID":"5b434b81-6f6d-48f1-b569-4a31ef7abbec","Type":"ContainerStarted","Data":"53fd80a45b685d98fccebed04ddce39ac1a0472e21ddad02bf2f78cd204e6d21"} Dec 01 09:04:17 crc kubenswrapper[5004]: I1201 09:04:17.774745 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-65pft" podStartSLOduration=2.947738357 podStartE2EDuration="3.774720521s" podCreationTimestamp="2025-12-01 09:04:14 +0000 UTC" firstStartedPulling="2025-12-01 09:04:15.811961252 +0000 UTC m=+2833.376953234" lastFinishedPulling="2025-12-01 09:04:16.638943416 +0000 UTC m=+2834.203935398" observedRunningTime="2025-12-01 09:04:17.770112689 +0000 UTC m=+2835.335104691" watchObservedRunningTime="2025-12-01 09:04:17.774720521 +0000 UTC m=+2835.339712503" Dec 01 09:04:38 crc kubenswrapper[5004]: I1201 09:04:38.729530 5004 patch_prober.go:28] interesting pod/machine-config-daemon-fvdgt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:04:38 crc kubenswrapper[5004]: I1201 09:04:38.730329 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:05:08 crc kubenswrapper[5004]: I1201 09:05:08.729644 5004 patch_prober.go:28] interesting pod/machine-config-daemon-fvdgt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:05:08 crc kubenswrapper[5004]: I1201 09:05:08.730214 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:05:08 crc kubenswrapper[5004]: I1201 09:05:08.730265 5004 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" Dec 01 09:05:08 crc kubenswrapper[5004]: I1201 09:05:08.731292 5004 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a78d694ae3feedb53c9bcdd2efb1f355640963d1a6d9fa69392d3b2ab8f0f9a6"} pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 09:05:08 crc kubenswrapper[5004]: I1201 09:05:08.731339 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" containerID="cri-o://a78d694ae3feedb53c9bcdd2efb1f355640963d1a6d9fa69392d3b2ab8f0f9a6" gracePeriod=600 Dec 01 09:05:09 crc kubenswrapper[5004]: I1201 09:05:09.404854 5004 generic.go:334] "Generic (PLEG): container finished" podID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerID="a78d694ae3feedb53c9bcdd2efb1f355640963d1a6d9fa69392d3b2ab8f0f9a6" exitCode=0 Dec 01 09:05:09 crc kubenswrapper[5004]: I1201 09:05:09.404916 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" event={"ID":"f9977ebb-82de-4e96-8763-0b5a84f8d4ce","Type":"ContainerDied","Data":"a78d694ae3feedb53c9bcdd2efb1f355640963d1a6d9fa69392d3b2ab8f0f9a6"} Dec 01 09:05:09 crc kubenswrapper[5004]: I1201 09:05:09.404967 5004 scope.go:117] "RemoveContainer" containerID="35549cd96a84a4d57cde0021a5b3e7f3cc68411360397308b1e916325429b080" Dec 01 09:05:10 crc kubenswrapper[5004]: I1201 09:05:10.419653 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" event={"ID":"f9977ebb-82de-4e96-8763-0b5a84f8d4ce","Type":"ContainerStarted","Data":"959c4dd48a6a01a111184092f5278271df46b0993f7d823dbccdaf0fe68ac515"} Dec 01 09:07:10 crc kubenswrapper[5004]: I1201 09:07:10.204763 5004 generic.go:334] "Generic (PLEG): container finished" podID="5b434b81-6f6d-48f1-b569-4a31ef7abbec" containerID="53fd80a45b685d98fccebed04ddce39ac1a0472e21ddad02bf2f78cd204e6d21" exitCode=0 Dec 01 09:07:10 crc kubenswrapper[5004]: I1201 09:07:10.205317 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-65pft" event={"ID":"5b434b81-6f6d-48f1-b569-4a31ef7abbec","Type":"ContainerDied","Data":"53fd80a45b685d98fccebed04ddce39ac1a0472e21ddad02bf2f78cd204e6d21"} Dec 01 09:07:11 crc kubenswrapper[5004]: I1201 09:07:11.687933 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-65pft" Dec 01 09:07:11 crc kubenswrapper[5004]: I1201 09:07:11.832887 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xccnc\" (UniqueName: \"kubernetes.io/projected/5b434b81-6f6d-48f1-b569-4a31ef7abbec-kube-api-access-xccnc\") pod \"5b434b81-6f6d-48f1-b569-4a31ef7abbec\" (UID: \"5b434b81-6f6d-48f1-b569-4a31ef7abbec\") " Dec 01 09:07:11 crc kubenswrapper[5004]: I1201 09:07:11.833017 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5b434b81-6f6d-48f1-b569-4a31ef7abbec-ssh-key\") pod \"5b434b81-6f6d-48f1-b569-4a31ef7abbec\" (UID: \"5b434b81-6f6d-48f1-b569-4a31ef7abbec\") " Dec 01 09:07:11 crc kubenswrapper[5004]: I1201 09:07:11.833039 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/5b434b81-6f6d-48f1-b569-4a31ef7abbec-nova-cell1-compute-config-0\") pod \"5b434b81-6f6d-48f1-b569-4a31ef7abbec\" (UID: \"5b434b81-6f6d-48f1-b569-4a31ef7abbec\") " Dec 01 09:07:11 crc kubenswrapper[5004]: I1201 09:07:11.833142 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/5b434b81-6f6d-48f1-b569-4a31ef7abbec-nova-migration-ssh-key-1\") pod \"5b434b81-6f6d-48f1-b569-4a31ef7abbec\" (UID: \"5b434b81-6f6d-48f1-b569-4a31ef7abbec\") " Dec 01 09:07:11 crc kubenswrapper[5004]: I1201 09:07:11.833164 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/5b434b81-6f6d-48f1-b569-4a31ef7abbec-nova-cell1-compute-config-1\") pod \"5b434b81-6f6d-48f1-b569-4a31ef7abbec\" (UID: \"5b434b81-6f6d-48f1-b569-4a31ef7abbec\") " Dec 01 09:07:11 crc kubenswrapper[5004]: I1201 09:07:11.833183 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b434b81-6f6d-48f1-b569-4a31ef7abbec-nova-combined-ca-bundle\") pod \"5b434b81-6f6d-48f1-b569-4a31ef7abbec\" (UID: \"5b434b81-6f6d-48f1-b569-4a31ef7abbec\") " Dec 01 09:07:11 crc kubenswrapper[5004]: I1201 09:07:11.833215 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/5b434b81-6f6d-48f1-b569-4a31ef7abbec-nova-extra-config-0\") pod \"5b434b81-6f6d-48f1-b569-4a31ef7abbec\" (UID: \"5b434b81-6f6d-48f1-b569-4a31ef7abbec\") " Dec 01 09:07:11 crc kubenswrapper[5004]: I1201 09:07:11.833299 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/5b434b81-6f6d-48f1-b569-4a31ef7abbec-nova-migration-ssh-key-0\") pod \"5b434b81-6f6d-48f1-b569-4a31ef7abbec\" (UID: \"5b434b81-6f6d-48f1-b569-4a31ef7abbec\") " Dec 01 09:07:11 crc kubenswrapper[5004]: I1201 09:07:11.833344 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5b434b81-6f6d-48f1-b569-4a31ef7abbec-inventory\") pod \"5b434b81-6f6d-48f1-b569-4a31ef7abbec\" (UID: \"5b434b81-6f6d-48f1-b569-4a31ef7abbec\") " Dec 01 09:07:11 crc kubenswrapper[5004]: I1201 09:07:11.841765 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b434b81-6f6d-48f1-b569-4a31ef7abbec-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "5b434b81-6f6d-48f1-b569-4a31ef7abbec" (UID: "5b434b81-6f6d-48f1-b569-4a31ef7abbec"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:07:11 crc kubenswrapper[5004]: I1201 09:07:11.845043 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b434b81-6f6d-48f1-b569-4a31ef7abbec-kube-api-access-xccnc" (OuterVolumeSpecName: "kube-api-access-xccnc") pod "5b434b81-6f6d-48f1-b569-4a31ef7abbec" (UID: "5b434b81-6f6d-48f1-b569-4a31ef7abbec"). InnerVolumeSpecName "kube-api-access-xccnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:07:11 crc kubenswrapper[5004]: I1201 09:07:11.869808 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b434b81-6f6d-48f1-b569-4a31ef7abbec-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "5b434b81-6f6d-48f1-b569-4a31ef7abbec" (UID: "5b434b81-6f6d-48f1-b569-4a31ef7abbec"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:07:11 crc kubenswrapper[5004]: I1201 09:07:11.873190 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b434b81-6f6d-48f1-b569-4a31ef7abbec-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "5b434b81-6f6d-48f1-b569-4a31ef7abbec" (UID: "5b434b81-6f6d-48f1-b569-4a31ef7abbec"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:07:11 crc kubenswrapper[5004]: I1201 09:07:11.878189 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b434b81-6f6d-48f1-b569-4a31ef7abbec-inventory" (OuterVolumeSpecName: "inventory") pod "5b434b81-6f6d-48f1-b569-4a31ef7abbec" (UID: "5b434b81-6f6d-48f1-b569-4a31ef7abbec"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:07:11 crc kubenswrapper[5004]: I1201 09:07:11.882954 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b434b81-6f6d-48f1-b569-4a31ef7abbec-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "5b434b81-6f6d-48f1-b569-4a31ef7abbec" (UID: "5b434b81-6f6d-48f1-b569-4a31ef7abbec"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:07:11 crc kubenswrapper[5004]: I1201 09:07:11.887857 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b434b81-6f6d-48f1-b569-4a31ef7abbec-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "5b434b81-6f6d-48f1-b569-4a31ef7abbec" (UID: "5b434b81-6f6d-48f1-b569-4a31ef7abbec"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:07:11 crc kubenswrapper[5004]: I1201 09:07:11.888815 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b434b81-6f6d-48f1-b569-4a31ef7abbec-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "5b434b81-6f6d-48f1-b569-4a31ef7abbec" (UID: "5b434b81-6f6d-48f1-b569-4a31ef7abbec"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:07:11 crc kubenswrapper[5004]: I1201 09:07:11.896785 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b434b81-6f6d-48f1-b569-4a31ef7abbec-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5b434b81-6f6d-48f1-b569-4a31ef7abbec" (UID: "5b434b81-6f6d-48f1-b569-4a31ef7abbec"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:07:11 crc kubenswrapper[5004]: I1201 09:07:11.936381 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xccnc\" (UniqueName: \"kubernetes.io/projected/5b434b81-6f6d-48f1-b569-4a31ef7abbec-kube-api-access-xccnc\") on node \"crc\" DevicePath \"\"" Dec 01 09:07:11 crc kubenswrapper[5004]: I1201 09:07:11.936423 5004 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5b434b81-6f6d-48f1-b569-4a31ef7abbec-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 09:07:11 crc kubenswrapper[5004]: I1201 09:07:11.936436 5004 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/5b434b81-6f6d-48f1-b569-4a31ef7abbec-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Dec 01 09:07:11 crc kubenswrapper[5004]: I1201 09:07:11.936448 5004 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/5b434b81-6f6d-48f1-b569-4a31ef7abbec-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Dec 01 09:07:11 crc kubenswrapper[5004]: I1201 09:07:11.936464 5004 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/5b434b81-6f6d-48f1-b569-4a31ef7abbec-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Dec 01 09:07:11 crc kubenswrapper[5004]: I1201 09:07:11.936475 5004 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b434b81-6f6d-48f1-b569-4a31ef7abbec-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:07:11 crc kubenswrapper[5004]: I1201 09:07:11.936488 5004 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/5b434b81-6f6d-48f1-b569-4a31ef7abbec-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Dec 01 09:07:11 crc kubenswrapper[5004]: I1201 09:07:11.936499 5004 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/5b434b81-6f6d-48f1-b569-4a31ef7abbec-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Dec 01 09:07:11 crc kubenswrapper[5004]: I1201 09:07:11.936511 5004 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5b434b81-6f6d-48f1-b569-4a31ef7abbec-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 09:07:12 crc kubenswrapper[5004]: I1201 09:07:12.227924 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-65pft" event={"ID":"5b434b81-6f6d-48f1-b569-4a31ef7abbec","Type":"ContainerDied","Data":"cf58f62c0921bcf7a70b41da84f0c483574e57eaefbb803ab804dbc45588360f"} Dec 01 09:07:12 crc kubenswrapper[5004]: I1201 09:07:12.227966 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf58f62c0921bcf7a70b41da84f0c483574e57eaefbb803ab804dbc45588360f" Dec 01 09:07:12 crc kubenswrapper[5004]: I1201 09:07:12.228437 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-65pft" Dec 01 09:07:12 crc kubenswrapper[5004]: I1201 09:07:12.328714 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6h4fj"] Dec 01 09:07:12 crc kubenswrapper[5004]: E1201 09:07:12.329363 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b434b81-6f6d-48f1-b569-4a31ef7abbec" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 01 09:07:12 crc kubenswrapper[5004]: I1201 09:07:12.329388 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b434b81-6f6d-48f1-b569-4a31ef7abbec" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 01 09:07:12 crc kubenswrapper[5004]: I1201 09:07:12.329648 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b434b81-6f6d-48f1-b569-4a31ef7abbec" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 01 09:07:12 crc kubenswrapper[5004]: I1201 09:07:12.330541 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6h4fj" Dec 01 09:07:12 crc kubenswrapper[5004]: I1201 09:07:12.333200 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 09:07:12 crc kubenswrapper[5004]: I1201 09:07:12.333396 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 09:07:12 crc kubenswrapper[5004]: I1201 09:07:12.333521 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pdnrq" Dec 01 09:07:12 crc kubenswrapper[5004]: I1201 09:07:12.333683 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Dec 01 09:07:12 crc kubenswrapper[5004]: I1201 09:07:12.335374 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 09:07:12 crc kubenswrapper[5004]: I1201 09:07:12.343543 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6h4fj"] Dec 01 09:07:12 crc kubenswrapper[5004]: I1201 09:07:12.447143 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/194e6eb7-a03b-4482-a871-368a20922a87-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6h4fj\" (UID: \"194e6eb7-a03b-4482-a871-368a20922a87\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6h4fj" Dec 01 09:07:12 crc kubenswrapper[5004]: I1201 09:07:12.447190 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fw88c\" (UniqueName: \"kubernetes.io/projected/194e6eb7-a03b-4482-a871-368a20922a87-kube-api-access-fw88c\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6h4fj\" (UID: \"194e6eb7-a03b-4482-a871-368a20922a87\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6h4fj" Dec 01 09:07:12 crc kubenswrapper[5004]: I1201 09:07:12.447228 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/194e6eb7-a03b-4482-a871-368a20922a87-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6h4fj\" (UID: \"194e6eb7-a03b-4482-a871-368a20922a87\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6h4fj" Dec 01 09:07:12 crc kubenswrapper[5004]: I1201 09:07:12.447259 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/194e6eb7-a03b-4482-a871-368a20922a87-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6h4fj\" (UID: \"194e6eb7-a03b-4482-a871-368a20922a87\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6h4fj" Dec 01 09:07:12 crc kubenswrapper[5004]: I1201 09:07:12.447280 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/194e6eb7-a03b-4482-a871-368a20922a87-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6h4fj\" (UID: \"194e6eb7-a03b-4482-a871-368a20922a87\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6h4fj" Dec 01 09:07:12 crc kubenswrapper[5004]: I1201 09:07:12.447365 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/194e6eb7-a03b-4482-a871-368a20922a87-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6h4fj\" (UID: \"194e6eb7-a03b-4482-a871-368a20922a87\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6h4fj" Dec 01 09:07:12 crc kubenswrapper[5004]: I1201 09:07:12.447470 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/194e6eb7-a03b-4482-a871-368a20922a87-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6h4fj\" (UID: \"194e6eb7-a03b-4482-a871-368a20922a87\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6h4fj" Dec 01 09:07:12 crc kubenswrapper[5004]: I1201 09:07:12.549253 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/194e6eb7-a03b-4482-a871-368a20922a87-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6h4fj\" (UID: \"194e6eb7-a03b-4482-a871-368a20922a87\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6h4fj" Dec 01 09:07:12 crc kubenswrapper[5004]: I1201 09:07:12.549309 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fw88c\" (UniqueName: \"kubernetes.io/projected/194e6eb7-a03b-4482-a871-368a20922a87-kube-api-access-fw88c\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6h4fj\" (UID: \"194e6eb7-a03b-4482-a871-368a20922a87\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6h4fj" Dec 01 09:07:12 crc kubenswrapper[5004]: I1201 09:07:12.549355 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/194e6eb7-a03b-4482-a871-368a20922a87-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6h4fj\" (UID: \"194e6eb7-a03b-4482-a871-368a20922a87\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6h4fj" Dec 01 09:07:12 crc kubenswrapper[5004]: I1201 09:07:12.549398 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/194e6eb7-a03b-4482-a871-368a20922a87-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6h4fj\" (UID: \"194e6eb7-a03b-4482-a871-368a20922a87\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6h4fj" Dec 01 09:07:12 crc kubenswrapper[5004]: I1201 09:07:12.549428 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/194e6eb7-a03b-4482-a871-368a20922a87-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6h4fj\" (UID: \"194e6eb7-a03b-4482-a871-368a20922a87\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6h4fj" Dec 01 09:07:12 crc kubenswrapper[5004]: I1201 09:07:12.549539 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/194e6eb7-a03b-4482-a871-368a20922a87-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6h4fj\" (UID: \"194e6eb7-a03b-4482-a871-368a20922a87\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6h4fj" Dec 01 09:07:12 crc kubenswrapper[5004]: I1201 09:07:12.549693 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/194e6eb7-a03b-4482-a871-368a20922a87-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6h4fj\" (UID: \"194e6eb7-a03b-4482-a871-368a20922a87\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6h4fj" Dec 01 09:07:12 crc kubenswrapper[5004]: I1201 09:07:12.553800 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/194e6eb7-a03b-4482-a871-368a20922a87-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6h4fj\" (UID: \"194e6eb7-a03b-4482-a871-368a20922a87\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6h4fj" Dec 01 09:07:12 crc kubenswrapper[5004]: I1201 09:07:12.553844 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/194e6eb7-a03b-4482-a871-368a20922a87-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6h4fj\" (UID: \"194e6eb7-a03b-4482-a871-368a20922a87\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6h4fj" Dec 01 09:07:12 crc kubenswrapper[5004]: I1201 09:07:12.553856 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/194e6eb7-a03b-4482-a871-368a20922a87-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6h4fj\" (UID: \"194e6eb7-a03b-4482-a871-368a20922a87\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6h4fj" Dec 01 09:07:12 crc kubenswrapper[5004]: I1201 09:07:12.555001 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/194e6eb7-a03b-4482-a871-368a20922a87-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6h4fj\" (UID: \"194e6eb7-a03b-4482-a871-368a20922a87\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6h4fj" Dec 01 09:07:12 crc kubenswrapper[5004]: I1201 09:07:12.555100 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/194e6eb7-a03b-4482-a871-368a20922a87-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6h4fj\" (UID: \"194e6eb7-a03b-4482-a871-368a20922a87\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6h4fj" Dec 01 09:07:12 crc kubenswrapper[5004]: I1201 09:07:12.555859 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/194e6eb7-a03b-4482-a871-368a20922a87-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6h4fj\" (UID: \"194e6eb7-a03b-4482-a871-368a20922a87\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6h4fj" Dec 01 09:07:12 crc kubenswrapper[5004]: I1201 09:07:12.566874 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fw88c\" (UniqueName: \"kubernetes.io/projected/194e6eb7-a03b-4482-a871-368a20922a87-kube-api-access-fw88c\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6h4fj\" (UID: \"194e6eb7-a03b-4482-a871-368a20922a87\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6h4fj" Dec 01 09:07:12 crc kubenswrapper[5004]: I1201 09:07:12.654479 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6h4fj" Dec 01 09:07:13 crc kubenswrapper[5004]: W1201 09:07:13.232764 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod194e6eb7_a03b_4482_a871_368a20922a87.slice/crio-15edf66b7c84dbebcb9a632bc76b6e7790ea122f6135926337d6befc6d9afa08 WatchSource:0}: Error finding container 15edf66b7c84dbebcb9a632bc76b6e7790ea122f6135926337d6befc6d9afa08: Status 404 returned error can't find the container with id 15edf66b7c84dbebcb9a632bc76b6e7790ea122f6135926337d6befc6d9afa08 Dec 01 09:07:13 crc kubenswrapper[5004]: I1201 09:07:13.235646 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6h4fj"] Dec 01 09:07:13 crc kubenswrapper[5004]: I1201 09:07:13.236390 5004 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 09:07:14 crc kubenswrapper[5004]: I1201 09:07:14.249291 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6h4fj" event={"ID":"194e6eb7-a03b-4482-a871-368a20922a87","Type":"ContainerStarted","Data":"58e88aa15ca589e50e6088acc5f2661c83819978d47eeb0629991f4baa8e184a"} Dec 01 09:07:14 crc kubenswrapper[5004]: I1201 09:07:14.249635 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6h4fj" event={"ID":"194e6eb7-a03b-4482-a871-368a20922a87","Type":"ContainerStarted","Data":"15edf66b7c84dbebcb9a632bc76b6e7790ea122f6135926337d6befc6d9afa08"} Dec 01 09:07:14 crc kubenswrapper[5004]: I1201 09:07:14.275481 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6h4fj" podStartSLOduration=1.629281458 podStartE2EDuration="2.275456806s" podCreationTimestamp="2025-12-01 09:07:12 +0000 UTC" firstStartedPulling="2025-12-01 09:07:13.236133412 +0000 UTC m=+3010.801125394" lastFinishedPulling="2025-12-01 09:07:13.88230872 +0000 UTC m=+3011.447300742" observedRunningTime="2025-12-01 09:07:14.266034017 +0000 UTC m=+3011.831025999" watchObservedRunningTime="2025-12-01 09:07:14.275456806 +0000 UTC m=+3011.840448788" Dec 01 09:07:38 crc kubenswrapper[5004]: I1201 09:07:38.729725 5004 patch_prober.go:28] interesting pod/machine-config-daemon-fvdgt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:07:38 crc kubenswrapper[5004]: I1201 09:07:38.730870 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:07:47 crc kubenswrapper[5004]: I1201 09:07:47.150703 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-j4jrl"] Dec 01 09:07:47 crc kubenswrapper[5004]: I1201 09:07:47.154074 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j4jrl" Dec 01 09:07:47 crc kubenswrapper[5004]: I1201 09:07:47.165188 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j4jrl"] Dec 01 09:07:47 crc kubenswrapper[5004]: I1201 09:07:47.314875 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8fbe095-4c41-41a5-a87a-60d2f0a264f3-utilities\") pod \"redhat-marketplace-j4jrl\" (UID: \"d8fbe095-4c41-41a5-a87a-60d2f0a264f3\") " pod="openshift-marketplace/redhat-marketplace-j4jrl" Dec 01 09:07:47 crc kubenswrapper[5004]: I1201 09:07:47.315176 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8fbe095-4c41-41a5-a87a-60d2f0a264f3-catalog-content\") pod \"redhat-marketplace-j4jrl\" (UID: \"d8fbe095-4c41-41a5-a87a-60d2f0a264f3\") " pod="openshift-marketplace/redhat-marketplace-j4jrl" Dec 01 09:07:47 crc kubenswrapper[5004]: I1201 09:07:47.315415 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dt6cr\" (UniqueName: \"kubernetes.io/projected/d8fbe095-4c41-41a5-a87a-60d2f0a264f3-kube-api-access-dt6cr\") pod \"redhat-marketplace-j4jrl\" (UID: \"d8fbe095-4c41-41a5-a87a-60d2f0a264f3\") " pod="openshift-marketplace/redhat-marketplace-j4jrl" Dec 01 09:07:47 crc kubenswrapper[5004]: I1201 09:07:47.417359 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dt6cr\" (UniqueName: \"kubernetes.io/projected/d8fbe095-4c41-41a5-a87a-60d2f0a264f3-kube-api-access-dt6cr\") pod \"redhat-marketplace-j4jrl\" (UID: \"d8fbe095-4c41-41a5-a87a-60d2f0a264f3\") " pod="openshift-marketplace/redhat-marketplace-j4jrl" Dec 01 09:07:47 crc kubenswrapper[5004]: I1201 09:07:47.417543 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8fbe095-4c41-41a5-a87a-60d2f0a264f3-utilities\") pod \"redhat-marketplace-j4jrl\" (UID: \"d8fbe095-4c41-41a5-a87a-60d2f0a264f3\") " pod="openshift-marketplace/redhat-marketplace-j4jrl" Dec 01 09:07:47 crc kubenswrapper[5004]: I1201 09:07:47.417682 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8fbe095-4c41-41a5-a87a-60d2f0a264f3-catalog-content\") pod \"redhat-marketplace-j4jrl\" (UID: \"d8fbe095-4c41-41a5-a87a-60d2f0a264f3\") " pod="openshift-marketplace/redhat-marketplace-j4jrl" Dec 01 09:07:47 crc kubenswrapper[5004]: I1201 09:07:47.418139 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8fbe095-4c41-41a5-a87a-60d2f0a264f3-catalog-content\") pod \"redhat-marketplace-j4jrl\" (UID: \"d8fbe095-4c41-41a5-a87a-60d2f0a264f3\") " pod="openshift-marketplace/redhat-marketplace-j4jrl" Dec 01 09:07:47 crc kubenswrapper[5004]: I1201 09:07:47.418407 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8fbe095-4c41-41a5-a87a-60d2f0a264f3-utilities\") pod \"redhat-marketplace-j4jrl\" (UID: \"d8fbe095-4c41-41a5-a87a-60d2f0a264f3\") " pod="openshift-marketplace/redhat-marketplace-j4jrl" Dec 01 09:07:47 crc kubenswrapper[5004]: I1201 09:07:47.440022 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dt6cr\" (UniqueName: \"kubernetes.io/projected/d8fbe095-4c41-41a5-a87a-60d2f0a264f3-kube-api-access-dt6cr\") pod \"redhat-marketplace-j4jrl\" (UID: \"d8fbe095-4c41-41a5-a87a-60d2f0a264f3\") " pod="openshift-marketplace/redhat-marketplace-j4jrl" Dec 01 09:07:47 crc kubenswrapper[5004]: I1201 09:07:47.488490 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j4jrl" Dec 01 09:07:48 crc kubenswrapper[5004]: I1201 09:07:48.045745 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j4jrl"] Dec 01 09:07:48 crc kubenswrapper[5004]: I1201 09:07:48.600666 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j4jrl" event={"ID":"d8fbe095-4c41-41a5-a87a-60d2f0a264f3","Type":"ContainerStarted","Data":"979187322b938fb4dad49c7b2cc3534dd6e48d7f2683a0def08491ecbcd3af0d"} Dec 01 09:07:49 crc kubenswrapper[5004]: I1201 09:07:49.617271 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j4jrl" event={"ID":"d8fbe095-4c41-41a5-a87a-60d2f0a264f3","Type":"ContainerStarted","Data":"7b1644cc91ad2ccf54c77ff30f10771a64b8c4a79d6ee09d5cad2b5433493859"} Dec 01 09:07:50 crc kubenswrapper[5004]: I1201 09:07:50.638644 5004 generic.go:334] "Generic (PLEG): container finished" podID="d8fbe095-4c41-41a5-a87a-60d2f0a264f3" containerID="7b1644cc91ad2ccf54c77ff30f10771a64b8c4a79d6ee09d5cad2b5433493859" exitCode=0 Dec 01 09:07:50 crc kubenswrapper[5004]: I1201 09:07:50.639015 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j4jrl" event={"ID":"d8fbe095-4c41-41a5-a87a-60d2f0a264f3","Type":"ContainerDied","Data":"7b1644cc91ad2ccf54c77ff30f10771a64b8c4a79d6ee09d5cad2b5433493859"} Dec 01 09:07:52 crc kubenswrapper[5004]: I1201 09:07:52.660829 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j4jrl" event={"ID":"d8fbe095-4c41-41a5-a87a-60d2f0a264f3","Type":"ContainerStarted","Data":"3ee425bdd0aef4ec68a611be10605d1d8c3df8dae35aee8a713686d7dc405ea4"} Dec 01 09:07:53 crc kubenswrapper[5004]: I1201 09:07:53.674724 5004 generic.go:334] "Generic (PLEG): container finished" podID="d8fbe095-4c41-41a5-a87a-60d2f0a264f3" containerID="3ee425bdd0aef4ec68a611be10605d1d8c3df8dae35aee8a713686d7dc405ea4" exitCode=0 Dec 01 09:07:53 crc kubenswrapper[5004]: I1201 09:07:53.674886 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j4jrl" event={"ID":"d8fbe095-4c41-41a5-a87a-60d2f0a264f3","Type":"ContainerDied","Data":"3ee425bdd0aef4ec68a611be10605d1d8c3df8dae35aee8a713686d7dc405ea4"} Dec 01 09:07:55 crc kubenswrapper[5004]: I1201 09:07:55.700078 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j4jrl" event={"ID":"d8fbe095-4c41-41a5-a87a-60d2f0a264f3","Type":"ContainerStarted","Data":"262a59686b6cccd713057dd062dd0d8af9e851b56c56622f07fe17746dd383b3"} Dec 01 09:07:55 crc kubenswrapper[5004]: I1201 09:07:55.723426 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-j4jrl" podStartSLOduration=4.834646923 podStartE2EDuration="8.723410362s" podCreationTimestamp="2025-12-01 09:07:47 +0000 UTC" firstStartedPulling="2025-12-01 09:07:50.643620481 +0000 UTC m=+3048.208612473" lastFinishedPulling="2025-12-01 09:07:54.53238393 +0000 UTC m=+3052.097375912" observedRunningTime="2025-12-01 09:07:55.718318307 +0000 UTC m=+3053.283310289" watchObservedRunningTime="2025-12-01 09:07:55.723410362 +0000 UTC m=+3053.288402344" Dec 01 09:07:57 crc kubenswrapper[5004]: I1201 09:07:57.489408 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-j4jrl" Dec 01 09:07:57 crc kubenswrapper[5004]: I1201 09:07:57.490099 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-j4jrl" Dec 01 09:07:57 crc kubenswrapper[5004]: I1201 09:07:57.542264 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-j4jrl" Dec 01 09:08:07 crc kubenswrapper[5004]: I1201 09:08:07.554097 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-j4jrl" Dec 01 09:08:07 crc kubenswrapper[5004]: I1201 09:08:07.616534 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j4jrl"] Dec 01 09:08:07 crc kubenswrapper[5004]: I1201 09:08:07.874816 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-j4jrl" podUID="d8fbe095-4c41-41a5-a87a-60d2f0a264f3" containerName="registry-server" containerID="cri-o://262a59686b6cccd713057dd062dd0d8af9e851b56c56622f07fe17746dd383b3" gracePeriod=2 Dec 01 09:08:08 crc kubenswrapper[5004]: I1201 09:08:08.729333 5004 patch_prober.go:28] interesting pod/machine-config-daemon-fvdgt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:08:08 crc kubenswrapper[5004]: I1201 09:08:08.729627 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:08:08 crc kubenswrapper[5004]: I1201 09:08:08.890545 5004 generic.go:334] "Generic (PLEG): container finished" podID="d8fbe095-4c41-41a5-a87a-60d2f0a264f3" containerID="262a59686b6cccd713057dd062dd0d8af9e851b56c56622f07fe17746dd383b3" exitCode=0 Dec 01 09:08:08 crc kubenswrapper[5004]: I1201 09:08:08.890638 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j4jrl" event={"ID":"d8fbe095-4c41-41a5-a87a-60d2f0a264f3","Type":"ContainerDied","Data":"262a59686b6cccd713057dd062dd0d8af9e851b56c56622f07fe17746dd383b3"} Dec 01 09:08:09 crc kubenswrapper[5004]: I1201 09:08:09.001836 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j4jrl" Dec 01 09:08:09 crc kubenswrapper[5004]: I1201 09:08:09.120029 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dt6cr\" (UniqueName: \"kubernetes.io/projected/d8fbe095-4c41-41a5-a87a-60d2f0a264f3-kube-api-access-dt6cr\") pod \"d8fbe095-4c41-41a5-a87a-60d2f0a264f3\" (UID: \"d8fbe095-4c41-41a5-a87a-60d2f0a264f3\") " Dec 01 09:08:09 crc kubenswrapper[5004]: I1201 09:08:09.120199 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8fbe095-4c41-41a5-a87a-60d2f0a264f3-utilities\") pod \"d8fbe095-4c41-41a5-a87a-60d2f0a264f3\" (UID: \"d8fbe095-4c41-41a5-a87a-60d2f0a264f3\") " Dec 01 09:08:09 crc kubenswrapper[5004]: I1201 09:08:09.120414 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8fbe095-4c41-41a5-a87a-60d2f0a264f3-catalog-content\") pod \"d8fbe095-4c41-41a5-a87a-60d2f0a264f3\" (UID: \"d8fbe095-4c41-41a5-a87a-60d2f0a264f3\") " Dec 01 09:08:09 crc kubenswrapper[5004]: I1201 09:08:09.120838 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8fbe095-4c41-41a5-a87a-60d2f0a264f3-utilities" (OuterVolumeSpecName: "utilities") pod "d8fbe095-4c41-41a5-a87a-60d2f0a264f3" (UID: "d8fbe095-4c41-41a5-a87a-60d2f0a264f3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:08:09 crc kubenswrapper[5004]: I1201 09:08:09.128543 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8fbe095-4c41-41a5-a87a-60d2f0a264f3-kube-api-access-dt6cr" (OuterVolumeSpecName: "kube-api-access-dt6cr") pod "d8fbe095-4c41-41a5-a87a-60d2f0a264f3" (UID: "d8fbe095-4c41-41a5-a87a-60d2f0a264f3"). InnerVolumeSpecName "kube-api-access-dt6cr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:08:09 crc kubenswrapper[5004]: I1201 09:08:09.128654 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8fbe095-4c41-41a5-a87a-60d2f0a264f3-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:09 crc kubenswrapper[5004]: I1201 09:08:09.140369 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8fbe095-4c41-41a5-a87a-60d2f0a264f3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d8fbe095-4c41-41a5-a87a-60d2f0a264f3" (UID: "d8fbe095-4c41-41a5-a87a-60d2f0a264f3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:08:09 crc kubenswrapper[5004]: I1201 09:08:09.231235 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8fbe095-4c41-41a5-a87a-60d2f0a264f3-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:09 crc kubenswrapper[5004]: I1201 09:08:09.231279 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dt6cr\" (UniqueName: \"kubernetes.io/projected/d8fbe095-4c41-41a5-a87a-60d2f0a264f3-kube-api-access-dt6cr\") on node \"crc\" DevicePath \"\"" Dec 01 09:08:09 crc kubenswrapper[5004]: I1201 09:08:09.909906 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j4jrl" event={"ID":"d8fbe095-4c41-41a5-a87a-60d2f0a264f3","Type":"ContainerDied","Data":"979187322b938fb4dad49c7b2cc3534dd6e48d7f2683a0def08491ecbcd3af0d"} Dec 01 09:08:09 crc kubenswrapper[5004]: I1201 09:08:09.910255 5004 scope.go:117] "RemoveContainer" containerID="262a59686b6cccd713057dd062dd0d8af9e851b56c56622f07fe17746dd383b3" Dec 01 09:08:09 crc kubenswrapper[5004]: I1201 09:08:09.909969 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j4jrl" Dec 01 09:08:09 crc kubenswrapper[5004]: I1201 09:08:09.963754 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j4jrl"] Dec 01 09:08:09 crc kubenswrapper[5004]: I1201 09:08:09.966306 5004 scope.go:117] "RemoveContainer" containerID="3ee425bdd0aef4ec68a611be10605d1d8c3df8dae35aee8a713686d7dc405ea4" Dec 01 09:08:09 crc kubenswrapper[5004]: I1201 09:08:09.992935 5004 scope.go:117] "RemoveContainer" containerID="7b1644cc91ad2ccf54c77ff30f10771a64b8c4a79d6ee09d5cad2b5433493859" Dec 01 09:08:09 crc kubenswrapper[5004]: I1201 09:08:09.995913 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-j4jrl"] Dec 01 09:08:10 crc kubenswrapper[5004]: I1201 09:08:10.773273 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8fbe095-4c41-41a5-a87a-60d2f0a264f3" path="/var/lib/kubelet/pods/d8fbe095-4c41-41a5-a87a-60d2f0a264f3/volumes" Dec 01 09:08:38 crc kubenswrapper[5004]: I1201 09:08:38.729241 5004 patch_prober.go:28] interesting pod/machine-config-daemon-fvdgt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:08:38 crc kubenswrapper[5004]: I1201 09:08:38.729945 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:08:38 crc kubenswrapper[5004]: I1201 09:08:38.730026 5004 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" Dec 01 09:08:38 crc kubenswrapper[5004]: I1201 09:08:38.731675 5004 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"959c4dd48a6a01a111184092f5278271df46b0993f7d823dbccdaf0fe68ac515"} pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 09:08:38 crc kubenswrapper[5004]: I1201 09:08:38.731838 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" containerID="cri-o://959c4dd48a6a01a111184092f5278271df46b0993f7d823dbccdaf0fe68ac515" gracePeriod=600 Dec 01 09:08:39 crc kubenswrapper[5004]: I1201 09:08:39.275905 5004 generic.go:334] "Generic (PLEG): container finished" podID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerID="959c4dd48a6a01a111184092f5278271df46b0993f7d823dbccdaf0fe68ac515" exitCode=0 Dec 01 09:08:39 crc kubenswrapper[5004]: I1201 09:08:39.276009 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" event={"ID":"f9977ebb-82de-4e96-8763-0b5a84f8d4ce","Type":"ContainerDied","Data":"959c4dd48a6a01a111184092f5278271df46b0993f7d823dbccdaf0fe68ac515"} Dec 01 09:08:39 crc kubenswrapper[5004]: I1201 09:08:39.276363 5004 scope.go:117] "RemoveContainer" containerID="a78d694ae3feedb53c9bcdd2efb1f355640963d1a6d9fa69392d3b2ab8f0f9a6" Dec 01 09:08:39 crc kubenswrapper[5004]: E1201 09:08:39.874210 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:08:40 crc kubenswrapper[5004]: I1201 09:08:40.288964 5004 scope.go:117] "RemoveContainer" containerID="959c4dd48a6a01a111184092f5278271df46b0993f7d823dbccdaf0fe68ac515" Dec 01 09:08:40 crc kubenswrapper[5004]: E1201 09:08:40.289274 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:08:54 crc kubenswrapper[5004]: I1201 09:08:54.759971 5004 scope.go:117] "RemoveContainer" containerID="959c4dd48a6a01a111184092f5278271df46b0993f7d823dbccdaf0fe68ac515" Dec 01 09:08:54 crc kubenswrapper[5004]: E1201 09:08:54.762140 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:09:06 crc kubenswrapper[5004]: I1201 09:09:06.760206 5004 scope.go:117] "RemoveContainer" containerID="959c4dd48a6a01a111184092f5278271df46b0993f7d823dbccdaf0fe68ac515" Dec 01 09:09:06 crc kubenswrapper[5004]: E1201 09:09:06.760914 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:09:17 crc kubenswrapper[5004]: I1201 09:09:17.760037 5004 scope.go:117] "RemoveContainer" containerID="959c4dd48a6a01a111184092f5278271df46b0993f7d823dbccdaf0fe68ac515" Dec 01 09:09:17 crc kubenswrapper[5004]: E1201 09:09:17.761806 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:09:32 crc kubenswrapper[5004]: I1201 09:09:32.776425 5004 scope.go:117] "RemoveContainer" containerID="959c4dd48a6a01a111184092f5278271df46b0993f7d823dbccdaf0fe68ac515" Dec 01 09:09:32 crc kubenswrapper[5004]: E1201 09:09:32.777984 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:09:46 crc kubenswrapper[5004]: I1201 09:09:46.759988 5004 scope.go:117] "RemoveContainer" containerID="959c4dd48a6a01a111184092f5278271df46b0993f7d823dbccdaf0fe68ac515" Dec 01 09:09:46 crc kubenswrapper[5004]: E1201 09:09:46.760983 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:09:54 crc kubenswrapper[5004]: I1201 09:09:54.252412 5004 generic.go:334] "Generic (PLEG): container finished" podID="194e6eb7-a03b-4482-a871-368a20922a87" containerID="58e88aa15ca589e50e6088acc5f2661c83819978d47eeb0629991f4baa8e184a" exitCode=0 Dec 01 09:09:54 crc kubenswrapper[5004]: I1201 09:09:54.253209 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6h4fj" event={"ID":"194e6eb7-a03b-4482-a871-368a20922a87","Type":"ContainerDied","Data":"58e88aa15ca589e50e6088acc5f2661c83819978d47eeb0629991f4baa8e184a"} Dec 01 09:09:55 crc kubenswrapper[5004]: I1201 09:09:55.853606 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6h4fj" Dec 01 09:09:56 crc kubenswrapper[5004]: I1201 09:09:56.032206 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/194e6eb7-a03b-4482-a871-368a20922a87-ceilometer-compute-config-data-1\") pod \"194e6eb7-a03b-4482-a871-368a20922a87\" (UID: \"194e6eb7-a03b-4482-a871-368a20922a87\") " Dec 01 09:09:56 crc kubenswrapper[5004]: I1201 09:09:56.032689 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/194e6eb7-a03b-4482-a871-368a20922a87-inventory\") pod \"194e6eb7-a03b-4482-a871-368a20922a87\" (UID: \"194e6eb7-a03b-4482-a871-368a20922a87\") " Dec 01 09:09:56 crc kubenswrapper[5004]: I1201 09:09:56.032742 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fw88c\" (UniqueName: \"kubernetes.io/projected/194e6eb7-a03b-4482-a871-368a20922a87-kube-api-access-fw88c\") pod \"194e6eb7-a03b-4482-a871-368a20922a87\" (UID: \"194e6eb7-a03b-4482-a871-368a20922a87\") " Dec 01 09:09:56 crc kubenswrapper[5004]: I1201 09:09:56.032772 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/194e6eb7-a03b-4482-a871-368a20922a87-ssh-key\") pod \"194e6eb7-a03b-4482-a871-368a20922a87\" (UID: \"194e6eb7-a03b-4482-a871-368a20922a87\") " Dec 01 09:09:56 crc kubenswrapper[5004]: I1201 09:09:56.032863 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/194e6eb7-a03b-4482-a871-368a20922a87-telemetry-combined-ca-bundle\") pod \"194e6eb7-a03b-4482-a871-368a20922a87\" (UID: \"194e6eb7-a03b-4482-a871-368a20922a87\") " Dec 01 09:09:56 crc kubenswrapper[5004]: I1201 09:09:56.032952 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/194e6eb7-a03b-4482-a871-368a20922a87-ceilometer-compute-config-data-0\") pod \"194e6eb7-a03b-4482-a871-368a20922a87\" (UID: \"194e6eb7-a03b-4482-a871-368a20922a87\") " Dec 01 09:09:56 crc kubenswrapper[5004]: I1201 09:09:56.032984 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/194e6eb7-a03b-4482-a871-368a20922a87-ceilometer-compute-config-data-2\") pod \"194e6eb7-a03b-4482-a871-368a20922a87\" (UID: \"194e6eb7-a03b-4482-a871-368a20922a87\") " Dec 01 09:09:56 crc kubenswrapper[5004]: I1201 09:09:56.038801 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/194e6eb7-a03b-4482-a871-368a20922a87-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "194e6eb7-a03b-4482-a871-368a20922a87" (UID: "194e6eb7-a03b-4482-a871-368a20922a87"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:09:56 crc kubenswrapper[5004]: I1201 09:09:56.048252 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/194e6eb7-a03b-4482-a871-368a20922a87-kube-api-access-fw88c" (OuterVolumeSpecName: "kube-api-access-fw88c") pod "194e6eb7-a03b-4482-a871-368a20922a87" (UID: "194e6eb7-a03b-4482-a871-368a20922a87"). InnerVolumeSpecName "kube-api-access-fw88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:09:56 crc kubenswrapper[5004]: I1201 09:09:56.068586 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/194e6eb7-a03b-4482-a871-368a20922a87-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "194e6eb7-a03b-4482-a871-368a20922a87" (UID: "194e6eb7-a03b-4482-a871-368a20922a87"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:09:56 crc kubenswrapper[5004]: I1201 09:09:56.069789 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/194e6eb7-a03b-4482-a871-368a20922a87-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "194e6eb7-a03b-4482-a871-368a20922a87" (UID: "194e6eb7-a03b-4482-a871-368a20922a87"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:09:56 crc kubenswrapper[5004]: I1201 09:09:56.070341 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/194e6eb7-a03b-4482-a871-368a20922a87-inventory" (OuterVolumeSpecName: "inventory") pod "194e6eb7-a03b-4482-a871-368a20922a87" (UID: "194e6eb7-a03b-4482-a871-368a20922a87"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:09:56 crc kubenswrapper[5004]: I1201 09:09:56.070942 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/194e6eb7-a03b-4482-a871-368a20922a87-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "194e6eb7-a03b-4482-a871-368a20922a87" (UID: "194e6eb7-a03b-4482-a871-368a20922a87"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:09:56 crc kubenswrapper[5004]: I1201 09:09:56.108283 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/194e6eb7-a03b-4482-a871-368a20922a87-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "194e6eb7-a03b-4482-a871-368a20922a87" (UID: "194e6eb7-a03b-4482-a871-368a20922a87"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:09:56 crc kubenswrapper[5004]: I1201 09:09:56.136197 5004 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/194e6eb7-a03b-4482-a871-368a20922a87-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 01 09:09:56 crc kubenswrapper[5004]: I1201 09:09:56.136425 5004 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/194e6eb7-a03b-4482-a871-368a20922a87-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 09:09:56 crc kubenswrapper[5004]: I1201 09:09:56.136503 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fw88c\" (UniqueName: \"kubernetes.io/projected/194e6eb7-a03b-4482-a871-368a20922a87-kube-api-access-fw88c\") on node \"crc\" DevicePath \"\"" Dec 01 09:09:56 crc kubenswrapper[5004]: I1201 09:09:56.136603 5004 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/194e6eb7-a03b-4482-a871-368a20922a87-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 09:09:56 crc kubenswrapper[5004]: I1201 09:09:56.136677 5004 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/194e6eb7-a03b-4482-a871-368a20922a87-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:09:56 crc kubenswrapper[5004]: I1201 09:09:56.136751 5004 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/194e6eb7-a03b-4482-a871-368a20922a87-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Dec 01 09:09:56 crc kubenswrapper[5004]: I1201 09:09:56.136842 5004 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/194e6eb7-a03b-4482-a871-368a20922a87-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 01 09:09:56 crc kubenswrapper[5004]: I1201 09:09:56.275910 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6h4fj" event={"ID":"194e6eb7-a03b-4482-a871-368a20922a87","Type":"ContainerDied","Data":"15edf66b7c84dbebcb9a632bc76b6e7790ea122f6135926337d6befc6d9afa08"} Dec 01 09:09:56 crc kubenswrapper[5004]: I1201 09:09:56.275946 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15edf66b7c84dbebcb9a632bc76b6e7790ea122f6135926337d6befc6d9afa08" Dec 01 09:09:56 crc kubenswrapper[5004]: I1201 09:09:56.275952 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6h4fj" Dec 01 09:09:56 crc kubenswrapper[5004]: I1201 09:09:56.396934 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-rs57q"] Dec 01 09:09:56 crc kubenswrapper[5004]: E1201 09:09:56.397859 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8fbe095-4c41-41a5-a87a-60d2f0a264f3" containerName="extract-content" Dec 01 09:09:56 crc kubenswrapper[5004]: I1201 09:09:56.397888 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8fbe095-4c41-41a5-a87a-60d2f0a264f3" containerName="extract-content" Dec 01 09:09:56 crc kubenswrapper[5004]: E1201 09:09:56.397914 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8fbe095-4c41-41a5-a87a-60d2f0a264f3" containerName="registry-server" Dec 01 09:09:56 crc kubenswrapper[5004]: I1201 09:09:56.397927 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8fbe095-4c41-41a5-a87a-60d2f0a264f3" containerName="registry-server" Dec 01 09:09:56 crc kubenswrapper[5004]: E1201 09:09:56.397973 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8fbe095-4c41-41a5-a87a-60d2f0a264f3" containerName="extract-utilities" Dec 01 09:09:56 crc kubenswrapper[5004]: I1201 09:09:56.397988 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8fbe095-4c41-41a5-a87a-60d2f0a264f3" containerName="extract-utilities" Dec 01 09:09:56 crc kubenswrapper[5004]: E1201 09:09:56.398011 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="194e6eb7-a03b-4482-a871-368a20922a87" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 01 09:09:56 crc kubenswrapper[5004]: I1201 09:09:56.398026 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="194e6eb7-a03b-4482-a871-368a20922a87" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 01 09:09:56 crc kubenswrapper[5004]: I1201 09:09:56.398517 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8fbe095-4c41-41a5-a87a-60d2f0a264f3" containerName="registry-server" Dec 01 09:09:56 crc kubenswrapper[5004]: I1201 09:09:56.398614 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="194e6eb7-a03b-4482-a871-368a20922a87" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 01 09:09:56 crc kubenswrapper[5004]: I1201 09:09:56.400240 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-rs57q" Dec 01 09:09:56 crc kubenswrapper[5004]: I1201 09:09:56.403762 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pdnrq" Dec 01 09:09:56 crc kubenswrapper[5004]: I1201 09:09:56.404186 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 09:09:56 crc kubenswrapper[5004]: I1201 09:09:56.404487 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 09:09:56 crc kubenswrapper[5004]: I1201 09:09:56.404840 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-ipmi-config-data" Dec 01 09:09:56 crc kubenswrapper[5004]: I1201 09:09:56.410471 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 09:09:56 crc kubenswrapper[5004]: I1201 09:09:56.415741 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-rs57q"] Dec 01 09:09:56 crc kubenswrapper[5004]: I1201 09:09:56.443584 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/6a233136-7248-4994-a3d6-0108bbf72fef-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-rs57q\" (UID: \"6a233136-7248-4994-a3d6-0108bbf72fef\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-rs57q" Dec 01 09:09:56 crc kubenswrapper[5004]: I1201 09:09:56.443662 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6a233136-7248-4994-a3d6-0108bbf72fef-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-rs57q\" (UID: \"6a233136-7248-4994-a3d6-0108bbf72fef\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-rs57q" Dec 01 09:09:56 crc kubenswrapper[5004]: I1201 09:09:56.443755 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a233136-7248-4994-a3d6-0108bbf72fef-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-rs57q\" (UID: \"6a233136-7248-4994-a3d6-0108bbf72fef\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-rs57q" Dec 01 09:09:56 crc kubenswrapper[5004]: I1201 09:09:56.443816 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6a233136-7248-4994-a3d6-0108bbf72fef-ssh-key\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-rs57q\" (UID: \"6a233136-7248-4994-a3d6-0108bbf72fef\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-rs57q" Dec 01 09:09:56 crc kubenswrapper[5004]: I1201 09:09:56.443842 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-274zt\" (UniqueName: \"kubernetes.io/projected/6a233136-7248-4994-a3d6-0108bbf72fef-kube-api-access-274zt\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-rs57q\" (UID: \"6a233136-7248-4994-a3d6-0108bbf72fef\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-rs57q" Dec 01 09:09:56 crc kubenswrapper[5004]: I1201 09:09:56.443891 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/6a233136-7248-4994-a3d6-0108bbf72fef-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-rs57q\" (UID: \"6a233136-7248-4994-a3d6-0108bbf72fef\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-rs57q" Dec 01 09:09:56 crc kubenswrapper[5004]: I1201 09:09:56.443922 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/6a233136-7248-4994-a3d6-0108bbf72fef-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-rs57q\" (UID: \"6a233136-7248-4994-a3d6-0108bbf72fef\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-rs57q" Dec 01 09:09:56 crc kubenswrapper[5004]: I1201 09:09:56.546207 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6a233136-7248-4994-a3d6-0108bbf72fef-ssh-key\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-rs57q\" (UID: \"6a233136-7248-4994-a3d6-0108bbf72fef\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-rs57q" Dec 01 09:09:56 crc kubenswrapper[5004]: I1201 09:09:56.546305 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-274zt\" (UniqueName: \"kubernetes.io/projected/6a233136-7248-4994-a3d6-0108bbf72fef-kube-api-access-274zt\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-rs57q\" (UID: \"6a233136-7248-4994-a3d6-0108bbf72fef\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-rs57q" Dec 01 09:09:56 crc kubenswrapper[5004]: I1201 09:09:56.546359 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/6a233136-7248-4994-a3d6-0108bbf72fef-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-rs57q\" (UID: \"6a233136-7248-4994-a3d6-0108bbf72fef\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-rs57q" Dec 01 09:09:56 crc kubenswrapper[5004]: I1201 09:09:56.546410 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/6a233136-7248-4994-a3d6-0108bbf72fef-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-rs57q\" (UID: \"6a233136-7248-4994-a3d6-0108bbf72fef\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-rs57q" Dec 01 09:09:56 crc kubenswrapper[5004]: I1201 09:09:56.546509 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/6a233136-7248-4994-a3d6-0108bbf72fef-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-rs57q\" (UID: \"6a233136-7248-4994-a3d6-0108bbf72fef\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-rs57q" Dec 01 09:09:56 crc kubenswrapper[5004]: I1201 09:09:56.546568 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6a233136-7248-4994-a3d6-0108bbf72fef-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-rs57q\" (UID: \"6a233136-7248-4994-a3d6-0108bbf72fef\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-rs57q" Dec 01 09:09:56 crc kubenswrapper[5004]: I1201 09:09:56.547305 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a233136-7248-4994-a3d6-0108bbf72fef-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-rs57q\" (UID: \"6a233136-7248-4994-a3d6-0108bbf72fef\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-rs57q" Dec 01 09:09:56 crc kubenswrapper[5004]: I1201 09:09:56.551235 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6a233136-7248-4994-a3d6-0108bbf72fef-ssh-key\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-rs57q\" (UID: \"6a233136-7248-4994-a3d6-0108bbf72fef\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-rs57q" Dec 01 09:09:56 crc kubenswrapper[5004]: I1201 09:09:56.551281 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/6a233136-7248-4994-a3d6-0108bbf72fef-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-rs57q\" (UID: \"6a233136-7248-4994-a3d6-0108bbf72fef\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-rs57q" Dec 01 09:09:56 crc kubenswrapper[5004]: I1201 09:09:56.551363 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6a233136-7248-4994-a3d6-0108bbf72fef-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-rs57q\" (UID: \"6a233136-7248-4994-a3d6-0108bbf72fef\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-rs57q" Dec 01 09:09:56 crc kubenswrapper[5004]: I1201 09:09:56.551514 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/6a233136-7248-4994-a3d6-0108bbf72fef-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-rs57q\" (UID: \"6a233136-7248-4994-a3d6-0108bbf72fef\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-rs57q" Dec 01 09:09:56 crc kubenswrapper[5004]: I1201 09:09:56.552161 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a233136-7248-4994-a3d6-0108bbf72fef-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-rs57q\" (UID: \"6a233136-7248-4994-a3d6-0108bbf72fef\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-rs57q" Dec 01 09:09:56 crc kubenswrapper[5004]: I1201 09:09:56.552261 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/6a233136-7248-4994-a3d6-0108bbf72fef-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-rs57q\" (UID: \"6a233136-7248-4994-a3d6-0108bbf72fef\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-rs57q" Dec 01 09:09:56 crc kubenswrapper[5004]: I1201 09:09:56.565526 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-274zt\" (UniqueName: \"kubernetes.io/projected/6a233136-7248-4994-a3d6-0108bbf72fef-kube-api-access-274zt\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-rs57q\" (UID: \"6a233136-7248-4994-a3d6-0108bbf72fef\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-rs57q" Dec 01 09:09:56 crc kubenswrapper[5004]: I1201 09:09:56.734476 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-rs57q" Dec 01 09:09:57 crc kubenswrapper[5004]: I1201 09:09:57.347733 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-rs57q"] Dec 01 09:09:58 crc kubenswrapper[5004]: I1201 09:09:58.313744 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-rs57q" event={"ID":"6a233136-7248-4994-a3d6-0108bbf72fef","Type":"ContainerStarted","Data":"833995a0f7b26bc5757530b1801b4f4d3ff79ba57d3d8be31a528cf2c743e8c3"} Dec 01 09:09:59 crc kubenswrapper[5004]: I1201 09:09:59.328159 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-rs57q" event={"ID":"6a233136-7248-4994-a3d6-0108bbf72fef","Type":"ContainerStarted","Data":"e435d8aebaaf7c01ecaea4634c429b468e5ace8d1b8d4bf8e19378595a25f782"} Dec 01 09:09:59 crc kubenswrapper[5004]: I1201 09:09:59.355827 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-rs57q" podStartSLOduration=2.5649172780000002 podStartE2EDuration="3.355806823s" podCreationTimestamp="2025-12-01 09:09:56 +0000 UTC" firstStartedPulling="2025-12-01 09:09:57.356143833 +0000 UTC m=+3174.921135815" lastFinishedPulling="2025-12-01 09:09:58.147033378 +0000 UTC m=+3175.712025360" observedRunningTime="2025-12-01 09:09:59.349290493 +0000 UTC m=+3176.914282495" watchObservedRunningTime="2025-12-01 09:09:59.355806823 +0000 UTC m=+3176.920798805" Dec 01 09:10:01 crc kubenswrapper[5004]: I1201 09:10:01.758744 5004 scope.go:117] "RemoveContainer" containerID="959c4dd48a6a01a111184092f5278271df46b0993f7d823dbccdaf0fe68ac515" Dec 01 09:10:01 crc kubenswrapper[5004]: E1201 09:10:01.759495 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:10:15 crc kubenswrapper[5004]: I1201 09:10:15.760368 5004 scope.go:117] "RemoveContainer" containerID="959c4dd48a6a01a111184092f5278271df46b0993f7d823dbccdaf0fe68ac515" Dec 01 09:10:15 crc kubenswrapper[5004]: E1201 09:10:15.761930 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:10:28 crc kubenswrapper[5004]: I1201 09:10:28.759943 5004 scope.go:117] "RemoveContainer" containerID="959c4dd48a6a01a111184092f5278271df46b0993f7d823dbccdaf0fe68ac515" Dec 01 09:10:28 crc kubenswrapper[5004]: E1201 09:10:28.760749 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:10:33 crc kubenswrapper[5004]: I1201 09:10:33.609662 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jd9cf"] Dec 01 09:10:33 crc kubenswrapper[5004]: I1201 09:10:33.624605 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jd9cf" Dec 01 09:10:33 crc kubenswrapper[5004]: I1201 09:10:33.638424 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jd9cf"] Dec 01 09:10:33 crc kubenswrapper[5004]: I1201 09:10:33.788147 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d29e6f3e-9fc3-4357-8d96-4f9115b0c841-catalog-content\") pod \"certified-operators-jd9cf\" (UID: \"d29e6f3e-9fc3-4357-8d96-4f9115b0c841\") " pod="openshift-marketplace/certified-operators-jd9cf" Dec 01 09:10:33 crc kubenswrapper[5004]: I1201 09:10:33.788299 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkpr7\" (UniqueName: \"kubernetes.io/projected/d29e6f3e-9fc3-4357-8d96-4f9115b0c841-kube-api-access-qkpr7\") pod \"certified-operators-jd9cf\" (UID: \"d29e6f3e-9fc3-4357-8d96-4f9115b0c841\") " pod="openshift-marketplace/certified-operators-jd9cf" Dec 01 09:10:33 crc kubenswrapper[5004]: I1201 09:10:33.788509 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d29e6f3e-9fc3-4357-8d96-4f9115b0c841-utilities\") pod \"certified-operators-jd9cf\" (UID: \"d29e6f3e-9fc3-4357-8d96-4f9115b0c841\") " pod="openshift-marketplace/certified-operators-jd9cf" Dec 01 09:10:33 crc kubenswrapper[5004]: I1201 09:10:33.890838 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d29e6f3e-9fc3-4357-8d96-4f9115b0c841-utilities\") pod \"certified-operators-jd9cf\" (UID: \"d29e6f3e-9fc3-4357-8d96-4f9115b0c841\") " pod="openshift-marketplace/certified-operators-jd9cf" Dec 01 09:10:33 crc kubenswrapper[5004]: I1201 09:10:33.890970 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d29e6f3e-9fc3-4357-8d96-4f9115b0c841-catalog-content\") pod \"certified-operators-jd9cf\" (UID: \"d29e6f3e-9fc3-4357-8d96-4f9115b0c841\") " pod="openshift-marketplace/certified-operators-jd9cf" Dec 01 09:10:33 crc kubenswrapper[5004]: I1201 09:10:33.891054 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkpr7\" (UniqueName: \"kubernetes.io/projected/d29e6f3e-9fc3-4357-8d96-4f9115b0c841-kube-api-access-qkpr7\") pod \"certified-operators-jd9cf\" (UID: \"d29e6f3e-9fc3-4357-8d96-4f9115b0c841\") " pod="openshift-marketplace/certified-operators-jd9cf" Dec 01 09:10:33 crc kubenswrapper[5004]: I1201 09:10:33.891490 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d29e6f3e-9fc3-4357-8d96-4f9115b0c841-catalog-content\") pod \"certified-operators-jd9cf\" (UID: \"d29e6f3e-9fc3-4357-8d96-4f9115b0c841\") " pod="openshift-marketplace/certified-operators-jd9cf" Dec 01 09:10:33 crc kubenswrapper[5004]: I1201 09:10:33.891484 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d29e6f3e-9fc3-4357-8d96-4f9115b0c841-utilities\") pod \"certified-operators-jd9cf\" (UID: \"d29e6f3e-9fc3-4357-8d96-4f9115b0c841\") " pod="openshift-marketplace/certified-operators-jd9cf" Dec 01 09:10:33 crc kubenswrapper[5004]: I1201 09:10:33.919943 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkpr7\" (UniqueName: \"kubernetes.io/projected/d29e6f3e-9fc3-4357-8d96-4f9115b0c841-kube-api-access-qkpr7\") pod \"certified-operators-jd9cf\" (UID: \"d29e6f3e-9fc3-4357-8d96-4f9115b0c841\") " pod="openshift-marketplace/certified-operators-jd9cf" Dec 01 09:10:33 crc kubenswrapper[5004]: I1201 09:10:33.962086 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jd9cf" Dec 01 09:10:34 crc kubenswrapper[5004]: I1201 09:10:34.484132 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jd9cf"] Dec 01 09:10:34 crc kubenswrapper[5004]: I1201 09:10:34.756495 5004 generic.go:334] "Generic (PLEG): container finished" podID="d29e6f3e-9fc3-4357-8d96-4f9115b0c841" containerID="67c849fee9fa664c1c07fe49f4899653cfa76d8293066adea49fd41a10f1f0fd" exitCode=0 Dec 01 09:10:34 crc kubenswrapper[5004]: I1201 09:10:34.756535 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jd9cf" event={"ID":"d29e6f3e-9fc3-4357-8d96-4f9115b0c841","Type":"ContainerDied","Data":"67c849fee9fa664c1c07fe49f4899653cfa76d8293066adea49fd41a10f1f0fd"} Dec 01 09:10:34 crc kubenswrapper[5004]: I1201 09:10:34.756929 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jd9cf" event={"ID":"d29e6f3e-9fc3-4357-8d96-4f9115b0c841","Type":"ContainerStarted","Data":"41b326285bd19d9d285338dcbfeb6a42e73e1ecb95faec96f787fed48245517f"} Dec 01 09:10:34 crc kubenswrapper[5004]: E1201 09:10:34.930480 5004 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd29e6f3e_9fc3_4357_8d96_4f9115b0c841.slice/crio-67c849fee9fa664c1c07fe49f4899653cfa76d8293066adea49fd41a10f1f0fd.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd29e6f3e_9fc3_4357_8d96_4f9115b0c841.slice/crio-conmon-67c849fee9fa664c1c07fe49f4899653cfa76d8293066adea49fd41a10f1f0fd.scope\": RecentStats: unable to find data in memory cache]" Dec 01 09:10:35 crc kubenswrapper[5004]: I1201 09:10:35.768096 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jd9cf" event={"ID":"d29e6f3e-9fc3-4357-8d96-4f9115b0c841","Type":"ContainerStarted","Data":"5d74d0a361d387ce4af55c6fadb78f633915b5be9a538b7bfeac9bede605b45d"} Dec 01 09:10:37 crc kubenswrapper[5004]: I1201 09:10:37.798840 5004 generic.go:334] "Generic (PLEG): container finished" podID="d29e6f3e-9fc3-4357-8d96-4f9115b0c841" containerID="5d74d0a361d387ce4af55c6fadb78f633915b5be9a538b7bfeac9bede605b45d" exitCode=0 Dec 01 09:10:37 crc kubenswrapper[5004]: I1201 09:10:37.798903 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jd9cf" event={"ID":"d29e6f3e-9fc3-4357-8d96-4f9115b0c841","Type":"ContainerDied","Data":"5d74d0a361d387ce4af55c6fadb78f633915b5be9a538b7bfeac9bede605b45d"} Dec 01 09:10:39 crc kubenswrapper[5004]: I1201 09:10:39.829049 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jd9cf" event={"ID":"d29e6f3e-9fc3-4357-8d96-4f9115b0c841","Type":"ContainerStarted","Data":"26aa2fc50d0a12abd78d46516f439ce17d3f163eee15fdea29baeeceafa29bcb"} Dec 01 09:10:39 crc kubenswrapper[5004]: I1201 09:10:39.858513 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jd9cf" podStartSLOduration=2.863273161 podStartE2EDuration="6.858484851s" podCreationTimestamp="2025-12-01 09:10:33 +0000 UTC" firstStartedPulling="2025-12-01 09:10:34.758973149 +0000 UTC m=+3212.323965131" lastFinishedPulling="2025-12-01 09:10:38.754184829 +0000 UTC m=+3216.319176821" observedRunningTime="2025-12-01 09:10:39.854466433 +0000 UTC m=+3217.419458435" watchObservedRunningTime="2025-12-01 09:10:39.858484851 +0000 UTC m=+3217.423476853" Dec 01 09:10:42 crc kubenswrapper[5004]: I1201 09:10:42.769121 5004 scope.go:117] "RemoveContainer" containerID="959c4dd48a6a01a111184092f5278271df46b0993f7d823dbccdaf0fe68ac515" Dec 01 09:10:42 crc kubenswrapper[5004]: E1201 09:10:42.769839 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:10:43 crc kubenswrapper[5004]: I1201 09:10:43.962579 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jd9cf" Dec 01 09:10:43 crc kubenswrapper[5004]: I1201 09:10:43.962955 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jd9cf" Dec 01 09:10:44 crc kubenswrapper[5004]: I1201 09:10:44.040809 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jd9cf" Dec 01 09:10:44 crc kubenswrapper[5004]: I1201 09:10:44.972392 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jd9cf" Dec 01 09:10:45 crc kubenswrapper[5004]: I1201 09:10:45.370996 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jd9cf"] Dec 01 09:10:46 crc kubenswrapper[5004]: I1201 09:10:46.918157 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jd9cf" podUID="d29e6f3e-9fc3-4357-8d96-4f9115b0c841" containerName="registry-server" containerID="cri-o://26aa2fc50d0a12abd78d46516f439ce17d3f163eee15fdea29baeeceafa29bcb" gracePeriod=2 Dec 01 09:10:47 crc kubenswrapper[5004]: I1201 09:10:47.472230 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jd9cf" Dec 01 09:10:47 crc kubenswrapper[5004]: I1201 09:10:47.541410 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d29e6f3e-9fc3-4357-8d96-4f9115b0c841-utilities\") pod \"d29e6f3e-9fc3-4357-8d96-4f9115b0c841\" (UID: \"d29e6f3e-9fc3-4357-8d96-4f9115b0c841\") " Dec 01 09:10:47 crc kubenswrapper[5004]: I1201 09:10:47.541686 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkpr7\" (UniqueName: \"kubernetes.io/projected/d29e6f3e-9fc3-4357-8d96-4f9115b0c841-kube-api-access-qkpr7\") pod \"d29e6f3e-9fc3-4357-8d96-4f9115b0c841\" (UID: \"d29e6f3e-9fc3-4357-8d96-4f9115b0c841\") " Dec 01 09:10:47 crc kubenswrapper[5004]: I1201 09:10:47.541887 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d29e6f3e-9fc3-4357-8d96-4f9115b0c841-catalog-content\") pod \"d29e6f3e-9fc3-4357-8d96-4f9115b0c841\" (UID: \"d29e6f3e-9fc3-4357-8d96-4f9115b0c841\") " Dec 01 09:10:47 crc kubenswrapper[5004]: I1201 09:10:47.542814 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d29e6f3e-9fc3-4357-8d96-4f9115b0c841-utilities" (OuterVolumeSpecName: "utilities") pod "d29e6f3e-9fc3-4357-8d96-4f9115b0c841" (UID: "d29e6f3e-9fc3-4357-8d96-4f9115b0c841"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:10:47 crc kubenswrapper[5004]: I1201 09:10:47.551610 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d29e6f3e-9fc3-4357-8d96-4f9115b0c841-kube-api-access-qkpr7" (OuterVolumeSpecName: "kube-api-access-qkpr7") pod "d29e6f3e-9fc3-4357-8d96-4f9115b0c841" (UID: "d29e6f3e-9fc3-4357-8d96-4f9115b0c841"). InnerVolumeSpecName "kube-api-access-qkpr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:10:47 crc kubenswrapper[5004]: I1201 09:10:47.588753 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d29e6f3e-9fc3-4357-8d96-4f9115b0c841-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d29e6f3e-9fc3-4357-8d96-4f9115b0c841" (UID: "d29e6f3e-9fc3-4357-8d96-4f9115b0c841"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:10:47 crc kubenswrapper[5004]: I1201 09:10:47.644150 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d29e6f3e-9fc3-4357-8d96-4f9115b0c841-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:10:47 crc kubenswrapper[5004]: I1201 09:10:47.644214 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d29e6f3e-9fc3-4357-8d96-4f9115b0c841-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:10:47 crc kubenswrapper[5004]: I1201 09:10:47.644225 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkpr7\" (UniqueName: \"kubernetes.io/projected/d29e6f3e-9fc3-4357-8d96-4f9115b0c841-kube-api-access-qkpr7\") on node \"crc\" DevicePath \"\"" Dec 01 09:10:47 crc kubenswrapper[5004]: I1201 09:10:47.945734 5004 generic.go:334] "Generic (PLEG): container finished" podID="d29e6f3e-9fc3-4357-8d96-4f9115b0c841" containerID="26aa2fc50d0a12abd78d46516f439ce17d3f163eee15fdea29baeeceafa29bcb" exitCode=0 Dec 01 09:10:47 crc kubenswrapper[5004]: I1201 09:10:47.945808 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jd9cf" event={"ID":"d29e6f3e-9fc3-4357-8d96-4f9115b0c841","Type":"ContainerDied","Data":"26aa2fc50d0a12abd78d46516f439ce17d3f163eee15fdea29baeeceafa29bcb"} Dec 01 09:10:47 crc kubenswrapper[5004]: I1201 09:10:47.947243 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jd9cf" event={"ID":"d29e6f3e-9fc3-4357-8d96-4f9115b0c841","Type":"ContainerDied","Data":"41b326285bd19d9d285338dcbfeb6a42e73e1ecb95faec96f787fed48245517f"} Dec 01 09:10:47 crc kubenswrapper[5004]: I1201 09:10:47.945882 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jd9cf" Dec 01 09:10:47 crc kubenswrapper[5004]: I1201 09:10:47.947278 5004 scope.go:117] "RemoveContainer" containerID="26aa2fc50d0a12abd78d46516f439ce17d3f163eee15fdea29baeeceafa29bcb" Dec 01 09:10:47 crc kubenswrapper[5004]: I1201 09:10:47.990914 5004 scope.go:117] "RemoveContainer" containerID="5d74d0a361d387ce4af55c6fadb78f633915b5be9a538b7bfeac9bede605b45d" Dec 01 09:10:47 crc kubenswrapper[5004]: I1201 09:10:47.993184 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jd9cf"] Dec 01 09:10:48 crc kubenswrapper[5004]: I1201 09:10:48.009277 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jd9cf"] Dec 01 09:10:48 crc kubenswrapper[5004]: I1201 09:10:48.025543 5004 scope.go:117] "RemoveContainer" containerID="67c849fee9fa664c1c07fe49f4899653cfa76d8293066adea49fd41a10f1f0fd" Dec 01 09:10:48 crc kubenswrapper[5004]: I1201 09:10:48.079603 5004 scope.go:117] "RemoveContainer" containerID="26aa2fc50d0a12abd78d46516f439ce17d3f163eee15fdea29baeeceafa29bcb" Dec 01 09:10:48 crc kubenswrapper[5004]: E1201 09:10:48.080152 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26aa2fc50d0a12abd78d46516f439ce17d3f163eee15fdea29baeeceafa29bcb\": container with ID starting with 26aa2fc50d0a12abd78d46516f439ce17d3f163eee15fdea29baeeceafa29bcb not found: ID does not exist" containerID="26aa2fc50d0a12abd78d46516f439ce17d3f163eee15fdea29baeeceafa29bcb" Dec 01 09:10:48 crc kubenswrapper[5004]: I1201 09:10:48.080233 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26aa2fc50d0a12abd78d46516f439ce17d3f163eee15fdea29baeeceafa29bcb"} err="failed to get container status \"26aa2fc50d0a12abd78d46516f439ce17d3f163eee15fdea29baeeceafa29bcb\": rpc error: code = NotFound desc = could not find container \"26aa2fc50d0a12abd78d46516f439ce17d3f163eee15fdea29baeeceafa29bcb\": container with ID starting with 26aa2fc50d0a12abd78d46516f439ce17d3f163eee15fdea29baeeceafa29bcb not found: ID does not exist" Dec 01 09:10:48 crc kubenswrapper[5004]: I1201 09:10:48.080278 5004 scope.go:117] "RemoveContainer" containerID="5d74d0a361d387ce4af55c6fadb78f633915b5be9a538b7bfeac9bede605b45d" Dec 01 09:10:48 crc kubenswrapper[5004]: E1201 09:10:48.080906 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d74d0a361d387ce4af55c6fadb78f633915b5be9a538b7bfeac9bede605b45d\": container with ID starting with 5d74d0a361d387ce4af55c6fadb78f633915b5be9a538b7bfeac9bede605b45d not found: ID does not exist" containerID="5d74d0a361d387ce4af55c6fadb78f633915b5be9a538b7bfeac9bede605b45d" Dec 01 09:10:48 crc kubenswrapper[5004]: I1201 09:10:48.080962 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d74d0a361d387ce4af55c6fadb78f633915b5be9a538b7bfeac9bede605b45d"} err="failed to get container status \"5d74d0a361d387ce4af55c6fadb78f633915b5be9a538b7bfeac9bede605b45d\": rpc error: code = NotFound desc = could not find container \"5d74d0a361d387ce4af55c6fadb78f633915b5be9a538b7bfeac9bede605b45d\": container with ID starting with 5d74d0a361d387ce4af55c6fadb78f633915b5be9a538b7bfeac9bede605b45d not found: ID does not exist" Dec 01 09:10:48 crc kubenswrapper[5004]: I1201 09:10:48.080998 5004 scope.go:117] "RemoveContainer" containerID="67c849fee9fa664c1c07fe49f4899653cfa76d8293066adea49fd41a10f1f0fd" Dec 01 09:10:48 crc kubenswrapper[5004]: E1201 09:10:48.081302 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67c849fee9fa664c1c07fe49f4899653cfa76d8293066adea49fd41a10f1f0fd\": container with ID starting with 67c849fee9fa664c1c07fe49f4899653cfa76d8293066adea49fd41a10f1f0fd not found: ID does not exist" containerID="67c849fee9fa664c1c07fe49f4899653cfa76d8293066adea49fd41a10f1f0fd" Dec 01 09:10:48 crc kubenswrapper[5004]: I1201 09:10:48.081379 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67c849fee9fa664c1c07fe49f4899653cfa76d8293066adea49fd41a10f1f0fd"} err="failed to get container status \"67c849fee9fa664c1c07fe49f4899653cfa76d8293066adea49fd41a10f1f0fd\": rpc error: code = NotFound desc = could not find container \"67c849fee9fa664c1c07fe49f4899653cfa76d8293066adea49fd41a10f1f0fd\": container with ID starting with 67c849fee9fa664c1c07fe49f4899653cfa76d8293066adea49fd41a10f1f0fd not found: ID does not exist" Dec 01 09:10:48 crc kubenswrapper[5004]: I1201 09:10:48.801234 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d29e6f3e-9fc3-4357-8d96-4f9115b0c841" path="/var/lib/kubelet/pods/d29e6f3e-9fc3-4357-8d96-4f9115b0c841/volumes" Dec 01 09:10:53 crc kubenswrapper[5004]: I1201 09:10:53.759916 5004 scope.go:117] "RemoveContainer" containerID="959c4dd48a6a01a111184092f5278271df46b0993f7d823dbccdaf0fe68ac515" Dec 01 09:10:53 crc kubenswrapper[5004]: E1201 09:10:53.761079 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:11:07 crc kubenswrapper[5004]: I1201 09:11:07.759001 5004 scope.go:117] "RemoveContainer" containerID="959c4dd48a6a01a111184092f5278271df46b0993f7d823dbccdaf0fe68ac515" Dec 01 09:11:07 crc kubenswrapper[5004]: E1201 09:11:07.760902 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:11:18 crc kubenswrapper[5004]: I1201 09:11:18.288482 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-b4rmb"] Dec 01 09:11:18 crc kubenswrapper[5004]: E1201 09:11:18.289664 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d29e6f3e-9fc3-4357-8d96-4f9115b0c841" containerName="extract-utilities" Dec 01 09:11:18 crc kubenswrapper[5004]: I1201 09:11:18.289712 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="d29e6f3e-9fc3-4357-8d96-4f9115b0c841" containerName="extract-utilities" Dec 01 09:11:18 crc kubenswrapper[5004]: E1201 09:11:18.289730 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d29e6f3e-9fc3-4357-8d96-4f9115b0c841" containerName="registry-server" Dec 01 09:11:18 crc kubenswrapper[5004]: I1201 09:11:18.289736 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="d29e6f3e-9fc3-4357-8d96-4f9115b0c841" containerName="registry-server" Dec 01 09:11:18 crc kubenswrapper[5004]: E1201 09:11:18.289751 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d29e6f3e-9fc3-4357-8d96-4f9115b0c841" containerName="extract-content" Dec 01 09:11:18 crc kubenswrapper[5004]: I1201 09:11:18.289759 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="d29e6f3e-9fc3-4357-8d96-4f9115b0c841" containerName="extract-content" Dec 01 09:11:18 crc kubenswrapper[5004]: I1201 09:11:18.290073 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="d29e6f3e-9fc3-4357-8d96-4f9115b0c841" containerName="registry-server" Dec 01 09:11:18 crc kubenswrapper[5004]: I1201 09:11:18.291808 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b4rmb" Dec 01 09:11:18 crc kubenswrapper[5004]: I1201 09:11:18.304321 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b4rmb"] Dec 01 09:11:18 crc kubenswrapper[5004]: I1201 09:11:18.426920 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01e64821-a32d-475f-b7d7-267c92533a55-catalog-content\") pod \"redhat-operators-b4rmb\" (UID: \"01e64821-a32d-475f-b7d7-267c92533a55\") " pod="openshift-marketplace/redhat-operators-b4rmb" Dec 01 09:11:18 crc kubenswrapper[5004]: I1201 09:11:18.426999 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-656zf\" (UniqueName: \"kubernetes.io/projected/01e64821-a32d-475f-b7d7-267c92533a55-kube-api-access-656zf\") pod \"redhat-operators-b4rmb\" (UID: \"01e64821-a32d-475f-b7d7-267c92533a55\") " pod="openshift-marketplace/redhat-operators-b4rmb" Dec 01 09:11:18 crc kubenswrapper[5004]: I1201 09:11:18.427110 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01e64821-a32d-475f-b7d7-267c92533a55-utilities\") pod \"redhat-operators-b4rmb\" (UID: \"01e64821-a32d-475f-b7d7-267c92533a55\") " pod="openshift-marketplace/redhat-operators-b4rmb" Dec 01 09:11:18 crc kubenswrapper[5004]: I1201 09:11:18.528597 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-656zf\" (UniqueName: \"kubernetes.io/projected/01e64821-a32d-475f-b7d7-267c92533a55-kube-api-access-656zf\") pod \"redhat-operators-b4rmb\" (UID: \"01e64821-a32d-475f-b7d7-267c92533a55\") " pod="openshift-marketplace/redhat-operators-b4rmb" Dec 01 09:11:18 crc kubenswrapper[5004]: I1201 09:11:18.528736 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01e64821-a32d-475f-b7d7-267c92533a55-utilities\") pod \"redhat-operators-b4rmb\" (UID: \"01e64821-a32d-475f-b7d7-267c92533a55\") " pod="openshift-marketplace/redhat-operators-b4rmb" Dec 01 09:11:18 crc kubenswrapper[5004]: I1201 09:11:18.528843 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01e64821-a32d-475f-b7d7-267c92533a55-catalog-content\") pod \"redhat-operators-b4rmb\" (UID: \"01e64821-a32d-475f-b7d7-267c92533a55\") " pod="openshift-marketplace/redhat-operators-b4rmb" Dec 01 09:11:18 crc kubenswrapper[5004]: I1201 09:11:18.529436 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01e64821-a32d-475f-b7d7-267c92533a55-catalog-content\") pod \"redhat-operators-b4rmb\" (UID: \"01e64821-a32d-475f-b7d7-267c92533a55\") " pod="openshift-marketplace/redhat-operators-b4rmb" Dec 01 09:11:18 crc kubenswrapper[5004]: I1201 09:11:18.530080 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01e64821-a32d-475f-b7d7-267c92533a55-utilities\") pod \"redhat-operators-b4rmb\" (UID: \"01e64821-a32d-475f-b7d7-267c92533a55\") " pod="openshift-marketplace/redhat-operators-b4rmb" Dec 01 09:11:18 crc kubenswrapper[5004]: I1201 09:11:18.563526 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-656zf\" (UniqueName: \"kubernetes.io/projected/01e64821-a32d-475f-b7d7-267c92533a55-kube-api-access-656zf\") pod \"redhat-operators-b4rmb\" (UID: \"01e64821-a32d-475f-b7d7-267c92533a55\") " pod="openshift-marketplace/redhat-operators-b4rmb" Dec 01 09:11:18 crc kubenswrapper[5004]: I1201 09:11:18.620787 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b4rmb" Dec 01 09:11:18 crc kubenswrapper[5004]: I1201 09:11:18.758882 5004 scope.go:117] "RemoveContainer" containerID="959c4dd48a6a01a111184092f5278271df46b0993f7d823dbccdaf0fe68ac515" Dec 01 09:11:18 crc kubenswrapper[5004]: E1201 09:11:18.759244 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:11:19 crc kubenswrapper[5004]: I1201 09:11:19.128295 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b4rmb"] Dec 01 09:11:19 crc kubenswrapper[5004]: I1201 09:11:19.378899 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b4rmb" event={"ID":"01e64821-a32d-475f-b7d7-267c92533a55","Type":"ContainerStarted","Data":"e820221b17e6412b12fb137fc9977fa726b82980b43d4ed4ab859b53832131f8"} Dec 01 09:11:21 crc kubenswrapper[5004]: I1201 09:11:21.401671 5004 generic.go:334] "Generic (PLEG): container finished" podID="01e64821-a32d-475f-b7d7-267c92533a55" containerID="f56d91eed845f343708cfa5accb5b0df8714aa4835a75974b14ade0861fc97f0" exitCode=0 Dec 01 09:11:21 crc kubenswrapper[5004]: I1201 09:11:21.401761 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b4rmb" event={"ID":"01e64821-a32d-475f-b7d7-267c92533a55","Type":"ContainerDied","Data":"f56d91eed845f343708cfa5accb5b0df8714aa4835a75974b14ade0861fc97f0"} Dec 01 09:11:22 crc kubenswrapper[5004]: I1201 09:11:22.441981 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b4rmb" event={"ID":"01e64821-a32d-475f-b7d7-267c92533a55","Type":"ContainerStarted","Data":"42f821639daf299b034f3b4cc245b7f079a0779ab9a84819a8df9b9c25fae122"} Dec 01 09:11:25 crc kubenswrapper[5004]: I1201 09:11:25.477178 5004 generic.go:334] "Generic (PLEG): container finished" podID="01e64821-a32d-475f-b7d7-267c92533a55" containerID="42f821639daf299b034f3b4cc245b7f079a0779ab9a84819a8df9b9c25fae122" exitCode=0 Dec 01 09:11:25 crc kubenswrapper[5004]: I1201 09:11:25.477250 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b4rmb" event={"ID":"01e64821-a32d-475f-b7d7-267c92533a55","Type":"ContainerDied","Data":"42f821639daf299b034f3b4cc245b7f079a0779ab9a84819a8df9b9c25fae122"} Dec 01 09:11:26 crc kubenswrapper[5004]: I1201 09:11:26.494180 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b4rmb" event={"ID":"01e64821-a32d-475f-b7d7-267c92533a55","Type":"ContainerStarted","Data":"04c634ce76b654397217b815979ab7abdeb6b50b99b7d56fd187dad0012b0e60"} Dec 01 09:11:26 crc kubenswrapper[5004]: I1201 09:11:26.515070 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-b4rmb" podStartSLOduration=3.808610435 podStartE2EDuration="8.515056644s" podCreationTimestamp="2025-12-01 09:11:18 +0000 UTC" firstStartedPulling="2025-12-01 09:11:21.404501271 +0000 UTC m=+3258.969493253" lastFinishedPulling="2025-12-01 09:11:26.11094745 +0000 UTC m=+3263.675939462" observedRunningTime="2025-12-01 09:11:26.513591708 +0000 UTC m=+3264.078583690" watchObservedRunningTime="2025-12-01 09:11:26.515056644 +0000 UTC m=+3264.080048636" Dec 01 09:11:28 crc kubenswrapper[5004]: I1201 09:11:28.621815 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-b4rmb" Dec 01 09:11:28 crc kubenswrapper[5004]: I1201 09:11:28.622230 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-b4rmb" Dec 01 09:11:29 crc kubenswrapper[5004]: I1201 09:11:29.671054 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-b4rmb" podUID="01e64821-a32d-475f-b7d7-267c92533a55" containerName="registry-server" probeResult="failure" output=< Dec 01 09:11:29 crc kubenswrapper[5004]: timeout: failed to connect service ":50051" within 1s Dec 01 09:11:29 crc kubenswrapper[5004]: > Dec 01 09:11:30 crc kubenswrapper[5004]: I1201 09:11:30.759175 5004 scope.go:117] "RemoveContainer" containerID="959c4dd48a6a01a111184092f5278271df46b0993f7d823dbccdaf0fe68ac515" Dec 01 09:11:30 crc kubenswrapper[5004]: E1201 09:11:30.759828 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:11:38 crc kubenswrapper[5004]: I1201 09:11:38.668235 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-b4rmb" Dec 01 09:11:38 crc kubenswrapper[5004]: I1201 09:11:38.720313 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-b4rmb" Dec 01 09:11:38 crc kubenswrapper[5004]: I1201 09:11:38.908788 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b4rmb"] Dec 01 09:11:40 crc kubenswrapper[5004]: I1201 09:11:40.651278 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-b4rmb" podUID="01e64821-a32d-475f-b7d7-267c92533a55" containerName="registry-server" containerID="cri-o://04c634ce76b654397217b815979ab7abdeb6b50b99b7d56fd187dad0012b0e60" gracePeriod=2 Dec 01 09:11:41 crc kubenswrapper[5004]: I1201 09:11:41.671277 5004 generic.go:334] "Generic (PLEG): container finished" podID="01e64821-a32d-475f-b7d7-267c92533a55" containerID="04c634ce76b654397217b815979ab7abdeb6b50b99b7d56fd187dad0012b0e60" exitCode=0 Dec 01 09:11:41 crc kubenswrapper[5004]: I1201 09:11:41.671591 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b4rmb" event={"ID":"01e64821-a32d-475f-b7d7-267c92533a55","Type":"ContainerDied","Data":"04c634ce76b654397217b815979ab7abdeb6b50b99b7d56fd187dad0012b0e60"} Dec 01 09:11:41 crc kubenswrapper[5004]: I1201 09:11:41.671624 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b4rmb" event={"ID":"01e64821-a32d-475f-b7d7-267c92533a55","Type":"ContainerDied","Data":"e820221b17e6412b12fb137fc9977fa726b82980b43d4ed4ab859b53832131f8"} Dec 01 09:11:41 crc kubenswrapper[5004]: I1201 09:11:41.671638 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e820221b17e6412b12fb137fc9977fa726b82980b43d4ed4ab859b53832131f8" Dec 01 09:11:41 crc kubenswrapper[5004]: I1201 09:11:41.698681 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b4rmb" Dec 01 09:11:41 crc kubenswrapper[5004]: I1201 09:11:41.827304 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01e64821-a32d-475f-b7d7-267c92533a55-utilities\") pod \"01e64821-a32d-475f-b7d7-267c92533a55\" (UID: \"01e64821-a32d-475f-b7d7-267c92533a55\") " Dec 01 09:11:41 crc kubenswrapper[5004]: I1201 09:11:41.827427 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01e64821-a32d-475f-b7d7-267c92533a55-catalog-content\") pod \"01e64821-a32d-475f-b7d7-267c92533a55\" (UID: \"01e64821-a32d-475f-b7d7-267c92533a55\") " Dec 01 09:11:41 crc kubenswrapper[5004]: I1201 09:11:41.827558 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-656zf\" (UniqueName: \"kubernetes.io/projected/01e64821-a32d-475f-b7d7-267c92533a55-kube-api-access-656zf\") pod \"01e64821-a32d-475f-b7d7-267c92533a55\" (UID: \"01e64821-a32d-475f-b7d7-267c92533a55\") " Dec 01 09:11:41 crc kubenswrapper[5004]: I1201 09:11:41.828129 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01e64821-a32d-475f-b7d7-267c92533a55-utilities" (OuterVolumeSpecName: "utilities") pod "01e64821-a32d-475f-b7d7-267c92533a55" (UID: "01e64821-a32d-475f-b7d7-267c92533a55"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:11:41 crc kubenswrapper[5004]: I1201 09:11:41.828263 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01e64821-a32d-475f-b7d7-267c92533a55-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:11:41 crc kubenswrapper[5004]: I1201 09:11:41.833384 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01e64821-a32d-475f-b7d7-267c92533a55-kube-api-access-656zf" (OuterVolumeSpecName: "kube-api-access-656zf") pod "01e64821-a32d-475f-b7d7-267c92533a55" (UID: "01e64821-a32d-475f-b7d7-267c92533a55"). InnerVolumeSpecName "kube-api-access-656zf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:11:41 crc kubenswrapper[5004]: I1201 09:11:41.930650 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-656zf\" (UniqueName: \"kubernetes.io/projected/01e64821-a32d-475f-b7d7-267c92533a55-kube-api-access-656zf\") on node \"crc\" DevicePath \"\"" Dec 01 09:11:41 crc kubenswrapper[5004]: I1201 09:11:41.939618 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01e64821-a32d-475f-b7d7-267c92533a55-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "01e64821-a32d-475f-b7d7-267c92533a55" (UID: "01e64821-a32d-475f-b7d7-267c92533a55"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:11:42 crc kubenswrapper[5004]: I1201 09:11:42.033191 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01e64821-a32d-475f-b7d7-267c92533a55-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:11:42 crc kubenswrapper[5004]: I1201 09:11:42.682470 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b4rmb" Dec 01 09:11:42 crc kubenswrapper[5004]: I1201 09:11:42.719789 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b4rmb"] Dec 01 09:11:42 crc kubenswrapper[5004]: I1201 09:11:42.731449 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-b4rmb"] Dec 01 09:11:42 crc kubenswrapper[5004]: I1201 09:11:42.774192 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01e64821-a32d-475f-b7d7-267c92533a55" path="/var/lib/kubelet/pods/01e64821-a32d-475f-b7d7-267c92533a55/volumes" Dec 01 09:11:44 crc kubenswrapper[5004]: I1201 09:11:44.759001 5004 scope.go:117] "RemoveContainer" containerID="959c4dd48a6a01a111184092f5278271df46b0993f7d823dbccdaf0fe68ac515" Dec 01 09:11:44 crc kubenswrapper[5004]: E1201 09:11:44.759709 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:11:57 crc kubenswrapper[5004]: I1201 09:11:57.759976 5004 scope.go:117] "RemoveContainer" containerID="959c4dd48a6a01a111184092f5278271df46b0993f7d823dbccdaf0fe68ac515" Dec 01 09:11:57 crc kubenswrapper[5004]: E1201 09:11:57.761063 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:12:07 crc kubenswrapper[5004]: I1201 09:12:07.976695 5004 generic.go:334] "Generic (PLEG): container finished" podID="6a233136-7248-4994-a3d6-0108bbf72fef" containerID="e435d8aebaaf7c01ecaea4634c429b468e5ace8d1b8d4bf8e19378595a25f782" exitCode=0 Dec 01 09:12:07 crc kubenswrapper[5004]: I1201 09:12:07.976805 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-rs57q" event={"ID":"6a233136-7248-4994-a3d6-0108bbf72fef","Type":"ContainerDied","Data":"e435d8aebaaf7c01ecaea4634c429b468e5ace8d1b8d4bf8e19378595a25f782"} Dec 01 09:12:08 crc kubenswrapper[5004]: I1201 09:12:08.760059 5004 scope.go:117] "RemoveContainer" containerID="959c4dd48a6a01a111184092f5278271df46b0993f7d823dbccdaf0fe68ac515" Dec 01 09:12:08 crc kubenswrapper[5004]: E1201 09:12:08.760803 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:12:09 crc kubenswrapper[5004]: I1201 09:12:09.448963 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-rs57q" Dec 01 09:12:09 crc kubenswrapper[5004]: I1201 09:12:09.523940 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/6a233136-7248-4994-a3d6-0108bbf72fef-ceilometer-ipmi-config-data-0\") pod \"6a233136-7248-4994-a3d6-0108bbf72fef\" (UID: \"6a233136-7248-4994-a3d6-0108bbf72fef\") " Dec 01 09:12:09 crc kubenswrapper[5004]: I1201 09:12:09.524047 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a233136-7248-4994-a3d6-0108bbf72fef-telemetry-power-monitoring-combined-ca-bundle\") pod \"6a233136-7248-4994-a3d6-0108bbf72fef\" (UID: \"6a233136-7248-4994-a3d6-0108bbf72fef\") " Dec 01 09:12:09 crc kubenswrapper[5004]: I1201 09:12:09.524076 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/6a233136-7248-4994-a3d6-0108bbf72fef-ceilometer-ipmi-config-data-2\") pod \"6a233136-7248-4994-a3d6-0108bbf72fef\" (UID: \"6a233136-7248-4994-a3d6-0108bbf72fef\") " Dec 01 09:12:09 crc kubenswrapper[5004]: I1201 09:12:09.524245 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6a233136-7248-4994-a3d6-0108bbf72fef-ssh-key\") pod \"6a233136-7248-4994-a3d6-0108bbf72fef\" (UID: \"6a233136-7248-4994-a3d6-0108bbf72fef\") " Dec 01 09:12:09 crc kubenswrapper[5004]: I1201 09:12:09.524332 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/6a233136-7248-4994-a3d6-0108bbf72fef-ceilometer-ipmi-config-data-1\") pod \"6a233136-7248-4994-a3d6-0108bbf72fef\" (UID: \"6a233136-7248-4994-a3d6-0108bbf72fef\") " Dec 01 09:12:09 crc kubenswrapper[5004]: I1201 09:12:09.524404 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6a233136-7248-4994-a3d6-0108bbf72fef-inventory\") pod \"6a233136-7248-4994-a3d6-0108bbf72fef\" (UID: \"6a233136-7248-4994-a3d6-0108bbf72fef\") " Dec 01 09:12:09 crc kubenswrapper[5004]: I1201 09:12:09.524438 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-274zt\" (UniqueName: \"kubernetes.io/projected/6a233136-7248-4994-a3d6-0108bbf72fef-kube-api-access-274zt\") pod \"6a233136-7248-4994-a3d6-0108bbf72fef\" (UID: \"6a233136-7248-4994-a3d6-0108bbf72fef\") " Dec 01 09:12:09 crc kubenswrapper[5004]: I1201 09:12:09.536142 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a233136-7248-4994-a3d6-0108bbf72fef-kube-api-access-274zt" (OuterVolumeSpecName: "kube-api-access-274zt") pod "6a233136-7248-4994-a3d6-0108bbf72fef" (UID: "6a233136-7248-4994-a3d6-0108bbf72fef"). InnerVolumeSpecName "kube-api-access-274zt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:12:09 crc kubenswrapper[5004]: I1201 09:12:09.537440 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a233136-7248-4994-a3d6-0108bbf72fef-telemetry-power-monitoring-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-power-monitoring-combined-ca-bundle") pod "6a233136-7248-4994-a3d6-0108bbf72fef" (UID: "6a233136-7248-4994-a3d6-0108bbf72fef"). InnerVolumeSpecName "telemetry-power-monitoring-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:12:09 crc kubenswrapper[5004]: I1201 09:12:09.558450 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a233136-7248-4994-a3d6-0108bbf72fef-ceilometer-ipmi-config-data-0" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-0") pod "6a233136-7248-4994-a3d6-0108bbf72fef" (UID: "6a233136-7248-4994-a3d6-0108bbf72fef"). InnerVolumeSpecName "ceilometer-ipmi-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:12:09 crc kubenswrapper[5004]: I1201 09:12:09.560354 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a233136-7248-4994-a3d6-0108bbf72fef-inventory" (OuterVolumeSpecName: "inventory") pod "6a233136-7248-4994-a3d6-0108bbf72fef" (UID: "6a233136-7248-4994-a3d6-0108bbf72fef"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:12:09 crc kubenswrapper[5004]: I1201 09:12:09.565858 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a233136-7248-4994-a3d6-0108bbf72fef-ceilometer-ipmi-config-data-1" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-1") pod "6a233136-7248-4994-a3d6-0108bbf72fef" (UID: "6a233136-7248-4994-a3d6-0108bbf72fef"). InnerVolumeSpecName "ceilometer-ipmi-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:12:09 crc kubenswrapper[5004]: I1201 09:12:09.570378 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a233136-7248-4994-a3d6-0108bbf72fef-ceilometer-ipmi-config-data-2" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-2") pod "6a233136-7248-4994-a3d6-0108bbf72fef" (UID: "6a233136-7248-4994-a3d6-0108bbf72fef"). InnerVolumeSpecName "ceilometer-ipmi-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:12:09 crc kubenswrapper[5004]: I1201 09:12:09.575710 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a233136-7248-4994-a3d6-0108bbf72fef-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6a233136-7248-4994-a3d6-0108bbf72fef" (UID: "6a233136-7248-4994-a3d6-0108bbf72fef"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:12:09 crc kubenswrapper[5004]: I1201 09:12:09.628327 5004 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/6a233136-7248-4994-a3d6-0108bbf72fef-ceilometer-ipmi-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 01 09:12:09 crc kubenswrapper[5004]: I1201 09:12:09.628386 5004 reconciler_common.go:293] "Volume detached for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a233136-7248-4994-a3d6-0108bbf72fef-telemetry-power-monitoring-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:12:09 crc kubenswrapper[5004]: I1201 09:12:09.628413 5004 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/6a233136-7248-4994-a3d6-0108bbf72fef-ceilometer-ipmi-config-data-2\") on node \"crc\" DevicePath \"\"" Dec 01 09:12:09 crc kubenswrapper[5004]: I1201 09:12:09.628435 5004 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6a233136-7248-4994-a3d6-0108bbf72fef-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 09:12:09 crc kubenswrapper[5004]: I1201 09:12:09.628455 5004 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/6a233136-7248-4994-a3d6-0108bbf72fef-ceilometer-ipmi-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 01 09:12:09 crc kubenswrapper[5004]: I1201 09:12:09.628478 5004 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6a233136-7248-4994-a3d6-0108bbf72fef-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 09:12:09 crc kubenswrapper[5004]: I1201 09:12:09.628496 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-274zt\" (UniqueName: \"kubernetes.io/projected/6a233136-7248-4994-a3d6-0108bbf72fef-kube-api-access-274zt\") on node \"crc\" DevicePath \"\"" Dec 01 09:12:10 crc kubenswrapper[5004]: I1201 09:12:10.007321 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-rs57q" event={"ID":"6a233136-7248-4994-a3d6-0108bbf72fef","Type":"ContainerDied","Data":"833995a0f7b26bc5757530b1801b4f4d3ff79ba57d3d8be31a528cf2c743e8c3"} Dec 01 09:12:10 crc kubenswrapper[5004]: I1201 09:12:10.007362 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="833995a0f7b26bc5757530b1801b4f4d3ff79ba57d3d8be31a528cf2c743e8c3" Dec 01 09:12:10 crc kubenswrapper[5004]: I1201 09:12:10.007395 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-rs57q" Dec 01 09:12:10 crc kubenswrapper[5004]: I1201 09:12:10.127430 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-zrjqn"] Dec 01 09:12:10 crc kubenswrapper[5004]: E1201 09:12:10.127930 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01e64821-a32d-475f-b7d7-267c92533a55" containerName="extract-utilities" Dec 01 09:12:10 crc kubenswrapper[5004]: I1201 09:12:10.127950 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="01e64821-a32d-475f-b7d7-267c92533a55" containerName="extract-utilities" Dec 01 09:12:10 crc kubenswrapper[5004]: E1201 09:12:10.127995 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01e64821-a32d-475f-b7d7-267c92533a55" containerName="extract-content" Dec 01 09:12:10 crc kubenswrapper[5004]: I1201 09:12:10.128004 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="01e64821-a32d-475f-b7d7-267c92533a55" containerName="extract-content" Dec 01 09:12:10 crc kubenswrapper[5004]: E1201 09:12:10.128022 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a233136-7248-4994-a3d6-0108bbf72fef" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Dec 01 09:12:10 crc kubenswrapper[5004]: I1201 09:12:10.128029 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a233136-7248-4994-a3d6-0108bbf72fef" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Dec 01 09:12:10 crc kubenswrapper[5004]: E1201 09:12:10.128050 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01e64821-a32d-475f-b7d7-267c92533a55" containerName="registry-server" Dec 01 09:12:10 crc kubenswrapper[5004]: I1201 09:12:10.128056 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="01e64821-a32d-475f-b7d7-267c92533a55" containerName="registry-server" Dec 01 09:12:10 crc kubenswrapper[5004]: I1201 09:12:10.128283 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a233136-7248-4994-a3d6-0108bbf72fef" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Dec 01 09:12:10 crc kubenswrapper[5004]: I1201 09:12:10.128307 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="01e64821-a32d-475f-b7d7-267c92533a55" containerName="registry-server" Dec 01 09:12:10 crc kubenswrapper[5004]: I1201 09:12:10.129098 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-zrjqn" Dec 01 09:12:10 crc kubenswrapper[5004]: I1201 09:12:10.131304 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"logging-compute-config-data" Dec 01 09:12:10 crc kubenswrapper[5004]: I1201 09:12:10.131626 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 09:12:10 crc kubenswrapper[5004]: I1201 09:12:10.131897 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pdnrq" Dec 01 09:12:10 crc kubenswrapper[5004]: I1201 09:12:10.132057 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 09:12:10 crc kubenswrapper[5004]: I1201 09:12:10.132225 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 09:12:10 crc kubenswrapper[5004]: I1201 09:12:10.149978 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-zrjqn"] Dec 01 09:12:10 crc kubenswrapper[5004]: I1201 09:12:10.254347 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/39f7f17f-adaa-4dd0-a59a-b40406b85353-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-zrjqn\" (UID: \"39f7f17f-adaa-4dd0-a59a-b40406b85353\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-zrjqn" Dec 01 09:12:10 crc kubenswrapper[5004]: I1201 09:12:10.255115 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/39f7f17f-adaa-4dd0-a59a-b40406b85353-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-zrjqn\" (UID: \"39f7f17f-adaa-4dd0-a59a-b40406b85353\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-zrjqn" Dec 01 09:12:10 crc kubenswrapper[5004]: I1201 09:12:10.255200 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/39f7f17f-adaa-4dd0-a59a-b40406b85353-ssh-key\") pod \"logging-edpm-deployment-openstack-edpm-ipam-zrjqn\" (UID: \"39f7f17f-adaa-4dd0-a59a-b40406b85353\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-zrjqn" Dec 01 09:12:10 crc kubenswrapper[5004]: I1201 09:12:10.255289 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/39f7f17f-adaa-4dd0-a59a-b40406b85353-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-zrjqn\" (UID: \"39f7f17f-adaa-4dd0-a59a-b40406b85353\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-zrjqn" Dec 01 09:12:10 crc kubenswrapper[5004]: I1201 09:12:10.255460 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mh5pl\" (UniqueName: \"kubernetes.io/projected/39f7f17f-adaa-4dd0-a59a-b40406b85353-kube-api-access-mh5pl\") pod \"logging-edpm-deployment-openstack-edpm-ipam-zrjqn\" (UID: \"39f7f17f-adaa-4dd0-a59a-b40406b85353\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-zrjqn" Dec 01 09:12:10 crc kubenswrapper[5004]: I1201 09:12:10.357901 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/39f7f17f-adaa-4dd0-a59a-b40406b85353-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-zrjqn\" (UID: \"39f7f17f-adaa-4dd0-a59a-b40406b85353\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-zrjqn" Dec 01 09:12:10 crc kubenswrapper[5004]: I1201 09:12:10.358073 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/39f7f17f-adaa-4dd0-a59a-b40406b85353-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-zrjqn\" (UID: \"39f7f17f-adaa-4dd0-a59a-b40406b85353\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-zrjqn" Dec 01 09:12:10 crc kubenswrapper[5004]: I1201 09:12:10.358107 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/39f7f17f-adaa-4dd0-a59a-b40406b85353-ssh-key\") pod \"logging-edpm-deployment-openstack-edpm-ipam-zrjqn\" (UID: \"39f7f17f-adaa-4dd0-a59a-b40406b85353\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-zrjqn" Dec 01 09:12:10 crc kubenswrapper[5004]: I1201 09:12:10.358150 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/39f7f17f-adaa-4dd0-a59a-b40406b85353-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-zrjqn\" (UID: \"39f7f17f-adaa-4dd0-a59a-b40406b85353\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-zrjqn" Dec 01 09:12:10 crc kubenswrapper[5004]: I1201 09:12:10.358198 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mh5pl\" (UniqueName: \"kubernetes.io/projected/39f7f17f-adaa-4dd0-a59a-b40406b85353-kube-api-access-mh5pl\") pod \"logging-edpm-deployment-openstack-edpm-ipam-zrjqn\" (UID: \"39f7f17f-adaa-4dd0-a59a-b40406b85353\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-zrjqn" Dec 01 09:12:10 crc kubenswrapper[5004]: I1201 09:12:10.361937 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/39f7f17f-adaa-4dd0-a59a-b40406b85353-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-zrjqn\" (UID: \"39f7f17f-adaa-4dd0-a59a-b40406b85353\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-zrjqn" Dec 01 09:12:10 crc kubenswrapper[5004]: I1201 09:12:10.362333 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/39f7f17f-adaa-4dd0-a59a-b40406b85353-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-zrjqn\" (UID: \"39f7f17f-adaa-4dd0-a59a-b40406b85353\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-zrjqn" Dec 01 09:12:10 crc kubenswrapper[5004]: I1201 09:12:10.362459 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/39f7f17f-adaa-4dd0-a59a-b40406b85353-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-zrjqn\" (UID: \"39f7f17f-adaa-4dd0-a59a-b40406b85353\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-zrjqn" Dec 01 09:12:10 crc kubenswrapper[5004]: I1201 09:12:10.374196 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/39f7f17f-adaa-4dd0-a59a-b40406b85353-ssh-key\") pod \"logging-edpm-deployment-openstack-edpm-ipam-zrjqn\" (UID: \"39f7f17f-adaa-4dd0-a59a-b40406b85353\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-zrjqn" Dec 01 09:12:10 crc kubenswrapper[5004]: I1201 09:12:10.374804 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mh5pl\" (UniqueName: \"kubernetes.io/projected/39f7f17f-adaa-4dd0-a59a-b40406b85353-kube-api-access-mh5pl\") pod \"logging-edpm-deployment-openstack-edpm-ipam-zrjqn\" (UID: \"39f7f17f-adaa-4dd0-a59a-b40406b85353\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-zrjqn" Dec 01 09:12:10 crc kubenswrapper[5004]: I1201 09:12:10.461855 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-zrjqn" Dec 01 09:12:11 crc kubenswrapper[5004]: I1201 09:12:11.034043 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-zrjqn"] Dec 01 09:12:12 crc kubenswrapper[5004]: I1201 09:12:12.035451 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-zrjqn" event={"ID":"39f7f17f-adaa-4dd0-a59a-b40406b85353","Type":"ContainerStarted","Data":"b1742470acffde8e82f555a8f1921adf1e4f16ba1a682850e4c0e907abe81ae6"} Dec 01 09:12:12 crc kubenswrapper[5004]: I1201 09:12:12.035816 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-zrjqn" event={"ID":"39f7f17f-adaa-4dd0-a59a-b40406b85353","Type":"ContainerStarted","Data":"f0ba12e67a10fbd50dd75a221d7a5d42ed4e87ee58f6b3a869c074f749e12b29"} Dec 01 09:12:12 crc kubenswrapper[5004]: I1201 09:12:12.067173 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-zrjqn" podStartSLOduration=1.365165847 podStartE2EDuration="2.06715672s" podCreationTimestamp="2025-12-01 09:12:10 +0000 UTC" firstStartedPulling="2025-12-01 09:12:11.025479957 +0000 UTC m=+3308.590471939" lastFinishedPulling="2025-12-01 09:12:11.72747083 +0000 UTC m=+3309.292462812" observedRunningTime="2025-12-01 09:12:12.052338478 +0000 UTC m=+3309.617330460" watchObservedRunningTime="2025-12-01 09:12:12.06715672 +0000 UTC m=+3309.632148702" Dec 01 09:12:22 crc kubenswrapper[5004]: I1201 09:12:22.774878 5004 scope.go:117] "RemoveContainer" containerID="959c4dd48a6a01a111184092f5278271df46b0993f7d823dbccdaf0fe68ac515" Dec 01 09:12:22 crc kubenswrapper[5004]: E1201 09:12:22.776233 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:12:29 crc kubenswrapper[5004]: I1201 09:12:29.251996 5004 generic.go:334] "Generic (PLEG): container finished" podID="39f7f17f-adaa-4dd0-a59a-b40406b85353" containerID="b1742470acffde8e82f555a8f1921adf1e4f16ba1a682850e4c0e907abe81ae6" exitCode=0 Dec 01 09:12:29 crc kubenswrapper[5004]: I1201 09:12:29.252090 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-zrjqn" event={"ID":"39f7f17f-adaa-4dd0-a59a-b40406b85353","Type":"ContainerDied","Data":"b1742470acffde8e82f555a8f1921adf1e4f16ba1a682850e4c0e907abe81ae6"} Dec 01 09:12:30 crc kubenswrapper[5004]: I1201 09:12:30.877980 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-zrjqn" Dec 01 09:12:31 crc kubenswrapper[5004]: I1201 09:12:31.009009 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mh5pl\" (UniqueName: \"kubernetes.io/projected/39f7f17f-adaa-4dd0-a59a-b40406b85353-kube-api-access-mh5pl\") pod \"39f7f17f-adaa-4dd0-a59a-b40406b85353\" (UID: \"39f7f17f-adaa-4dd0-a59a-b40406b85353\") " Dec 01 09:12:31 crc kubenswrapper[5004]: I1201 09:12:31.009072 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/39f7f17f-adaa-4dd0-a59a-b40406b85353-logging-compute-config-data-1\") pod \"39f7f17f-adaa-4dd0-a59a-b40406b85353\" (UID: \"39f7f17f-adaa-4dd0-a59a-b40406b85353\") " Dec 01 09:12:31 crc kubenswrapper[5004]: I1201 09:12:31.009150 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/39f7f17f-adaa-4dd0-a59a-b40406b85353-inventory\") pod \"39f7f17f-adaa-4dd0-a59a-b40406b85353\" (UID: \"39f7f17f-adaa-4dd0-a59a-b40406b85353\") " Dec 01 09:12:31 crc kubenswrapper[5004]: I1201 09:12:31.009226 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/39f7f17f-adaa-4dd0-a59a-b40406b85353-ssh-key\") pod \"39f7f17f-adaa-4dd0-a59a-b40406b85353\" (UID: \"39f7f17f-adaa-4dd0-a59a-b40406b85353\") " Dec 01 09:12:31 crc kubenswrapper[5004]: I1201 09:12:31.009377 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/39f7f17f-adaa-4dd0-a59a-b40406b85353-logging-compute-config-data-0\") pod \"39f7f17f-adaa-4dd0-a59a-b40406b85353\" (UID: \"39f7f17f-adaa-4dd0-a59a-b40406b85353\") " Dec 01 09:12:31 crc kubenswrapper[5004]: I1201 09:12:31.014627 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39f7f17f-adaa-4dd0-a59a-b40406b85353-kube-api-access-mh5pl" (OuterVolumeSpecName: "kube-api-access-mh5pl") pod "39f7f17f-adaa-4dd0-a59a-b40406b85353" (UID: "39f7f17f-adaa-4dd0-a59a-b40406b85353"). InnerVolumeSpecName "kube-api-access-mh5pl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:12:31 crc kubenswrapper[5004]: I1201 09:12:31.038638 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39f7f17f-adaa-4dd0-a59a-b40406b85353-logging-compute-config-data-1" (OuterVolumeSpecName: "logging-compute-config-data-1") pod "39f7f17f-adaa-4dd0-a59a-b40406b85353" (UID: "39f7f17f-adaa-4dd0-a59a-b40406b85353"). InnerVolumeSpecName "logging-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:12:31 crc kubenswrapper[5004]: I1201 09:12:31.044197 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39f7f17f-adaa-4dd0-a59a-b40406b85353-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "39f7f17f-adaa-4dd0-a59a-b40406b85353" (UID: "39f7f17f-adaa-4dd0-a59a-b40406b85353"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:12:31 crc kubenswrapper[5004]: I1201 09:12:31.054783 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39f7f17f-adaa-4dd0-a59a-b40406b85353-logging-compute-config-data-0" (OuterVolumeSpecName: "logging-compute-config-data-0") pod "39f7f17f-adaa-4dd0-a59a-b40406b85353" (UID: "39f7f17f-adaa-4dd0-a59a-b40406b85353"). InnerVolumeSpecName "logging-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:12:31 crc kubenswrapper[5004]: I1201 09:12:31.061736 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39f7f17f-adaa-4dd0-a59a-b40406b85353-inventory" (OuterVolumeSpecName: "inventory") pod "39f7f17f-adaa-4dd0-a59a-b40406b85353" (UID: "39f7f17f-adaa-4dd0-a59a-b40406b85353"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:12:31 crc kubenswrapper[5004]: I1201 09:12:31.111945 5004 reconciler_common.go:293] "Volume detached for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/39f7f17f-adaa-4dd0-a59a-b40406b85353-logging-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 01 09:12:31 crc kubenswrapper[5004]: I1201 09:12:31.111976 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mh5pl\" (UniqueName: \"kubernetes.io/projected/39f7f17f-adaa-4dd0-a59a-b40406b85353-kube-api-access-mh5pl\") on node \"crc\" DevicePath \"\"" Dec 01 09:12:31 crc kubenswrapper[5004]: I1201 09:12:31.111986 5004 reconciler_common.go:293] "Volume detached for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/39f7f17f-adaa-4dd0-a59a-b40406b85353-logging-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 01 09:12:31 crc kubenswrapper[5004]: I1201 09:12:31.111994 5004 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/39f7f17f-adaa-4dd0-a59a-b40406b85353-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 09:12:31 crc kubenswrapper[5004]: I1201 09:12:31.112006 5004 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/39f7f17f-adaa-4dd0-a59a-b40406b85353-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 09:12:31 crc kubenswrapper[5004]: I1201 09:12:31.281174 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-zrjqn" event={"ID":"39f7f17f-adaa-4dd0-a59a-b40406b85353","Type":"ContainerDied","Data":"f0ba12e67a10fbd50dd75a221d7a5d42ed4e87ee58f6b3a869c074f749e12b29"} Dec 01 09:12:31 crc kubenswrapper[5004]: I1201 09:12:31.281255 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f0ba12e67a10fbd50dd75a221d7a5d42ed4e87ee58f6b3a869c074f749e12b29" Dec 01 09:12:31 crc kubenswrapper[5004]: I1201 09:12:31.281368 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-zrjqn" Dec 01 09:12:34 crc kubenswrapper[5004]: I1201 09:12:34.759870 5004 scope.go:117] "RemoveContainer" containerID="959c4dd48a6a01a111184092f5278271df46b0993f7d823dbccdaf0fe68ac515" Dec 01 09:12:34 crc kubenswrapper[5004]: E1201 09:12:34.760549 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:12:48 crc kubenswrapper[5004]: I1201 09:12:48.758916 5004 scope.go:117] "RemoveContainer" containerID="959c4dd48a6a01a111184092f5278271df46b0993f7d823dbccdaf0fe68ac515" Dec 01 09:12:48 crc kubenswrapper[5004]: E1201 09:12:48.759653 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:12:59 crc kubenswrapper[5004]: I1201 09:12:59.759176 5004 scope.go:117] "RemoveContainer" containerID="959c4dd48a6a01a111184092f5278271df46b0993f7d823dbccdaf0fe68ac515" Dec 01 09:12:59 crc kubenswrapper[5004]: E1201 09:12:59.759972 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:13:14 crc kubenswrapper[5004]: I1201 09:13:14.760026 5004 scope.go:117] "RemoveContainer" containerID="959c4dd48a6a01a111184092f5278271df46b0993f7d823dbccdaf0fe68ac515" Dec 01 09:13:14 crc kubenswrapper[5004]: E1201 09:13:14.760834 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:13:27 crc kubenswrapper[5004]: I1201 09:13:27.758303 5004 scope.go:117] "RemoveContainer" containerID="959c4dd48a6a01a111184092f5278271df46b0993f7d823dbccdaf0fe68ac515" Dec 01 09:13:27 crc kubenswrapper[5004]: E1201 09:13:27.759188 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:13:41 crc kubenswrapper[5004]: I1201 09:13:41.761378 5004 scope.go:117] "RemoveContainer" containerID="959c4dd48a6a01a111184092f5278271df46b0993f7d823dbccdaf0fe68ac515" Dec 01 09:13:42 crc kubenswrapper[5004]: I1201 09:13:42.195595 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" event={"ID":"f9977ebb-82de-4e96-8763-0b5a84f8d4ce","Type":"ContainerStarted","Data":"67314238aaf836d37dc206c647fc8d6f25a8091a20d766f75fb40dfdc2efc058"} Dec 01 09:14:48 crc kubenswrapper[5004]: I1201 09:14:48.296180 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-l2lxc"] Dec 01 09:14:48 crc kubenswrapper[5004]: E1201 09:14:48.298025 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39f7f17f-adaa-4dd0-a59a-b40406b85353" containerName="logging-edpm-deployment-openstack-edpm-ipam" Dec 01 09:14:48 crc kubenswrapper[5004]: I1201 09:14:48.298045 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="39f7f17f-adaa-4dd0-a59a-b40406b85353" containerName="logging-edpm-deployment-openstack-edpm-ipam" Dec 01 09:14:48 crc kubenswrapper[5004]: I1201 09:14:48.298422 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="39f7f17f-adaa-4dd0-a59a-b40406b85353" containerName="logging-edpm-deployment-openstack-edpm-ipam" Dec 01 09:14:48 crc kubenswrapper[5004]: I1201 09:14:48.303437 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l2lxc" Dec 01 09:14:48 crc kubenswrapper[5004]: I1201 09:14:48.330934 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l2lxc"] Dec 01 09:14:48 crc kubenswrapper[5004]: I1201 09:14:48.440630 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/475a4ad9-7e12-4521-bd4a-83b31e2d08a4-utilities\") pod \"community-operators-l2lxc\" (UID: \"475a4ad9-7e12-4521-bd4a-83b31e2d08a4\") " pod="openshift-marketplace/community-operators-l2lxc" Dec 01 09:14:48 crc kubenswrapper[5004]: I1201 09:14:48.441011 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/475a4ad9-7e12-4521-bd4a-83b31e2d08a4-catalog-content\") pod \"community-operators-l2lxc\" (UID: \"475a4ad9-7e12-4521-bd4a-83b31e2d08a4\") " pod="openshift-marketplace/community-operators-l2lxc" Dec 01 09:14:48 crc kubenswrapper[5004]: I1201 09:14:48.441063 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdt45\" (UniqueName: \"kubernetes.io/projected/475a4ad9-7e12-4521-bd4a-83b31e2d08a4-kube-api-access-fdt45\") pod \"community-operators-l2lxc\" (UID: \"475a4ad9-7e12-4521-bd4a-83b31e2d08a4\") " pod="openshift-marketplace/community-operators-l2lxc" Dec 01 09:14:48 crc kubenswrapper[5004]: I1201 09:14:48.543056 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/475a4ad9-7e12-4521-bd4a-83b31e2d08a4-utilities\") pod \"community-operators-l2lxc\" (UID: \"475a4ad9-7e12-4521-bd4a-83b31e2d08a4\") " pod="openshift-marketplace/community-operators-l2lxc" Dec 01 09:14:48 crc kubenswrapper[5004]: I1201 09:14:48.543179 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/475a4ad9-7e12-4521-bd4a-83b31e2d08a4-catalog-content\") pod \"community-operators-l2lxc\" (UID: \"475a4ad9-7e12-4521-bd4a-83b31e2d08a4\") " pod="openshift-marketplace/community-operators-l2lxc" Dec 01 09:14:48 crc kubenswrapper[5004]: I1201 09:14:48.543230 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdt45\" (UniqueName: \"kubernetes.io/projected/475a4ad9-7e12-4521-bd4a-83b31e2d08a4-kube-api-access-fdt45\") pod \"community-operators-l2lxc\" (UID: \"475a4ad9-7e12-4521-bd4a-83b31e2d08a4\") " pod="openshift-marketplace/community-operators-l2lxc" Dec 01 09:14:48 crc kubenswrapper[5004]: I1201 09:14:48.543646 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/475a4ad9-7e12-4521-bd4a-83b31e2d08a4-catalog-content\") pod \"community-operators-l2lxc\" (UID: \"475a4ad9-7e12-4521-bd4a-83b31e2d08a4\") " pod="openshift-marketplace/community-operators-l2lxc" Dec 01 09:14:48 crc kubenswrapper[5004]: I1201 09:14:48.543676 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/475a4ad9-7e12-4521-bd4a-83b31e2d08a4-utilities\") pod \"community-operators-l2lxc\" (UID: \"475a4ad9-7e12-4521-bd4a-83b31e2d08a4\") " pod="openshift-marketplace/community-operators-l2lxc" Dec 01 09:14:48 crc kubenswrapper[5004]: I1201 09:14:48.563763 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdt45\" (UniqueName: \"kubernetes.io/projected/475a4ad9-7e12-4521-bd4a-83b31e2d08a4-kube-api-access-fdt45\") pod \"community-operators-l2lxc\" (UID: \"475a4ad9-7e12-4521-bd4a-83b31e2d08a4\") " pod="openshift-marketplace/community-operators-l2lxc" Dec 01 09:14:48 crc kubenswrapper[5004]: I1201 09:14:48.650344 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l2lxc" Dec 01 09:14:49 crc kubenswrapper[5004]: I1201 09:14:49.188182 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l2lxc"] Dec 01 09:14:50 crc kubenswrapper[5004]: I1201 09:14:50.091309 5004 generic.go:334] "Generic (PLEG): container finished" podID="475a4ad9-7e12-4521-bd4a-83b31e2d08a4" containerID="578064903e4690f5845b6351733e274e95040fa407dc9547dd0f76b5cda4a6af" exitCode=0 Dec 01 09:14:50 crc kubenswrapper[5004]: I1201 09:14:50.091720 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l2lxc" event={"ID":"475a4ad9-7e12-4521-bd4a-83b31e2d08a4","Type":"ContainerDied","Data":"578064903e4690f5845b6351733e274e95040fa407dc9547dd0f76b5cda4a6af"} Dec 01 09:14:50 crc kubenswrapper[5004]: I1201 09:14:50.091905 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l2lxc" event={"ID":"475a4ad9-7e12-4521-bd4a-83b31e2d08a4","Type":"ContainerStarted","Data":"e1534d400bfeb5163eaa11961758d956f478265ce903fdcb1f0eb1f541e31e9f"} Dec 01 09:14:50 crc kubenswrapper[5004]: I1201 09:14:50.097264 5004 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 09:14:52 crc kubenswrapper[5004]: I1201 09:14:52.124039 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l2lxc" event={"ID":"475a4ad9-7e12-4521-bd4a-83b31e2d08a4","Type":"ContainerStarted","Data":"8b8929029434e0ec20c7eb8dc7775a7b13a5729645d838cc42e3daee901f4329"} Dec 01 09:14:53 crc kubenswrapper[5004]: I1201 09:14:53.142970 5004 generic.go:334] "Generic (PLEG): container finished" podID="475a4ad9-7e12-4521-bd4a-83b31e2d08a4" containerID="8b8929029434e0ec20c7eb8dc7775a7b13a5729645d838cc42e3daee901f4329" exitCode=0 Dec 01 09:14:53 crc kubenswrapper[5004]: I1201 09:14:53.143088 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l2lxc" event={"ID":"475a4ad9-7e12-4521-bd4a-83b31e2d08a4","Type":"ContainerDied","Data":"8b8929029434e0ec20c7eb8dc7775a7b13a5729645d838cc42e3daee901f4329"} Dec 01 09:14:55 crc kubenswrapper[5004]: I1201 09:14:55.189720 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l2lxc" event={"ID":"475a4ad9-7e12-4521-bd4a-83b31e2d08a4","Type":"ContainerStarted","Data":"4c09418da6e73bdcc0ca8c5b94515b6120d371d23547f640b5b14b0063cc4ba3"} Dec 01 09:14:55 crc kubenswrapper[5004]: I1201 09:14:55.220114 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-l2lxc" podStartSLOduration=3.322485371 podStartE2EDuration="7.22008735s" podCreationTimestamp="2025-12-01 09:14:48 +0000 UTC" firstStartedPulling="2025-12-01 09:14:50.096907456 +0000 UTC m=+3467.661899458" lastFinishedPulling="2025-12-01 09:14:53.994509445 +0000 UTC m=+3471.559501437" observedRunningTime="2025-12-01 09:14:55.219768772 +0000 UTC m=+3472.784760764" watchObservedRunningTime="2025-12-01 09:14:55.22008735 +0000 UTC m=+3472.785079372" Dec 01 09:14:58 crc kubenswrapper[5004]: I1201 09:14:58.651312 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-l2lxc" Dec 01 09:14:58 crc kubenswrapper[5004]: I1201 09:14:58.651871 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-l2lxc" Dec 01 09:14:59 crc kubenswrapper[5004]: I1201 09:14:59.715886 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-l2lxc" podUID="475a4ad9-7e12-4521-bd4a-83b31e2d08a4" containerName="registry-server" probeResult="failure" output=< Dec 01 09:14:59 crc kubenswrapper[5004]: timeout: failed to connect service ":50051" within 1s Dec 01 09:14:59 crc kubenswrapper[5004]: > Dec 01 09:15:00 crc kubenswrapper[5004]: I1201 09:15:00.161277 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409675-tldzl"] Dec 01 09:15:00 crc kubenswrapper[5004]: I1201 09:15:00.164110 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409675-tldzl" Dec 01 09:15:00 crc kubenswrapper[5004]: I1201 09:15:00.166703 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 09:15:00 crc kubenswrapper[5004]: I1201 09:15:00.166728 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 09:15:00 crc kubenswrapper[5004]: I1201 09:15:00.199760 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409675-tldzl"] Dec 01 09:15:00 crc kubenswrapper[5004]: I1201 09:15:00.247152 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5aaf7b51-fbd4-40e4-a5a1-3281a372a6cd-secret-volume\") pod \"collect-profiles-29409675-tldzl\" (UID: \"5aaf7b51-fbd4-40e4-a5a1-3281a372a6cd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409675-tldzl" Dec 01 09:15:00 crc kubenswrapper[5004]: I1201 09:15:00.247319 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tcln\" (UniqueName: \"kubernetes.io/projected/5aaf7b51-fbd4-40e4-a5a1-3281a372a6cd-kube-api-access-7tcln\") pod \"collect-profiles-29409675-tldzl\" (UID: \"5aaf7b51-fbd4-40e4-a5a1-3281a372a6cd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409675-tldzl" Dec 01 09:15:00 crc kubenswrapper[5004]: I1201 09:15:00.247872 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5aaf7b51-fbd4-40e4-a5a1-3281a372a6cd-config-volume\") pod \"collect-profiles-29409675-tldzl\" (UID: \"5aaf7b51-fbd4-40e4-a5a1-3281a372a6cd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409675-tldzl" Dec 01 09:15:00 crc kubenswrapper[5004]: I1201 09:15:00.350828 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5aaf7b51-fbd4-40e4-a5a1-3281a372a6cd-secret-volume\") pod \"collect-profiles-29409675-tldzl\" (UID: \"5aaf7b51-fbd4-40e4-a5a1-3281a372a6cd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409675-tldzl" Dec 01 09:15:00 crc kubenswrapper[5004]: I1201 09:15:00.351215 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tcln\" (UniqueName: \"kubernetes.io/projected/5aaf7b51-fbd4-40e4-a5a1-3281a372a6cd-kube-api-access-7tcln\") pod \"collect-profiles-29409675-tldzl\" (UID: \"5aaf7b51-fbd4-40e4-a5a1-3281a372a6cd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409675-tldzl" Dec 01 09:15:00 crc kubenswrapper[5004]: I1201 09:15:00.351482 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5aaf7b51-fbd4-40e4-a5a1-3281a372a6cd-config-volume\") pod \"collect-profiles-29409675-tldzl\" (UID: \"5aaf7b51-fbd4-40e4-a5a1-3281a372a6cd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409675-tldzl" Dec 01 09:15:00 crc kubenswrapper[5004]: I1201 09:15:00.352964 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5aaf7b51-fbd4-40e4-a5a1-3281a372a6cd-config-volume\") pod \"collect-profiles-29409675-tldzl\" (UID: \"5aaf7b51-fbd4-40e4-a5a1-3281a372a6cd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409675-tldzl" Dec 01 09:15:00 crc kubenswrapper[5004]: I1201 09:15:00.360449 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5aaf7b51-fbd4-40e4-a5a1-3281a372a6cd-secret-volume\") pod \"collect-profiles-29409675-tldzl\" (UID: \"5aaf7b51-fbd4-40e4-a5a1-3281a372a6cd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409675-tldzl" Dec 01 09:15:00 crc kubenswrapper[5004]: I1201 09:15:00.376545 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tcln\" (UniqueName: \"kubernetes.io/projected/5aaf7b51-fbd4-40e4-a5a1-3281a372a6cd-kube-api-access-7tcln\") pod \"collect-profiles-29409675-tldzl\" (UID: \"5aaf7b51-fbd4-40e4-a5a1-3281a372a6cd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409675-tldzl" Dec 01 09:15:00 crc kubenswrapper[5004]: I1201 09:15:00.508262 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409675-tldzl" Dec 01 09:15:01 crc kubenswrapper[5004]: I1201 09:15:01.066267 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409675-tldzl"] Dec 01 09:15:01 crc kubenswrapper[5004]: I1201 09:15:01.272460 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409675-tldzl" event={"ID":"5aaf7b51-fbd4-40e4-a5a1-3281a372a6cd","Type":"ContainerStarted","Data":"4eb93845d88b914f62dd435ec1b3daa3d5ab2c84bf19cbc61f368485d933518c"} Dec 01 09:15:01 crc kubenswrapper[5004]: E1201 09:15:01.637351 5004 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.75:43390->38.102.83.75:44201: read tcp 38.102.83.75:43390->38.102.83.75:44201: read: connection reset by peer Dec 01 09:15:02 crc kubenswrapper[5004]: I1201 09:15:02.283485 5004 generic.go:334] "Generic (PLEG): container finished" podID="5aaf7b51-fbd4-40e4-a5a1-3281a372a6cd" containerID="00f8e665d3ea8ff52f1d723ba72d8f24f89eb5d9061d051ff36178dbc208ad02" exitCode=0 Dec 01 09:15:02 crc kubenswrapper[5004]: I1201 09:15:02.283526 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409675-tldzl" event={"ID":"5aaf7b51-fbd4-40e4-a5a1-3281a372a6cd","Type":"ContainerDied","Data":"00f8e665d3ea8ff52f1d723ba72d8f24f89eb5d9061d051ff36178dbc208ad02"} Dec 01 09:15:03 crc kubenswrapper[5004]: I1201 09:15:03.744328 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409675-tldzl" Dec 01 09:15:03 crc kubenswrapper[5004]: I1201 09:15:03.858202 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5aaf7b51-fbd4-40e4-a5a1-3281a372a6cd-config-volume\") pod \"5aaf7b51-fbd4-40e4-a5a1-3281a372a6cd\" (UID: \"5aaf7b51-fbd4-40e4-a5a1-3281a372a6cd\") " Dec 01 09:15:03 crc kubenswrapper[5004]: I1201 09:15:03.858571 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tcln\" (UniqueName: \"kubernetes.io/projected/5aaf7b51-fbd4-40e4-a5a1-3281a372a6cd-kube-api-access-7tcln\") pod \"5aaf7b51-fbd4-40e4-a5a1-3281a372a6cd\" (UID: \"5aaf7b51-fbd4-40e4-a5a1-3281a372a6cd\") " Dec 01 09:15:03 crc kubenswrapper[5004]: I1201 09:15:03.859278 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5aaf7b51-fbd4-40e4-a5a1-3281a372a6cd-config-volume" (OuterVolumeSpecName: "config-volume") pod "5aaf7b51-fbd4-40e4-a5a1-3281a372a6cd" (UID: "5aaf7b51-fbd4-40e4-a5a1-3281a372a6cd"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:15:03 crc kubenswrapper[5004]: I1201 09:15:03.859520 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5aaf7b51-fbd4-40e4-a5a1-3281a372a6cd-secret-volume\") pod \"5aaf7b51-fbd4-40e4-a5a1-3281a372a6cd\" (UID: \"5aaf7b51-fbd4-40e4-a5a1-3281a372a6cd\") " Dec 01 09:15:03 crc kubenswrapper[5004]: I1201 09:15:03.860430 5004 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5aaf7b51-fbd4-40e4-a5a1-3281a372a6cd-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:03 crc kubenswrapper[5004]: I1201 09:15:03.864829 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5aaf7b51-fbd4-40e4-a5a1-3281a372a6cd-kube-api-access-7tcln" (OuterVolumeSpecName: "kube-api-access-7tcln") pod "5aaf7b51-fbd4-40e4-a5a1-3281a372a6cd" (UID: "5aaf7b51-fbd4-40e4-a5a1-3281a372a6cd"). InnerVolumeSpecName "kube-api-access-7tcln". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:15:03 crc kubenswrapper[5004]: I1201 09:15:03.865876 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5aaf7b51-fbd4-40e4-a5a1-3281a372a6cd-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5aaf7b51-fbd4-40e4-a5a1-3281a372a6cd" (UID: "5aaf7b51-fbd4-40e4-a5a1-3281a372a6cd"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:15:03 crc kubenswrapper[5004]: I1201 09:15:03.964808 5004 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5aaf7b51-fbd4-40e4-a5a1-3281a372a6cd-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:03 crc kubenswrapper[5004]: I1201 09:15:03.964862 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tcln\" (UniqueName: \"kubernetes.io/projected/5aaf7b51-fbd4-40e4-a5a1-3281a372a6cd-kube-api-access-7tcln\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:04 crc kubenswrapper[5004]: I1201 09:15:04.307391 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409675-tldzl" event={"ID":"5aaf7b51-fbd4-40e4-a5a1-3281a372a6cd","Type":"ContainerDied","Data":"4eb93845d88b914f62dd435ec1b3daa3d5ab2c84bf19cbc61f368485d933518c"} Dec 01 09:15:04 crc kubenswrapper[5004]: I1201 09:15:04.307433 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4eb93845d88b914f62dd435ec1b3daa3d5ab2c84bf19cbc61f368485d933518c" Dec 01 09:15:04 crc kubenswrapper[5004]: I1201 09:15:04.307519 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409675-tldzl" Dec 01 09:15:04 crc kubenswrapper[5004]: I1201 09:15:04.841105 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409630-7x9vn"] Dec 01 09:15:04 crc kubenswrapper[5004]: I1201 09:15:04.853641 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409630-7x9vn"] Dec 01 09:15:06 crc kubenswrapper[5004]: I1201 09:15:06.779029 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38460618-7ff3-4591-98f3-f20af1bfff60" path="/var/lib/kubelet/pods/38460618-7ff3-4591-98f3-f20af1bfff60/volumes" Dec 01 09:15:08 crc kubenswrapper[5004]: I1201 09:15:08.726740 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-l2lxc" Dec 01 09:15:08 crc kubenswrapper[5004]: I1201 09:15:08.825949 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-l2lxc" Dec 01 09:15:08 crc kubenswrapper[5004]: I1201 09:15:08.977457 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l2lxc"] Dec 01 09:15:10 crc kubenswrapper[5004]: I1201 09:15:10.402932 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-l2lxc" podUID="475a4ad9-7e12-4521-bd4a-83b31e2d08a4" containerName="registry-server" containerID="cri-o://4c09418da6e73bdcc0ca8c5b94515b6120d371d23547f640b5b14b0063cc4ba3" gracePeriod=2 Dec 01 09:15:10 crc kubenswrapper[5004]: I1201 09:15:10.925866 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l2lxc" Dec 01 09:15:10 crc kubenswrapper[5004]: I1201 09:15:10.955215 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/475a4ad9-7e12-4521-bd4a-83b31e2d08a4-utilities\") pod \"475a4ad9-7e12-4521-bd4a-83b31e2d08a4\" (UID: \"475a4ad9-7e12-4521-bd4a-83b31e2d08a4\") " Dec 01 09:15:10 crc kubenswrapper[5004]: I1201 09:15:10.955353 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdt45\" (UniqueName: \"kubernetes.io/projected/475a4ad9-7e12-4521-bd4a-83b31e2d08a4-kube-api-access-fdt45\") pod \"475a4ad9-7e12-4521-bd4a-83b31e2d08a4\" (UID: \"475a4ad9-7e12-4521-bd4a-83b31e2d08a4\") " Dec 01 09:15:10 crc kubenswrapper[5004]: I1201 09:15:10.955418 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/475a4ad9-7e12-4521-bd4a-83b31e2d08a4-catalog-content\") pod \"475a4ad9-7e12-4521-bd4a-83b31e2d08a4\" (UID: \"475a4ad9-7e12-4521-bd4a-83b31e2d08a4\") " Dec 01 09:15:10 crc kubenswrapper[5004]: I1201 09:15:10.956625 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/475a4ad9-7e12-4521-bd4a-83b31e2d08a4-utilities" (OuterVolumeSpecName: "utilities") pod "475a4ad9-7e12-4521-bd4a-83b31e2d08a4" (UID: "475a4ad9-7e12-4521-bd4a-83b31e2d08a4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:15:10 crc kubenswrapper[5004]: I1201 09:15:10.957273 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/475a4ad9-7e12-4521-bd4a-83b31e2d08a4-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:10 crc kubenswrapper[5004]: I1201 09:15:10.969755 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/475a4ad9-7e12-4521-bd4a-83b31e2d08a4-kube-api-access-fdt45" (OuterVolumeSpecName: "kube-api-access-fdt45") pod "475a4ad9-7e12-4521-bd4a-83b31e2d08a4" (UID: "475a4ad9-7e12-4521-bd4a-83b31e2d08a4"). InnerVolumeSpecName "kube-api-access-fdt45". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:15:11 crc kubenswrapper[5004]: I1201 09:15:11.029086 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/475a4ad9-7e12-4521-bd4a-83b31e2d08a4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "475a4ad9-7e12-4521-bd4a-83b31e2d08a4" (UID: "475a4ad9-7e12-4521-bd4a-83b31e2d08a4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:15:11 crc kubenswrapper[5004]: I1201 09:15:11.063223 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdt45\" (UniqueName: \"kubernetes.io/projected/475a4ad9-7e12-4521-bd4a-83b31e2d08a4-kube-api-access-fdt45\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:11 crc kubenswrapper[5004]: I1201 09:15:11.063261 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/475a4ad9-7e12-4521-bd4a-83b31e2d08a4-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:11 crc kubenswrapper[5004]: I1201 09:15:11.419033 5004 generic.go:334] "Generic (PLEG): container finished" podID="475a4ad9-7e12-4521-bd4a-83b31e2d08a4" containerID="4c09418da6e73bdcc0ca8c5b94515b6120d371d23547f640b5b14b0063cc4ba3" exitCode=0 Dec 01 09:15:11 crc kubenswrapper[5004]: I1201 09:15:11.419143 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l2lxc" Dec 01 09:15:11 crc kubenswrapper[5004]: I1201 09:15:11.419190 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l2lxc" event={"ID":"475a4ad9-7e12-4521-bd4a-83b31e2d08a4","Type":"ContainerDied","Data":"4c09418da6e73bdcc0ca8c5b94515b6120d371d23547f640b5b14b0063cc4ba3"} Dec 01 09:15:11 crc kubenswrapper[5004]: I1201 09:15:11.419656 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l2lxc" event={"ID":"475a4ad9-7e12-4521-bd4a-83b31e2d08a4","Type":"ContainerDied","Data":"e1534d400bfeb5163eaa11961758d956f478265ce903fdcb1f0eb1f541e31e9f"} Dec 01 09:15:11 crc kubenswrapper[5004]: I1201 09:15:11.419731 5004 scope.go:117] "RemoveContainer" containerID="4c09418da6e73bdcc0ca8c5b94515b6120d371d23547f640b5b14b0063cc4ba3" Dec 01 09:15:11 crc kubenswrapper[5004]: I1201 09:15:11.488152 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l2lxc"] Dec 01 09:15:11 crc kubenswrapper[5004]: I1201 09:15:11.499862 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-l2lxc"] Dec 01 09:15:11 crc kubenswrapper[5004]: I1201 09:15:11.500822 5004 scope.go:117] "RemoveContainer" containerID="8b8929029434e0ec20c7eb8dc7775a7b13a5729645d838cc42e3daee901f4329" Dec 01 09:15:11 crc kubenswrapper[5004]: I1201 09:15:11.526100 5004 scope.go:117] "RemoveContainer" containerID="578064903e4690f5845b6351733e274e95040fa407dc9547dd0f76b5cda4a6af" Dec 01 09:15:11 crc kubenswrapper[5004]: I1201 09:15:11.591965 5004 scope.go:117] "RemoveContainer" containerID="4c09418da6e73bdcc0ca8c5b94515b6120d371d23547f640b5b14b0063cc4ba3" Dec 01 09:15:11 crc kubenswrapper[5004]: E1201 09:15:11.592661 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c09418da6e73bdcc0ca8c5b94515b6120d371d23547f640b5b14b0063cc4ba3\": container with ID starting with 4c09418da6e73bdcc0ca8c5b94515b6120d371d23547f640b5b14b0063cc4ba3 not found: ID does not exist" containerID="4c09418da6e73bdcc0ca8c5b94515b6120d371d23547f640b5b14b0063cc4ba3" Dec 01 09:15:11 crc kubenswrapper[5004]: I1201 09:15:11.592695 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c09418da6e73bdcc0ca8c5b94515b6120d371d23547f640b5b14b0063cc4ba3"} err="failed to get container status \"4c09418da6e73bdcc0ca8c5b94515b6120d371d23547f640b5b14b0063cc4ba3\": rpc error: code = NotFound desc = could not find container \"4c09418da6e73bdcc0ca8c5b94515b6120d371d23547f640b5b14b0063cc4ba3\": container with ID starting with 4c09418da6e73bdcc0ca8c5b94515b6120d371d23547f640b5b14b0063cc4ba3 not found: ID does not exist" Dec 01 09:15:11 crc kubenswrapper[5004]: I1201 09:15:11.592741 5004 scope.go:117] "RemoveContainer" containerID="8b8929029434e0ec20c7eb8dc7775a7b13a5729645d838cc42e3daee901f4329" Dec 01 09:15:11 crc kubenswrapper[5004]: E1201 09:15:11.593099 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b8929029434e0ec20c7eb8dc7775a7b13a5729645d838cc42e3daee901f4329\": container with ID starting with 8b8929029434e0ec20c7eb8dc7775a7b13a5729645d838cc42e3daee901f4329 not found: ID does not exist" containerID="8b8929029434e0ec20c7eb8dc7775a7b13a5729645d838cc42e3daee901f4329" Dec 01 09:15:11 crc kubenswrapper[5004]: I1201 09:15:11.593163 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b8929029434e0ec20c7eb8dc7775a7b13a5729645d838cc42e3daee901f4329"} err="failed to get container status \"8b8929029434e0ec20c7eb8dc7775a7b13a5729645d838cc42e3daee901f4329\": rpc error: code = NotFound desc = could not find container \"8b8929029434e0ec20c7eb8dc7775a7b13a5729645d838cc42e3daee901f4329\": container with ID starting with 8b8929029434e0ec20c7eb8dc7775a7b13a5729645d838cc42e3daee901f4329 not found: ID does not exist" Dec 01 09:15:11 crc kubenswrapper[5004]: I1201 09:15:11.593201 5004 scope.go:117] "RemoveContainer" containerID="578064903e4690f5845b6351733e274e95040fa407dc9547dd0f76b5cda4a6af" Dec 01 09:15:11 crc kubenswrapper[5004]: E1201 09:15:11.594083 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"578064903e4690f5845b6351733e274e95040fa407dc9547dd0f76b5cda4a6af\": container with ID starting with 578064903e4690f5845b6351733e274e95040fa407dc9547dd0f76b5cda4a6af not found: ID does not exist" containerID="578064903e4690f5845b6351733e274e95040fa407dc9547dd0f76b5cda4a6af" Dec 01 09:15:11 crc kubenswrapper[5004]: I1201 09:15:11.594144 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"578064903e4690f5845b6351733e274e95040fa407dc9547dd0f76b5cda4a6af"} err="failed to get container status \"578064903e4690f5845b6351733e274e95040fa407dc9547dd0f76b5cda4a6af\": rpc error: code = NotFound desc = could not find container \"578064903e4690f5845b6351733e274e95040fa407dc9547dd0f76b5cda4a6af\": container with ID starting with 578064903e4690f5845b6351733e274e95040fa407dc9547dd0f76b5cda4a6af not found: ID does not exist" Dec 01 09:15:12 crc kubenswrapper[5004]: I1201 09:15:12.787754 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="475a4ad9-7e12-4521-bd4a-83b31e2d08a4" path="/var/lib/kubelet/pods/475a4ad9-7e12-4521-bd4a-83b31e2d08a4/volumes" Dec 01 09:15:18 crc kubenswrapper[5004]: I1201 09:15:18.335471 5004 scope.go:117] "RemoveContainer" containerID="61e9fd11bbe94fabce0f763ab29f3e72e66d74ee0807111eadaadd3b22e7b12d" Dec 01 09:16:08 crc kubenswrapper[5004]: I1201 09:16:08.729268 5004 patch_prober.go:28] interesting pod/machine-config-daemon-fvdgt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:16:08 crc kubenswrapper[5004]: I1201 09:16:08.729927 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:16:38 crc kubenswrapper[5004]: I1201 09:16:38.729658 5004 patch_prober.go:28] interesting pod/machine-config-daemon-fvdgt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:16:38 crc kubenswrapper[5004]: I1201 09:16:38.730214 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:17:08 crc kubenswrapper[5004]: I1201 09:17:08.728902 5004 patch_prober.go:28] interesting pod/machine-config-daemon-fvdgt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:17:08 crc kubenswrapper[5004]: I1201 09:17:08.729551 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:17:08 crc kubenswrapper[5004]: I1201 09:17:08.729637 5004 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" Dec 01 09:17:08 crc kubenswrapper[5004]: I1201 09:17:08.730619 5004 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"67314238aaf836d37dc206c647fc8d6f25a8091a20d766f75fb40dfdc2efc058"} pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 09:17:08 crc kubenswrapper[5004]: I1201 09:17:08.730690 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" containerID="cri-o://67314238aaf836d37dc206c647fc8d6f25a8091a20d766f75fb40dfdc2efc058" gracePeriod=600 Dec 01 09:17:08 crc kubenswrapper[5004]: I1201 09:17:08.921956 5004 generic.go:334] "Generic (PLEG): container finished" podID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerID="67314238aaf836d37dc206c647fc8d6f25a8091a20d766f75fb40dfdc2efc058" exitCode=0 Dec 01 09:17:08 crc kubenswrapper[5004]: I1201 09:17:08.922032 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" event={"ID":"f9977ebb-82de-4e96-8763-0b5a84f8d4ce","Type":"ContainerDied","Data":"67314238aaf836d37dc206c647fc8d6f25a8091a20d766f75fb40dfdc2efc058"} Dec 01 09:17:08 crc kubenswrapper[5004]: I1201 09:17:08.922267 5004 scope.go:117] "RemoveContainer" containerID="959c4dd48a6a01a111184092f5278271df46b0993f7d823dbccdaf0fe68ac515" Dec 01 09:17:09 crc kubenswrapper[5004]: I1201 09:17:09.939690 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" event={"ID":"f9977ebb-82de-4e96-8763-0b5a84f8d4ce","Type":"ContainerStarted","Data":"ddad648297bc943586134e821ef858ef63f97885bd3e0068a2a2edd240b88618"} Dec 01 09:17:52 crc kubenswrapper[5004]: I1201 09:17:52.805961 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fhf99"] Dec 01 09:17:52 crc kubenswrapper[5004]: E1201 09:17:52.807232 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="475a4ad9-7e12-4521-bd4a-83b31e2d08a4" containerName="registry-server" Dec 01 09:17:52 crc kubenswrapper[5004]: I1201 09:17:52.807250 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="475a4ad9-7e12-4521-bd4a-83b31e2d08a4" containerName="registry-server" Dec 01 09:17:52 crc kubenswrapper[5004]: E1201 09:17:52.807274 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="475a4ad9-7e12-4521-bd4a-83b31e2d08a4" containerName="extract-utilities" Dec 01 09:17:52 crc kubenswrapper[5004]: I1201 09:17:52.807282 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="475a4ad9-7e12-4521-bd4a-83b31e2d08a4" containerName="extract-utilities" Dec 01 09:17:52 crc kubenswrapper[5004]: E1201 09:17:52.807294 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="475a4ad9-7e12-4521-bd4a-83b31e2d08a4" containerName="extract-content" Dec 01 09:17:52 crc kubenswrapper[5004]: I1201 09:17:52.807301 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="475a4ad9-7e12-4521-bd4a-83b31e2d08a4" containerName="extract-content" Dec 01 09:17:52 crc kubenswrapper[5004]: E1201 09:17:52.807311 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5aaf7b51-fbd4-40e4-a5a1-3281a372a6cd" containerName="collect-profiles" Dec 01 09:17:52 crc kubenswrapper[5004]: I1201 09:17:52.807319 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="5aaf7b51-fbd4-40e4-a5a1-3281a372a6cd" containerName="collect-profiles" Dec 01 09:17:52 crc kubenswrapper[5004]: I1201 09:17:52.807798 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="475a4ad9-7e12-4521-bd4a-83b31e2d08a4" containerName="registry-server" Dec 01 09:17:52 crc kubenswrapper[5004]: I1201 09:17:52.807833 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="5aaf7b51-fbd4-40e4-a5a1-3281a372a6cd" containerName="collect-profiles" Dec 01 09:17:52 crc kubenswrapper[5004]: I1201 09:17:52.809851 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fhf99"] Dec 01 09:17:52 crc kubenswrapper[5004]: I1201 09:17:52.809955 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fhf99" Dec 01 09:17:52 crc kubenswrapper[5004]: I1201 09:17:52.930047 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjsf5\" (UniqueName: \"kubernetes.io/projected/b6157d01-bacc-4ebc-8384-b66dafd57a73-kube-api-access-rjsf5\") pod \"redhat-marketplace-fhf99\" (UID: \"b6157d01-bacc-4ebc-8384-b66dafd57a73\") " pod="openshift-marketplace/redhat-marketplace-fhf99" Dec 01 09:17:52 crc kubenswrapper[5004]: I1201 09:17:52.930429 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6157d01-bacc-4ebc-8384-b66dafd57a73-utilities\") pod \"redhat-marketplace-fhf99\" (UID: \"b6157d01-bacc-4ebc-8384-b66dafd57a73\") " pod="openshift-marketplace/redhat-marketplace-fhf99" Dec 01 09:17:52 crc kubenswrapper[5004]: I1201 09:17:52.930728 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6157d01-bacc-4ebc-8384-b66dafd57a73-catalog-content\") pod \"redhat-marketplace-fhf99\" (UID: \"b6157d01-bacc-4ebc-8384-b66dafd57a73\") " pod="openshift-marketplace/redhat-marketplace-fhf99" Dec 01 09:17:53 crc kubenswrapper[5004]: I1201 09:17:53.033220 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjsf5\" (UniqueName: \"kubernetes.io/projected/b6157d01-bacc-4ebc-8384-b66dafd57a73-kube-api-access-rjsf5\") pod \"redhat-marketplace-fhf99\" (UID: \"b6157d01-bacc-4ebc-8384-b66dafd57a73\") " pod="openshift-marketplace/redhat-marketplace-fhf99" Dec 01 09:17:53 crc kubenswrapper[5004]: I1201 09:17:53.033298 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6157d01-bacc-4ebc-8384-b66dafd57a73-utilities\") pod \"redhat-marketplace-fhf99\" (UID: \"b6157d01-bacc-4ebc-8384-b66dafd57a73\") " pod="openshift-marketplace/redhat-marketplace-fhf99" Dec 01 09:17:53 crc kubenswrapper[5004]: I1201 09:17:53.033485 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6157d01-bacc-4ebc-8384-b66dafd57a73-catalog-content\") pod \"redhat-marketplace-fhf99\" (UID: \"b6157d01-bacc-4ebc-8384-b66dafd57a73\") " pod="openshift-marketplace/redhat-marketplace-fhf99" Dec 01 09:17:53 crc kubenswrapper[5004]: I1201 09:17:53.033985 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6157d01-bacc-4ebc-8384-b66dafd57a73-utilities\") pod \"redhat-marketplace-fhf99\" (UID: \"b6157d01-bacc-4ebc-8384-b66dafd57a73\") " pod="openshift-marketplace/redhat-marketplace-fhf99" Dec 01 09:17:53 crc kubenswrapper[5004]: I1201 09:17:53.056284 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjsf5\" (UniqueName: \"kubernetes.io/projected/b6157d01-bacc-4ebc-8384-b66dafd57a73-kube-api-access-rjsf5\") pod \"redhat-marketplace-fhf99\" (UID: \"b6157d01-bacc-4ebc-8384-b66dafd57a73\") " pod="openshift-marketplace/redhat-marketplace-fhf99" Dec 01 09:17:53 crc kubenswrapper[5004]: I1201 09:17:53.445781 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6157d01-bacc-4ebc-8384-b66dafd57a73-catalog-content\") pod \"redhat-marketplace-fhf99\" (UID: \"b6157d01-bacc-4ebc-8384-b66dafd57a73\") " pod="openshift-marketplace/redhat-marketplace-fhf99" Dec 01 09:17:53 crc kubenswrapper[5004]: I1201 09:17:53.733298 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fhf99" Dec 01 09:17:54 crc kubenswrapper[5004]: I1201 09:17:54.240825 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fhf99"] Dec 01 09:17:54 crc kubenswrapper[5004]: I1201 09:17:54.456043 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fhf99" event={"ID":"b6157d01-bacc-4ebc-8384-b66dafd57a73","Type":"ContainerStarted","Data":"e878753690aa9a05394508a119ce1f10b28984a9449716c05a282671f31a8afc"} Dec 01 09:17:55 crc kubenswrapper[5004]: I1201 09:17:55.469027 5004 generic.go:334] "Generic (PLEG): container finished" podID="b6157d01-bacc-4ebc-8384-b66dafd57a73" containerID="9cff73eb356111f962abc7afb548dd0622da3f79d8a53b61b4d737077e23e015" exitCode=0 Dec 01 09:17:55 crc kubenswrapper[5004]: I1201 09:17:55.469136 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fhf99" event={"ID":"b6157d01-bacc-4ebc-8384-b66dafd57a73","Type":"ContainerDied","Data":"9cff73eb356111f962abc7afb548dd0622da3f79d8a53b61b4d737077e23e015"} Dec 01 09:17:56 crc kubenswrapper[5004]: I1201 09:17:56.489395 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fhf99" event={"ID":"b6157d01-bacc-4ebc-8384-b66dafd57a73","Type":"ContainerStarted","Data":"ff3f555d3218a03d2696007ac8edc67e5378d53c25abab6e6bf9af8fcd27fd1b"} Dec 01 09:17:57 crc kubenswrapper[5004]: I1201 09:17:57.508841 5004 generic.go:334] "Generic (PLEG): container finished" podID="b6157d01-bacc-4ebc-8384-b66dafd57a73" containerID="ff3f555d3218a03d2696007ac8edc67e5378d53c25abab6e6bf9af8fcd27fd1b" exitCode=0 Dec 01 09:17:57 crc kubenswrapper[5004]: I1201 09:17:57.509109 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fhf99" event={"ID":"b6157d01-bacc-4ebc-8384-b66dafd57a73","Type":"ContainerDied","Data":"ff3f555d3218a03d2696007ac8edc67e5378d53c25abab6e6bf9af8fcd27fd1b"} Dec 01 09:17:59 crc kubenswrapper[5004]: I1201 09:17:59.531685 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fhf99" event={"ID":"b6157d01-bacc-4ebc-8384-b66dafd57a73","Type":"ContainerStarted","Data":"70a448596c9f917401d87ca7dca3c160b3ed4cf1787704585c778505c1c386a2"} Dec 01 09:17:59 crc kubenswrapper[5004]: I1201 09:17:59.554628 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fhf99" podStartSLOduration=4.552078185 podStartE2EDuration="7.554607125s" podCreationTimestamp="2025-12-01 09:17:52 +0000 UTC" firstStartedPulling="2025-12-01 09:17:55.471703121 +0000 UTC m=+3653.036695123" lastFinishedPulling="2025-12-01 09:17:58.474232071 +0000 UTC m=+3656.039224063" observedRunningTime="2025-12-01 09:17:59.550988596 +0000 UTC m=+3657.115980588" watchObservedRunningTime="2025-12-01 09:17:59.554607125 +0000 UTC m=+3657.119599127" Dec 01 09:18:03 crc kubenswrapper[5004]: I1201 09:18:03.733873 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fhf99" Dec 01 09:18:03 crc kubenswrapper[5004]: I1201 09:18:03.734348 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fhf99" Dec 01 09:18:03 crc kubenswrapper[5004]: I1201 09:18:03.817631 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fhf99" Dec 01 09:18:04 crc kubenswrapper[5004]: I1201 09:18:04.664503 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fhf99" Dec 01 09:18:04 crc kubenswrapper[5004]: I1201 09:18:04.722255 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fhf99"] Dec 01 09:18:06 crc kubenswrapper[5004]: I1201 09:18:06.604776 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fhf99" podUID="b6157d01-bacc-4ebc-8384-b66dafd57a73" containerName="registry-server" containerID="cri-o://70a448596c9f917401d87ca7dca3c160b3ed4cf1787704585c778505c1c386a2" gracePeriod=2 Dec 01 09:18:07 crc kubenswrapper[5004]: I1201 09:18:07.216414 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fhf99" Dec 01 09:18:07 crc kubenswrapper[5004]: I1201 09:18:07.384663 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjsf5\" (UniqueName: \"kubernetes.io/projected/b6157d01-bacc-4ebc-8384-b66dafd57a73-kube-api-access-rjsf5\") pod \"b6157d01-bacc-4ebc-8384-b66dafd57a73\" (UID: \"b6157d01-bacc-4ebc-8384-b66dafd57a73\") " Dec 01 09:18:07 crc kubenswrapper[5004]: I1201 09:18:07.384756 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6157d01-bacc-4ebc-8384-b66dafd57a73-utilities\") pod \"b6157d01-bacc-4ebc-8384-b66dafd57a73\" (UID: \"b6157d01-bacc-4ebc-8384-b66dafd57a73\") " Dec 01 09:18:07 crc kubenswrapper[5004]: I1201 09:18:07.385052 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6157d01-bacc-4ebc-8384-b66dafd57a73-catalog-content\") pod \"b6157d01-bacc-4ebc-8384-b66dafd57a73\" (UID: \"b6157d01-bacc-4ebc-8384-b66dafd57a73\") " Dec 01 09:18:07 crc kubenswrapper[5004]: I1201 09:18:07.385749 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6157d01-bacc-4ebc-8384-b66dafd57a73-utilities" (OuterVolumeSpecName: "utilities") pod "b6157d01-bacc-4ebc-8384-b66dafd57a73" (UID: "b6157d01-bacc-4ebc-8384-b66dafd57a73"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:18:07 crc kubenswrapper[5004]: I1201 09:18:07.386074 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6157d01-bacc-4ebc-8384-b66dafd57a73-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:18:07 crc kubenswrapper[5004]: I1201 09:18:07.397041 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6157d01-bacc-4ebc-8384-b66dafd57a73-kube-api-access-rjsf5" (OuterVolumeSpecName: "kube-api-access-rjsf5") pod "b6157d01-bacc-4ebc-8384-b66dafd57a73" (UID: "b6157d01-bacc-4ebc-8384-b66dafd57a73"). InnerVolumeSpecName "kube-api-access-rjsf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:18:07 crc kubenswrapper[5004]: I1201 09:18:07.406590 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6157d01-bacc-4ebc-8384-b66dafd57a73-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b6157d01-bacc-4ebc-8384-b66dafd57a73" (UID: "b6157d01-bacc-4ebc-8384-b66dafd57a73"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:18:07 crc kubenswrapper[5004]: I1201 09:18:07.488278 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6157d01-bacc-4ebc-8384-b66dafd57a73-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:18:07 crc kubenswrapper[5004]: I1201 09:18:07.488527 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjsf5\" (UniqueName: \"kubernetes.io/projected/b6157d01-bacc-4ebc-8384-b66dafd57a73-kube-api-access-rjsf5\") on node \"crc\" DevicePath \"\"" Dec 01 09:18:07 crc kubenswrapper[5004]: I1201 09:18:07.615622 5004 generic.go:334] "Generic (PLEG): container finished" podID="b6157d01-bacc-4ebc-8384-b66dafd57a73" containerID="70a448596c9f917401d87ca7dca3c160b3ed4cf1787704585c778505c1c386a2" exitCode=0 Dec 01 09:18:07 crc kubenswrapper[5004]: I1201 09:18:07.615665 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fhf99" event={"ID":"b6157d01-bacc-4ebc-8384-b66dafd57a73","Type":"ContainerDied","Data":"70a448596c9f917401d87ca7dca3c160b3ed4cf1787704585c778505c1c386a2"} Dec 01 09:18:07 crc kubenswrapper[5004]: I1201 09:18:07.615691 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fhf99" event={"ID":"b6157d01-bacc-4ebc-8384-b66dafd57a73","Type":"ContainerDied","Data":"e878753690aa9a05394508a119ce1f10b28984a9449716c05a282671f31a8afc"} Dec 01 09:18:07 crc kubenswrapper[5004]: I1201 09:18:07.615713 5004 scope.go:117] "RemoveContainer" containerID="70a448596c9f917401d87ca7dca3c160b3ed4cf1787704585c778505c1c386a2" Dec 01 09:18:07 crc kubenswrapper[5004]: I1201 09:18:07.615725 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fhf99" Dec 01 09:18:07 crc kubenswrapper[5004]: I1201 09:18:07.641298 5004 scope.go:117] "RemoveContainer" containerID="ff3f555d3218a03d2696007ac8edc67e5378d53c25abab6e6bf9af8fcd27fd1b" Dec 01 09:18:07 crc kubenswrapper[5004]: I1201 09:18:07.660219 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fhf99"] Dec 01 09:18:07 crc kubenswrapper[5004]: I1201 09:18:07.670413 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fhf99"] Dec 01 09:18:07 crc kubenswrapper[5004]: I1201 09:18:07.675821 5004 scope.go:117] "RemoveContainer" containerID="9cff73eb356111f962abc7afb548dd0622da3f79d8a53b61b4d737077e23e015" Dec 01 09:18:07 crc kubenswrapper[5004]: I1201 09:18:07.716986 5004 scope.go:117] "RemoveContainer" containerID="70a448596c9f917401d87ca7dca3c160b3ed4cf1787704585c778505c1c386a2" Dec 01 09:18:07 crc kubenswrapper[5004]: E1201 09:18:07.717378 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70a448596c9f917401d87ca7dca3c160b3ed4cf1787704585c778505c1c386a2\": container with ID starting with 70a448596c9f917401d87ca7dca3c160b3ed4cf1787704585c778505c1c386a2 not found: ID does not exist" containerID="70a448596c9f917401d87ca7dca3c160b3ed4cf1787704585c778505c1c386a2" Dec 01 09:18:07 crc kubenswrapper[5004]: I1201 09:18:07.717421 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70a448596c9f917401d87ca7dca3c160b3ed4cf1787704585c778505c1c386a2"} err="failed to get container status \"70a448596c9f917401d87ca7dca3c160b3ed4cf1787704585c778505c1c386a2\": rpc error: code = NotFound desc = could not find container \"70a448596c9f917401d87ca7dca3c160b3ed4cf1787704585c778505c1c386a2\": container with ID starting with 70a448596c9f917401d87ca7dca3c160b3ed4cf1787704585c778505c1c386a2 not found: ID does not exist" Dec 01 09:18:07 crc kubenswrapper[5004]: I1201 09:18:07.717447 5004 scope.go:117] "RemoveContainer" containerID="ff3f555d3218a03d2696007ac8edc67e5378d53c25abab6e6bf9af8fcd27fd1b" Dec 01 09:18:07 crc kubenswrapper[5004]: E1201 09:18:07.717833 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff3f555d3218a03d2696007ac8edc67e5378d53c25abab6e6bf9af8fcd27fd1b\": container with ID starting with ff3f555d3218a03d2696007ac8edc67e5378d53c25abab6e6bf9af8fcd27fd1b not found: ID does not exist" containerID="ff3f555d3218a03d2696007ac8edc67e5378d53c25abab6e6bf9af8fcd27fd1b" Dec 01 09:18:07 crc kubenswrapper[5004]: I1201 09:18:07.717863 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff3f555d3218a03d2696007ac8edc67e5378d53c25abab6e6bf9af8fcd27fd1b"} err="failed to get container status \"ff3f555d3218a03d2696007ac8edc67e5378d53c25abab6e6bf9af8fcd27fd1b\": rpc error: code = NotFound desc = could not find container \"ff3f555d3218a03d2696007ac8edc67e5378d53c25abab6e6bf9af8fcd27fd1b\": container with ID starting with ff3f555d3218a03d2696007ac8edc67e5378d53c25abab6e6bf9af8fcd27fd1b not found: ID does not exist" Dec 01 09:18:07 crc kubenswrapper[5004]: I1201 09:18:07.717883 5004 scope.go:117] "RemoveContainer" containerID="9cff73eb356111f962abc7afb548dd0622da3f79d8a53b61b4d737077e23e015" Dec 01 09:18:07 crc kubenswrapper[5004]: E1201 09:18:07.718192 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cff73eb356111f962abc7afb548dd0622da3f79d8a53b61b4d737077e23e015\": container with ID starting with 9cff73eb356111f962abc7afb548dd0622da3f79d8a53b61b4d737077e23e015 not found: ID does not exist" containerID="9cff73eb356111f962abc7afb548dd0622da3f79d8a53b61b4d737077e23e015" Dec 01 09:18:07 crc kubenswrapper[5004]: I1201 09:18:07.718248 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cff73eb356111f962abc7afb548dd0622da3f79d8a53b61b4d737077e23e015"} err="failed to get container status \"9cff73eb356111f962abc7afb548dd0622da3f79d8a53b61b4d737077e23e015\": rpc error: code = NotFound desc = could not find container \"9cff73eb356111f962abc7afb548dd0622da3f79d8a53b61b4d737077e23e015\": container with ID starting with 9cff73eb356111f962abc7afb548dd0622da3f79d8a53b61b4d737077e23e015 not found: ID does not exist" Dec 01 09:18:08 crc kubenswrapper[5004]: I1201 09:18:08.777171 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6157d01-bacc-4ebc-8384-b66dafd57a73" path="/var/lib/kubelet/pods/b6157d01-bacc-4ebc-8384-b66dafd57a73/volumes" Dec 01 09:18:18 crc kubenswrapper[5004]: I1201 09:18:18.542177 5004 scope.go:117] "RemoveContainer" containerID="f56d91eed845f343708cfa5accb5b0df8714aa4835a75974b14ade0861fc97f0" Dec 01 09:18:18 crc kubenswrapper[5004]: I1201 09:18:18.574976 5004 scope.go:117] "RemoveContainer" containerID="04c634ce76b654397217b815979ab7abdeb6b50b99b7d56fd187dad0012b0e60" Dec 01 09:18:18 crc kubenswrapper[5004]: I1201 09:18:18.640376 5004 scope.go:117] "RemoveContainer" containerID="42f821639daf299b034f3b4cc245b7f079a0779ab9a84819a8df9b9c25fae122" Dec 01 09:19:38 crc kubenswrapper[5004]: I1201 09:19:38.731048 5004 patch_prober.go:28] interesting pod/machine-config-daemon-fvdgt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:19:38 crc kubenswrapper[5004]: I1201 09:19:38.731581 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:20:08 crc kubenswrapper[5004]: I1201 09:20:08.729909 5004 patch_prober.go:28] interesting pod/machine-config-daemon-fvdgt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:20:08 crc kubenswrapper[5004]: I1201 09:20:08.730456 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:20:38 crc kubenswrapper[5004]: I1201 09:20:38.730475 5004 patch_prober.go:28] interesting pod/machine-config-daemon-fvdgt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:20:38 crc kubenswrapper[5004]: I1201 09:20:38.731063 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:20:38 crc kubenswrapper[5004]: I1201 09:20:38.731111 5004 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" Dec 01 09:20:38 crc kubenswrapper[5004]: I1201 09:20:38.732068 5004 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ddad648297bc943586134e821ef858ef63f97885bd3e0068a2a2edd240b88618"} pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 09:20:38 crc kubenswrapper[5004]: I1201 09:20:38.732127 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" containerID="cri-o://ddad648297bc943586134e821ef858ef63f97885bd3e0068a2a2edd240b88618" gracePeriod=600 Dec 01 09:20:38 crc kubenswrapper[5004]: E1201 09:20:38.873820 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:20:39 crc kubenswrapper[5004]: I1201 09:20:39.355231 5004 generic.go:334] "Generic (PLEG): container finished" podID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerID="ddad648297bc943586134e821ef858ef63f97885bd3e0068a2a2edd240b88618" exitCode=0 Dec 01 09:20:39 crc kubenswrapper[5004]: I1201 09:20:39.355642 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" event={"ID":"f9977ebb-82de-4e96-8763-0b5a84f8d4ce","Type":"ContainerDied","Data":"ddad648297bc943586134e821ef858ef63f97885bd3e0068a2a2edd240b88618"} Dec 01 09:20:39 crc kubenswrapper[5004]: I1201 09:20:39.355704 5004 scope.go:117] "RemoveContainer" containerID="67314238aaf836d37dc206c647fc8d6f25a8091a20d766f75fb40dfdc2efc058" Dec 01 09:20:39 crc kubenswrapper[5004]: I1201 09:20:39.356903 5004 scope.go:117] "RemoveContainer" containerID="ddad648297bc943586134e821ef858ef63f97885bd3e0068a2a2edd240b88618" Dec 01 09:20:39 crc kubenswrapper[5004]: E1201 09:20:39.357532 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:20:51 crc kubenswrapper[5004]: I1201 09:20:51.758859 5004 scope.go:117] "RemoveContainer" containerID="ddad648297bc943586134e821ef858ef63f97885bd3e0068a2a2edd240b88618" Dec 01 09:20:51 crc kubenswrapper[5004]: E1201 09:20:51.760002 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:21:02 crc kubenswrapper[5004]: I1201 09:21:02.795957 5004 scope.go:117] "RemoveContainer" containerID="ddad648297bc943586134e821ef858ef63f97885bd3e0068a2a2edd240b88618" Dec 01 09:21:02 crc kubenswrapper[5004]: E1201 09:21:02.798060 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:21:16 crc kubenswrapper[5004]: I1201 09:21:16.760909 5004 scope.go:117] "RemoveContainer" containerID="ddad648297bc943586134e821ef858ef63f97885bd3e0068a2a2edd240b88618" Dec 01 09:21:16 crc kubenswrapper[5004]: E1201 09:21:16.762076 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:21:29 crc kubenswrapper[5004]: I1201 09:21:29.758777 5004 scope.go:117] "RemoveContainer" containerID="ddad648297bc943586134e821ef858ef63f97885bd3e0068a2a2edd240b88618" Dec 01 09:21:29 crc kubenswrapper[5004]: E1201 09:21:29.759642 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:21:41 crc kubenswrapper[5004]: I1201 09:21:41.766819 5004 scope.go:117] "RemoveContainer" containerID="ddad648297bc943586134e821ef858ef63f97885bd3e0068a2a2edd240b88618" Dec 01 09:21:41 crc kubenswrapper[5004]: E1201 09:21:41.767453 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:21:54 crc kubenswrapper[5004]: I1201 09:21:54.760009 5004 scope.go:117] "RemoveContainer" containerID="ddad648297bc943586134e821ef858ef63f97885bd3e0068a2a2edd240b88618" Dec 01 09:21:54 crc kubenswrapper[5004]: E1201 09:21:54.761200 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:22:00 crc kubenswrapper[5004]: I1201 09:22:00.220457 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mxt47"] Dec 01 09:22:00 crc kubenswrapper[5004]: E1201 09:22:00.221689 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6157d01-bacc-4ebc-8384-b66dafd57a73" containerName="registry-server" Dec 01 09:22:00 crc kubenswrapper[5004]: I1201 09:22:00.221709 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6157d01-bacc-4ebc-8384-b66dafd57a73" containerName="registry-server" Dec 01 09:22:00 crc kubenswrapper[5004]: E1201 09:22:00.221733 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6157d01-bacc-4ebc-8384-b66dafd57a73" containerName="extract-utilities" Dec 01 09:22:00 crc kubenswrapper[5004]: I1201 09:22:00.221741 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6157d01-bacc-4ebc-8384-b66dafd57a73" containerName="extract-utilities" Dec 01 09:22:00 crc kubenswrapper[5004]: E1201 09:22:00.221754 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6157d01-bacc-4ebc-8384-b66dafd57a73" containerName="extract-content" Dec 01 09:22:00 crc kubenswrapper[5004]: I1201 09:22:00.221764 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6157d01-bacc-4ebc-8384-b66dafd57a73" containerName="extract-content" Dec 01 09:22:00 crc kubenswrapper[5004]: I1201 09:22:00.222057 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6157d01-bacc-4ebc-8384-b66dafd57a73" containerName="registry-server" Dec 01 09:22:00 crc kubenswrapper[5004]: I1201 09:22:00.223988 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mxt47" Dec 01 09:22:00 crc kubenswrapper[5004]: I1201 09:22:00.251473 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mxt47"] Dec 01 09:22:00 crc kubenswrapper[5004]: I1201 09:22:00.308962 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b8bceea-94d5-4e7e-942a-b065baefb991-catalog-content\") pod \"redhat-operators-mxt47\" (UID: \"1b8bceea-94d5-4e7e-942a-b065baefb991\") " pod="openshift-marketplace/redhat-operators-mxt47" Dec 01 09:22:00 crc kubenswrapper[5004]: I1201 09:22:00.309041 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b8bceea-94d5-4e7e-942a-b065baefb991-utilities\") pod \"redhat-operators-mxt47\" (UID: \"1b8bceea-94d5-4e7e-942a-b065baefb991\") " pod="openshift-marketplace/redhat-operators-mxt47" Dec 01 09:22:00 crc kubenswrapper[5004]: I1201 09:22:00.309651 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xf49z\" (UniqueName: \"kubernetes.io/projected/1b8bceea-94d5-4e7e-942a-b065baefb991-kube-api-access-xf49z\") pod \"redhat-operators-mxt47\" (UID: \"1b8bceea-94d5-4e7e-942a-b065baefb991\") " pod="openshift-marketplace/redhat-operators-mxt47" Dec 01 09:22:00 crc kubenswrapper[5004]: I1201 09:22:00.412552 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xf49z\" (UniqueName: \"kubernetes.io/projected/1b8bceea-94d5-4e7e-942a-b065baefb991-kube-api-access-xf49z\") pod \"redhat-operators-mxt47\" (UID: \"1b8bceea-94d5-4e7e-942a-b065baefb991\") " pod="openshift-marketplace/redhat-operators-mxt47" Dec 01 09:22:00 crc kubenswrapper[5004]: I1201 09:22:00.412661 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b8bceea-94d5-4e7e-942a-b065baefb991-catalog-content\") pod \"redhat-operators-mxt47\" (UID: \"1b8bceea-94d5-4e7e-942a-b065baefb991\") " pod="openshift-marketplace/redhat-operators-mxt47" Dec 01 09:22:00 crc kubenswrapper[5004]: I1201 09:22:00.412707 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b8bceea-94d5-4e7e-942a-b065baefb991-utilities\") pod \"redhat-operators-mxt47\" (UID: \"1b8bceea-94d5-4e7e-942a-b065baefb991\") " pod="openshift-marketplace/redhat-operators-mxt47" Dec 01 09:22:00 crc kubenswrapper[5004]: I1201 09:22:00.413172 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b8bceea-94d5-4e7e-942a-b065baefb991-utilities\") pod \"redhat-operators-mxt47\" (UID: \"1b8bceea-94d5-4e7e-942a-b065baefb991\") " pod="openshift-marketplace/redhat-operators-mxt47" Dec 01 09:22:00 crc kubenswrapper[5004]: I1201 09:22:00.413208 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b8bceea-94d5-4e7e-942a-b065baefb991-catalog-content\") pod \"redhat-operators-mxt47\" (UID: \"1b8bceea-94d5-4e7e-942a-b065baefb991\") " pod="openshift-marketplace/redhat-operators-mxt47" Dec 01 09:22:00 crc kubenswrapper[5004]: I1201 09:22:00.441040 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xf49z\" (UniqueName: \"kubernetes.io/projected/1b8bceea-94d5-4e7e-942a-b065baefb991-kube-api-access-xf49z\") pod \"redhat-operators-mxt47\" (UID: \"1b8bceea-94d5-4e7e-942a-b065baefb991\") " pod="openshift-marketplace/redhat-operators-mxt47" Dec 01 09:22:00 crc kubenswrapper[5004]: I1201 09:22:00.558612 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mxt47" Dec 01 09:22:01 crc kubenswrapper[5004]: I1201 09:22:01.390743 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mxt47"] Dec 01 09:22:02 crc kubenswrapper[5004]: I1201 09:22:02.402865 5004 generic.go:334] "Generic (PLEG): container finished" podID="1b8bceea-94d5-4e7e-942a-b065baefb991" containerID="303c914fa1eaa81d4c76cae672e6b107b3a3681e6948cef6c98414c221397b49" exitCode=0 Dec 01 09:22:02 crc kubenswrapper[5004]: I1201 09:22:02.402980 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mxt47" event={"ID":"1b8bceea-94d5-4e7e-942a-b065baefb991","Type":"ContainerDied","Data":"303c914fa1eaa81d4c76cae672e6b107b3a3681e6948cef6c98414c221397b49"} Dec 01 09:22:02 crc kubenswrapper[5004]: I1201 09:22:02.403141 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mxt47" event={"ID":"1b8bceea-94d5-4e7e-942a-b065baefb991","Type":"ContainerStarted","Data":"eb9045c190e18a0f1f0b8a1f1b6bd966ff82b3c19efbeab45fd45731a9471a90"} Dec 01 09:22:02 crc kubenswrapper[5004]: I1201 09:22:02.407751 5004 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 09:22:03 crc kubenswrapper[5004]: I1201 09:22:03.418319 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mxt47" event={"ID":"1b8bceea-94d5-4e7e-942a-b065baefb991","Type":"ContainerStarted","Data":"e3f8a4145802262a2dc4ebfab26d1e7096d94a3b97ba2baf1685933e7cbbd488"} Dec 01 09:22:06 crc kubenswrapper[5004]: I1201 09:22:06.758899 5004 scope.go:117] "RemoveContainer" containerID="ddad648297bc943586134e821ef858ef63f97885bd3e0068a2a2edd240b88618" Dec 01 09:22:06 crc kubenswrapper[5004]: E1201 09:22:06.759676 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:22:07 crc kubenswrapper[5004]: I1201 09:22:07.464288 5004 generic.go:334] "Generic (PLEG): container finished" podID="1b8bceea-94d5-4e7e-942a-b065baefb991" containerID="e3f8a4145802262a2dc4ebfab26d1e7096d94a3b97ba2baf1685933e7cbbd488" exitCode=0 Dec 01 09:22:07 crc kubenswrapper[5004]: I1201 09:22:07.464336 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mxt47" event={"ID":"1b8bceea-94d5-4e7e-942a-b065baefb991","Type":"ContainerDied","Data":"e3f8a4145802262a2dc4ebfab26d1e7096d94a3b97ba2baf1685933e7cbbd488"} Dec 01 09:22:08 crc kubenswrapper[5004]: I1201 09:22:08.476635 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mxt47" event={"ID":"1b8bceea-94d5-4e7e-942a-b065baefb991","Type":"ContainerStarted","Data":"4512fa8da0fa2651f24a555c21a33d4ee20277b23156cd60a52feddf6344d439"} Dec 01 09:22:08 crc kubenswrapper[5004]: I1201 09:22:08.504070 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mxt47" podStartSLOduration=2.883430526 podStartE2EDuration="8.504048551s" podCreationTimestamp="2025-12-01 09:22:00 +0000 UTC" firstStartedPulling="2025-12-01 09:22:02.407527763 +0000 UTC m=+3899.972519745" lastFinishedPulling="2025-12-01 09:22:08.028145778 +0000 UTC m=+3905.593137770" observedRunningTime="2025-12-01 09:22:08.492464567 +0000 UTC m=+3906.057456569" watchObservedRunningTime="2025-12-01 09:22:08.504048551 +0000 UTC m=+3906.069040533" Dec 01 09:22:10 crc kubenswrapper[5004]: I1201 09:22:10.559457 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mxt47" Dec 01 09:22:10 crc kubenswrapper[5004]: I1201 09:22:10.560006 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mxt47" Dec 01 09:22:11 crc kubenswrapper[5004]: I1201 09:22:11.610372 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mxt47" podUID="1b8bceea-94d5-4e7e-942a-b065baefb991" containerName="registry-server" probeResult="failure" output=< Dec 01 09:22:11 crc kubenswrapper[5004]: timeout: failed to connect service ":50051" within 1s Dec 01 09:22:11 crc kubenswrapper[5004]: > Dec 01 09:22:18 crc kubenswrapper[5004]: I1201 09:22:18.759852 5004 scope.go:117] "RemoveContainer" containerID="ddad648297bc943586134e821ef858ef63f97885bd3e0068a2a2edd240b88618" Dec 01 09:22:18 crc kubenswrapper[5004]: E1201 09:22:18.760795 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:22:20 crc kubenswrapper[5004]: I1201 09:22:20.612471 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mxt47" Dec 01 09:22:20 crc kubenswrapper[5004]: I1201 09:22:20.669244 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mxt47" Dec 01 09:22:20 crc kubenswrapper[5004]: I1201 09:22:20.857447 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mxt47"] Dec 01 09:22:22 crc kubenswrapper[5004]: I1201 09:22:22.628387 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mxt47" podUID="1b8bceea-94d5-4e7e-942a-b065baefb991" containerName="registry-server" containerID="cri-o://4512fa8da0fa2651f24a555c21a33d4ee20277b23156cd60a52feddf6344d439" gracePeriod=2 Dec 01 09:22:23 crc kubenswrapper[5004]: I1201 09:22:23.169637 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mxt47" Dec 01 09:22:23 crc kubenswrapper[5004]: I1201 09:22:23.265874 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b8bceea-94d5-4e7e-942a-b065baefb991-utilities\") pod \"1b8bceea-94d5-4e7e-942a-b065baefb991\" (UID: \"1b8bceea-94d5-4e7e-942a-b065baefb991\") " Dec 01 09:22:23 crc kubenswrapper[5004]: I1201 09:22:23.266029 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xf49z\" (UniqueName: \"kubernetes.io/projected/1b8bceea-94d5-4e7e-942a-b065baefb991-kube-api-access-xf49z\") pod \"1b8bceea-94d5-4e7e-942a-b065baefb991\" (UID: \"1b8bceea-94d5-4e7e-942a-b065baefb991\") " Dec 01 09:22:23 crc kubenswrapper[5004]: I1201 09:22:23.266148 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b8bceea-94d5-4e7e-942a-b065baefb991-catalog-content\") pod \"1b8bceea-94d5-4e7e-942a-b065baefb991\" (UID: \"1b8bceea-94d5-4e7e-942a-b065baefb991\") " Dec 01 09:22:23 crc kubenswrapper[5004]: I1201 09:22:23.266794 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b8bceea-94d5-4e7e-942a-b065baefb991-utilities" (OuterVolumeSpecName: "utilities") pod "1b8bceea-94d5-4e7e-942a-b065baefb991" (UID: "1b8bceea-94d5-4e7e-942a-b065baefb991"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:22:23 crc kubenswrapper[5004]: I1201 09:22:23.276657 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b8bceea-94d5-4e7e-942a-b065baefb991-kube-api-access-xf49z" (OuterVolumeSpecName: "kube-api-access-xf49z") pod "1b8bceea-94d5-4e7e-942a-b065baefb991" (UID: "1b8bceea-94d5-4e7e-942a-b065baefb991"). InnerVolumeSpecName "kube-api-access-xf49z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:22:23 crc kubenswrapper[5004]: I1201 09:22:23.371469 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b8bceea-94d5-4e7e-942a-b065baefb991-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:22:23 crc kubenswrapper[5004]: I1201 09:22:23.371520 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xf49z\" (UniqueName: \"kubernetes.io/projected/1b8bceea-94d5-4e7e-942a-b065baefb991-kube-api-access-xf49z\") on node \"crc\" DevicePath \"\"" Dec 01 09:22:23 crc kubenswrapper[5004]: I1201 09:22:23.399486 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b8bceea-94d5-4e7e-942a-b065baefb991-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1b8bceea-94d5-4e7e-942a-b065baefb991" (UID: "1b8bceea-94d5-4e7e-942a-b065baefb991"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:22:23 crc kubenswrapper[5004]: I1201 09:22:23.474313 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b8bceea-94d5-4e7e-942a-b065baefb991-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:22:23 crc kubenswrapper[5004]: I1201 09:22:23.640223 5004 generic.go:334] "Generic (PLEG): container finished" podID="1b8bceea-94d5-4e7e-942a-b065baefb991" containerID="4512fa8da0fa2651f24a555c21a33d4ee20277b23156cd60a52feddf6344d439" exitCode=0 Dec 01 09:22:23 crc kubenswrapper[5004]: I1201 09:22:23.640264 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mxt47" event={"ID":"1b8bceea-94d5-4e7e-942a-b065baefb991","Type":"ContainerDied","Data":"4512fa8da0fa2651f24a555c21a33d4ee20277b23156cd60a52feddf6344d439"} Dec 01 09:22:23 crc kubenswrapper[5004]: I1201 09:22:23.640294 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mxt47" event={"ID":"1b8bceea-94d5-4e7e-942a-b065baefb991","Type":"ContainerDied","Data":"eb9045c190e18a0f1f0b8a1f1b6bd966ff82b3c19efbeab45fd45731a9471a90"} Dec 01 09:22:23 crc kubenswrapper[5004]: I1201 09:22:23.640315 5004 scope.go:117] "RemoveContainer" containerID="4512fa8da0fa2651f24a555c21a33d4ee20277b23156cd60a52feddf6344d439" Dec 01 09:22:23 crc kubenswrapper[5004]: I1201 09:22:23.640312 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mxt47" Dec 01 09:22:23 crc kubenswrapper[5004]: I1201 09:22:23.661588 5004 scope.go:117] "RemoveContainer" containerID="e3f8a4145802262a2dc4ebfab26d1e7096d94a3b97ba2baf1685933e7cbbd488" Dec 01 09:22:23 crc kubenswrapper[5004]: I1201 09:22:23.678714 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mxt47"] Dec 01 09:22:23 crc kubenswrapper[5004]: I1201 09:22:23.690848 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mxt47"] Dec 01 09:22:23 crc kubenswrapper[5004]: I1201 09:22:23.703829 5004 scope.go:117] "RemoveContainer" containerID="303c914fa1eaa81d4c76cae672e6b107b3a3681e6948cef6c98414c221397b49" Dec 01 09:22:23 crc kubenswrapper[5004]: I1201 09:22:23.745309 5004 scope.go:117] "RemoveContainer" containerID="4512fa8da0fa2651f24a555c21a33d4ee20277b23156cd60a52feddf6344d439" Dec 01 09:22:23 crc kubenswrapper[5004]: E1201 09:22:23.745822 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4512fa8da0fa2651f24a555c21a33d4ee20277b23156cd60a52feddf6344d439\": container with ID starting with 4512fa8da0fa2651f24a555c21a33d4ee20277b23156cd60a52feddf6344d439 not found: ID does not exist" containerID="4512fa8da0fa2651f24a555c21a33d4ee20277b23156cd60a52feddf6344d439" Dec 01 09:22:23 crc kubenswrapper[5004]: I1201 09:22:23.745869 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4512fa8da0fa2651f24a555c21a33d4ee20277b23156cd60a52feddf6344d439"} err="failed to get container status \"4512fa8da0fa2651f24a555c21a33d4ee20277b23156cd60a52feddf6344d439\": rpc error: code = NotFound desc = could not find container \"4512fa8da0fa2651f24a555c21a33d4ee20277b23156cd60a52feddf6344d439\": container with ID starting with 4512fa8da0fa2651f24a555c21a33d4ee20277b23156cd60a52feddf6344d439 not found: ID does not exist" Dec 01 09:22:23 crc kubenswrapper[5004]: I1201 09:22:23.745897 5004 scope.go:117] "RemoveContainer" containerID="e3f8a4145802262a2dc4ebfab26d1e7096d94a3b97ba2baf1685933e7cbbd488" Dec 01 09:22:23 crc kubenswrapper[5004]: E1201 09:22:23.746227 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3f8a4145802262a2dc4ebfab26d1e7096d94a3b97ba2baf1685933e7cbbd488\": container with ID starting with e3f8a4145802262a2dc4ebfab26d1e7096d94a3b97ba2baf1685933e7cbbd488 not found: ID does not exist" containerID="e3f8a4145802262a2dc4ebfab26d1e7096d94a3b97ba2baf1685933e7cbbd488" Dec 01 09:22:23 crc kubenswrapper[5004]: I1201 09:22:23.746278 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3f8a4145802262a2dc4ebfab26d1e7096d94a3b97ba2baf1685933e7cbbd488"} err="failed to get container status \"e3f8a4145802262a2dc4ebfab26d1e7096d94a3b97ba2baf1685933e7cbbd488\": rpc error: code = NotFound desc = could not find container \"e3f8a4145802262a2dc4ebfab26d1e7096d94a3b97ba2baf1685933e7cbbd488\": container with ID starting with e3f8a4145802262a2dc4ebfab26d1e7096d94a3b97ba2baf1685933e7cbbd488 not found: ID does not exist" Dec 01 09:22:23 crc kubenswrapper[5004]: I1201 09:22:23.746314 5004 scope.go:117] "RemoveContainer" containerID="303c914fa1eaa81d4c76cae672e6b107b3a3681e6948cef6c98414c221397b49" Dec 01 09:22:23 crc kubenswrapper[5004]: E1201 09:22:23.746663 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"303c914fa1eaa81d4c76cae672e6b107b3a3681e6948cef6c98414c221397b49\": container with ID starting with 303c914fa1eaa81d4c76cae672e6b107b3a3681e6948cef6c98414c221397b49 not found: ID does not exist" containerID="303c914fa1eaa81d4c76cae672e6b107b3a3681e6948cef6c98414c221397b49" Dec 01 09:22:23 crc kubenswrapper[5004]: I1201 09:22:23.746689 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"303c914fa1eaa81d4c76cae672e6b107b3a3681e6948cef6c98414c221397b49"} err="failed to get container status \"303c914fa1eaa81d4c76cae672e6b107b3a3681e6948cef6c98414c221397b49\": rpc error: code = NotFound desc = could not find container \"303c914fa1eaa81d4c76cae672e6b107b3a3681e6948cef6c98414c221397b49\": container with ID starting with 303c914fa1eaa81d4c76cae672e6b107b3a3681e6948cef6c98414c221397b49 not found: ID does not exist" Dec 01 09:22:24 crc kubenswrapper[5004]: I1201 09:22:24.775072 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b8bceea-94d5-4e7e-942a-b065baefb991" path="/var/lib/kubelet/pods/1b8bceea-94d5-4e7e-942a-b065baefb991/volumes" Dec 01 09:22:29 crc kubenswrapper[5004]: I1201 09:22:29.759710 5004 scope.go:117] "RemoveContainer" containerID="ddad648297bc943586134e821ef858ef63f97885bd3e0068a2a2edd240b88618" Dec 01 09:22:29 crc kubenswrapper[5004]: E1201 09:22:29.760635 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:22:41 crc kubenswrapper[5004]: I1201 09:22:41.490837 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kcbjd"] Dec 01 09:22:41 crc kubenswrapper[5004]: E1201 09:22:41.492036 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b8bceea-94d5-4e7e-942a-b065baefb991" containerName="extract-utilities" Dec 01 09:22:41 crc kubenswrapper[5004]: I1201 09:22:41.492052 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b8bceea-94d5-4e7e-942a-b065baefb991" containerName="extract-utilities" Dec 01 09:22:41 crc kubenswrapper[5004]: E1201 09:22:41.492115 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b8bceea-94d5-4e7e-942a-b065baefb991" containerName="extract-content" Dec 01 09:22:41 crc kubenswrapper[5004]: I1201 09:22:41.492122 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b8bceea-94d5-4e7e-942a-b065baefb991" containerName="extract-content" Dec 01 09:22:41 crc kubenswrapper[5004]: E1201 09:22:41.492142 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b8bceea-94d5-4e7e-942a-b065baefb991" containerName="registry-server" Dec 01 09:22:41 crc kubenswrapper[5004]: I1201 09:22:41.492150 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b8bceea-94d5-4e7e-942a-b065baefb991" containerName="registry-server" Dec 01 09:22:41 crc kubenswrapper[5004]: I1201 09:22:41.492375 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b8bceea-94d5-4e7e-942a-b065baefb991" containerName="registry-server" Dec 01 09:22:41 crc kubenswrapper[5004]: I1201 09:22:41.494219 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kcbjd" Dec 01 09:22:41 crc kubenswrapper[5004]: I1201 09:22:41.517330 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kcbjd"] Dec 01 09:22:41 crc kubenswrapper[5004]: I1201 09:22:41.519533 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jh2l5\" (UniqueName: \"kubernetes.io/projected/086fb2f4-8bcb-425e-b6ae-dc3e760c24fb-kube-api-access-jh2l5\") pod \"certified-operators-kcbjd\" (UID: \"086fb2f4-8bcb-425e-b6ae-dc3e760c24fb\") " pod="openshift-marketplace/certified-operators-kcbjd" Dec 01 09:22:41 crc kubenswrapper[5004]: I1201 09:22:41.519657 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/086fb2f4-8bcb-425e-b6ae-dc3e760c24fb-catalog-content\") pod \"certified-operators-kcbjd\" (UID: \"086fb2f4-8bcb-425e-b6ae-dc3e760c24fb\") " pod="openshift-marketplace/certified-operators-kcbjd" Dec 01 09:22:41 crc kubenswrapper[5004]: I1201 09:22:41.519704 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/086fb2f4-8bcb-425e-b6ae-dc3e760c24fb-utilities\") pod \"certified-operators-kcbjd\" (UID: \"086fb2f4-8bcb-425e-b6ae-dc3e760c24fb\") " pod="openshift-marketplace/certified-operators-kcbjd" Dec 01 09:22:41 crc kubenswrapper[5004]: I1201 09:22:41.620632 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jh2l5\" (UniqueName: \"kubernetes.io/projected/086fb2f4-8bcb-425e-b6ae-dc3e760c24fb-kube-api-access-jh2l5\") pod \"certified-operators-kcbjd\" (UID: \"086fb2f4-8bcb-425e-b6ae-dc3e760c24fb\") " pod="openshift-marketplace/certified-operators-kcbjd" Dec 01 09:22:41 crc kubenswrapper[5004]: I1201 09:22:41.620728 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/086fb2f4-8bcb-425e-b6ae-dc3e760c24fb-catalog-content\") pod \"certified-operators-kcbjd\" (UID: \"086fb2f4-8bcb-425e-b6ae-dc3e760c24fb\") " pod="openshift-marketplace/certified-operators-kcbjd" Dec 01 09:22:41 crc kubenswrapper[5004]: I1201 09:22:41.620782 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/086fb2f4-8bcb-425e-b6ae-dc3e760c24fb-utilities\") pod \"certified-operators-kcbjd\" (UID: \"086fb2f4-8bcb-425e-b6ae-dc3e760c24fb\") " pod="openshift-marketplace/certified-operators-kcbjd" Dec 01 09:22:41 crc kubenswrapper[5004]: I1201 09:22:41.621467 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/086fb2f4-8bcb-425e-b6ae-dc3e760c24fb-catalog-content\") pod \"certified-operators-kcbjd\" (UID: \"086fb2f4-8bcb-425e-b6ae-dc3e760c24fb\") " pod="openshift-marketplace/certified-operators-kcbjd" Dec 01 09:22:41 crc kubenswrapper[5004]: I1201 09:22:41.621526 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/086fb2f4-8bcb-425e-b6ae-dc3e760c24fb-utilities\") pod \"certified-operators-kcbjd\" (UID: \"086fb2f4-8bcb-425e-b6ae-dc3e760c24fb\") " pod="openshift-marketplace/certified-operators-kcbjd" Dec 01 09:22:41 crc kubenswrapper[5004]: I1201 09:22:41.645739 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jh2l5\" (UniqueName: \"kubernetes.io/projected/086fb2f4-8bcb-425e-b6ae-dc3e760c24fb-kube-api-access-jh2l5\") pod \"certified-operators-kcbjd\" (UID: \"086fb2f4-8bcb-425e-b6ae-dc3e760c24fb\") " pod="openshift-marketplace/certified-operators-kcbjd" Dec 01 09:22:41 crc kubenswrapper[5004]: I1201 09:22:41.759352 5004 scope.go:117] "RemoveContainer" containerID="ddad648297bc943586134e821ef858ef63f97885bd3e0068a2a2edd240b88618" Dec 01 09:22:41 crc kubenswrapper[5004]: E1201 09:22:41.759828 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:22:41 crc kubenswrapper[5004]: I1201 09:22:41.819721 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kcbjd" Dec 01 09:22:42 crc kubenswrapper[5004]: I1201 09:22:42.338008 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kcbjd"] Dec 01 09:22:42 crc kubenswrapper[5004]: I1201 09:22:42.874007 5004 generic.go:334] "Generic (PLEG): container finished" podID="086fb2f4-8bcb-425e-b6ae-dc3e760c24fb" containerID="14b0d74b2e4faf14f48c1fcd64d1525a0a64c2780c2ba8fb56d33db957100619" exitCode=0 Dec 01 09:22:42 crc kubenswrapper[5004]: I1201 09:22:42.874127 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kcbjd" event={"ID":"086fb2f4-8bcb-425e-b6ae-dc3e760c24fb","Type":"ContainerDied","Data":"14b0d74b2e4faf14f48c1fcd64d1525a0a64c2780c2ba8fb56d33db957100619"} Dec 01 09:22:42 crc kubenswrapper[5004]: I1201 09:22:42.874373 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kcbjd" event={"ID":"086fb2f4-8bcb-425e-b6ae-dc3e760c24fb","Type":"ContainerStarted","Data":"e03bd557757d2dab46837a7ef2e8ab3afedb20c40f0735e2e3c76ac2cd1296fa"} Dec 01 09:22:44 crc kubenswrapper[5004]: I1201 09:22:44.919308 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kcbjd" event={"ID":"086fb2f4-8bcb-425e-b6ae-dc3e760c24fb","Type":"ContainerStarted","Data":"18ac02a2e54cb366554417dea9eca1cc5f48685da6a515660f47ee32d7162ebc"} Dec 01 09:22:45 crc kubenswrapper[5004]: I1201 09:22:45.930109 5004 generic.go:334] "Generic (PLEG): container finished" podID="086fb2f4-8bcb-425e-b6ae-dc3e760c24fb" containerID="18ac02a2e54cb366554417dea9eca1cc5f48685da6a515660f47ee32d7162ebc" exitCode=0 Dec 01 09:22:45 crc kubenswrapper[5004]: I1201 09:22:45.930153 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kcbjd" event={"ID":"086fb2f4-8bcb-425e-b6ae-dc3e760c24fb","Type":"ContainerDied","Data":"18ac02a2e54cb366554417dea9eca1cc5f48685da6a515660f47ee32d7162ebc"} Dec 01 09:22:47 crc kubenswrapper[5004]: I1201 09:22:47.954239 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kcbjd" event={"ID":"086fb2f4-8bcb-425e-b6ae-dc3e760c24fb","Type":"ContainerStarted","Data":"2e66569ad0ca2b5b312eef1afd1dfc751b82fb2c4c17b211c6d394c95afafd42"} Dec 01 09:22:48 crc kubenswrapper[5004]: I1201 09:22:48.022578 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kcbjd" podStartSLOduration=2.898839689 podStartE2EDuration="7.022541313s" podCreationTimestamp="2025-12-01 09:22:41 +0000 UTC" firstStartedPulling="2025-12-01 09:22:42.876660112 +0000 UTC m=+3940.441652104" lastFinishedPulling="2025-12-01 09:22:47.000361716 +0000 UTC m=+3944.565353728" observedRunningTime="2025-12-01 09:22:47.997611546 +0000 UTC m=+3945.562603538" watchObservedRunningTime="2025-12-01 09:22:48.022541313 +0000 UTC m=+3945.587533295" Dec 01 09:22:51 crc kubenswrapper[5004]: I1201 09:22:51.820029 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kcbjd" Dec 01 09:22:51 crc kubenswrapper[5004]: I1201 09:22:51.820628 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kcbjd" Dec 01 09:22:51 crc kubenswrapper[5004]: I1201 09:22:51.871364 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kcbjd" Dec 01 09:22:52 crc kubenswrapper[5004]: I1201 09:22:52.064694 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kcbjd" Dec 01 09:22:52 crc kubenswrapper[5004]: I1201 09:22:52.260419 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kcbjd"] Dec 01 09:22:54 crc kubenswrapper[5004]: I1201 09:22:54.040647 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-kcbjd" podUID="086fb2f4-8bcb-425e-b6ae-dc3e760c24fb" containerName="registry-server" containerID="cri-o://2e66569ad0ca2b5b312eef1afd1dfc751b82fb2c4c17b211c6d394c95afafd42" gracePeriod=2 Dec 01 09:22:54 crc kubenswrapper[5004]: I1201 09:22:54.654022 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kcbjd" Dec 01 09:22:54 crc kubenswrapper[5004]: I1201 09:22:54.718425 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/086fb2f4-8bcb-425e-b6ae-dc3e760c24fb-catalog-content\") pod \"086fb2f4-8bcb-425e-b6ae-dc3e760c24fb\" (UID: \"086fb2f4-8bcb-425e-b6ae-dc3e760c24fb\") " Dec 01 09:22:54 crc kubenswrapper[5004]: I1201 09:22:54.718526 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jh2l5\" (UniqueName: \"kubernetes.io/projected/086fb2f4-8bcb-425e-b6ae-dc3e760c24fb-kube-api-access-jh2l5\") pod \"086fb2f4-8bcb-425e-b6ae-dc3e760c24fb\" (UID: \"086fb2f4-8bcb-425e-b6ae-dc3e760c24fb\") " Dec 01 09:22:54 crc kubenswrapper[5004]: I1201 09:22:54.718672 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/086fb2f4-8bcb-425e-b6ae-dc3e760c24fb-utilities\") pod \"086fb2f4-8bcb-425e-b6ae-dc3e760c24fb\" (UID: \"086fb2f4-8bcb-425e-b6ae-dc3e760c24fb\") " Dec 01 09:22:54 crc kubenswrapper[5004]: I1201 09:22:54.719659 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/086fb2f4-8bcb-425e-b6ae-dc3e760c24fb-utilities" (OuterVolumeSpecName: "utilities") pod "086fb2f4-8bcb-425e-b6ae-dc3e760c24fb" (UID: "086fb2f4-8bcb-425e-b6ae-dc3e760c24fb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:22:54 crc kubenswrapper[5004]: I1201 09:22:54.727882 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/086fb2f4-8bcb-425e-b6ae-dc3e760c24fb-kube-api-access-jh2l5" (OuterVolumeSpecName: "kube-api-access-jh2l5") pod "086fb2f4-8bcb-425e-b6ae-dc3e760c24fb" (UID: "086fb2f4-8bcb-425e-b6ae-dc3e760c24fb"). InnerVolumeSpecName "kube-api-access-jh2l5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:22:54 crc kubenswrapper[5004]: I1201 09:22:54.793173 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/086fb2f4-8bcb-425e-b6ae-dc3e760c24fb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "086fb2f4-8bcb-425e-b6ae-dc3e760c24fb" (UID: "086fb2f4-8bcb-425e-b6ae-dc3e760c24fb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:22:54 crc kubenswrapper[5004]: I1201 09:22:54.822463 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/086fb2f4-8bcb-425e-b6ae-dc3e760c24fb-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:22:54 crc kubenswrapper[5004]: I1201 09:22:54.822506 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/086fb2f4-8bcb-425e-b6ae-dc3e760c24fb-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:22:54 crc kubenswrapper[5004]: I1201 09:22:54.822521 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jh2l5\" (UniqueName: \"kubernetes.io/projected/086fb2f4-8bcb-425e-b6ae-dc3e760c24fb-kube-api-access-jh2l5\") on node \"crc\" DevicePath \"\"" Dec 01 09:22:55 crc kubenswrapper[5004]: I1201 09:22:55.052985 5004 generic.go:334] "Generic (PLEG): container finished" podID="086fb2f4-8bcb-425e-b6ae-dc3e760c24fb" containerID="2e66569ad0ca2b5b312eef1afd1dfc751b82fb2c4c17b211c6d394c95afafd42" exitCode=0 Dec 01 09:22:55 crc kubenswrapper[5004]: I1201 09:22:55.053033 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kcbjd" event={"ID":"086fb2f4-8bcb-425e-b6ae-dc3e760c24fb","Type":"ContainerDied","Data":"2e66569ad0ca2b5b312eef1afd1dfc751b82fb2c4c17b211c6d394c95afafd42"} Dec 01 09:22:55 crc kubenswrapper[5004]: I1201 09:22:55.053112 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kcbjd" event={"ID":"086fb2f4-8bcb-425e-b6ae-dc3e760c24fb","Type":"ContainerDied","Data":"e03bd557757d2dab46837a7ef2e8ab3afedb20c40f0735e2e3c76ac2cd1296fa"} Dec 01 09:22:55 crc kubenswrapper[5004]: I1201 09:22:55.053136 5004 scope.go:117] "RemoveContainer" containerID="2e66569ad0ca2b5b312eef1afd1dfc751b82fb2c4c17b211c6d394c95afafd42" Dec 01 09:22:55 crc kubenswrapper[5004]: I1201 09:22:55.053056 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kcbjd" Dec 01 09:22:55 crc kubenswrapper[5004]: I1201 09:22:55.073353 5004 scope.go:117] "RemoveContainer" containerID="18ac02a2e54cb366554417dea9eca1cc5f48685da6a515660f47ee32d7162ebc" Dec 01 09:22:55 crc kubenswrapper[5004]: I1201 09:22:55.099436 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kcbjd"] Dec 01 09:22:55 crc kubenswrapper[5004]: I1201 09:22:55.109727 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-kcbjd"] Dec 01 09:22:55 crc kubenswrapper[5004]: I1201 09:22:55.126222 5004 scope.go:117] "RemoveContainer" containerID="14b0d74b2e4faf14f48c1fcd64d1525a0a64c2780c2ba8fb56d33db957100619" Dec 01 09:22:55 crc kubenswrapper[5004]: I1201 09:22:55.155636 5004 scope.go:117] "RemoveContainer" containerID="2e66569ad0ca2b5b312eef1afd1dfc751b82fb2c4c17b211c6d394c95afafd42" Dec 01 09:22:55 crc kubenswrapper[5004]: E1201 09:22:55.156111 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e66569ad0ca2b5b312eef1afd1dfc751b82fb2c4c17b211c6d394c95afafd42\": container with ID starting with 2e66569ad0ca2b5b312eef1afd1dfc751b82fb2c4c17b211c6d394c95afafd42 not found: ID does not exist" containerID="2e66569ad0ca2b5b312eef1afd1dfc751b82fb2c4c17b211c6d394c95afafd42" Dec 01 09:22:55 crc kubenswrapper[5004]: I1201 09:22:55.156143 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e66569ad0ca2b5b312eef1afd1dfc751b82fb2c4c17b211c6d394c95afafd42"} err="failed to get container status \"2e66569ad0ca2b5b312eef1afd1dfc751b82fb2c4c17b211c6d394c95afafd42\": rpc error: code = NotFound desc = could not find container \"2e66569ad0ca2b5b312eef1afd1dfc751b82fb2c4c17b211c6d394c95afafd42\": container with ID starting with 2e66569ad0ca2b5b312eef1afd1dfc751b82fb2c4c17b211c6d394c95afafd42 not found: ID does not exist" Dec 01 09:22:55 crc kubenswrapper[5004]: I1201 09:22:55.156166 5004 scope.go:117] "RemoveContainer" containerID="18ac02a2e54cb366554417dea9eca1cc5f48685da6a515660f47ee32d7162ebc" Dec 01 09:22:55 crc kubenswrapper[5004]: E1201 09:22:55.156430 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18ac02a2e54cb366554417dea9eca1cc5f48685da6a515660f47ee32d7162ebc\": container with ID starting with 18ac02a2e54cb366554417dea9eca1cc5f48685da6a515660f47ee32d7162ebc not found: ID does not exist" containerID="18ac02a2e54cb366554417dea9eca1cc5f48685da6a515660f47ee32d7162ebc" Dec 01 09:22:55 crc kubenswrapper[5004]: I1201 09:22:55.156469 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18ac02a2e54cb366554417dea9eca1cc5f48685da6a515660f47ee32d7162ebc"} err="failed to get container status \"18ac02a2e54cb366554417dea9eca1cc5f48685da6a515660f47ee32d7162ebc\": rpc error: code = NotFound desc = could not find container \"18ac02a2e54cb366554417dea9eca1cc5f48685da6a515660f47ee32d7162ebc\": container with ID starting with 18ac02a2e54cb366554417dea9eca1cc5f48685da6a515660f47ee32d7162ebc not found: ID does not exist" Dec 01 09:22:55 crc kubenswrapper[5004]: I1201 09:22:55.156482 5004 scope.go:117] "RemoveContainer" containerID="14b0d74b2e4faf14f48c1fcd64d1525a0a64c2780c2ba8fb56d33db957100619" Dec 01 09:22:55 crc kubenswrapper[5004]: E1201 09:22:55.156810 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14b0d74b2e4faf14f48c1fcd64d1525a0a64c2780c2ba8fb56d33db957100619\": container with ID starting with 14b0d74b2e4faf14f48c1fcd64d1525a0a64c2780c2ba8fb56d33db957100619 not found: ID does not exist" containerID="14b0d74b2e4faf14f48c1fcd64d1525a0a64c2780c2ba8fb56d33db957100619" Dec 01 09:22:55 crc kubenswrapper[5004]: I1201 09:22:55.156835 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14b0d74b2e4faf14f48c1fcd64d1525a0a64c2780c2ba8fb56d33db957100619"} err="failed to get container status \"14b0d74b2e4faf14f48c1fcd64d1525a0a64c2780c2ba8fb56d33db957100619\": rpc error: code = NotFound desc = could not find container \"14b0d74b2e4faf14f48c1fcd64d1525a0a64c2780c2ba8fb56d33db957100619\": container with ID starting with 14b0d74b2e4faf14f48c1fcd64d1525a0a64c2780c2ba8fb56d33db957100619 not found: ID does not exist" Dec 01 09:22:55 crc kubenswrapper[5004]: I1201 09:22:55.758943 5004 scope.go:117] "RemoveContainer" containerID="ddad648297bc943586134e821ef858ef63f97885bd3e0068a2a2edd240b88618" Dec 01 09:22:55 crc kubenswrapper[5004]: E1201 09:22:55.759517 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:22:56 crc kubenswrapper[5004]: I1201 09:22:56.772281 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="086fb2f4-8bcb-425e-b6ae-dc3e760c24fb" path="/var/lib/kubelet/pods/086fb2f4-8bcb-425e-b6ae-dc3e760c24fb/volumes" Dec 01 09:23:10 crc kubenswrapper[5004]: I1201 09:23:10.760577 5004 scope.go:117] "RemoveContainer" containerID="ddad648297bc943586134e821ef858ef63f97885bd3e0068a2a2edd240b88618" Dec 01 09:23:10 crc kubenswrapper[5004]: E1201 09:23:10.761255 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:23:21 crc kubenswrapper[5004]: I1201 09:23:21.758917 5004 scope.go:117] "RemoveContainer" containerID="ddad648297bc943586134e821ef858ef63f97885bd3e0068a2a2edd240b88618" Dec 01 09:23:21 crc kubenswrapper[5004]: E1201 09:23:21.759790 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:23:29 crc kubenswrapper[5004]: I1201 09:23:29.704824 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="04a6dd3a-f297-40b9-b480-0239383b9460" containerName="galera" probeResult="failure" output="command timed out" Dec 01 09:23:29 crc kubenswrapper[5004]: I1201 09:23:29.705517 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="04a6dd3a-f297-40b9-b480-0239383b9460" containerName="galera" probeResult="failure" output="command timed out" Dec 01 09:23:36 crc kubenswrapper[5004]: I1201 09:23:36.759473 5004 scope.go:117] "RemoveContainer" containerID="ddad648297bc943586134e821ef858ef63f97885bd3e0068a2a2edd240b88618" Dec 01 09:23:36 crc kubenswrapper[5004]: E1201 09:23:36.760296 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:23:51 crc kubenswrapper[5004]: I1201 09:23:51.759709 5004 scope.go:117] "RemoveContainer" containerID="ddad648297bc943586134e821ef858ef63f97885bd3e0068a2a2edd240b88618" Dec 01 09:23:51 crc kubenswrapper[5004]: E1201 09:23:51.760925 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:24:02 crc kubenswrapper[5004]: I1201 09:24:02.768479 5004 scope.go:117] "RemoveContainer" containerID="ddad648297bc943586134e821ef858ef63f97885bd3e0068a2a2edd240b88618" Dec 01 09:24:02 crc kubenswrapper[5004]: E1201 09:24:02.769482 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:24:15 crc kubenswrapper[5004]: I1201 09:24:15.758908 5004 scope.go:117] "RemoveContainer" containerID="ddad648297bc943586134e821ef858ef63f97885bd3e0068a2a2edd240b88618" Dec 01 09:24:15 crc kubenswrapper[5004]: E1201 09:24:15.759759 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:24:28 crc kubenswrapper[5004]: I1201 09:24:28.759786 5004 scope.go:117] "RemoveContainer" containerID="ddad648297bc943586134e821ef858ef63f97885bd3e0068a2a2edd240b88618" Dec 01 09:24:28 crc kubenswrapper[5004]: E1201 09:24:28.762437 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:24:43 crc kubenswrapper[5004]: I1201 09:24:43.758974 5004 scope.go:117] "RemoveContainer" containerID="ddad648297bc943586134e821ef858ef63f97885bd3e0068a2a2edd240b88618" Dec 01 09:24:43 crc kubenswrapper[5004]: E1201 09:24:43.760044 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:24:55 crc kubenswrapper[5004]: I1201 09:24:55.759373 5004 scope.go:117] "RemoveContainer" containerID="ddad648297bc943586134e821ef858ef63f97885bd3e0068a2a2edd240b88618" Dec 01 09:24:55 crc kubenswrapper[5004]: E1201 09:24:55.760195 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:25:06 crc kubenswrapper[5004]: I1201 09:25:06.760173 5004 scope.go:117] "RemoveContainer" containerID="ddad648297bc943586134e821ef858ef63f97885bd3e0068a2a2edd240b88618" Dec 01 09:25:06 crc kubenswrapper[5004]: E1201 09:25:06.761090 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:25:07 crc kubenswrapper[5004]: I1201 09:25:07.770748 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4j42r"] Dec 01 09:25:07 crc kubenswrapper[5004]: E1201 09:25:07.771683 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="086fb2f4-8bcb-425e-b6ae-dc3e760c24fb" containerName="extract-content" Dec 01 09:25:07 crc kubenswrapper[5004]: I1201 09:25:07.771701 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="086fb2f4-8bcb-425e-b6ae-dc3e760c24fb" containerName="extract-content" Dec 01 09:25:07 crc kubenswrapper[5004]: E1201 09:25:07.771749 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="086fb2f4-8bcb-425e-b6ae-dc3e760c24fb" containerName="registry-server" Dec 01 09:25:07 crc kubenswrapper[5004]: I1201 09:25:07.771758 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="086fb2f4-8bcb-425e-b6ae-dc3e760c24fb" containerName="registry-server" Dec 01 09:25:07 crc kubenswrapper[5004]: E1201 09:25:07.771790 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="086fb2f4-8bcb-425e-b6ae-dc3e760c24fb" containerName="extract-utilities" Dec 01 09:25:07 crc kubenswrapper[5004]: I1201 09:25:07.771799 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="086fb2f4-8bcb-425e-b6ae-dc3e760c24fb" containerName="extract-utilities" Dec 01 09:25:07 crc kubenswrapper[5004]: I1201 09:25:07.772314 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="086fb2f4-8bcb-425e-b6ae-dc3e760c24fb" containerName="registry-server" Dec 01 09:25:07 crc kubenswrapper[5004]: I1201 09:25:07.775343 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4j42r" Dec 01 09:25:07 crc kubenswrapper[5004]: I1201 09:25:07.793250 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4j42r"] Dec 01 09:25:07 crc kubenswrapper[5004]: I1201 09:25:07.880991 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b9a10fb-aabb-45e5-b0ce-156df39ce402-utilities\") pod \"community-operators-4j42r\" (UID: \"6b9a10fb-aabb-45e5-b0ce-156df39ce402\") " pod="openshift-marketplace/community-operators-4j42r" Dec 01 09:25:07 crc kubenswrapper[5004]: I1201 09:25:07.881073 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqgc5\" (UniqueName: \"kubernetes.io/projected/6b9a10fb-aabb-45e5-b0ce-156df39ce402-kube-api-access-mqgc5\") pod \"community-operators-4j42r\" (UID: \"6b9a10fb-aabb-45e5-b0ce-156df39ce402\") " pod="openshift-marketplace/community-operators-4j42r" Dec 01 09:25:07 crc kubenswrapper[5004]: I1201 09:25:07.881252 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b9a10fb-aabb-45e5-b0ce-156df39ce402-catalog-content\") pod \"community-operators-4j42r\" (UID: \"6b9a10fb-aabb-45e5-b0ce-156df39ce402\") " pod="openshift-marketplace/community-operators-4j42r" Dec 01 09:25:07 crc kubenswrapper[5004]: I1201 09:25:07.983865 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b9a10fb-aabb-45e5-b0ce-156df39ce402-utilities\") pod \"community-operators-4j42r\" (UID: \"6b9a10fb-aabb-45e5-b0ce-156df39ce402\") " pod="openshift-marketplace/community-operators-4j42r" Dec 01 09:25:07 crc kubenswrapper[5004]: I1201 09:25:07.983932 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqgc5\" (UniqueName: \"kubernetes.io/projected/6b9a10fb-aabb-45e5-b0ce-156df39ce402-kube-api-access-mqgc5\") pod \"community-operators-4j42r\" (UID: \"6b9a10fb-aabb-45e5-b0ce-156df39ce402\") " pod="openshift-marketplace/community-operators-4j42r" Dec 01 09:25:07 crc kubenswrapper[5004]: I1201 09:25:07.983993 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b9a10fb-aabb-45e5-b0ce-156df39ce402-catalog-content\") pod \"community-operators-4j42r\" (UID: \"6b9a10fb-aabb-45e5-b0ce-156df39ce402\") " pod="openshift-marketplace/community-operators-4j42r" Dec 01 09:25:07 crc kubenswrapper[5004]: I1201 09:25:07.984392 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b9a10fb-aabb-45e5-b0ce-156df39ce402-utilities\") pod \"community-operators-4j42r\" (UID: \"6b9a10fb-aabb-45e5-b0ce-156df39ce402\") " pod="openshift-marketplace/community-operators-4j42r" Dec 01 09:25:07 crc kubenswrapper[5004]: I1201 09:25:07.984507 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b9a10fb-aabb-45e5-b0ce-156df39ce402-catalog-content\") pod \"community-operators-4j42r\" (UID: \"6b9a10fb-aabb-45e5-b0ce-156df39ce402\") " pod="openshift-marketplace/community-operators-4j42r" Dec 01 09:25:08 crc kubenswrapper[5004]: I1201 09:25:08.005259 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqgc5\" (UniqueName: \"kubernetes.io/projected/6b9a10fb-aabb-45e5-b0ce-156df39ce402-kube-api-access-mqgc5\") pod \"community-operators-4j42r\" (UID: \"6b9a10fb-aabb-45e5-b0ce-156df39ce402\") " pod="openshift-marketplace/community-operators-4j42r" Dec 01 09:25:08 crc kubenswrapper[5004]: I1201 09:25:08.102969 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4j42r" Dec 01 09:25:08 crc kubenswrapper[5004]: I1201 09:25:08.715902 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4j42r"] Dec 01 09:25:08 crc kubenswrapper[5004]: W1201 09:25:08.721039 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b9a10fb_aabb_45e5_b0ce_156df39ce402.slice/crio-cc551b7f9d232081826f35c59e439be11f1a484b0a6ef502d9ecb6c5b657d5fe WatchSource:0}: Error finding container cc551b7f9d232081826f35c59e439be11f1a484b0a6ef502d9ecb6c5b657d5fe: Status 404 returned error can't find the container with id cc551b7f9d232081826f35c59e439be11f1a484b0a6ef502d9ecb6c5b657d5fe Dec 01 09:25:09 crc kubenswrapper[5004]: I1201 09:25:09.618985 5004 generic.go:334] "Generic (PLEG): container finished" podID="6b9a10fb-aabb-45e5-b0ce-156df39ce402" containerID="fe72fb65d7413bdf9bca28ec3de5360a4885dc8f516716a3ff12c42f7ce84866" exitCode=0 Dec 01 09:25:09 crc kubenswrapper[5004]: I1201 09:25:09.619047 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4j42r" event={"ID":"6b9a10fb-aabb-45e5-b0ce-156df39ce402","Type":"ContainerDied","Data":"fe72fb65d7413bdf9bca28ec3de5360a4885dc8f516716a3ff12c42f7ce84866"} Dec 01 09:25:09 crc kubenswrapper[5004]: I1201 09:25:09.619222 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4j42r" event={"ID":"6b9a10fb-aabb-45e5-b0ce-156df39ce402","Type":"ContainerStarted","Data":"cc551b7f9d232081826f35c59e439be11f1a484b0a6ef502d9ecb6c5b657d5fe"} Dec 01 09:25:20 crc kubenswrapper[5004]: I1201 09:25:20.759333 5004 scope.go:117] "RemoveContainer" containerID="ddad648297bc943586134e821ef858ef63f97885bd3e0068a2a2edd240b88618" Dec 01 09:25:20 crc kubenswrapper[5004]: E1201 09:25:20.760944 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:25:22 crc kubenswrapper[5004]: I1201 09:25:22.799266 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4j42r" event={"ID":"6b9a10fb-aabb-45e5-b0ce-156df39ce402","Type":"ContainerStarted","Data":"366879b7726aead9bba48ee40424f3e32435a45487d1e21eb6fd36b53c614949"} Dec 01 09:25:23 crc kubenswrapper[5004]: I1201 09:25:23.817162 5004 generic.go:334] "Generic (PLEG): container finished" podID="6b9a10fb-aabb-45e5-b0ce-156df39ce402" containerID="366879b7726aead9bba48ee40424f3e32435a45487d1e21eb6fd36b53c614949" exitCode=0 Dec 01 09:25:23 crc kubenswrapper[5004]: I1201 09:25:23.817278 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4j42r" event={"ID":"6b9a10fb-aabb-45e5-b0ce-156df39ce402","Type":"ContainerDied","Data":"366879b7726aead9bba48ee40424f3e32435a45487d1e21eb6fd36b53c614949"} Dec 01 09:25:24 crc kubenswrapper[5004]: I1201 09:25:24.841973 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4j42r" event={"ID":"6b9a10fb-aabb-45e5-b0ce-156df39ce402","Type":"ContainerStarted","Data":"b80e971735c96627c0ed3eb279d515ff1a307e7a2ea61a6e6b2b686fb9d9fc9f"} Dec 01 09:25:24 crc kubenswrapper[5004]: I1201 09:25:24.872154 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4j42r" podStartSLOduration=3.121768474 podStartE2EDuration="17.872128878s" podCreationTimestamp="2025-12-01 09:25:07 +0000 UTC" firstStartedPulling="2025-12-01 09:25:09.621080605 +0000 UTC m=+4087.186072597" lastFinishedPulling="2025-12-01 09:25:24.371441009 +0000 UTC m=+4101.936433001" observedRunningTime="2025-12-01 09:25:24.863206171 +0000 UTC m=+4102.428198173" watchObservedRunningTime="2025-12-01 09:25:24.872128878 +0000 UTC m=+4102.437120870" Dec 01 09:25:28 crc kubenswrapper[5004]: I1201 09:25:28.103091 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4j42r" Dec 01 09:25:28 crc kubenswrapper[5004]: I1201 09:25:28.103498 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4j42r" Dec 01 09:25:29 crc kubenswrapper[5004]: I1201 09:25:29.582922 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4j42r" Dec 01 09:25:29 crc kubenswrapper[5004]: I1201 09:25:29.788006 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4j42r" Dec 01 09:25:29 crc kubenswrapper[5004]: I1201 09:25:29.954044 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4j42r"] Dec 01 09:25:30 crc kubenswrapper[5004]: I1201 09:25:30.005817 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lzmgh"] Dec 01 09:25:30 crc kubenswrapper[5004]: I1201 09:25:30.006058 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lzmgh" podUID="df757840-7c38-4de3-829b-759182d9c96d" containerName="registry-server" containerID="cri-o://782332edccb146590f6c58fc9817f1c50c21ec328fdd227aeb22e7e514d4be6e" gracePeriod=2 Dec 01 09:25:30 crc kubenswrapper[5004]: E1201 09:25:30.586655 5004 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 782332edccb146590f6c58fc9817f1c50c21ec328fdd227aeb22e7e514d4be6e is running failed: container process not found" containerID="782332edccb146590f6c58fc9817f1c50c21ec328fdd227aeb22e7e514d4be6e" cmd=["grpc_health_probe","-addr=:50051"] Dec 01 09:25:30 crc kubenswrapper[5004]: E1201 09:25:30.588780 5004 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 782332edccb146590f6c58fc9817f1c50c21ec328fdd227aeb22e7e514d4be6e is running failed: container process not found" containerID="782332edccb146590f6c58fc9817f1c50c21ec328fdd227aeb22e7e514d4be6e" cmd=["grpc_health_probe","-addr=:50051"] Dec 01 09:25:30 crc kubenswrapper[5004]: E1201 09:25:30.589152 5004 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 782332edccb146590f6c58fc9817f1c50c21ec328fdd227aeb22e7e514d4be6e is running failed: container process not found" containerID="782332edccb146590f6c58fc9817f1c50c21ec328fdd227aeb22e7e514d4be6e" cmd=["grpc_health_probe","-addr=:50051"] Dec 01 09:25:30 crc kubenswrapper[5004]: E1201 09:25:30.589180 5004 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 782332edccb146590f6c58fc9817f1c50c21ec328fdd227aeb22e7e514d4be6e is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-lzmgh" podUID="df757840-7c38-4de3-829b-759182d9c96d" containerName="registry-server" Dec 01 09:25:30 crc kubenswrapper[5004]: I1201 09:25:30.629960 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lzmgh" Dec 01 09:25:30 crc kubenswrapper[5004]: I1201 09:25:30.710804 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nk2k\" (UniqueName: \"kubernetes.io/projected/df757840-7c38-4de3-829b-759182d9c96d-kube-api-access-9nk2k\") pod \"df757840-7c38-4de3-829b-759182d9c96d\" (UID: \"df757840-7c38-4de3-829b-759182d9c96d\") " Dec 01 09:25:30 crc kubenswrapper[5004]: I1201 09:25:30.710995 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df757840-7c38-4de3-829b-759182d9c96d-catalog-content\") pod \"df757840-7c38-4de3-829b-759182d9c96d\" (UID: \"df757840-7c38-4de3-829b-759182d9c96d\") " Dec 01 09:25:30 crc kubenswrapper[5004]: I1201 09:25:30.711091 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df757840-7c38-4de3-829b-759182d9c96d-utilities\") pod \"df757840-7c38-4de3-829b-759182d9c96d\" (UID: \"df757840-7c38-4de3-829b-759182d9c96d\") " Dec 01 09:25:30 crc kubenswrapper[5004]: I1201 09:25:30.711968 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df757840-7c38-4de3-829b-759182d9c96d-utilities" (OuterVolumeSpecName: "utilities") pod "df757840-7c38-4de3-829b-759182d9c96d" (UID: "df757840-7c38-4de3-829b-759182d9c96d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:25:30 crc kubenswrapper[5004]: I1201 09:25:30.719752 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df757840-7c38-4de3-829b-759182d9c96d-kube-api-access-9nk2k" (OuterVolumeSpecName: "kube-api-access-9nk2k") pod "df757840-7c38-4de3-829b-759182d9c96d" (UID: "df757840-7c38-4de3-829b-759182d9c96d"). InnerVolumeSpecName "kube-api-access-9nk2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:25:30 crc kubenswrapper[5004]: I1201 09:25:30.774662 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df757840-7c38-4de3-829b-759182d9c96d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "df757840-7c38-4de3-829b-759182d9c96d" (UID: "df757840-7c38-4de3-829b-759182d9c96d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:25:30 crc kubenswrapper[5004]: I1201 09:25:30.814032 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nk2k\" (UniqueName: \"kubernetes.io/projected/df757840-7c38-4de3-829b-759182d9c96d-kube-api-access-9nk2k\") on node \"crc\" DevicePath \"\"" Dec 01 09:25:30 crc kubenswrapper[5004]: I1201 09:25:30.814076 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df757840-7c38-4de3-829b-759182d9c96d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:25:30 crc kubenswrapper[5004]: I1201 09:25:30.814088 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df757840-7c38-4de3-829b-759182d9c96d-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:25:30 crc kubenswrapper[5004]: I1201 09:25:30.902472 5004 generic.go:334] "Generic (PLEG): container finished" podID="df757840-7c38-4de3-829b-759182d9c96d" containerID="782332edccb146590f6c58fc9817f1c50c21ec328fdd227aeb22e7e514d4be6e" exitCode=0 Dec 01 09:25:30 crc kubenswrapper[5004]: I1201 09:25:30.903530 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lzmgh" Dec 01 09:25:30 crc kubenswrapper[5004]: I1201 09:25:30.904201 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lzmgh" event={"ID":"df757840-7c38-4de3-829b-759182d9c96d","Type":"ContainerDied","Data":"782332edccb146590f6c58fc9817f1c50c21ec328fdd227aeb22e7e514d4be6e"} Dec 01 09:25:30 crc kubenswrapper[5004]: I1201 09:25:30.904259 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lzmgh" event={"ID":"df757840-7c38-4de3-829b-759182d9c96d","Type":"ContainerDied","Data":"87ced3db20294ef1bfcc68fec4c0abdbddf46a0b0d06dd530137bc0c2cb75cda"} Dec 01 09:25:30 crc kubenswrapper[5004]: I1201 09:25:30.904280 5004 scope.go:117] "RemoveContainer" containerID="782332edccb146590f6c58fc9817f1c50c21ec328fdd227aeb22e7e514d4be6e" Dec 01 09:25:30 crc kubenswrapper[5004]: I1201 09:25:30.930908 5004 scope.go:117] "RemoveContainer" containerID="5001cfd699548e88f27495bae9358b98a0dacac724da0da5679228e94705f0e6" Dec 01 09:25:30 crc kubenswrapper[5004]: I1201 09:25:30.941551 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lzmgh"] Dec 01 09:25:30 crc kubenswrapper[5004]: I1201 09:25:30.952209 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lzmgh"] Dec 01 09:25:30 crc kubenswrapper[5004]: I1201 09:25:30.959531 5004 scope.go:117] "RemoveContainer" containerID="d8107835040e63c1bba04c5b9332fed22616c141a47453aa447fa8aa99a06177" Dec 01 09:25:31 crc kubenswrapper[5004]: I1201 09:25:31.035552 5004 scope.go:117] "RemoveContainer" containerID="782332edccb146590f6c58fc9817f1c50c21ec328fdd227aeb22e7e514d4be6e" Dec 01 09:25:31 crc kubenswrapper[5004]: E1201 09:25:31.036125 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"782332edccb146590f6c58fc9817f1c50c21ec328fdd227aeb22e7e514d4be6e\": container with ID starting with 782332edccb146590f6c58fc9817f1c50c21ec328fdd227aeb22e7e514d4be6e not found: ID does not exist" containerID="782332edccb146590f6c58fc9817f1c50c21ec328fdd227aeb22e7e514d4be6e" Dec 01 09:25:31 crc kubenswrapper[5004]: I1201 09:25:31.036198 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"782332edccb146590f6c58fc9817f1c50c21ec328fdd227aeb22e7e514d4be6e"} err="failed to get container status \"782332edccb146590f6c58fc9817f1c50c21ec328fdd227aeb22e7e514d4be6e\": rpc error: code = NotFound desc = could not find container \"782332edccb146590f6c58fc9817f1c50c21ec328fdd227aeb22e7e514d4be6e\": container with ID starting with 782332edccb146590f6c58fc9817f1c50c21ec328fdd227aeb22e7e514d4be6e not found: ID does not exist" Dec 01 09:25:31 crc kubenswrapper[5004]: I1201 09:25:31.036254 5004 scope.go:117] "RemoveContainer" containerID="5001cfd699548e88f27495bae9358b98a0dacac724da0da5679228e94705f0e6" Dec 01 09:25:31 crc kubenswrapper[5004]: E1201 09:25:31.036680 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5001cfd699548e88f27495bae9358b98a0dacac724da0da5679228e94705f0e6\": container with ID starting with 5001cfd699548e88f27495bae9358b98a0dacac724da0da5679228e94705f0e6 not found: ID does not exist" containerID="5001cfd699548e88f27495bae9358b98a0dacac724da0da5679228e94705f0e6" Dec 01 09:25:31 crc kubenswrapper[5004]: I1201 09:25:31.036714 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5001cfd699548e88f27495bae9358b98a0dacac724da0da5679228e94705f0e6"} err="failed to get container status \"5001cfd699548e88f27495bae9358b98a0dacac724da0da5679228e94705f0e6\": rpc error: code = NotFound desc = could not find container \"5001cfd699548e88f27495bae9358b98a0dacac724da0da5679228e94705f0e6\": container with ID starting with 5001cfd699548e88f27495bae9358b98a0dacac724da0da5679228e94705f0e6 not found: ID does not exist" Dec 01 09:25:31 crc kubenswrapper[5004]: I1201 09:25:31.036734 5004 scope.go:117] "RemoveContainer" containerID="d8107835040e63c1bba04c5b9332fed22616c141a47453aa447fa8aa99a06177" Dec 01 09:25:31 crc kubenswrapper[5004]: E1201 09:25:31.037011 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8107835040e63c1bba04c5b9332fed22616c141a47453aa447fa8aa99a06177\": container with ID starting with d8107835040e63c1bba04c5b9332fed22616c141a47453aa447fa8aa99a06177 not found: ID does not exist" containerID="d8107835040e63c1bba04c5b9332fed22616c141a47453aa447fa8aa99a06177" Dec 01 09:25:31 crc kubenswrapper[5004]: I1201 09:25:31.037046 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8107835040e63c1bba04c5b9332fed22616c141a47453aa447fa8aa99a06177"} err="failed to get container status \"d8107835040e63c1bba04c5b9332fed22616c141a47453aa447fa8aa99a06177\": rpc error: code = NotFound desc = could not find container \"d8107835040e63c1bba04c5b9332fed22616c141a47453aa447fa8aa99a06177\": container with ID starting with d8107835040e63c1bba04c5b9332fed22616c141a47453aa447fa8aa99a06177 not found: ID does not exist" Dec 01 09:25:32 crc kubenswrapper[5004]: I1201 09:25:32.772288 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df757840-7c38-4de3-829b-759182d9c96d" path="/var/lib/kubelet/pods/df757840-7c38-4de3-829b-759182d9c96d/volumes" Dec 01 09:25:34 crc kubenswrapper[5004]: I1201 09:25:34.760041 5004 scope.go:117] "RemoveContainer" containerID="ddad648297bc943586134e821ef858ef63f97885bd3e0068a2a2edd240b88618" Dec 01 09:25:34 crc kubenswrapper[5004]: E1201 09:25:34.760713 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:25:46 crc kubenswrapper[5004]: I1201 09:25:46.758923 5004 scope.go:117] "RemoveContainer" containerID="ddad648297bc943586134e821ef858ef63f97885bd3e0068a2a2edd240b88618" Dec 01 09:25:48 crc kubenswrapper[5004]: I1201 09:25:48.132500 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" event={"ID":"f9977ebb-82de-4e96-8763-0b5a84f8d4ce","Type":"ContainerStarted","Data":"dbac988da5fadc05b09a91df3f1891ebf32218595972382befb5aa0753662a68"} Dec 01 09:28:08 crc kubenswrapper[5004]: I1201 09:28:08.729730 5004 patch_prober.go:28] interesting pod/machine-config-daemon-fvdgt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:28:08 crc kubenswrapper[5004]: I1201 09:28:08.730592 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:28:38 crc kubenswrapper[5004]: I1201 09:28:38.729754 5004 patch_prober.go:28] interesting pod/machine-config-daemon-fvdgt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:28:38 crc kubenswrapper[5004]: I1201 09:28:38.730384 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:29:08 crc kubenswrapper[5004]: I1201 09:29:08.729184 5004 patch_prober.go:28] interesting pod/machine-config-daemon-fvdgt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:29:08 crc kubenswrapper[5004]: I1201 09:29:08.729776 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:29:08 crc kubenswrapper[5004]: I1201 09:29:08.729821 5004 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" Dec 01 09:29:08 crc kubenswrapper[5004]: I1201 09:29:08.730761 5004 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dbac988da5fadc05b09a91df3f1891ebf32218595972382befb5aa0753662a68"} pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 09:29:08 crc kubenswrapper[5004]: I1201 09:29:08.730810 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" containerID="cri-o://dbac988da5fadc05b09a91df3f1891ebf32218595972382befb5aa0753662a68" gracePeriod=600 Dec 01 09:29:09 crc kubenswrapper[5004]: I1201 09:29:09.648066 5004 generic.go:334] "Generic (PLEG): container finished" podID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerID="dbac988da5fadc05b09a91df3f1891ebf32218595972382befb5aa0753662a68" exitCode=0 Dec 01 09:29:09 crc kubenswrapper[5004]: I1201 09:29:09.648200 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" event={"ID":"f9977ebb-82de-4e96-8763-0b5a84f8d4ce","Type":"ContainerDied","Data":"dbac988da5fadc05b09a91df3f1891ebf32218595972382befb5aa0753662a68"} Dec 01 09:29:09 crc kubenswrapper[5004]: I1201 09:29:09.648864 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" event={"ID":"f9977ebb-82de-4e96-8763-0b5a84f8d4ce","Type":"ContainerStarted","Data":"ccf28834b75e460fde85109220042abb807e53b91adea0a0b179d3e7961709b8"} Dec 01 09:29:09 crc kubenswrapper[5004]: I1201 09:29:09.648899 5004 scope.go:117] "RemoveContainer" containerID="ddad648297bc943586134e821ef858ef63f97885bd3e0068a2a2edd240b88618" Dec 01 09:30:00 crc kubenswrapper[5004]: I1201 09:30:00.196706 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409690-lhs92"] Dec 01 09:30:00 crc kubenswrapper[5004]: E1201 09:30:00.197935 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df757840-7c38-4de3-829b-759182d9c96d" containerName="extract-utilities" Dec 01 09:30:00 crc kubenswrapper[5004]: I1201 09:30:00.197956 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="df757840-7c38-4de3-829b-759182d9c96d" containerName="extract-utilities" Dec 01 09:30:00 crc kubenswrapper[5004]: E1201 09:30:00.198032 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df757840-7c38-4de3-829b-759182d9c96d" containerName="registry-server" Dec 01 09:30:00 crc kubenswrapper[5004]: I1201 09:30:00.198041 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="df757840-7c38-4de3-829b-759182d9c96d" containerName="registry-server" Dec 01 09:30:00 crc kubenswrapper[5004]: E1201 09:30:00.198056 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df757840-7c38-4de3-829b-759182d9c96d" containerName="extract-content" Dec 01 09:30:00 crc kubenswrapper[5004]: I1201 09:30:00.198065 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="df757840-7c38-4de3-829b-759182d9c96d" containerName="extract-content" Dec 01 09:30:00 crc kubenswrapper[5004]: I1201 09:30:00.198341 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="df757840-7c38-4de3-829b-759182d9c96d" containerName="registry-server" Dec 01 09:30:00 crc kubenswrapper[5004]: I1201 09:30:00.199477 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409690-lhs92" Dec 01 09:30:00 crc kubenswrapper[5004]: I1201 09:30:00.202003 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 09:30:00 crc kubenswrapper[5004]: I1201 09:30:00.204695 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 09:30:00 crc kubenswrapper[5004]: I1201 09:30:00.209154 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409690-lhs92"] Dec 01 09:30:00 crc kubenswrapper[5004]: I1201 09:30:00.347119 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f6bb02d9-7a7e-490f-97fa-aeae3427ca39-config-volume\") pod \"collect-profiles-29409690-lhs92\" (UID: \"f6bb02d9-7a7e-490f-97fa-aeae3427ca39\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409690-lhs92" Dec 01 09:30:00 crc kubenswrapper[5004]: I1201 09:30:00.347524 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f6bb02d9-7a7e-490f-97fa-aeae3427ca39-secret-volume\") pod \"collect-profiles-29409690-lhs92\" (UID: \"f6bb02d9-7a7e-490f-97fa-aeae3427ca39\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409690-lhs92" Dec 01 09:30:00 crc kubenswrapper[5004]: I1201 09:30:00.347636 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnbvp\" (UniqueName: \"kubernetes.io/projected/f6bb02d9-7a7e-490f-97fa-aeae3427ca39-kube-api-access-bnbvp\") pod \"collect-profiles-29409690-lhs92\" (UID: \"f6bb02d9-7a7e-490f-97fa-aeae3427ca39\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409690-lhs92" Dec 01 09:30:00 crc kubenswrapper[5004]: I1201 09:30:00.449678 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnbvp\" (UniqueName: \"kubernetes.io/projected/f6bb02d9-7a7e-490f-97fa-aeae3427ca39-kube-api-access-bnbvp\") pod \"collect-profiles-29409690-lhs92\" (UID: \"f6bb02d9-7a7e-490f-97fa-aeae3427ca39\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409690-lhs92" Dec 01 09:30:00 crc kubenswrapper[5004]: I1201 09:30:00.449857 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f6bb02d9-7a7e-490f-97fa-aeae3427ca39-config-volume\") pod \"collect-profiles-29409690-lhs92\" (UID: \"f6bb02d9-7a7e-490f-97fa-aeae3427ca39\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409690-lhs92" Dec 01 09:30:00 crc kubenswrapper[5004]: I1201 09:30:00.449973 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f6bb02d9-7a7e-490f-97fa-aeae3427ca39-secret-volume\") pod \"collect-profiles-29409690-lhs92\" (UID: \"f6bb02d9-7a7e-490f-97fa-aeae3427ca39\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409690-lhs92" Dec 01 09:30:00 crc kubenswrapper[5004]: I1201 09:30:00.451041 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f6bb02d9-7a7e-490f-97fa-aeae3427ca39-config-volume\") pod \"collect-profiles-29409690-lhs92\" (UID: \"f6bb02d9-7a7e-490f-97fa-aeae3427ca39\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409690-lhs92" Dec 01 09:30:00 crc kubenswrapper[5004]: I1201 09:30:00.459636 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f6bb02d9-7a7e-490f-97fa-aeae3427ca39-secret-volume\") pod \"collect-profiles-29409690-lhs92\" (UID: \"f6bb02d9-7a7e-490f-97fa-aeae3427ca39\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409690-lhs92" Dec 01 09:30:00 crc kubenswrapper[5004]: I1201 09:30:00.472411 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnbvp\" (UniqueName: \"kubernetes.io/projected/f6bb02d9-7a7e-490f-97fa-aeae3427ca39-kube-api-access-bnbvp\") pod \"collect-profiles-29409690-lhs92\" (UID: \"f6bb02d9-7a7e-490f-97fa-aeae3427ca39\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409690-lhs92" Dec 01 09:30:00 crc kubenswrapper[5004]: I1201 09:30:00.533744 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409690-lhs92" Dec 01 09:30:01 crc kubenswrapper[5004]: I1201 09:30:01.010447 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409690-lhs92"] Dec 01 09:30:01 crc kubenswrapper[5004]: I1201 09:30:01.341463 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409690-lhs92" event={"ID":"f6bb02d9-7a7e-490f-97fa-aeae3427ca39","Type":"ContainerStarted","Data":"a54a54e20a91b5a63a79b0d6a4171a7dc1878a4917468c4d3e43a88147ea0032"} Dec 01 09:30:01 crc kubenswrapper[5004]: I1201 09:30:01.341859 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409690-lhs92" event={"ID":"f6bb02d9-7a7e-490f-97fa-aeae3427ca39","Type":"ContainerStarted","Data":"484321e9d60c6436455c3657e5dd6b2d05bc1b19762483540ca2ac80250c7434"} Dec 01 09:30:01 crc kubenswrapper[5004]: I1201 09:30:01.378863 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29409690-lhs92" podStartSLOduration=1.3788418359999999 podStartE2EDuration="1.378841836s" podCreationTimestamp="2025-12-01 09:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:30:01.367694864 +0000 UTC m=+4378.932686886" watchObservedRunningTime="2025-12-01 09:30:01.378841836 +0000 UTC m=+4378.943833828" Dec 01 09:30:02 crc kubenswrapper[5004]: I1201 09:30:02.355139 5004 generic.go:334] "Generic (PLEG): container finished" podID="f6bb02d9-7a7e-490f-97fa-aeae3427ca39" containerID="a54a54e20a91b5a63a79b0d6a4171a7dc1878a4917468c4d3e43a88147ea0032" exitCode=0 Dec 01 09:30:02 crc kubenswrapper[5004]: I1201 09:30:02.355255 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409690-lhs92" event={"ID":"f6bb02d9-7a7e-490f-97fa-aeae3427ca39","Type":"ContainerDied","Data":"a54a54e20a91b5a63a79b0d6a4171a7dc1878a4917468c4d3e43a88147ea0032"} Dec 01 09:30:03 crc kubenswrapper[5004]: I1201 09:30:03.733069 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409690-lhs92" Dec 01 09:30:03 crc kubenswrapper[5004]: I1201 09:30:03.833614 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnbvp\" (UniqueName: \"kubernetes.io/projected/f6bb02d9-7a7e-490f-97fa-aeae3427ca39-kube-api-access-bnbvp\") pod \"f6bb02d9-7a7e-490f-97fa-aeae3427ca39\" (UID: \"f6bb02d9-7a7e-490f-97fa-aeae3427ca39\") " Dec 01 09:30:03 crc kubenswrapper[5004]: I1201 09:30:03.833683 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f6bb02d9-7a7e-490f-97fa-aeae3427ca39-secret-volume\") pod \"f6bb02d9-7a7e-490f-97fa-aeae3427ca39\" (UID: \"f6bb02d9-7a7e-490f-97fa-aeae3427ca39\") " Dec 01 09:30:03 crc kubenswrapper[5004]: I1201 09:30:03.833749 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f6bb02d9-7a7e-490f-97fa-aeae3427ca39-config-volume\") pod \"f6bb02d9-7a7e-490f-97fa-aeae3427ca39\" (UID: \"f6bb02d9-7a7e-490f-97fa-aeae3427ca39\") " Dec 01 09:30:03 crc kubenswrapper[5004]: I1201 09:30:03.834642 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6bb02d9-7a7e-490f-97fa-aeae3427ca39-config-volume" (OuterVolumeSpecName: "config-volume") pod "f6bb02d9-7a7e-490f-97fa-aeae3427ca39" (UID: "f6bb02d9-7a7e-490f-97fa-aeae3427ca39"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:30:03 crc kubenswrapper[5004]: I1201 09:30:03.839624 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6bb02d9-7a7e-490f-97fa-aeae3427ca39-kube-api-access-bnbvp" (OuterVolumeSpecName: "kube-api-access-bnbvp") pod "f6bb02d9-7a7e-490f-97fa-aeae3427ca39" (UID: "f6bb02d9-7a7e-490f-97fa-aeae3427ca39"). InnerVolumeSpecName "kube-api-access-bnbvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:30:03 crc kubenswrapper[5004]: I1201 09:30:03.839650 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6bb02d9-7a7e-490f-97fa-aeae3427ca39-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f6bb02d9-7a7e-490f-97fa-aeae3427ca39" (UID: "f6bb02d9-7a7e-490f-97fa-aeae3427ca39"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:30:03 crc kubenswrapper[5004]: I1201 09:30:03.938048 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bnbvp\" (UniqueName: \"kubernetes.io/projected/f6bb02d9-7a7e-490f-97fa-aeae3427ca39-kube-api-access-bnbvp\") on node \"crc\" DevicePath \"\"" Dec 01 09:30:03 crc kubenswrapper[5004]: I1201 09:30:03.938092 5004 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f6bb02d9-7a7e-490f-97fa-aeae3427ca39-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 09:30:03 crc kubenswrapper[5004]: I1201 09:30:03.938105 5004 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f6bb02d9-7a7e-490f-97fa-aeae3427ca39-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 09:30:04 crc kubenswrapper[5004]: I1201 09:30:04.378412 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409690-lhs92" event={"ID":"f6bb02d9-7a7e-490f-97fa-aeae3427ca39","Type":"ContainerDied","Data":"484321e9d60c6436455c3657e5dd6b2d05bc1b19762483540ca2ac80250c7434"} Dec 01 09:30:04 crc kubenswrapper[5004]: I1201 09:30:04.378802 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="484321e9d60c6436455c3657e5dd6b2d05bc1b19762483540ca2ac80250c7434" Dec 01 09:30:04 crc kubenswrapper[5004]: I1201 09:30:04.378548 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409690-lhs92" Dec 01 09:30:04 crc kubenswrapper[5004]: I1201 09:30:04.451148 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409645-5722b"] Dec 01 09:30:04 crc kubenswrapper[5004]: I1201 09:30:04.468578 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409645-5722b"] Dec 01 09:30:04 crc kubenswrapper[5004]: I1201 09:30:04.771952 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14e9ea25-5306-4134-8e77-dde9901fceb5" path="/var/lib/kubelet/pods/14e9ea25-5306-4134-8e77-dde9901fceb5/volumes" Dec 01 09:30:19 crc kubenswrapper[5004]: I1201 09:30:19.388385 5004 scope.go:117] "RemoveContainer" containerID="8d2f618e7d1ef58aba1a2e2c91c469e60a14830ccc83d7b1673365c261980545" Dec 01 09:31:38 crc kubenswrapper[5004]: I1201 09:31:38.729965 5004 patch_prober.go:28] interesting pod/machine-config-daemon-fvdgt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:31:38 crc kubenswrapper[5004]: I1201 09:31:38.730528 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:32:08 crc kubenswrapper[5004]: I1201 09:32:08.728976 5004 patch_prober.go:28] interesting pod/machine-config-daemon-fvdgt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:32:08 crc kubenswrapper[5004]: I1201 09:32:08.729514 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:32:12 crc kubenswrapper[5004]: I1201 09:32:12.591099 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nk6vs"] Dec 01 09:32:12 crc kubenswrapper[5004]: E1201 09:32:12.592059 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6bb02d9-7a7e-490f-97fa-aeae3427ca39" containerName="collect-profiles" Dec 01 09:32:12 crc kubenswrapper[5004]: I1201 09:32:12.592073 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6bb02d9-7a7e-490f-97fa-aeae3427ca39" containerName="collect-profiles" Dec 01 09:32:12 crc kubenswrapper[5004]: I1201 09:32:12.592332 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6bb02d9-7a7e-490f-97fa-aeae3427ca39" containerName="collect-profiles" Dec 01 09:32:12 crc kubenswrapper[5004]: I1201 09:32:12.594011 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nk6vs" Dec 01 09:32:12 crc kubenswrapper[5004]: I1201 09:32:12.606618 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nk6vs"] Dec 01 09:32:12 crc kubenswrapper[5004]: I1201 09:32:12.706951 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdx6c\" (UniqueName: \"kubernetes.io/projected/ff052992-0b16-459e-afdd-bced1e60350a-kube-api-access-hdx6c\") pod \"redhat-operators-nk6vs\" (UID: \"ff052992-0b16-459e-afdd-bced1e60350a\") " pod="openshift-marketplace/redhat-operators-nk6vs" Dec 01 09:32:12 crc kubenswrapper[5004]: I1201 09:32:12.707012 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff052992-0b16-459e-afdd-bced1e60350a-catalog-content\") pod \"redhat-operators-nk6vs\" (UID: \"ff052992-0b16-459e-afdd-bced1e60350a\") " pod="openshift-marketplace/redhat-operators-nk6vs" Dec 01 09:32:12 crc kubenswrapper[5004]: I1201 09:32:12.707059 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff052992-0b16-459e-afdd-bced1e60350a-utilities\") pod \"redhat-operators-nk6vs\" (UID: \"ff052992-0b16-459e-afdd-bced1e60350a\") " pod="openshift-marketplace/redhat-operators-nk6vs" Dec 01 09:32:12 crc kubenswrapper[5004]: I1201 09:32:12.809324 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdx6c\" (UniqueName: \"kubernetes.io/projected/ff052992-0b16-459e-afdd-bced1e60350a-kube-api-access-hdx6c\") pod \"redhat-operators-nk6vs\" (UID: \"ff052992-0b16-459e-afdd-bced1e60350a\") " pod="openshift-marketplace/redhat-operators-nk6vs" Dec 01 09:32:12 crc kubenswrapper[5004]: I1201 09:32:12.810292 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff052992-0b16-459e-afdd-bced1e60350a-catalog-content\") pod \"redhat-operators-nk6vs\" (UID: \"ff052992-0b16-459e-afdd-bced1e60350a\") " pod="openshift-marketplace/redhat-operators-nk6vs" Dec 01 09:32:12 crc kubenswrapper[5004]: I1201 09:32:12.810427 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff052992-0b16-459e-afdd-bced1e60350a-utilities\") pod \"redhat-operators-nk6vs\" (UID: \"ff052992-0b16-459e-afdd-bced1e60350a\") " pod="openshift-marketplace/redhat-operators-nk6vs" Dec 01 09:32:12 crc kubenswrapper[5004]: I1201 09:32:12.810973 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff052992-0b16-459e-afdd-bced1e60350a-catalog-content\") pod \"redhat-operators-nk6vs\" (UID: \"ff052992-0b16-459e-afdd-bced1e60350a\") " pod="openshift-marketplace/redhat-operators-nk6vs" Dec 01 09:32:12 crc kubenswrapper[5004]: I1201 09:32:12.810986 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff052992-0b16-459e-afdd-bced1e60350a-utilities\") pod \"redhat-operators-nk6vs\" (UID: \"ff052992-0b16-459e-afdd-bced1e60350a\") " pod="openshift-marketplace/redhat-operators-nk6vs" Dec 01 09:32:12 crc kubenswrapper[5004]: I1201 09:32:12.842794 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdx6c\" (UniqueName: \"kubernetes.io/projected/ff052992-0b16-459e-afdd-bced1e60350a-kube-api-access-hdx6c\") pod \"redhat-operators-nk6vs\" (UID: \"ff052992-0b16-459e-afdd-bced1e60350a\") " pod="openshift-marketplace/redhat-operators-nk6vs" Dec 01 09:32:12 crc kubenswrapper[5004]: I1201 09:32:12.918684 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nk6vs" Dec 01 09:32:13 crc kubenswrapper[5004]: I1201 09:32:13.436709 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nk6vs"] Dec 01 09:32:13 crc kubenswrapper[5004]: W1201 09:32:13.445505 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff052992_0b16_459e_afdd_bced1e60350a.slice/crio-3f3dadac4181edf959c85ab474983200f3d87beae2e2514207730cb185bfae4c WatchSource:0}: Error finding container 3f3dadac4181edf959c85ab474983200f3d87beae2e2514207730cb185bfae4c: Status 404 returned error can't find the container with id 3f3dadac4181edf959c85ab474983200f3d87beae2e2514207730cb185bfae4c Dec 01 09:32:13 crc kubenswrapper[5004]: I1201 09:32:13.863053 5004 generic.go:334] "Generic (PLEG): container finished" podID="ff052992-0b16-459e-afdd-bced1e60350a" containerID="2116bed374eb52510c9d37d692cc5d5f001613579be6485e2d7e40c0f066c2e5" exitCode=0 Dec 01 09:32:13 crc kubenswrapper[5004]: I1201 09:32:13.863103 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nk6vs" event={"ID":"ff052992-0b16-459e-afdd-bced1e60350a","Type":"ContainerDied","Data":"2116bed374eb52510c9d37d692cc5d5f001613579be6485e2d7e40c0f066c2e5"} Dec 01 09:32:13 crc kubenswrapper[5004]: I1201 09:32:13.863129 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nk6vs" event={"ID":"ff052992-0b16-459e-afdd-bced1e60350a","Type":"ContainerStarted","Data":"3f3dadac4181edf959c85ab474983200f3d87beae2e2514207730cb185bfae4c"} Dec 01 09:32:13 crc kubenswrapper[5004]: I1201 09:32:13.866297 5004 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 09:32:17 crc kubenswrapper[5004]: I1201 09:32:17.908417 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nk6vs" event={"ID":"ff052992-0b16-459e-afdd-bced1e60350a","Type":"ContainerStarted","Data":"083a6d77b418dc2b79f56f79bba76145a42fdaae2f51202b36a6a6f13293c209"} Dec 01 09:32:21 crc kubenswrapper[5004]: I1201 09:32:21.954824 5004 generic.go:334] "Generic (PLEG): container finished" podID="ff052992-0b16-459e-afdd-bced1e60350a" containerID="083a6d77b418dc2b79f56f79bba76145a42fdaae2f51202b36a6a6f13293c209" exitCode=0 Dec 01 09:32:21 crc kubenswrapper[5004]: I1201 09:32:21.954905 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nk6vs" event={"ID":"ff052992-0b16-459e-afdd-bced1e60350a","Type":"ContainerDied","Data":"083a6d77b418dc2b79f56f79bba76145a42fdaae2f51202b36a6a6f13293c209"} Dec 01 09:32:25 crc kubenswrapper[5004]: I1201 09:32:25.003735 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nk6vs" event={"ID":"ff052992-0b16-459e-afdd-bced1e60350a","Type":"ContainerStarted","Data":"fa935800bf0e7d3f13a484953974db10e3cd9abda0f30c920d12bf8692ef1d38"} Dec 01 09:32:25 crc kubenswrapper[5004]: I1201 09:32:25.032940 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nk6vs" podStartSLOduration=2.753347309 podStartE2EDuration="13.032889553s" podCreationTimestamp="2025-12-01 09:32:12 +0000 UTC" firstStartedPulling="2025-12-01 09:32:13.866054391 +0000 UTC m=+4511.431046373" lastFinishedPulling="2025-12-01 09:32:24.145596635 +0000 UTC m=+4521.710588617" observedRunningTime="2025-12-01 09:32:25.025113144 +0000 UTC m=+4522.590105136" watchObservedRunningTime="2025-12-01 09:32:25.032889553 +0000 UTC m=+4522.597881555" Dec 01 09:32:32 crc kubenswrapper[5004]: I1201 09:32:32.918833 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nk6vs" Dec 01 09:32:32 crc kubenswrapper[5004]: I1201 09:32:32.920399 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nk6vs" Dec 01 09:32:32 crc kubenswrapper[5004]: I1201 09:32:32.978587 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nk6vs" Dec 01 09:32:33 crc kubenswrapper[5004]: I1201 09:32:33.148331 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nk6vs" Dec 01 09:32:33 crc kubenswrapper[5004]: I1201 09:32:33.216271 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nk6vs"] Dec 01 09:32:35 crc kubenswrapper[5004]: I1201 09:32:35.121036 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nk6vs" podUID="ff052992-0b16-459e-afdd-bced1e60350a" containerName="registry-server" containerID="cri-o://fa935800bf0e7d3f13a484953974db10e3cd9abda0f30c920d12bf8692ef1d38" gracePeriod=2 Dec 01 09:32:35 crc kubenswrapper[5004]: I1201 09:32:35.699461 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nk6vs" Dec 01 09:32:35 crc kubenswrapper[5004]: I1201 09:32:35.795965 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff052992-0b16-459e-afdd-bced1e60350a-catalog-content\") pod \"ff052992-0b16-459e-afdd-bced1e60350a\" (UID: \"ff052992-0b16-459e-afdd-bced1e60350a\") " Dec 01 09:32:35 crc kubenswrapper[5004]: I1201 09:32:35.796151 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hdx6c\" (UniqueName: \"kubernetes.io/projected/ff052992-0b16-459e-afdd-bced1e60350a-kube-api-access-hdx6c\") pod \"ff052992-0b16-459e-afdd-bced1e60350a\" (UID: \"ff052992-0b16-459e-afdd-bced1e60350a\") " Dec 01 09:32:35 crc kubenswrapper[5004]: I1201 09:32:35.796211 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff052992-0b16-459e-afdd-bced1e60350a-utilities\") pod \"ff052992-0b16-459e-afdd-bced1e60350a\" (UID: \"ff052992-0b16-459e-afdd-bced1e60350a\") " Dec 01 09:32:35 crc kubenswrapper[5004]: I1201 09:32:35.797038 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff052992-0b16-459e-afdd-bced1e60350a-utilities" (OuterVolumeSpecName: "utilities") pod "ff052992-0b16-459e-afdd-bced1e60350a" (UID: "ff052992-0b16-459e-afdd-bced1e60350a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:32:35 crc kubenswrapper[5004]: I1201 09:32:35.797487 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff052992-0b16-459e-afdd-bced1e60350a-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:32:35 crc kubenswrapper[5004]: I1201 09:32:35.802290 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff052992-0b16-459e-afdd-bced1e60350a-kube-api-access-hdx6c" (OuterVolumeSpecName: "kube-api-access-hdx6c") pod "ff052992-0b16-459e-afdd-bced1e60350a" (UID: "ff052992-0b16-459e-afdd-bced1e60350a"). InnerVolumeSpecName "kube-api-access-hdx6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:32:35 crc kubenswrapper[5004]: I1201 09:32:35.899955 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hdx6c\" (UniqueName: \"kubernetes.io/projected/ff052992-0b16-459e-afdd-bced1e60350a-kube-api-access-hdx6c\") on node \"crc\" DevicePath \"\"" Dec 01 09:32:35 crc kubenswrapper[5004]: I1201 09:32:35.903651 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff052992-0b16-459e-afdd-bced1e60350a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ff052992-0b16-459e-afdd-bced1e60350a" (UID: "ff052992-0b16-459e-afdd-bced1e60350a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:32:36 crc kubenswrapper[5004]: I1201 09:32:36.002759 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff052992-0b16-459e-afdd-bced1e60350a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:32:36 crc kubenswrapper[5004]: I1201 09:32:36.135843 5004 generic.go:334] "Generic (PLEG): container finished" podID="ff052992-0b16-459e-afdd-bced1e60350a" containerID="fa935800bf0e7d3f13a484953974db10e3cd9abda0f30c920d12bf8692ef1d38" exitCode=0 Dec 01 09:32:36 crc kubenswrapper[5004]: I1201 09:32:36.135888 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nk6vs" event={"ID":"ff052992-0b16-459e-afdd-bced1e60350a","Type":"ContainerDied","Data":"fa935800bf0e7d3f13a484953974db10e3cd9abda0f30c920d12bf8692ef1d38"} Dec 01 09:32:36 crc kubenswrapper[5004]: I1201 09:32:36.135915 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nk6vs" event={"ID":"ff052992-0b16-459e-afdd-bced1e60350a","Type":"ContainerDied","Data":"3f3dadac4181edf959c85ab474983200f3d87beae2e2514207730cb185bfae4c"} Dec 01 09:32:36 crc kubenswrapper[5004]: I1201 09:32:36.135932 5004 scope.go:117] "RemoveContainer" containerID="fa935800bf0e7d3f13a484953974db10e3cd9abda0f30c920d12bf8692ef1d38" Dec 01 09:32:36 crc kubenswrapper[5004]: I1201 09:32:36.135955 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nk6vs" Dec 01 09:32:36 crc kubenswrapper[5004]: I1201 09:32:36.158520 5004 scope.go:117] "RemoveContainer" containerID="083a6d77b418dc2b79f56f79bba76145a42fdaae2f51202b36a6a6f13293c209" Dec 01 09:32:36 crc kubenswrapper[5004]: I1201 09:32:36.180787 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nk6vs"] Dec 01 09:32:36 crc kubenswrapper[5004]: I1201 09:32:36.191479 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nk6vs"] Dec 01 09:32:36 crc kubenswrapper[5004]: I1201 09:32:36.198590 5004 scope.go:117] "RemoveContainer" containerID="2116bed374eb52510c9d37d692cc5d5f001613579be6485e2d7e40c0f066c2e5" Dec 01 09:32:36 crc kubenswrapper[5004]: I1201 09:32:36.242805 5004 scope.go:117] "RemoveContainer" containerID="fa935800bf0e7d3f13a484953974db10e3cd9abda0f30c920d12bf8692ef1d38" Dec 01 09:32:36 crc kubenswrapper[5004]: E1201 09:32:36.243017 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa935800bf0e7d3f13a484953974db10e3cd9abda0f30c920d12bf8692ef1d38\": container with ID starting with fa935800bf0e7d3f13a484953974db10e3cd9abda0f30c920d12bf8692ef1d38 not found: ID does not exist" containerID="fa935800bf0e7d3f13a484953974db10e3cd9abda0f30c920d12bf8692ef1d38" Dec 01 09:32:36 crc kubenswrapper[5004]: I1201 09:32:36.243057 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa935800bf0e7d3f13a484953974db10e3cd9abda0f30c920d12bf8692ef1d38"} err="failed to get container status \"fa935800bf0e7d3f13a484953974db10e3cd9abda0f30c920d12bf8692ef1d38\": rpc error: code = NotFound desc = could not find container \"fa935800bf0e7d3f13a484953974db10e3cd9abda0f30c920d12bf8692ef1d38\": container with ID starting with fa935800bf0e7d3f13a484953974db10e3cd9abda0f30c920d12bf8692ef1d38 not found: ID does not exist" Dec 01 09:32:36 crc kubenswrapper[5004]: I1201 09:32:36.243079 5004 scope.go:117] "RemoveContainer" containerID="083a6d77b418dc2b79f56f79bba76145a42fdaae2f51202b36a6a6f13293c209" Dec 01 09:32:36 crc kubenswrapper[5004]: E1201 09:32:36.243332 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"083a6d77b418dc2b79f56f79bba76145a42fdaae2f51202b36a6a6f13293c209\": container with ID starting with 083a6d77b418dc2b79f56f79bba76145a42fdaae2f51202b36a6a6f13293c209 not found: ID does not exist" containerID="083a6d77b418dc2b79f56f79bba76145a42fdaae2f51202b36a6a6f13293c209" Dec 01 09:32:36 crc kubenswrapper[5004]: I1201 09:32:36.243358 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"083a6d77b418dc2b79f56f79bba76145a42fdaae2f51202b36a6a6f13293c209"} err="failed to get container status \"083a6d77b418dc2b79f56f79bba76145a42fdaae2f51202b36a6a6f13293c209\": rpc error: code = NotFound desc = could not find container \"083a6d77b418dc2b79f56f79bba76145a42fdaae2f51202b36a6a6f13293c209\": container with ID starting with 083a6d77b418dc2b79f56f79bba76145a42fdaae2f51202b36a6a6f13293c209 not found: ID does not exist" Dec 01 09:32:36 crc kubenswrapper[5004]: I1201 09:32:36.243374 5004 scope.go:117] "RemoveContainer" containerID="2116bed374eb52510c9d37d692cc5d5f001613579be6485e2d7e40c0f066c2e5" Dec 01 09:32:36 crc kubenswrapper[5004]: E1201 09:32:36.243611 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2116bed374eb52510c9d37d692cc5d5f001613579be6485e2d7e40c0f066c2e5\": container with ID starting with 2116bed374eb52510c9d37d692cc5d5f001613579be6485e2d7e40c0f066c2e5 not found: ID does not exist" containerID="2116bed374eb52510c9d37d692cc5d5f001613579be6485e2d7e40c0f066c2e5" Dec 01 09:32:36 crc kubenswrapper[5004]: I1201 09:32:36.243648 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2116bed374eb52510c9d37d692cc5d5f001613579be6485e2d7e40c0f066c2e5"} err="failed to get container status \"2116bed374eb52510c9d37d692cc5d5f001613579be6485e2d7e40c0f066c2e5\": rpc error: code = NotFound desc = could not find container \"2116bed374eb52510c9d37d692cc5d5f001613579be6485e2d7e40c0f066c2e5\": container with ID starting with 2116bed374eb52510c9d37d692cc5d5f001613579be6485e2d7e40c0f066c2e5 not found: ID does not exist" Dec 01 09:32:36 crc kubenswrapper[5004]: I1201 09:32:36.779358 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff052992-0b16-459e-afdd-bced1e60350a" path="/var/lib/kubelet/pods/ff052992-0b16-459e-afdd-bced1e60350a/volumes" Dec 01 09:32:38 crc kubenswrapper[5004]: I1201 09:32:38.729034 5004 patch_prober.go:28] interesting pod/machine-config-daemon-fvdgt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:32:38 crc kubenswrapper[5004]: I1201 09:32:38.729700 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:32:38 crc kubenswrapper[5004]: I1201 09:32:38.729787 5004 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" Dec 01 09:32:38 crc kubenswrapper[5004]: I1201 09:32:38.731157 5004 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ccf28834b75e460fde85109220042abb807e53b91adea0a0b179d3e7961709b8"} pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 09:32:38 crc kubenswrapper[5004]: I1201 09:32:38.731237 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" containerID="cri-o://ccf28834b75e460fde85109220042abb807e53b91adea0a0b179d3e7961709b8" gracePeriod=600 Dec 01 09:32:39 crc kubenswrapper[5004]: E1201 09:32:39.482979 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:32:40 crc kubenswrapper[5004]: I1201 09:32:40.193920 5004 generic.go:334] "Generic (PLEG): container finished" podID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerID="ccf28834b75e460fde85109220042abb807e53b91adea0a0b179d3e7961709b8" exitCode=0 Dec 01 09:32:40 crc kubenswrapper[5004]: I1201 09:32:40.193996 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" event={"ID":"f9977ebb-82de-4e96-8763-0b5a84f8d4ce","Type":"ContainerDied","Data":"ccf28834b75e460fde85109220042abb807e53b91adea0a0b179d3e7961709b8"} Dec 01 09:32:40 crc kubenswrapper[5004]: I1201 09:32:40.194298 5004 scope.go:117] "RemoveContainer" containerID="dbac988da5fadc05b09a91df3f1891ebf32218595972382befb5aa0753662a68" Dec 01 09:32:40 crc kubenswrapper[5004]: I1201 09:32:40.195083 5004 scope.go:117] "RemoveContainer" containerID="ccf28834b75e460fde85109220042abb807e53b91adea0a0b179d3e7961709b8" Dec 01 09:32:40 crc kubenswrapper[5004]: E1201 09:32:40.195448 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:32:55 crc kubenswrapper[5004]: I1201 09:32:55.759492 5004 scope.go:117] "RemoveContainer" containerID="ccf28834b75e460fde85109220042abb807e53b91adea0a0b179d3e7961709b8" Dec 01 09:32:55 crc kubenswrapper[5004]: E1201 09:32:55.760449 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:33:09 crc kubenswrapper[5004]: I1201 09:33:09.759633 5004 scope.go:117] "RemoveContainer" containerID="ccf28834b75e460fde85109220042abb807e53b91adea0a0b179d3e7961709b8" Dec 01 09:33:09 crc kubenswrapper[5004]: E1201 09:33:09.760888 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:33:24 crc kubenswrapper[5004]: I1201 09:33:24.759301 5004 scope.go:117] "RemoveContainer" containerID="ccf28834b75e460fde85109220042abb807e53b91adea0a0b179d3e7961709b8" Dec 01 09:33:24 crc kubenswrapper[5004]: E1201 09:33:24.760175 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:33:35 crc kubenswrapper[5004]: I1201 09:33:35.760271 5004 scope.go:117] "RemoveContainer" containerID="ccf28834b75e460fde85109220042abb807e53b91adea0a0b179d3e7961709b8" Dec 01 09:33:35 crc kubenswrapper[5004]: E1201 09:33:35.761863 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:33:49 crc kubenswrapper[5004]: I1201 09:33:49.760385 5004 scope.go:117] "RemoveContainer" containerID="ccf28834b75e460fde85109220042abb807e53b91adea0a0b179d3e7961709b8" Dec 01 09:33:49 crc kubenswrapper[5004]: E1201 09:33:49.761703 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:33:52 crc kubenswrapper[5004]: I1201 09:33:52.977208 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rttfr"] Dec 01 09:33:52 crc kubenswrapper[5004]: E1201 09:33:52.978433 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff052992-0b16-459e-afdd-bced1e60350a" containerName="extract-utilities" Dec 01 09:33:52 crc kubenswrapper[5004]: I1201 09:33:52.978454 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff052992-0b16-459e-afdd-bced1e60350a" containerName="extract-utilities" Dec 01 09:33:52 crc kubenswrapper[5004]: E1201 09:33:52.978474 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff052992-0b16-459e-afdd-bced1e60350a" containerName="registry-server" Dec 01 09:33:52 crc kubenswrapper[5004]: I1201 09:33:52.978482 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff052992-0b16-459e-afdd-bced1e60350a" containerName="registry-server" Dec 01 09:33:52 crc kubenswrapper[5004]: E1201 09:33:52.978498 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff052992-0b16-459e-afdd-bced1e60350a" containerName="extract-content" Dec 01 09:33:52 crc kubenswrapper[5004]: I1201 09:33:52.978506 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff052992-0b16-459e-afdd-bced1e60350a" containerName="extract-content" Dec 01 09:33:52 crc kubenswrapper[5004]: I1201 09:33:52.978844 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff052992-0b16-459e-afdd-bced1e60350a" containerName="registry-server" Dec 01 09:33:52 crc kubenswrapper[5004]: I1201 09:33:52.981371 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rttfr" Dec 01 09:33:52 crc kubenswrapper[5004]: I1201 09:33:52.998174 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rttfr"] Dec 01 09:33:53 crc kubenswrapper[5004]: I1201 09:33:53.073636 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fc42ed7-0735-4f81-9e0f-a2eb7e27bd7b-catalog-content\") pod \"certified-operators-rttfr\" (UID: \"3fc42ed7-0735-4f81-9e0f-a2eb7e27bd7b\") " pod="openshift-marketplace/certified-operators-rttfr" Dec 01 09:33:53 crc kubenswrapper[5004]: I1201 09:33:53.073703 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcqgm\" (UniqueName: \"kubernetes.io/projected/3fc42ed7-0735-4f81-9e0f-a2eb7e27bd7b-kube-api-access-wcqgm\") pod \"certified-operators-rttfr\" (UID: \"3fc42ed7-0735-4f81-9e0f-a2eb7e27bd7b\") " pod="openshift-marketplace/certified-operators-rttfr" Dec 01 09:33:53 crc kubenswrapper[5004]: I1201 09:33:53.073815 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fc42ed7-0735-4f81-9e0f-a2eb7e27bd7b-utilities\") pod \"certified-operators-rttfr\" (UID: \"3fc42ed7-0735-4f81-9e0f-a2eb7e27bd7b\") " pod="openshift-marketplace/certified-operators-rttfr" Dec 01 09:33:53 crc kubenswrapper[5004]: I1201 09:33:53.176058 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fc42ed7-0735-4f81-9e0f-a2eb7e27bd7b-catalog-content\") pod \"certified-operators-rttfr\" (UID: \"3fc42ed7-0735-4f81-9e0f-a2eb7e27bd7b\") " pod="openshift-marketplace/certified-operators-rttfr" Dec 01 09:33:53 crc kubenswrapper[5004]: I1201 09:33:53.176114 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcqgm\" (UniqueName: \"kubernetes.io/projected/3fc42ed7-0735-4f81-9e0f-a2eb7e27bd7b-kube-api-access-wcqgm\") pod \"certified-operators-rttfr\" (UID: \"3fc42ed7-0735-4f81-9e0f-a2eb7e27bd7b\") " pod="openshift-marketplace/certified-operators-rttfr" Dec 01 09:33:53 crc kubenswrapper[5004]: I1201 09:33:53.176211 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fc42ed7-0735-4f81-9e0f-a2eb7e27bd7b-utilities\") pod \"certified-operators-rttfr\" (UID: \"3fc42ed7-0735-4f81-9e0f-a2eb7e27bd7b\") " pod="openshift-marketplace/certified-operators-rttfr" Dec 01 09:33:53 crc kubenswrapper[5004]: I1201 09:33:53.176524 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fc42ed7-0735-4f81-9e0f-a2eb7e27bd7b-catalog-content\") pod \"certified-operators-rttfr\" (UID: \"3fc42ed7-0735-4f81-9e0f-a2eb7e27bd7b\") " pod="openshift-marketplace/certified-operators-rttfr" Dec 01 09:33:53 crc kubenswrapper[5004]: I1201 09:33:53.176635 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fc42ed7-0735-4f81-9e0f-a2eb7e27bd7b-utilities\") pod \"certified-operators-rttfr\" (UID: \"3fc42ed7-0735-4f81-9e0f-a2eb7e27bd7b\") " pod="openshift-marketplace/certified-operators-rttfr" Dec 01 09:33:53 crc kubenswrapper[5004]: I1201 09:33:53.198814 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcqgm\" (UniqueName: \"kubernetes.io/projected/3fc42ed7-0735-4f81-9e0f-a2eb7e27bd7b-kube-api-access-wcqgm\") pod \"certified-operators-rttfr\" (UID: \"3fc42ed7-0735-4f81-9e0f-a2eb7e27bd7b\") " pod="openshift-marketplace/certified-operators-rttfr" Dec 01 09:33:53 crc kubenswrapper[5004]: I1201 09:33:53.318197 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rttfr" Dec 01 09:33:53 crc kubenswrapper[5004]: I1201 09:33:53.951000 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rttfr"] Dec 01 09:33:54 crc kubenswrapper[5004]: I1201 09:33:54.660263 5004 generic.go:334] "Generic (PLEG): container finished" podID="3fc42ed7-0735-4f81-9e0f-a2eb7e27bd7b" containerID="fb79174ee9345f89e9d63f7e1aaa3628b7338f2d4f7e900f2b989f74e584b663" exitCode=0 Dec 01 09:33:54 crc kubenswrapper[5004]: I1201 09:33:54.660326 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rttfr" event={"ID":"3fc42ed7-0735-4f81-9e0f-a2eb7e27bd7b","Type":"ContainerDied","Data":"fb79174ee9345f89e9d63f7e1aaa3628b7338f2d4f7e900f2b989f74e584b663"} Dec 01 09:33:54 crc kubenswrapper[5004]: I1201 09:33:54.660681 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rttfr" event={"ID":"3fc42ed7-0735-4f81-9e0f-a2eb7e27bd7b","Type":"ContainerStarted","Data":"b6b1a4a93fc8621353ca870a6f9549c26e4385d06f63ff3c14e77445ce452cab"} Dec 01 09:33:55 crc kubenswrapper[5004]: I1201 09:33:55.673375 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rttfr" event={"ID":"3fc42ed7-0735-4f81-9e0f-a2eb7e27bd7b","Type":"ContainerStarted","Data":"39de37c5316979445590333112c45064a64aadd5d6f79de08954991821a2901b"} Dec 01 09:33:56 crc kubenswrapper[5004]: I1201 09:33:56.548060 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dwq22"] Dec 01 09:33:56 crc kubenswrapper[5004]: I1201 09:33:56.551439 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dwq22" Dec 01 09:33:56 crc kubenswrapper[5004]: I1201 09:33:56.565031 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dwq22"] Dec 01 09:33:56 crc kubenswrapper[5004]: I1201 09:33:56.566598 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/858e1967-07de-4a6a-b874-ccfb5ec07acd-utilities\") pod \"redhat-marketplace-dwq22\" (UID: \"858e1967-07de-4a6a-b874-ccfb5ec07acd\") " pod="openshift-marketplace/redhat-marketplace-dwq22" Dec 01 09:33:56 crc kubenswrapper[5004]: I1201 09:33:56.566668 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rfg7\" (UniqueName: \"kubernetes.io/projected/858e1967-07de-4a6a-b874-ccfb5ec07acd-kube-api-access-2rfg7\") pod \"redhat-marketplace-dwq22\" (UID: \"858e1967-07de-4a6a-b874-ccfb5ec07acd\") " pod="openshift-marketplace/redhat-marketplace-dwq22" Dec 01 09:33:56 crc kubenswrapper[5004]: I1201 09:33:56.566849 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/858e1967-07de-4a6a-b874-ccfb5ec07acd-catalog-content\") pod \"redhat-marketplace-dwq22\" (UID: \"858e1967-07de-4a6a-b874-ccfb5ec07acd\") " pod="openshift-marketplace/redhat-marketplace-dwq22" Dec 01 09:33:56 crc kubenswrapper[5004]: I1201 09:33:56.668930 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rfg7\" (UniqueName: \"kubernetes.io/projected/858e1967-07de-4a6a-b874-ccfb5ec07acd-kube-api-access-2rfg7\") pod \"redhat-marketplace-dwq22\" (UID: \"858e1967-07de-4a6a-b874-ccfb5ec07acd\") " pod="openshift-marketplace/redhat-marketplace-dwq22" Dec 01 09:33:56 crc kubenswrapper[5004]: I1201 09:33:56.669107 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/858e1967-07de-4a6a-b874-ccfb5ec07acd-catalog-content\") pod \"redhat-marketplace-dwq22\" (UID: \"858e1967-07de-4a6a-b874-ccfb5ec07acd\") " pod="openshift-marketplace/redhat-marketplace-dwq22" Dec 01 09:33:56 crc kubenswrapper[5004]: I1201 09:33:56.669288 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/858e1967-07de-4a6a-b874-ccfb5ec07acd-utilities\") pod \"redhat-marketplace-dwq22\" (UID: \"858e1967-07de-4a6a-b874-ccfb5ec07acd\") " pod="openshift-marketplace/redhat-marketplace-dwq22" Dec 01 09:33:56 crc kubenswrapper[5004]: I1201 09:33:56.670321 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/858e1967-07de-4a6a-b874-ccfb5ec07acd-catalog-content\") pod \"redhat-marketplace-dwq22\" (UID: \"858e1967-07de-4a6a-b874-ccfb5ec07acd\") " pod="openshift-marketplace/redhat-marketplace-dwq22" Dec 01 09:33:56 crc kubenswrapper[5004]: I1201 09:33:56.670341 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/858e1967-07de-4a6a-b874-ccfb5ec07acd-utilities\") pod \"redhat-marketplace-dwq22\" (UID: \"858e1967-07de-4a6a-b874-ccfb5ec07acd\") " pod="openshift-marketplace/redhat-marketplace-dwq22" Dec 01 09:33:56 crc kubenswrapper[5004]: I1201 09:33:56.683204 5004 generic.go:334] "Generic (PLEG): container finished" podID="3fc42ed7-0735-4f81-9e0f-a2eb7e27bd7b" containerID="39de37c5316979445590333112c45064a64aadd5d6f79de08954991821a2901b" exitCode=0 Dec 01 09:33:56 crc kubenswrapper[5004]: I1201 09:33:56.683265 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rttfr" event={"ID":"3fc42ed7-0735-4f81-9e0f-a2eb7e27bd7b","Type":"ContainerDied","Data":"39de37c5316979445590333112c45064a64aadd5d6f79de08954991821a2901b"} Dec 01 09:33:56 crc kubenswrapper[5004]: I1201 09:33:56.695147 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rfg7\" (UniqueName: \"kubernetes.io/projected/858e1967-07de-4a6a-b874-ccfb5ec07acd-kube-api-access-2rfg7\") pod \"redhat-marketplace-dwq22\" (UID: \"858e1967-07de-4a6a-b874-ccfb5ec07acd\") " pod="openshift-marketplace/redhat-marketplace-dwq22" Dec 01 09:33:56 crc kubenswrapper[5004]: I1201 09:33:56.874020 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dwq22" Dec 01 09:33:57 crc kubenswrapper[5004]: I1201 09:33:57.387510 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dwq22"] Dec 01 09:33:57 crc kubenswrapper[5004]: I1201 09:33:57.698363 5004 generic.go:334] "Generic (PLEG): container finished" podID="858e1967-07de-4a6a-b874-ccfb5ec07acd" containerID="8421424d6340fa754e9bf65ca0ab31559c294271574bacf13375d0b7c62e61b7" exitCode=0 Dec 01 09:33:57 crc kubenswrapper[5004]: I1201 09:33:57.698437 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dwq22" event={"ID":"858e1967-07de-4a6a-b874-ccfb5ec07acd","Type":"ContainerDied","Data":"8421424d6340fa754e9bf65ca0ab31559c294271574bacf13375d0b7c62e61b7"} Dec 01 09:33:57 crc kubenswrapper[5004]: I1201 09:33:57.698765 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dwq22" event={"ID":"858e1967-07de-4a6a-b874-ccfb5ec07acd","Type":"ContainerStarted","Data":"c54d766fa0e433990abcc8f5b975d030a3bc99a36f468adcc55ff16fe8f0f321"} Dec 01 09:33:58 crc kubenswrapper[5004]: I1201 09:33:58.715106 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rttfr" event={"ID":"3fc42ed7-0735-4f81-9e0f-a2eb7e27bd7b","Type":"ContainerStarted","Data":"002a64548c36718fc2edd5dc61386a2fdc25aaeb7216f8e9b06a501f629274af"} Dec 01 09:33:58 crc kubenswrapper[5004]: I1201 09:33:58.759485 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rttfr" podStartSLOduration=3.8127107369999997 podStartE2EDuration="6.759464649s" podCreationTimestamp="2025-12-01 09:33:52 +0000 UTC" firstStartedPulling="2025-12-01 09:33:54.662751723 +0000 UTC m=+4612.227743705" lastFinishedPulling="2025-12-01 09:33:57.609505625 +0000 UTC m=+4615.174497617" observedRunningTime="2025-12-01 09:33:58.737929474 +0000 UTC m=+4616.302921466" watchObservedRunningTime="2025-12-01 09:33:58.759464649 +0000 UTC m=+4616.324456631" Dec 01 09:33:59 crc kubenswrapper[5004]: I1201 09:33:59.735006 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dwq22" event={"ID":"858e1967-07de-4a6a-b874-ccfb5ec07acd","Type":"ContainerStarted","Data":"84e569f3f24be7ca53e46f6af5a23c5c61f7af40bd92bfa6b038de914c7355f5"} Dec 01 09:34:00 crc kubenswrapper[5004]: I1201 09:34:00.759734 5004 scope.go:117] "RemoveContainer" containerID="ccf28834b75e460fde85109220042abb807e53b91adea0a0b179d3e7961709b8" Dec 01 09:34:00 crc kubenswrapper[5004]: E1201 09:34:00.760353 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:34:01 crc kubenswrapper[5004]: I1201 09:34:01.757900 5004 generic.go:334] "Generic (PLEG): container finished" podID="858e1967-07de-4a6a-b874-ccfb5ec07acd" containerID="84e569f3f24be7ca53e46f6af5a23c5c61f7af40bd92bfa6b038de914c7355f5" exitCode=0 Dec 01 09:34:01 crc kubenswrapper[5004]: I1201 09:34:01.757948 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dwq22" event={"ID":"858e1967-07de-4a6a-b874-ccfb5ec07acd","Type":"ContainerDied","Data":"84e569f3f24be7ca53e46f6af5a23c5c61f7af40bd92bfa6b038de914c7355f5"} Dec 01 09:34:02 crc kubenswrapper[5004]: I1201 09:34:02.776884 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dwq22" event={"ID":"858e1967-07de-4a6a-b874-ccfb5ec07acd","Type":"ContainerStarted","Data":"942575a64640348a7871712f933719579e69b5c24c8ee7636b11138d7ac685cc"} Dec 01 09:34:02 crc kubenswrapper[5004]: I1201 09:34:02.806844 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dwq22" podStartSLOduration=2.215515433 podStartE2EDuration="6.806821581s" podCreationTimestamp="2025-12-01 09:33:56 +0000 UTC" firstStartedPulling="2025-12-01 09:33:57.703369294 +0000 UTC m=+4615.268361276" lastFinishedPulling="2025-12-01 09:34:02.294675442 +0000 UTC m=+4619.859667424" observedRunningTime="2025-12-01 09:34:02.796953061 +0000 UTC m=+4620.361945053" watchObservedRunningTime="2025-12-01 09:34:02.806821581 +0000 UTC m=+4620.371813563" Dec 01 09:34:03 crc kubenswrapper[5004]: I1201 09:34:03.318399 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rttfr" Dec 01 09:34:03 crc kubenswrapper[5004]: I1201 09:34:03.318748 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rttfr" Dec 01 09:34:03 crc kubenswrapper[5004]: I1201 09:34:03.378012 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rttfr" Dec 01 09:34:03 crc kubenswrapper[5004]: I1201 09:34:03.835417 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rttfr" Dec 01 09:34:05 crc kubenswrapper[5004]: I1201 09:34:05.740630 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rttfr"] Dec 01 09:34:05 crc kubenswrapper[5004]: I1201 09:34:05.808112 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rttfr" podUID="3fc42ed7-0735-4f81-9e0f-a2eb7e27bd7b" containerName="registry-server" containerID="cri-o://002a64548c36718fc2edd5dc61386a2fdc25aaeb7216f8e9b06a501f629274af" gracePeriod=2 Dec 01 09:34:06 crc kubenswrapper[5004]: I1201 09:34:06.307230 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rttfr" Dec 01 09:34:06 crc kubenswrapper[5004]: I1201 09:34:06.449085 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fc42ed7-0735-4f81-9e0f-a2eb7e27bd7b-utilities\") pod \"3fc42ed7-0735-4f81-9e0f-a2eb7e27bd7b\" (UID: \"3fc42ed7-0735-4f81-9e0f-a2eb7e27bd7b\") " Dec 01 09:34:06 crc kubenswrapper[5004]: I1201 09:34:06.449468 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fc42ed7-0735-4f81-9e0f-a2eb7e27bd7b-catalog-content\") pod \"3fc42ed7-0735-4f81-9e0f-a2eb7e27bd7b\" (UID: \"3fc42ed7-0735-4f81-9e0f-a2eb7e27bd7b\") " Dec 01 09:34:06 crc kubenswrapper[5004]: I1201 09:34:06.449522 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcqgm\" (UniqueName: \"kubernetes.io/projected/3fc42ed7-0735-4f81-9e0f-a2eb7e27bd7b-kube-api-access-wcqgm\") pod \"3fc42ed7-0735-4f81-9e0f-a2eb7e27bd7b\" (UID: \"3fc42ed7-0735-4f81-9e0f-a2eb7e27bd7b\") " Dec 01 09:34:06 crc kubenswrapper[5004]: I1201 09:34:06.450294 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3fc42ed7-0735-4f81-9e0f-a2eb7e27bd7b-utilities" (OuterVolumeSpecName: "utilities") pod "3fc42ed7-0735-4f81-9e0f-a2eb7e27bd7b" (UID: "3fc42ed7-0735-4f81-9e0f-a2eb7e27bd7b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:34:06 crc kubenswrapper[5004]: I1201 09:34:06.450847 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fc42ed7-0735-4f81-9e0f-a2eb7e27bd7b-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:06 crc kubenswrapper[5004]: I1201 09:34:06.459987 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fc42ed7-0735-4f81-9e0f-a2eb7e27bd7b-kube-api-access-wcqgm" (OuterVolumeSpecName: "kube-api-access-wcqgm") pod "3fc42ed7-0735-4f81-9e0f-a2eb7e27bd7b" (UID: "3fc42ed7-0735-4f81-9e0f-a2eb7e27bd7b"). InnerVolumeSpecName "kube-api-access-wcqgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:34:06 crc kubenswrapper[5004]: I1201 09:34:06.502532 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3fc42ed7-0735-4f81-9e0f-a2eb7e27bd7b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3fc42ed7-0735-4f81-9e0f-a2eb7e27bd7b" (UID: "3fc42ed7-0735-4f81-9e0f-a2eb7e27bd7b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:34:06 crc kubenswrapper[5004]: I1201 09:34:06.553226 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fc42ed7-0735-4f81-9e0f-a2eb7e27bd7b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:06 crc kubenswrapper[5004]: I1201 09:34:06.553275 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcqgm\" (UniqueName: \"kubernetes.io/projected/3fc42ed7-0735-4f81-9e0f-a2eb7e27bd7b-kube-api-access-wcqgm\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:06 crc kubenswrapper[5004]: I1201 09:34:06.830770 5004 generic.go:334] "Generic (PLEG): container finished" podID="3fc42ed7-0735-4f81-9e0f-a2eb7e27bd7b" containerID="002a64548c36718fc2edd5dc61386a2fdc25aaeb7216f8e9b06a501f629274af" exitCode=0 Dec 01 09:34:06 crc kubenswrapper[5004]: I1201 09:34:06.830891 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rttfr" event={"ID":"3fc42ed7-0735-4f81-9e0f-a2eb7e27bd7b","Type":"ContainerDied","Data":"002a64548c36718fc2edd5dc61386a2fdc25aaeb7216f8e9b06a501f629274af"} Dec 01 09:34:06 crc kubenswrapper[5004]: I1201 09:34:06.830937 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rttfr" event={"ID":"3fc42ed7-0735-4f81-9e0f-a2eb7e27bd7b","Type":"ContainerDied","Data":"b6b1a4a93fc8621353ca870a6f9549c26e4385d06f63ff3c14e77445ce452cab"} Dec 01 09:34:06 crc kubenswrapper[5004]: I1201 09:34:06.830963 5004 scope.go:117] "RemoveContainer" containerID="002a64548c36718fc2edd5dc61386a2fdc25aaeb7216f8e9b06a501f629274af" Dec 01 09:34:06 crc kubenswrapper[5004]: I1201 09:34:06.831323 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rttfr" Dec 01 09:34:06 crc kubenswrapper[5004]: I1201 09:34:06.871525 5004 scope.go:117] "RemoveContainer" containerID="39de37c5316979445590333112c45064a64aadd5d6f79de08954991821a2901b" Dec 01 09:34:06 crc kubenswrapper[5004]: I1201 09:34:06.874266 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dwq22" Dec 01 09:34:06 crc kubenswrapper[5004]: I1201 09:34:06.874320 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dwq22" Dec 01 09:34:06 crc kubenswrapper[5004]: I1201 09:34:06.875030 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rttfr"] Dec 01 09:34:06 crc kubenswrapper[5004]: I1201 09:34:06.888597 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rttfr"] Dec 01 09:34:06 crc kubenswrapper[5004]: I1201 09:34:06.901764 5004 scope.go:117] "RemoveContainer" containerID="fb79174ee9345f89e9d63f7e1aaa3628b7338f2d4f7e900f2b989f74e584b663" Dec 01 09:34:06 crc kubenswrapper[5004]: I1201 09:34:06.929336 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dwq22" Dec 01 09:34:06 crc kubenswrapper[5004]: I1201 09:34:06.964854 5004 scope.go:117] "RemoveContainer" containerID="002a64548c36718fc2edd5dc61386a2fdc25aaeb7216f8e9b06a501f629274af" Dec 01 09:34:06 crc kubenswrapper[5004]: E1201 09:34:06.965341 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"002a64548c36718fc2edd5dc61386a2fdc25aaeb7216f8e9b06a501f629274af\": container with ID starting with 002a64548c36718fc2edd5dc61386a2fdc25aaeb7216f8e9b06a501f629274af not found: ID does not exist" containerID="002a64548c36718fc2edd5dc61386a2fdc25aaeb7216f8e9b06a501f629274af" Dec 01 09:34:06 crc kubenswrapper[5004]: I1201 09:34:06.965391 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"002a64548c36718fc2edd5dc61386a2fdc25aaeb7216f8e9b06a501f629274af"} err="failed to get container status \"002a64548c36718fc2edd5dc61386a2fdc25aaeb7216f8e9b06a501f629274af\": rpc error: code = NotFound desc = could not find container \"002a64548c36718fc2edd5dc61386a2fdc25aaeb7216f8e9b06a501f629274af\": container with ID starting with 002a64548c36718fc2edd5dc61386a2fdc25aaeb7216f8e9b06a501f629274af not found: ID does not exist" Dec 01 09:34:06 crc kubenswrapper[5004]: I1201 09:34:06.965422 5004 scope.go:117] "RemoveContainer" containerID="39de37c5316979445590333112c45064a64aadd5d6f79de08954991821a2901b" Dec 01 09:34:06 crc kubenswrapper[5004]: E1201 09:34:06.965756 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39de37c5316979445590333112c45064a64aadd5d6f79de08954991821a2901b\": container with ID starting with 39de37c5316979445590333112c45064a64aadd5d6f79de08954991821a2901b not found: ID does not exist" containerID="39de37c5316979445590333112c45064a64aadd5d6f79de08954991821a2901b" Dec 01 09:34:06 crc kubenswrapper[5004]: I1201 09:34:06.965778 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39de37c5316979445590333112c45064a64aadd5d6f79de08954991821a2901b"} err="failed to get container status \"39de37c5316979445590333112c45064a64aadd5d6f79de08954991821a2901b\": rpc error: code = NotFound desc = could not find container \"39de37c5316979445590333112c45064a64aadd5d6f79de08954991821a2901b\": container with ID starting with 39de37c5316979445590333112c45064a64aadd5d6f79de08954991821a2901b not found: ID does not exist" Dec 01 09:34:06 crc kubenswrapper[5004]: I1201 09:34:06.965792 5004 scope.go:117] "RemoveContainer" containerID="fb79174ee9345f89e9d63f7e1aaa3628b7338f2d4f7e900f2b989f74e584b663" Dec 01 09:34:06 crc kubenswrapper[5004]: E1201 09:34:06.965966 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb79174ee9345f89e9d63f7e1aaa3628b7338f2d4f7e900f2b989f74e584b663\": container with ID starting with fb79174ee9345f89e9d63f7e1aaa3628b7338f2d4f7e900f2b989f74e584b663 not found: ID does not exist" containerID="fb79174ee9345f89e9d63f7e1aaa3628b7338f2d4f7e900f2b989f74e584b663" Dec 01 09:34:06 crc kubenswrapper[5004]: I1201 09:34:06.965984 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb79174ee9345f89e9d63f7e1aaa3628b7338f2d4f7e900f2b989f74e584b663"} err="failed to get container status \"fb79174ee9345f89e9d63f7e1aaa3628b7338f2d4f7e900f2b989f74e584b663\": rpc error: code = NotFound desc = could not find container \"fb79174ee9345f89e9d63f7e1aaa3628b7338f2d4f7e900f2b989f74e584b663\": container with ID starting with fb79174ee9345f89e9d63f7e1aaa3628b7338f2d4f7e900f2b989f74e584b663 not found: ID does not exist" Dec 01 09:34:07 crc kubenswrapper[5004]: I1201 09:34:07.896775 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dwq22" Dec 01 09:34:08 crc kubenswrapper[5004]: I1201 09:34:08.770543 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fc42ed7-0735-4f81-9e0f-a2eb7e27bd7b" path="/var/lib/kubelet/pods/3fc42ed7-0735-4f81-9e0f-a2eb7e27bd7b/volumes" Dec 01 09:34:09 crc kubenswrapper[5004]: I1201 09:34:09.144151 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dwq22"] Dec 01 09:34:09 crc kubenswrapper[5004]: I1201 09:34:09.863252 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dwq22" podUID="858e1967-07de-4a6a-b874-ccfb5ec07acd" containerName="registry-server" containerID="cri-o://942575a64640348a7871712f933719579e69b5c24c8ee7636b11138d7ac685cc" gracePeriod=2 Dec 01 09:34:10 crc kubenswrapper[5004]: I1201 09:34:10.880067 5004 generic.go:334] "Generic (PLEG): container finished" podID="858e1967-07de-4a6a-b874-ccfb5ec07acd" containerID="942575a64640348a7871712f933719579e69b5c24c8ee7636b11138d7ac685cc" exitCode=0 Dec 01 09:34:10 crc kubenswrapper[5004]: I1201 09:34:10.880127 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dwq22" event={"ID":"858e1967-07de-4a6a-b874-ccfb5ec07acd","Type":"ContainerDied","Data":"942575a64640348a7871712f933719579e69b5c24c8ee7636b11138d7ac685cc"} Dec 01 09:34:11 crc kubenswrapper[5004]: I1201 09:34:11.084512 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dwq22" Dec 01 09:34:11 crc kubenswrapper[5004]: I1201 09:34:11.172220 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rfg7\" (UniqueName: \"kubernetes.io/projected/858e1967-07de-4a6a-b874-ccfb5ec07acd-kube-api-access-2rfg7\") pod \"858e1967-07de-4a6a-b874-ccfb5ec07acd\" (UID: \"858e1967-07de-4a6a-b874-ccfb5ec07acd\") " Dec 01 09:34:11 crc kubenswrapper[5004]: I1201 09:34:11.172467 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/858e1967-07de-4a6a-b874-ccfb5ec07acd-utilities\") pod \"858e1967-07de-4a6a-b874-ccfb5ec07acd\" (UID: \"858e1967-07de-4a6a-b874-ccfb5ec07acd\") " Dec 01 09:34:11 crc kubenswrapper[5004]: I1201 09:34:11.172494 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/858e1967-07de-4a6a-b874-ccfb5ec07acd-catalog-content\") pod \"858e1967-07de-4a6a-b874-ccfb5ec07acd\" (UID: \"858e1967-07de-4a6a-b874-ccfb5ec07acd\") " Dec 01 09:34:11 crc kubenswrapper[5004]: I1201 09:34:11.173393 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/858e1967-07de-4a6a-b874-ccfb5ec07acd-utilities" (OuterVolumeSpecName: "utilities") pod "858e1967-07de-4a6a-b874-ccfb5ec07acd" (UID: "858e1967-07de-4a6a-b874-ccfb5ec07acd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:34:11 crc kubenswrapper[5004]: I1201 09:34:11.181904 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/858e1967-07de-4a6a-b874-ccfb5ec07acd-kube-api-access-2rfg7" (OuterVolumeSpecName: "kube-api-access-2rfg7") pod "858e1967-07de-4a6a-b874-ccfb5ec07acd" (UID: "858e1967-07de-4a6a-b874-ccfb5ec07acd"). InnerVolumeSpecName "kube-api-access-2rfg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:34:11 crc kubenswrapper[5004]: I1201 09:34:11.192493 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/858e1967-07de-4a6a-b874-ccfb5ec07acd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "858e1967-07de-4a6a-b874-ccfb5ec07acd" (UID: "858e1967-07de-4a6a-b874-ccfb5ec07acd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:34:11 crc kubenswrapper[5004]: I1201 09:34:11.275970 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/858e1967-07de-4a6a-b874-ccfb5ec07acd-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:11 crc kubenswrapper[5004]: I1201 09:34:11.276013 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/858e1967-07de-4a6a-b874-ccfb5ec07acd-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:11 crc kubenswrapper[5004]: I1201 09:34:11.276030 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rfg7\" (UniqueName: \"kubernetes.io/projected/858e1967-07de-4a6a-b874-ccfb5ec07acd-kube-api-access-2rfg7\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:11 crc kubenswrapper[5004]: I1201 09:34:11.901851 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dwq22" event={"ID":"858e1967-07de-4a6a-b874-ccfb5ec07acd","Type":"ContainerDied","Data":"c54d766fa0e433990abcc8f5b975d030a3bc99a36f468adcc55ff16fe8f0f321"} Dec 01 09:34:11 crc kubenswrapper[5004]: I1201 09:34:11.902223 5004 scope.go:117] "RemoveContainer" containerID="942575a64640348a7871712f933719579e69b5c24c8ee7636b11138d7ac685cc" Dec 01 09:34:11 crc kubenswrapper[5004]: I1201 09:34:11.902032 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dwq22" Dec 01 09:34:11 crc kubenswrapper[5004]: I1201 09:34:11.932722 5004 scope.go:117] "RemoveContainer" containerID="84e569f3f24be7ca53e46f6af5a23c5c61f7af40bd92bfa6b038de914c7355f5" Dec 01 09:34:11 crc kubenswrapper[5004]: I1201 09:34:11.955461 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dwq22"] Dec 01 09:34:11 crc kubenswrapper[5004]: I1201 09:34:11.966964 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dwq22"] Dec 01 09:34:12 crc kubenswrapper[5004]: I1201 09:34:12.236798 5004 scope.go:117] "RemoveContainer" containerID="8421424d6340fa754e9bf65ca0ab31559c294271574bacf13375d0b7c62e61b7" Dec 01 09:34:12 crc kubenswrapper[5004]: I1201 09:34:12.774919 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="858e1967-07de-4a6a-b874-ccfb5ec07acd" path="/var/lib/kubelet/pods/858e1967-07de-4a6a-b874-ccfb5ec07acd/volumes" Dec 01 09:34:13 crc kubenswrapper[5004]: I1201 09:34:13.760003 5004 scope.go:117] "RemoveContainer" containerID="ccf28834b75e460fde85109220042abb807e53b91adea0a0b179d3e7961709b8" Dec 01 09:34:13 crc kubenswrapper[5004]: E1201 09:34:13.760538 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:34:28 crc kubenswrapper[5004]: I1201 09:34:28.762526 5004 scope.go:117] "RemoveContainer" containerID="ccf28834b75e460fde85109220042abb807e53b91adea0a0b179d3e7961709b8" Dec 01 09:34:28 crc kubenswrapper[5004]: E1201 09:34:28.763424 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:34:40 crc kubenswrapper[5004]: I1201 09:34:40.759665 5004 scope.go:117] "RemoveContainer" containerID="ccf28834b75e460fde85109220042abb807e53b91adea0a0b179d3e7961709b8" Dec 01 09:34:40 crc kubenswrapper[5004]: E1201 09:34:40.760520 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:34:54 crc kubenswrapper[5004]: I1201 09:34:54.759627 5004 scope.go:117] "RemoveContainer" containerID="ccf28834b75e460fde85109220042abb807e53b91adea0a0b179d3e7961709b8" Dec 01 09:34:54 crc kubenswrapper[5004]: E1201 09:34:54.760426 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:35:09 crc kubenswrapper[5004]: I1201 09:35:09.759268 5004 scope.go:117] "RemoveContainer" containerID="ccf28834b75e460fde85109220042abb807e53b91adea0a0b179d3e7961709b8" Dec 01 09:35:09 crc kubenswrapper[5004]: E1201 09:35:09.760188 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:35:23 crc kubenswrapper[5004]: I1201 09:35:23.760378 5004 scope.go:117] "RemoveContainer" containerID="ccf28834b75e460fde85109220042abb807e53b91adea0a0b179d3e7961709b8" Dec 01 09:35:23 crc kubenswrapper[5004]: E1201 09:35:23.761154 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:35:34 crc kubenswrapper[5004]: I1201 09:35:34.758890 5004 scope.go:117] "RemoveContainer" containerID="ccf28834b75e460fde85109220042abb807e53b91adea0a0b179d3e7961709b8" Dec 01 09:35:34 crc kubenswrapper[5004]: E1201 09:35:34.759598 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:35:48 crc kubenswrapper[5004]: I1201 09:35:48.182616 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gql4w"] Dec 01 09:35:48 crc kubenswrapper[5004]: E1201 09:35:48.183907 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fc42ed7-0735-4f81-9e0f-a2eb7e27bd7b" containerName="extract-utilities" Dec 01 09:35:48 crc kubenswrapper[5004]: I1201 09:35:48.183924 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fc42ed7-0735-4f81-9e0f-a2eb7e27bd7b" containerName="extract-utilities" Dec 01 09:35:48 crc kubenswrapper[5004]: E1201 09:35:48.183944 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fc42ed7-0735-4f81-9e0f-a2eb7e27bd7b" containerName="extract-content" Dec 01 09:35:48 crc kubenswrapper[5004]: I1201 09:35:48.183952 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fc42ed7-0735-4f81-9e0f-a2eb7e27bd7b" containerName="extract-content" Dec 01 09:35:48 crc kubenswrapper[5004]: E1201 09:35:48.183963 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="858e1967-07de-4a6a-b874-ccfb5ec07acd" containerName="extract-utilities" Dec 01 09:35:48 crc kubenswrapper[5004]: I1201 09:35:48.183971 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="858e1967-07de-4a6a-b874-ccfb5ec07acd" containerName="extract-utilities" Dec 01 09:35:48 crc kubenswrapper[5004]: E1201 09:35:48.183985 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fc42ed7-0735-4f81-9e0f-a2eb7e27bd7b" containerName="registry-server" Dec 01 09:35:48 crc kubenswrapper[5004]: I1201 09:35:48.183993 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fc42ed7-0735-4f81-9e0f-a2eb7e27bd7b" containerName="registry-server" Dec 01 09:35:48 crc kubenswrapper[5004]: E1201 09:35:48.184031 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="858e1967-07de-4a6a-b874-ccfb5ec07acd" containerName="registry-server" Dec 01 09:35:48 crc kubenswrapper[5004]: I1201 09:35:48.184039 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="858e1967-07de-4a6a-b874-ccfb5ec07acd" containerName="registry-server" Dec 01 09:35:48 crc kubenswrapper[5004]: E1201 09:35:48.184064 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="858e1967-07de-4a6a-b874-ccfb5ec07acd" containerName="extract-content" Dec 01 09:35:48 crc kubenswrapper[5004]: I1201 09:35:48.184074 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="858e1967-07de-4a6a-b874-ccfb5ec07acd" containerName="extract-content" Dec 01 09:35:48 crc kubenswrapper[5004]: I1201 09:35:48.184342 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="858e1967-07de-4a6a-b874-ccfb5ec07acd" containerName="registry-server" Dec 01 09:35:48 crc kubenswrapper[5004]: I1201 09:35:48.184377 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fc42ed7-0735-4f81-9e0f-a2eb7e27bd7b" containerName="registry-server" Dec 01 09:35:48 crc kubenswrapper[5004]: I1201 09:35:48.187118 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gql4w" Dec 01 09:35:48 crc kubenswrapper[5004]: I1201 09:35:48.203960 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gql4w"] Dec 01 09:35:48 crc kubenswrapper[5004]: I1201 09:35:48.306050 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8f0d5dd-4555-4525-bfa4-b6dd1bf01914-utilities\") pod \"community-operators-gql4w\" (UID: \"b8f0d5dd-4555-4525-bfa4-b6dd1bf01914\") " pod="openshift-marketplace/community-operators-gql4w" Dec 01 09:35:48 crc kubenswrapper[5004]: I1201 09:35:48.306161 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8f0d5dd-4555-4525-bfa4-b6dd1bf01914-catalog-content\") pod \"community-operators-gql4w\" (UID: \"b8f0d5dd-4555-4525-bfa4-b6dd1bf01914\") " pod="openshift-marketplace/community-operators-gql4w" Dec 01 09:35:48 crc kubenswrapper[5004]: I1201 09:35:48.306303 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdkwr\" (UniqueName: \"kubernetes.io/projected/b8f0d5dd-4555-4525-bfa4-b6dd1bf01914-kube-api-access-kdkwr\") pod \"community-operators-gql4w\" (UID: \"b8f0d5dd-4555-4525-bfa4-b6dd1bf01914\") " pod="openshift-marketplace/community-operators-gql4w" Dec 01 09:35:48 crc kubenswrapper[5004]: I1201 09:35:48.408479 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdkwr\" (UniqueName: \"kubernetes.io/projected/b8f0d5dd-4555-4525-bfa4-b6dd1bf01914-kube-api-access-kdkwr\") pod \"community-operators-gql4w\" (UID: \"b8f0d5dd-4555-4525-bfa4-b6dd1bf01914\") " pod="openshift-marketplace/community-operators-gql4w" Dec 01 09:35:48 crc kubenswrapper[5004]: I1201 09:35:48.408775 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8f0d5dd-4555-4525-bfa4-b6dd1bf01914-utilities\") pod \"community-operators-gql4w\" (UID: \"b8f0d5dd-4555-4525-bfa4-b6dd1bf01914\") " pod="openshift-marketplace/community-operators-gql4w" Dec 01 09:35:48 crc kubenswrapper[5004]: I1201 09:35:48.408856 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8f0d5dd-4555-4525-bfa4-b6dd1bf01914-catalog-content\") pod \"community-operators-gql4w\" (UID: \"b8f0d5dd-4555-4525-bfa4-b6dd1bf01914\") " pod="openshift-marketplace/community-operators-gql4w" Dec 01 09:35:48 crc kubenswrapper[5004]: I1201 09:35:48.409315 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8f0d5dd-4555-4525-bfa4-b6dd1bf01914-utilities\") pod \"community-operators-gql4w\" (UID: \"b8f0d5dd-4555-4525-bfa4-b6dd1bf01914\") " pod="openshift-marketplace/community-operators-gql4w" Dec 01 09:35:48 crc kubenswrapper[5004]: I1201 09:35:48.409323 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8f0d5dd-4555-4525-bfa4-b6dd1bf01914-catalog-content\") pod \"community-operators-gql4w\" (UID: \"b8f0d5dd-4555-4525-bfa4-b6dd1bf01914\") " pod="openshift-marketplace/community-operators-gql4w" Dec 01 09:35:48 crc kubenswrapper[5004]: I1201 09:35:48.441988 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdkwr\" (UniqueName: \"kubernetes.io/projected/b8f0d5dd-4555-4525-bfa4-b6dd1bf01914-kube-api-access-kdkwr\") pod \"community-operators-gql4w\" (UID: \"b8f0d5dd-4555-4525-bfa4-b6dd1bf01914\") " pod="openshift-marketplace/community-operators-gql4w" Dec 01 09:35:48 crc kubenswrapper[5004]: I1201 09:35:48.525468 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gql4w" Dec 01 09:35:49 crc kubenswrapper[5004]: W1201 09:35:49.092161 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8f0d5dd_4555_4525_bfa4_b6dd1bf01914.slice/crio-a3f7ed04078b80116b17021327f615880cd346e802985bc95c9d9ad34fdba465 WatchSource:0}: Error finding container a3f7ed04078b80116b17021327f615880cd346e802985bc95c9d9ad34fdba465: Status 404 returned error can't find the container with id a3f7ed04078b80116b17021327f615880cd346e802985bc95c9d9ad34fdba465 Dec 01 09:35:49 crc kubenswrapper[5004]: I1201 09:35:49.112744 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gql4w"] Dec 01 09:35:49 crc kubenswrapper[5004]: I1201 09:35:49.759920 5004 scope.go:117] "RemoveContainer" containerID="ccf28834b75e460fde85109220042abb807e53b91adea0a0b179d3e7961709b8" Dec 01 09:35:49 crc kubenswrapper[5004]: E1201 09:35:49.760624 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:35:50 crc kubenswrapper[5004]: I1201 09:35:50.090644 5004 generic.go:334] "Generic (PLEG): container finished" podID="b8f0d5dd-4555-4525-bfa4-b6dd1bf01914" containerID="cfdd5bb36186f2d9b06960d778b609d93500957581be1b0ddd53bca306bc8a5b" exitCode=0 Dec 01 09:35:50 crc kubenswrapper[5004]: I1201 09:35:50.090699 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gql4w" event={"ID":"b8f0d5dd-4555-4525-bfa4-b6dd1bf01914","Type":"ContainerDied","Data":"cfdd5bb36186f2d9b06960d778b609d93500957581be1b0ddd53bca306bc8a5b"} Dec 01 09:35:50 crc kubenswrapper[5004]: I1201 09:35:50.090730 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gql4w" event={"ID":"b8f0d5dd-4555-4525-bfa4-b6dd1bf01914","Type":"ContainerStarted","Data":"a3f7ed04078b80116b17021327f615880cd346e802985bc95c9d9ad34fdba465"} Dec 01 09:35:52 crc kubenswrapper[5004]: I1201 09:35:52.116291 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gql4w" event={"ID":"b8f0d5dd-4555-4525-bfa4-b6dd1bf01914","Type":"ContainerStarted","Data":"6ff631ea9ceb84b86e541ef75420d25335d99ac24fe4aebe6f0229bcf8a81f19"} Dec 01 09:35:53 crc kubenswrapper[5004]: I1201 09:35:53.128877 5004 generic.go:334] "Generic (PLEG): container finished" podID="b8f0d5dd-4555-4525-bfa4-b6dd1bf01914" containerID="6ff631ea9ceb84b86e541ef75420d25335d99ac24fe4aebe6f0229bcf8a81f19" exitCode=0 Dec 01 09:35:53 crc kubenswrapper[5004]: I1201 09:35:53.128968 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gql4w" event={"ID":"b8f0d5dd-4555-4525-bfa4-b6dd1bf01914","Type":"ContainerDied","Data":"6ff631ea9ceb84b86e541ef75420d25335d99ac24fe4aebe6f0229bcf8a81f19"} Dec 01 09:35:55 crc kubenswrapper[5004]: I1201 09:35:55.148120 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gql4w" event={"ID":"b8f0d5dd-4555-4525-bfa4-b6dd1bf01914","Type":"ContainerStarted","Data":"2014e3029e7fca5fb49712972f08f2250b94debaac2cab83ce8751bdab1c40cf"} Dec 01 09:35:55 crc kubenswrapper[5004]: I1201 09:35:55.169242 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gql4w" podStartSLOduration=3.40410594 podStartE2EDuration="7.16922303s" podCreationTimestamp="2025-12-01 09:35:48 +0000 UTC" firstStartedPulling="2025-12-01 09:35:50.093795896 +0000 UTC m=+4727.658787878" lastFinishedPulling="2025-12-01 09:35:53.858912986 +0000 UTC m=+4731.423904968" observedRunningTime="2025-12-01 09:35:55.168966324 +0000 UTC m=+4732.733958306" watchObservedRunningTime="2025-12-01 09:35:55.16922303 +0000 UTC m=+4732.734215012" Dec 01 09:35:58 crc kubenswrapper[5004]: I1201 09:35:58.525620 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gql4w" Dec 01 09:35:58 crc kubenswrapper[5004]: I1201 09:35:58.527252 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gql4w" Dec 01 09:35:58 crc kubenswrapper[5004]: I1201 09:35:58.580101 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gql4w" Dec 01 09:35:59 crc kubenswrapper[5004]: I1201 09:35:59.235554 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gql4w" Dec 01 09:35:59 crc kubenswrapper[5004]: I1201 09:35:59.293416 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gql4w"] Dec 01 09:36:01 crc kubenswrapper[5004]: I1201 09:36:01.219161 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gql4w" podUID="b8f0d5dd-4555-4525-bfa4-b6dd1bf01914" containerName="registry-server" containerID="cri-o://2014e3029e7fca5fb49712972f08f2250b94debaac2cab83ce8751bdab1c40cf" gracePeriod=2 Dec 01 09:36:01 crc kubenswrapper[5004]: I1201 09:36:01.775303 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gql4w" Dec 01 09:36:01 crc kubenswrapper[5004]: I1201 09:36:01.832910 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdkwr\" (UniqueName: \"kubernetes.io/projected/b8f0d5dd-4555-4525-bfa4-b6dd1bf01914-kube-api-access-kdkwr\") pod \"b8f0d5dd-4555-4525-bfa4-b6dd1bf01914\" (UID: \"b8f0d5dd-4555-4525-bfa4-b6dd1bf01914\") " Dec 01 09:36:01 crc kubenswrapper[5004]: I1201 09:36:01.833047 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8f0d5dd-4555-4525-bfa4-b6dd1bf01914-utilities\") pod \"b8f0d5dd-4555-4525-bfa4-b6dd1bf01914\" (UID: \"b8f0d5dd-4555-4525-bfa4-b6dd1bf01914\") " Dec 01 09:36:01 crc kubenswrapper[5004]: I1201 09:36:01.833329 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8f0d5dd-4555-4525-bfa4-b6dd1bf01914-catalog-content\") pod \"b8f0d5dd-4555-4525-bfa4-b6dd1bf01914\" (UID: \"b8f0d5dd-4555-4525-bfa4-b6dd1bf01914\") " Dec 01 09:36:01 crc kubenswrapper[5004]: I1201 09:36:01.834101 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8f0d5dd-4555-4525-bfa4-b6dd1bf01914-utilities" (OuterVolumeSpecName: "utilities") pod "b8f0d5dd-4555-4525-bfa4-b6dd1bf01914" (UID: "b8f0d5dd-4555-4525-bfa4-b6dd1bf01914"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:36:01 crc kubenswrapper[5004]: I1201 09:36:01.841702 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8f0d5dd-4555-4525-bfa4-b6dd1bf01914-kube-api-access-kdkwr" (OuterVolumeSpecName: "kube-api-access-kdkwr") pod "b8f0d5dd-4555-4525-bfa4-b6dd1bf01914" (UID: "b8f0d5dd-4555-4525-bfa4-b6dd1bf01914"). InnerVolumeSpecName "kube-api-access-kdkwr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:36:01 crc kubenswrapper[5004]: I1201 09:36:01.887399 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8f0d5dd-4555-4525-bfa4-b6dd1bf01914-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b8f0d5dd-4555-4525-bfa4-b6dd1bf01914" (UID: "b8f0d5dd-4555-4525-bfa4-b6dd1bf01914"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:36:01 crc kubenswrapper[5004]: I1201 09:36:01.936048 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdkwr\" (UniqueName: \"kubernetes.io/projected/b8f0d5dd-4555-4525-bfa4-b6dd1bf01914-kube-api-access-kdkwr\") on node \"crc\" DevicePath \"\"" Dec 01 09:36:01 crc kubenswrapper[5004]: I1201 09:36:01.936092 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8f0d5dd-4555-4525-bfa4-b6dd1bf01914-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:36:01 crc kubenswrapper[5004]: I1201 09:36:01.936101 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8f0d5dd-4555-4525-bfa4-b6dd1bf01914-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:36:02 crc kubenswrapper[5004]: I1201 09:36:02.231964 5004 generic.go:334] "Generic (PLEG): container finished" podID="b8f0d5dd-4555-4525-bfa4-b6dd1bf01914" containerID="2014e3029e7fca5fb49712972f08f2250b94debaac2cab83ce8751bdab1c40cf" exitCode=0 Dec 01 09:36:02 crc kubenswrapper[5004]: I1201 09:36:02.232017 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gql4w" Dec 01 09:36:02 crc kubenswrapper[5004]: I1201 09:36:02.232037 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gql4w" event={"ID":"b8f0d5dd-4555-4525-bfa4-b6dd1bf01914","Type":"ContainerDied","Data":"2014e3029e7fca5fb49712972f08f2250b94debaac2cab83ce8751bdab1c40cf"} Dec 01 09:36:02 crc kubenswrapper[5004]: I1201 09:36:02.232985 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gql4w" event={"ID":"b8f0d5dd-4555-4525-bfa4-b6dd1bf01914","Type":"ContainerDied","Data":"a3f7ed04078b80116b17021327f615880cd346e802985bc95c9d9ad34fdba465"} Dec 01 09:36:02 crc kubenswrapper[5004]: I1201 09:36:02.233003 5004 scope.go:117] "RemoveContainer" containerID="2014e3029e7fca5fb49712972f08f2250b94debaac2cab83ce8751bdab1c40cf" Dec 01 09:36:02 crc kubenswrapper[5004]: I1201 09:36:02.255548 5004 scope.go:117] "RemoveContainer" containerID="6ff631ea9ceb84b86e541ef75420d25335d99ac24fe4aebe6f0229bcf8a81f19" Dec 01 09:36:02 crc kubenswrapper[5004]: I1201 09:36:02.272886 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gql4w"] Dec 01 09:36:02 crc kubenswrapper[5004]: I1201 09:36:02.284699 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gql4w"] Dec 01 09:36:02 crc kubenswrapper[5004]: I1201 09:36:02.293483 5004 scope.go:117] "RemoveContainer" containerID="cfdd5bb36186f2d9b06960d778b609d93500957581be1b0ddd53bca306bc8a5b" Dec 01 09:36:02 crc kubenswrapper[5004]: I1201 09:36:02.344653 5004 scope.go:117] "RemoveContainer" containerID="2014e3029e7fca5fb49712972f08f2250b94debaac2cab83ce8751bdab1c40cf" Dec 01 09:36:02 crc kubenswrapper[5004]: E1201 09:36:02.345415 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2014e3029e7fca5fb49712972f08f2250b94debaac2cab83ce8751bdab1c40cf\": container with ID starting with 2014e3029e7fca5fb49712972f08f2250b94debaac2cab83ce8751bdab1c40cf not found: ID does not exist" containerID="2014e3029e7fca5fb49712972f08f2250b94debaac2cab83ce8751bdab1c40cf" Dec 01 09:36:02 crc kubenswrapper[5004]: I1201 09:36:02.345709 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2014e3029e7fca5fb49712972f08f2250b94debaac2cab83ce8751bdab1c40cf"} err="failed to get container status \"2014e3029e7fca5fb49712972f08f2250b94debaac2cab83ce8751bdab1c40cf\": rpc error: code = NotFound desc = could not find container \"2014e3029e7fca5fb49712972f08f2250b94debaac2cab83ce8751bdab1c40cf\": container with ID starting with 2014e3029e7fca5fb49712972f08f2250b94debaac2cab83ce8751bdab1c40cf not found: ID does not exist" Dec 01 09:36:02 crc kubenswrapper[5004]: I1201 09:36:02.345847 5004 scope.go:117] "RemoveContainer" containerID="6ff631ea9ceb84b86e541ef75420d25335d99ac24fe4aebe6f0229bcf8a81f19" Dec 01 09:36:02 crc kubenswrapper[5004]: E1201 09:36:02.346398 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ff631ea9ceb84b86e541ef75420d25335d99ac24fe4aebe6f0229bcf8a81f19\": container with ID starting with 6ff631ea9ceb84b86e541ef75420d25335d99ac24fe4aebe6f0229bcf8a81f19 not found: ID does not exist" containerID="6ff631ea9ceb84b86e541ef75420d25335d99ac24fe4aebe6f0229bcf8a81f19" Dec 01 09:36:02 crc kubenswrapper[5004]: I1201 09:36:02.346438 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ff631ea9ceb84b86e541ef75420d25335d99ac24fe4aebe6f0229bcf8a81f19"} err="failed to get container status \"6ff631ea9ceb84b86e541ef75420d25335d99ac24fe4aebe6f0229bcf8a81f19\": rpc error: code = NotFound desc = could not find container \"6ff631ea9ceb84b86e541ef75420d25335d99ac24fe4aebe6f0229bcf8a81f19\": container with ID starting with 6ff631ea9ceb84b86e541ef75420d25335d99ac24fe4aebe6f0229bcf8a81f19 not found: ID does not exist" Dec 01 09:36:02 crc kubenswrapper[5004]: I1201 09:36:02.346465 5004 scope.go:117] "RemoveContainer" containerID="cfdd5bb36186f2d9b06960d778b609d93500957581be1b0ddd53bca306bc8a5b" Dec 01 09:36:02 crc kubenswrapper[5004]: E1201 09:36:02.346924 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfdd5bb36186f2d9b06960d778b609d93500957581be1b0ddd53bca306bc8a5b\": container with ID starting with cfdd5bb36186f2d9b06960d778b609d93500957581be1b0ddd53bca306bc8a5b not found: ID does not exist" containerID="cfdd5bb36186f2d9b06960d778b609d93500957581be1b0ddd53bca306bc8a5b" Dec 01 09:36:02 crc kubenswrapper[5004]: I1201 09:36:02.346988 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfdd5bb36186f2d9b06960d778b609d93500957581be1b0ddd53bca306bc8a5b"} err="failed to get container status \"cfdd5bb36186f2d9b06960d778b609d93500957581be1b0ddd53bca306bc8a5b\": rpc error: code = NotFound desc = could not find container \"cfdd5bb36186f2d9b06960d778b609d93500957581be1b0ddd53bca306bc8a5b\": container with ID starting with cfdd5bb36186f2d9b06960d778b609d93500957581be1b0ddd53bca306bc8a5b not found: ID does not exist" Dec 01 09:36:02 crc kubenswrapper[5004]: I1201 09:36:02.774027 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8f0d5dd-4555-4525-bfa4-b6dd1bf01914" path="/var/lib/kubelet/pods/b8f0d5dd-4555-4525-bfa4-b6dd1bf01914/volumes" Dec 01 09:36:03 crc kubenswrapper[5004]: I1201 09:36:03.759126 5004 scope.go:117] "RemoveContainer" containerID="ccf28834b75e460fde85109220042abb807e53b91adea0a0b179d3e7961709b8" Dec 01 09:36:03 crc kubenswrapper[5004]: E1201 09:36:03.759652 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:36:14 crc kubenswrapper[5004]: I1201 09:36:14.759345 5004 scope.go:117] "RemoveContainer" containerID="ccf28834b75e460fde85109220042abb807e53b91adea0a0b179d3e7961709b8" Dec 01 09:36:14 crc kubenswrapper[5004]: E1201 09:36:14.760159 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:36:27 crc kubenswrapper[5004]: I1201 09:36:27.760278 5004 scope.go:117] "RemoveContainer" containerID="ccf28834b75e460fde85109220042abb807e53b91adea0a0b179d3e7961709b8" Dec 01 09:36:27 crc kubenswrapper[5004]: E1201 09:36:27.761767 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:36:39 crc kubenswrapper[5004]: I1201 09:36:39.759592 5004 scope.go:117] "RemoveContainer" containerID="ccf28834b75e460fde85109220042abb807e53b91adea0a0b179d3e7961709b8" Dec 01 09:36:39 crc kubenswrapper[5004]: E1201 09:36:39.760407 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:36:53 crc kubenswrapper[5004]: I1201 09:36:53.758966 5004 scope.go:117] "RemoveContainer" containerID="ccf28834b75e460fde85109220042abb807e53b91adea0a0b179d3e7961709b8" Dec 01 09:36:53 crc kubenswrapper[5004]: E1201 09:36:53.760833 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:37:07 crc kubenswrapper[5004]: I1201 09:37:07.759714 5004 scope.go:117] "RemoveContainer" containerID="ccf28834b75e460fde85109220042abb807e53b91adea0a0b179d3e7961709b8" Dec 01 09:37:07 crc kubenswrapper[5004]: E1201 09:37:07.761105 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:37:18 crc kubenswrapper[5004]: I1201 09:37:18.759537 5004 scope.go:117] "RemoveContainer" containerID="ccf28834b75e460fde85109220042abb807e53b91adea0a0b179d3e7961709b8" Dec 01 09:37:18 crc kubenswrapper[5004]: E1201 09:37:18.760646 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:37:31 crc kubenswrapper[5004]: I1201 09:37:31.759584 5004 scope.go:117] "RemoveContainer" containerID="ccf28834b75e460fde85109220042abb807e53b91adea0a0b179d3e7961709b8" Dec 01 09:37:31 crc kubenswrapper[5004]: E1201 09:37:31.760353 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:37:32 crc kubenswrapper[5004]: E1201 09:37:32.607916 5004 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.75:51660->38.102.83.75:44201: write tcp 38.102.83.75:51660->38.102.83.75:44201: write: broken pipe Dec 01 09:37:44 crc kubenswrapper[5004]: I1201 09:37:44.761279 5004 scope.go:117] "RemoveContainer" containerID="ccf28834b75e460fde85109220042abb807e53b91adea0a0b179d3e7961709b8" Dec 01 09:37:45 crc kubenswrapper[5004]: I1201 09:37:45.833644 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" event={"ID":"f9977ebb-82de-4e96-8763-0b5a84f8d4ce","Type":"ContainerStarted","Data":"689fffec50469597d0fa1c55a6831d3cb9e176fbbe5744fd1cbafad2fe703d10"} Dec 01 09:40:08 crc kubenswrapper[5004]: I1201 09:40:08.729179 5004 patch_prober.go:28] interesting pod/machine-config-daemon-fvdgt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:40:08 crc kubenswrapper[5004]: I1201 09:40:08.729662 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:40:29 crc kubenswrapper[5004]: I1201 09:40:29.731955 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Dec 01 09:40:29 crc kubenswrapper[5004]: E1201 09:40:29.732987 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8f0d5dd-4555-4525-bfa4-b6dd1bf01914" containerName="extract-utilities" Dec 01 09:40:29 crc kubenswrapper[5004]: I1201 09:40:29.733007 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8f0d5dd-4555-4525-bfa4-b6dd1bf01914" containerName="extract-utilities" Dec 01 09:40:29 crc kubenswrapper[5004]: E1201 09:40:29.733046 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8f0d5dd-4555-4525-bfa4-b6dd1bf01914" containerName="extract-content" Dec 01 09:40:29 crc kubenswrapper[5004]: I1201 09:40:29.733055 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8f0d5dd-4555-4525-bfa4-b6dd1bf01914" containerName="extract-content" Dec 01 09:40:29 crc kubenswrapper[5004]: E1201 09:40:29.733090 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8f0d5dd-4555-4525-bfa4-b6dd1bf01914" containerName="registry-server" Dec 01 09:40:29 crc kubenswrapper[5004]: I1201 09:40:29.733101 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8f0d5dd-4555-4525-bfa4-b6dd1bf01914" containerName="registry-server" Dec 01 09:40:29 crc kubenswrapper[5004]: I1201 09:40:29.733379 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8f0d5dd-4555-4525-bfa4-b6dd1bf01914" containerName="registry-server" Dec 01 09:40:29 crc kubenswrapper[5004]: I1201 09:40:29.734407 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 01 09:40:29 crc kubenswrapper[5004]: I1201 09:40:29.737677 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 01 09:40:29 crc kubenswrapper[5004]: I1201 09:40:29.738201 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Dec 01 09:40:29 crc kubenswrapper[5004]: I1201 09:40:29.738494 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-2srjp" Dec 01 09:40:29 crc kubenswrapper[5004]: I1201 09:40:29.742454 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Dec 01 09:40:29 crc kubenswrapper[5004]: I1201 09:40:29.753836 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 01 09:40:29 crc kubenswrapper[5004]: I1201 09:40:29.829267 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b624b6f4-e294-427a-94ac-358b5be6897b-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"b624b6f4-e294-427a-94ac-358b5be6897b\") " pod="openstack/tempest-tests-tempest" Dec 01 09:40:29 crc kubenswrapper[5004]: I1201 09:40:29.829375 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b624b6f4-e294-427a-94ac-358b5be6897b-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"b624b6f4-e294-427a-94ac-358b5be6897b\") " pod="openstack/tempest-tests-tempest" Dec 01 09:40:29 crc kubenswrapper[5004]: I1201 09:40:29.829442 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/b624b6f4-e294-427a-94ac-358b5be6897b-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"b624b6f4-e294-427a-94ac-358b5be6897b\") " pod="openstack/tempest-tests-tempest" Dec 01 09:40:29 crc kubenswrapper[5004]: I1201 09:40:29.829921 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b624b6f4-e294-427a-94ac-358b5be6897b-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"b624b6f4-e294-427a-94ac-358b5be6897b\") " pod="openstack/tempest-tests-tempest" Dec 01 09:40:29 crc kubenswrapper[5004]: I1201 09:40:29.830036 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/b624b6f4-e294-427a-94ac-358b5be6897b-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"b624b6f4-e294-427a-94ac-358b5be6897b\") " pod="openstack/tempest-tests-tempest" Dec 01 09:40:29 crc kubenswrapper[5004]: I1201 09:40:29.830075 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nw6tg\" (UniqueName: \"kubernetes.io/projected/b624b6f4-e294-427a-94ac-358b5be6897b-kube-api-access-nw6tg\") pod \"tempest-tests-tempest\" (UID: \"b624b6f4-e294-427a-94ac-358b5be6897b\") " pod="openstack/tempest-tests-tempest" Dec 01 09:40:29 crc kubenswrapper[5004]: I1201 09:40:29.830522 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b624b6f4-e294-427a-94ac-358b5be6897b-config-data\") pod \"tempest-tests-tempest\" (UID: \"b624b6f4-e294-427a-94ac-358b5be6897b\") " pod="openstack/tempest-tests-tempest" Dec 01 09:40:29 crc kubenswrapper[5004]: I1201 09:40:29.830738 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/b624b6f4-e294-427a-94ac-358b5be6897b-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"b624b6f4-e294-427a-94ac-358b5be6897b\") " pod="openstack/tempest-tests-tempest" Dec 01 09:40:29 crc kubenswrapper[5004]: I1201 09:40:29.830891 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tempest-tests-tempest\" (UID: \"b624b6f4-e294-427a-94ac-358b5be6897b\") " pod="openstack/tempest-tests-tempest" Dec 01 09:40:29 crc kubenswrapper[5004]: I1201 09:40:29.933630 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b624b6f4-e294-427a-94ac-358b5be6897b-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"b624b6f4-e294-427a-94ac-358b5be6897b\") " pod="openstack/tempest-tests-tempest" Dec 01 09:40:29 crc kubenswrapper[5004]: I1201 09:40:29.933682 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b624b6f4-e294-427a-94ac-358b5be6897b-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"b624b6f4-e294-427a-94ac-358b5be6897b\") " pod="openstack/tempest-tests-tempest" Dec 01 09:40:29 crc kubenswrapper[5004]: I1201 09:40:29.933711 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/b624b6f4-e294-427a-94ac-358b5be6897b-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"b624b6f4-e294-427a-94ac-358b5be6897b\") " pod="openstack/tempest-tests-tempest" Dec 01 09:40:29 crc kubenswrapper[5004]: I1201 09:40:29.933834 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b624b6f4-e294-427a-94ac-358b5be6897b-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"b624b6f4-e294-427a-94ac-358b5be6897b\") " pod="openstack/tempest-tests-tempest" Dec 01 09:40:29 crc kubenswrapper[5004]: I1201 09:40:29.933877 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/b624b6f4-e294-427a-94ac-358b5be6897b-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"b624b6f4-e294-427a-94ac-358b5be6897b\") " pod="openstack/tempest-tests-tempest" Dec 01 09:40:29 crc kubenswrapper[5004]: I1201 09:40:29.934747 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nw6tg\" (UniqueName: \"kubernetes.io/projected/b624b6f4-e294-427a-94ac-358b5be6897b-kube-api-access-nw6tg\") pod \"tempest-tests-tempest\" (UID: \"b624b6f4-e294-427a-94ac-358b5be6897b\") " pod="openstack/tempest-tests-tempest" Dec 01 09:40:29 crc kubenswrapper[5004]: I1201 09:40:29.934838 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b624b6f4-e294-427a-94ac-358b5be6897b-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"b624b6f4-e294-427a-94ac-358b5be6897b\") " pod="openstack/tempest-tests-tempest" Dec 01 09:40:29 crc kubenswrapper[5004]: I1201 09:40:29.934864 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/b624b6f4-e294-427a-94ac-358b5be6897b-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"b624b6f4-e294-427a-94ac-358b5be6897b\") " pod="openstack/tempest-tests-tempest" Dec 01 09:40:29 crc kubenswrapper[5004]: I1201 09:40:29.935016 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b624b6f4-e294-427a-94ac-358b5be6897b-config-data\") pod \"tempest-tests-tempest\" (UID: \"b624b6f4-e294-427a-94ac-358b5be6897b\") " pod="openstack/tempest-tests-tempest" Dec 01 09:40:29 crc kubenswrapper[5004]: I1201 09:40:29.935119 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/b624b6f4-e294-427a-94ac-358b5be6897b-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"b624b6f4-e294-427a-94ac-358b5be6897b\") " pod="openstack/tempest-tests-tempest" Dec 01 09:40:29 crc kubenswrapper[5004]: I1201 09:40:29.935182 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tempest-tests-tempest\" (UID: \"b624b6f4-e294-427a-94ac-358b5be6897b\") " pod="openstack/tempest-tests-tempest" Dec 01 09:40:29 crc kubenswrapper[5004]: I1201 09:40:29.935488 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/b624b6f4-e294-427a-94ac-358b5be6897b-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"b624b6f4-e294-427a-94ac-358b5be6897b\") " pod="openstack/tempest-tests-tempest" Dec 01 09:40:29 crc kubenswrapper[5004]: I1201 09:40:29.936235 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b624b6f4-e294-427a-94ac-358b5be6897b-config-data\") pod \"tempest-tests-tempest\" (UID: \"b624b6f4-e294-427a-94ac-358b5be6897b\") " pod="openstack/tempest-tests-tempest" Dec 01 09:40:29 crc kubenswrapper[5004]: I1201 09:40:29.936396 5004 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tempest-tests-tempest\" (UID: \"b624b6f4-e294-427a-94ac-358b5be6897b\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/tempest-tests-tempest" Dec 01 09:40:29 crc kubenswrapper[5004]: I1201 09:40:29.940321 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b624b6f4-e294-427a-94ac-358b5be6897b-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"b624b6f4-e294-427a-94ac-358b5be6897b\") " pod="openstack/tempest-tests-tempest" Dec 01 09:40:29 crc kubenswrapper[5004]: I1201 09:40:29.940884 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/b624b6f4-e294-427a-94ac-358b5be6897b-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"b624b6f4-e294-427a-94ac-358b5be6897b\") " pod="openstack/tempest-tests-tempest" Dec 01 09:40:29 crc kubenswrapper[5004]: I1201 09:40:29.945133 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b624b6f4-e294-427a-94ac-358b5be6897b-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"b624b6f4-e294-427a-94ac-358b5be6897b\") " pod="openstack/tempest-tests-tempest" Dec 01 09:40:29 crc kubenswrapper[5004]: I1201 09:40:29.963609 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nw6tg\" (UniqueName: \"kubernetes.io/projected/b624b6f4-e294-427a-94ac-358b5be6897b-kube-api-access-nw6tg\") pod \"tempest-tests-tempest\" (UID: \"b624b6f4-e294-427a-94ac-358b5be6897b\") " pod="openstack/tempest-tests-tempest" Dec 01 09:40:30 crc kubenswrapper[5004]: I1201 09:40:30.002276 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tempest-tests-tempest\" (UID: \"b624b6f4-e294-427a-94ac-358b5be6897b\") " pod="openstack/tempest-tests-tempest" Dec 01 09:40:30 crc kubenswrapper[5004]: I1201 09:40:30.066434 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 01 09:40:30 crc kubenswrapper[5004]: I1201 09:40:30.547452 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 01 09:40:31 crc kubenswrapper[5004]: I1201 09:40:31.121451 5004 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 09:40:31 crc kubenswrapper[5004]: I1201 09:40:31.695939 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"b624b6f4-e294-427a-94ac-358b5be6897b","Type":"ContainerStarted","Data":"05795bb73619fbb81f76e41f982cd7e652b7c3c06352a27d591186a800d3fc55"} Dec 01 09:40:38 crc kubenswrapper[5004]: I1201 09:40:38.729981 5004 patch_prober.go:28] interesting pod/machine-config-daemon-fvdgt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:40:38 crc kubenswrapper[5004]: I1201 09:40:38.730528 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:41:06 crc kubenswrapper[5004]: E1201 09:41:06.252328 5004 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Dec 01 09:41:06 crc kubenswrapper[5004]: E1201 09:41:06.253898 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nw6tg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(b624b6f4-e294-427a-94ac-358b5be6897b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 09:41:06 crc kubenswrapper[5004]: E1201 09:41:06.255099 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="b624b6f4-e294-427a-94ac-358b5be6897b" Dec 01 09:41:07 crc kubenswrapper[5004]: E1201 09:41:07.110373 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="b624b6f4-e294-427a-94ac-358b5be6897b" Dec 01 09:41:08 crc kubenswrapper[5004]: I1201 09:41:08.729947 5004 patch_prober.go:28] interesting pod/machine-config-daemon-fvdgt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:41:08 crc kubenswrapper[5004]: I1201 09:41:08.730347 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:41:08 crc kubenswrapper[5004]: I1201 09:41:08.730458 5004 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" Dec 01 09:41:08 crc kubenswrapper[5004]: I1201 09:41:08.731969 5004 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"689fffec50469597d0fa1c55a6831d3cb9e176fbbe5744fd1cbafad2fe703d10"} pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 09:41:08 crc kubenswrapper[5004]: I1201 09:41:08.732100 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" containerID="cri-o://689fffec50469597d0fa1c55a6831d3cb9e176fbbe5744fd1cbafad2fe703d10" gracePeriod=600 Dec 01 09:41:09 crc kubenswrapper[5004]: I1201 09:41:09.133681 5004 generic.go:334] "Generic (PLEG): container finished" podID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerID="689fffec50469597d0fa1c55a6831d3cb9e176fbbe5744fd1cbafad2fe703d10" exitCode=0 Dec 01 09:41:09 crc kubenswrapper[5004]: I1201 09:41:09.133740 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" event={"ID":"f9977ebb-82de-4e96-8763-0b5a84f8d4ce","Type":"ContainerDied","Data":"689fffec50469597d0fa1c55a6831d3cb9e176fbbe5744fd1cbafad2fe703d10"} Dec 01 09:41:09 crc kubenswrapper[5004]: I1201 09:41:09.133777 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" event={"ID":"f9977ebb-82de-4e96-8763-0b5a84f8d4ce","Type":"ContainerStarted","Data":"844931be84dd9ed1d62e412f6ab6e17f8e3efdf77d93563aaf4b60bee8477332"} Dec 01 09:41:09 crc kubenswrapper[5004]: I1201 09:41:09.133794 5004 scope.go:117] "RemoveContainer" containerID="ccf28834b75e460fde85109220042abb807e53b91adea0a0b179d3e7961709b8" Dec 01 09:41:19 crc kubenswrapper[5004]: I1201 09:41:19.359305 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 01 09:41:21 crc kubenswrapper[5004]: I1201 09:41:21.289928 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"b624b6f4-e294-427a-94ac-358b5be6897b","Type":"ContainerStarted","Data":"429362d7683522181f61d793ac96ab2053aab7a3e7528b36f6c8459ffa813d6c"} Dec 01 09:41:21 crc kubenswrapper[5004]: I1201 09:41:21.331809 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=5.097136352 podStartE2EDuration="53.331784772s" podCreationTimestamp="2025-12-01 09:40:28 +0000 UTC" firstStartedPulling="2025-12-01 09:40:31.121200399 +0000 UTC m=+5008.686192381" lastFinishedPulling="2025-12-01 09:41:19.355848819 +0000 UTC m=+5056.920840801" observedRunningTime="2025-12-01 09:41:21.326536995 +0000 UTC m=+5058.891528977" watchObservedRunningTime="2025-12-01 09:41:21.331784772 +0000 UTC m=+5058.896776754" Dec 01 09:42:38 crc kubenswrapper[5004]: I1201 09:42:38.699334 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-t9rhs"] Dec 01 09:42:38 crc kubenswrapper[5004]: I1201 09:42:38.707721 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t9rhs" Dec 01 09:42:38 crc kubenswrapper[5004]: I1201 09:42:38.805386 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t9rhs"] Dec 01 09:42:38 crc kubenswrapper[5004]: I1201 09:42:38.836551 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8djkr\" (UniqueName: \"kubernetes.io/projected/1636321f-2475-4b45-9551-396e0a65cc23-kube-api-access-8djkr\") pod \"redhat-operators-t9rhs\" (UID: \"1636321f-2475-4b45-9551-396e0a65cc23\") " pod="openshift-marketplace/redhat-operators-t9rhs" Dec 01 09:42:38 crc kubenswrapper[5004]: I1201 09:42:38.836750 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1636321f-2475-4b45-9551-396e0a65cc23-utilities\") pod \"redhat-operators-t9rhs\" (UID: \"1636321f-2475-4b45-9551-396e0a65cc23\") " pod="openshift-marketplace/redhat-operators-t9rhs" Dec 01 09:42:38 crc kubenswrapper[5004]: I1201 09:42:38.836845 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1636321f-2475-4b45-9551-396e0a65cc23-catalog-content\") pod \"redhat-operators-t9rhs\" (UID: \"1636321f-2475-4b45-9551-396e0a65cc23\") " pod="openshift-marketplace/redhat-operators-t9rhs" Dec 01 09:42:38 crc kubenswrapper[5004]: I1201 09:42:38.939319 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8djkr\" (UniqueName: \"kubernetes.io/projected/1636321f-2475-4b45-9551-396e0a65cc23-kube-api-access-8djkr\") pod \"redhat-operators-t9rhs\" (UID: \"1636321f-2475-4b45-9551-396e0a65cc23\") " pod="openshift-marketplace/redhat-operators-t9rhs" Dec 01 09:42:38 crc kubenswrapper[5004]: I1201 09:42:38.939493 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1636321f-2475-4b45-9551-396e0a65cc23-utilities\") pod \"redhat-operators-t9rhs\" (UID: \"1636321f-2475-4b45-9551-396e0a65cc23\") " pod="openshift-marketplace/redhat-operators-t9rhs" Dec 01 09:42:38 crc kubenswrapper[5004]: I1201 09:42:38.939665 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1636321f-2475-4b45-9551-396e0a65cc23-catalog-content\") pod \"redhat-operators-t9rhs\" (UID: \"1636321f-2475-4b45-9551-396e0a65cc23\") " pod="openshift-marketplace/redhat-operators-t9rhs" Dec 01 09:42:38 crc kubenswrapper[5004]: I1201 09:42:38.940619 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1636321f-2475-4b45-9551-396e0a65cc23-catalog-content\") pod \"redhat-operators-t9rhs\" (UID: \"1636321f-2475-4b45-9551-396e0a65cc23\") " pod="openshift-marketplace/redhat-operators-t9rhs" Dec 01 09:42:38 crc kubenswrapper[5004]: I1201 09:42:38.941420 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1636321f-2475-4b45-9551-396e0a65cc23-utilities\") pod \"redhat-operators-t9rhs\" (UID: \"1636321f-2475-4b45-9551-396e0a65cc23\") " pod="openshift-marketplace/redhat-operators-t9rhs" Dec 01 09:42:38 crc kubenswrapper[5004]: I1201 09:42:38.969677 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8djkr\" (UniqueName: \"kubernetes.io/projected/1636321f-2475-4b45-9551-396e0a65cc23-kube-api-access-8djkr\") pod \"redhat-operators-t9rhs\" (UID: \"1636321f-2475-4b45-9551-396e0a65cc23\") " pod="openshift-marketplace/redhat-operators-t9rhs" Dec 01 09:42:39 crc kubenswrapper[5004]: I1201 09:42:39.035602 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t9rhs" Dec 01 09:42:39 crc kubenswrapper[5004]: I1201 09:42:39.865319 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t9rhs"] Dec 01 09:42:40 crc kubenswrapper[5004]: I1201 09:42:40.224926 5004 generic.go:334] "Generic (PLEG): container finished" podID="1636321f-2475-4b45-9551-396e0a65cc23" containerID="ce9e32ce1275293d08399828b2efe1ab6714d5afddd63ddee19b13bc3dbee61f" exitCode=0 Dec 01 09:42:40 crc kubenswrapper[5004]: I1201 09:42:40.225042 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t9rhs" event={"ID":"1636321f-2475-4b45-9551-396e0a65cc23","Type":"ContainerDied","Data":"ce9e32ce1275293d08399828b2efe1ab6714d5afddd63ddee19b13bc3dbee61f"} Dec 01 09:42:40 crc kubenswrapper[5004]: I1201 09:42:40.225259 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t9rhs" event={"ID":"1636321f-2475-4b45-9551-396e0a65cc23","Type":"ContainerStarted","Data":"6c4de60dce5ac11cb359ecaf8926294085227ec3b014d99130240144c5388abf"} Dec 01 09:42:42 crc kubenswrapper[5004]: I1201 09:42:42.247402 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t9rhs" event={"ID":"1636321f-2475-4b45-9551-396e0a65cc23","Type":"ContainerStarted","Data":"a9159a43a2be443da9dbcd7784ec7ceaf1b1ac30ef7ebc80061a6ec890818806"} Dec 01 09:42:44 crc kubenswrapper[5004]: I1201 09:42:44.271141 5004 generic.go:334] "Generic (PLEG): container finished" podID="1636321f-2475-4b45-9551-396e0a65cc23" containerID="a9159a43a2be443da9dbcd7784ec7ceaf1b1ac30ef7ebc80061a6ec890818806" exitCode=0 Dec 01 09:42:44 crc kubenswrapper[5004]: I1201 09:42:44.271241 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t9rhs" event={"ID":"1636321f-2475-4b45-9551-396e0a65cc23","Type":"ContainerDied","Data":"a9159a43a2be443da9dbcd7784ec7ceaf1b1ac30ef7ebc80061a6ec890818806"} Dec 01 09:42:45 crc kubenswrapper[5004]: I1201 09:42:45.284605 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t9rhs" event={"ID":"1636321f-2475-4b45-9551-396e0a65cc23","Type":"ContainerStarted","Data":"61c51dd4a5714567e3ee96ff8700054cb4d26ffc91e37943bae9c471fb411ddc"} Dec 01 09:42:45 crc kubenswrapper[5004]: I1201 09:42:45.305120 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-t9rhs" podStartSLOduration=2.84262467 podStartE2EDuration="7.304687523s" podCreationTimestamp="2025-12-01 09:42:38 +0000 UTC" firstStartedPulling="2025-12-01 09:42:40.226719879 +0000 UTC m=+5137.791711861" lastFinishedPulling="2025-12-01 09:42:44.688782732 +0000 UTC m=+5142.253774714" observedRunningTime="2025-12-01 09:42:45.30288855 +0000 UTC m=+5142.867880542" watchObservedRunningTime="2025-12-01 09:42:45.304687523 +0000 UTC m=+5142.869679505" Dec 01 09:42:49 crc kubenswrapper[5004]: I1201 09:42:49.036366 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-t9rhs" Dec 01 09:42:49 crc kubenswrapper[5004]: I1201 09:42:49.037005 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-t9rhs" Dec 01 09:42:50 crc kubenswrapper[5004]: I1201 09:42:50.102859 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-t9rhs" podUID="1636321f-2475-4b45-9551-396e0a65cc23" containerName="registry-server" probeResult="failure" output=< Dec 01 09:42:50 crc kubenswrapper[5004]: timeout: failed to connect service ":50051" within 1s Dec 01 09:42:50 crc kubenswrapper[5004]: > Dec 01 09:42:59 crc kubenswrapper[5004]: I1201 09:42:59.470711 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-t9rhs" Dec 01 09:42:59 crc kubenswrapper[5004]: I1201 09:42:59.522958 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-t9rhs" Dec 01 09:42:59 crc kubenswrapper[5004]: I1201 09:42:59.711729 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t9rhs"] Dec 01 09:43:01 crc kubenswrapper[5004]: I1201 09:43:01.479326 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-t9rhs" podUID="1636321f-2475-4b45-9551-396e0a65cc23" containerName="registry-server" containerID="cri-o://61c51dd4a5714567e3ee96ff8700054cb4d26ffc91e37943bae9c471fb411ddc" gracePeriod=2 Dec 01 09:43:02 crc kubenswrapper[5004]: I1201 09:43:02.338819 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t9rhs" Dec 01 09:43:02 crc kubenswrapper[5004]: I1201 09:43:02.461068 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1636321f-2475-4b45-9551-396e0a65cc23-utilities\") pod \"1636321f-2475-4b45-9551-396e0a65cc23\" (UID: \"1636321f-2475-4b45-9551-396e0a65cc23\") " Dec 01 09:43:02 crc kubenswrapper[5004]: I1201 09:43:02.461495 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8djkr\" (UniqueName: \"kubernetes.io/projected/1636321f-2475-4b45-9551-396e0a65cc23-kube-api-access-8djkr\") pod \"1636321f-2475-4b45-9551-396e0a65cc23\" (UID: \"1636321f-2475-4b45-9551-396e0a65cc23\") " Dec 01 09:43:02 crc kubenswrapper[5004]: I1201 09:43:02.461694 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1636321f-2475-4b45-9551-396e0a65cc23-catalog-content\") pod \"1636321f-2475-4b45-9551-396e0a65cc23\" (UID: \"1636321f-2475-4b45-9551-396e0a65cc23\") " Dec 01 09:43:02 crc kubenswrapper[5004]: I1201 09:43:02.462983 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1636321f-2475-4b45-9551-396e0a65cc23-utilities" (OuterVolumeSpecName: "utilities") pod "1636321f-2475-4b45-9551-396e0a65cc23" (UID: "1636321f-2475-4b45-9551-396e0a65cc23"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:43:02 crc kubenswrapper[5004]: I1201 09:43:02.478111 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1636321f-2475-4b45-9551-396e0a65cc23-kube-api-access-8djkr" (OuterVolumeSpecName: "kube-api-access-8djkr") pod "1636321f-2475-4b45-9551-396e0a65cc23" (UID: "1636321f-2475-4b45-9551-396e0a65cc23"). InnerVolumeSpecName "kube-api-access-8djkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:43:02 crc kubenswrapper[5004]: I1201 09:43:02.496666 5004 generic.go:334] "Generic (PLEG): container finished" podID="1636321f-2475-4b45-9551-396e0a65cc23" containerID="61c51dd4a5714567e3ee96ff8700054cb4d26ffc91e37943bae9c471fb411ddc" exitCode=0 Dec 01 09:43:02 crc kubenswrapper[5004]: I1201 09:43:02.496710 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t9rhs" event={"ID":"1636321f-2475-4b45-9551-396e0a65cc23","Type":"ContainerDied","Data":"61c51dd4a5714567e3ee96ff8700054cb4d26ffc91e37943bae9c471fb411ddc"} Dec 01 09:43:02 crc kubenswrapper[5004]: I1201 09:43:02.496736 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t9rhs" event={"ID":"1636321f-2475-4b45-9551-396e0a65cc23","Type":"ContainerDied","Data":"6c4de60dce5ac11cb359ecaf8926294085227ec3b014d99130240144c5388abf"} Dec 01 09:43:02 crc kubenswrapper[5004]: I1201 09:43:02.497158 5004 scope.go:117] "RemoveContainer" containerID="61c51dd4a5714567e3ee96ff8700054cb4d26ffc91e37943bae9c471fb411ddc" Dec 01 09:43:02 crc kubenswrapper[5004]: I1201 09:43:02.497320 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t9rhs" Dec 01 09:43:02 crc kubenswrapper[5004]: I1201 09:43:02.563958 5004 scope.go:117] "RemoveContainer" containerID="a9159a43a2be443da9dbcd7784ec7ceaf1b1ac30ef7ebc80061a6ec890818806" Dec 01 09:43:02 crc kubenswrapper[5004]: I1201 09:43:02.565090 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8djkr\" (UniqueName: \"kubernetes.io/projected/1636321f-2475-4b45-9551-396e0a65cc23-kube-api-access-8djkr\") on node \"crc\" DevicePath \"\"" Dec 01 09:43:02 crc kubenswrapper[5004]: I1201 09:43:02.565126 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1636321f-2475-4b45-9551-396e0a65cc23-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:43:02 crc kubenswrapper[5004]: I1201 09:43:02.596116 5004 scope.go:117] "RemoveContainer" containerID="ce9e32ce1275293d08399828b2efe1ab6714d5afddd63ddee19b13bc3dbee61f" Dec 01 09:43:02 crc kubenswrapper[5004]: I1201 09:43:02.605722 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1636321f-2475-4b45-9551-396e0a65cc23-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1636321f-2475-4b45-9551-396e0a65cc23" (UID: "1636321f-2475-4b45-9551-396e0a65cc23"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:43:02 crc kubenswrapper[5004]: I1201 09:43:02.667554 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1636321f-2475-4b45-9551-396e0a65cc23-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:43:02 crc kubenswrapper[5004]: I1201 09:43:02.674126 5004 scope.go:117] "RemoveContainer" containerID="61c51dd4a5714567e3ee96ff8700054cb4d26ffc91e37943bae9c471fb411ddc" Dec 01 09:43:02 crc kubenswrapper[5004]: E1201 09:43:02.681831 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61c51dd4a5714567e3ee96ff8700054cb4d26ffc91e37943bae9c471fb411ddc\": container with ID starting with 61c51dd4a5714567e3ee96ff8700054cb4d26ffc91e37943bae9c471fb411ddc not found: ID does not exist" containerID="61c51dd4a5714567e3ee96ff8700054cb4d26ffc91e37943bae9c471fb411ddc" Dec 01 09:43:02 crc kubenswrapper[5004]: I1201 09:43:02.682322 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61c51dd4a5714567e3ee96ff8700054cb4d26ffc91e37943bae9c471fb411ddc"} err="failed to get container status \"61c51dd4a5714567e3ee96ff8700054cb4d26ffc91e37943bae9c471fb411ddc\": rpc error: code = NotFound desc = could not find container \"61c51dd4a5714567e3ee96ff8700054cb4d26ffc91e37943bae9c471fb411ddc\": container with ID starting with 61c51dd4a5714567e3ee96ff8700054cb4d26ffc91e37943bae9c471fb411ddc not found: ID does not exist" Dec 01 09:43:02 crc kubenswrapper[5004]: I1201 09:43:02.682450 5004 scope.go:117] "RemoveContainer" containerID="a9159a43a2be443da9dbcd7784ec7ceaf1b1ac30ef7ebc80061a6ec890818806" Dec 01 09:43:02 crc kubenswrapper[5004]: E1201 09:43:02.683198 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9159a43a2be443da9dbcd7784ec7ceaf1b1ac30ef7ebc80061a6ec890818806\": container with ID starting with a9159a43a2be443da9dbcd7784ec7ceaf1b1ac30ef7ebc80061a6ec890818806 not found: ID does not exist" containerID="a9159a43a2be443da9dbcd7784ec7ceaf1b1ac30ef7ebc80061a6ec890818806" Dec 01 09:43:02 crc kubenswrapper[5004]: I1201 09:43:02.683244 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9159a43a2be443da9dbcd7784ec7ceaf1b1ac30ef7ebc80061a6ec890818806"} err="failed to get container status \"a9159a43a2be443da9dbcd7784ec7ceaf1b1ac30ef7ebc80061a6ec890818806\": rpc error: code = NotFound desc = could not find container \"a9159a43a2be443da9dbcd7784ec7ceaf1b1ac30ef7ebc80061a6ec890818806\": container with ID starting with a9159a43a2be443da9dbcd7784ec7ceaf1b1ac30ef7ebc80061a6ec890818806 not found: ID does not exist" Dec 01 09:43:02 crc kubenswrapper[5004]: I1201 09:43:02.683270 5004 scope.go:117] "RemoveContainer" containerID="ce9e32ce1275293d08399828b2efe1ab6714d5afddd63ddee19b13bc3dbee61f" Dec 01 09:43:02 crc kubenswrapper[5004]: E1201 09:43:02.683685 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce9e32ce1275293d08399828b2efe1ab6714d5afddd63ddee19b13bc3dbee61f\": container with ID starting with ce9e32ce1275293d08399828b2efe1ab6714d5afddd63ddee19b13bc3dbee61f not found: ID does not exist" containerID="ce9e32ce1275293d08399828b2efe1ab6714d5afddd63ddee19b13bc3dbee61f" Dec 01 09:43:02 crc kubenswrapper[5004]: I1201 09:43:02.683713 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce9e32ce1275293d08399828b2efe1ab6714d5afddd63ddee19b13bc3dbee61f"} err="failed to get container status \"ce9e32ce1275293d08399828b2efe1ab6714d5afddd63ddee19b13bc3dbee61f\": rpc error: code = NotFound desc = could not find container \"ce9e32ce1275293d08399828b2efe1ab6714d5afddd63ddee19b13bc3dbee61f\": container with ID starting with ce9e32ce1275293d08399828b2efe1ab6714d5afddd63ddee19b13bc3dbee61f not found: ID does not exist" Dec 01 09:43:02 crc kubenswrapper[5004]: I1201 09:43:02.848147 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t9rhs"] Dec 01 09:43:02 crc kubenswrapper[5004]: I1201 09:43:02.860956 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-t9rhs"] Dec 01 09:43:04 crc kubenswrapper[5004]: I1201 09:43:04.772173 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1636321f-2475-4b45-9551-396e0a65cc23" path="/var/lib/kubelet/pods/1636321f-2475-4b45-9551-396e0a65cc23/volumes" Dec 01 09:43:05 crc kubenswrapper[5004]: E1201 09:43:05.883580 5004 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1636321f_2475_4b45_9551_396e0a65cc23.slice/crio-6c4de60dce5ac11cb359ecaf8926294085227ec3b014d99130240144c5388abf\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1636321f_2475_4b45_9551_396e0a65cc23.slice\": RecentStats: unable to find data in memory cache]" Dec 01 09:43:08 crc kubenswrapper[5004]: E1201 09:43:08.875895 5004 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1636321f_2475_4b45_9551_396e0a65cc23.slice/crio-6c4de60dce5ac11cb359ecaf8926294085227ec3b014d99130240144c5388abf\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1636321f_2475_4b45_9551_396e0a65cc23.slice\": RecentStats: unable to find data in memory cache]" Dec 01 09:43:19 crc kubenswrapper[5004]: E1201 09:43:19.177817 5004 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1636321f_2475_4b45_9551_396e0a65cc23.slice/crio-6c4de60dce5ac11cb359ecaf8926294085227ec3b014d99130240144c5388abf\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1636321f_2475_4b45_9551_396e0a65cc23.slice\": RecentStats: unable to find data in memory cache]" Dec 01 09:43:20 crc kubenswrapper[5004]: E1201 09:43:20.879779 5004 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1636321f_2475_4b45_9551_396e0a65cc23.slice/crio-6c4de60dce5ac11cb359ecaf8926294085227ec3b014d99130240144c5388abf\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1636321f_2475_4b45_9551_396e0a65cc23.slice\": RecentStats: unable to find data in memory cache]" Dec 01 09:43:29 crc kubenswrapper[5004]: E1201 09:43:29.593039 5004 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1636321f_2475_4b45_9551_396e0a65cc23.slice/crio-6c4de60dce5ac11cb359ecaf8926294085227ec3b014d99130240144c5388abf\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1636321f_2475_4b45_9551_396e0a65cc23.slice\": RecentStats: unable to find data in memory cache]" Dec 01 09:43:36 crc kubenswrapper[5004]: E1201 09:43:36.151314 5004 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1636321f_2475_4b45_9551_396e0a65cc23.slice/crio-6c4de60dce5ac11cb359ecaf8926294085227ec3b014d99130240144c5388abf\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1636321f_2475_4b45_9551_396e0a65cc23.slice\": RecentStats: unable to find data in memory cache]" Dec 01 09:43:38 crc kubenswrapper[5004]: I1201 09:43:38.729205 5004 patch_prober.go:28] interesting pod/machine-config-daemon-fvdgt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:43:38 crc kubenswrapper[5004]: I1201 09:43:38.729663 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:43:39 crc kubenswrapper[5004]: E1201 09:43:39.687307 5004 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1636321f_2475_4b45_9551_396e0a65cc23.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1636321f_2475_4b45_9551_396e0a65cc23.slice/crio-6c4de60dce5ac11cb359ecaf8926294085227ec3b014d99130240144c5388abf\": RecentStats: unable to find data in memory cache]" Dec 01 09:43:48 crc kubenswrapper[5004]: E1201 09:43:48.255718 5004 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1636321f_2475_4b45_9551_396e0a65cc23.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1636321f_2475_4b45_9551_396e0a65cc23.slice/crio-6c4de60dce5ac11cb359ecaf8926294085227ec3b014d99130240144c5388abf\": RecentStats: unable to find data in memory cache]" Dec 01 09:43:48 crc kubenswrapper[5004]: E1201 09:43:48.255903 5004 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1636321f_2475_4b45_9551_396e0a65cc23.slice/crio-6c4de60dce5ac11cb359ecaf8926294085227ec3b014d99130240144c5388abf\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1636321f_2475_4b45_9551_396e0a65cc23.slice\": RecentStats: unable to find data in memory cache]" Dec 01 09:43:49 crc kubenswrapper[5004]: E1201 09:43:49.744129 5004 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1636321f_2475_4b45_9551_396e0a65cc23.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1636321f_2475_4b45_9551_396e0a65cc23.slice/crio-6c4de60dce5ac11cb359ecaf8926294085227ec3b014d99130240144c5388abf\": RecentStats: unable to find data in memory cache]" Dec 01 09:43:50 crc kubenswrapper[5004]: E1201 09:43:50.880051 5004 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1636321f_2475_4b45_9551_396e0a65cc23.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1636321f_2475_4b45_9551_396e0a65cc23.slice/crio-6c4de60dce5ac11cb359ecaf8926294085227ec3b014d99130240144c5388abf\": RecentStats: unable to find data in memory cache]" Dec 01 09:44:00 crc kubenswrapper[5004]: E1201 09:44:00.101721 5004 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1636321f_2475_4b45_9551_396e0a65cc23.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1636321f_2475_4b45_9551_396e0a65cc23.slice/crio-6c4de60dce5ac11cb359ecaf8926294085227ec3b014d99130240144c5388abf\": RecentStats: unable to find data in memory cache]" Dec 01 09:44:02 crc kubenswrapper[5004]: E1201 09:44:02.799049 5004 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/cb5b6345f343771d1eb35cae2bc4d4791f461fdc523851aa54454de07e4dc3c2/diff" to get inode usage: stat /var/lib/containers/storage/overlay/cb5b6345f343771d1eb35cae2bc4d4791f461fdc523851aa54454de07e4dc3c2/diff: no such file or directory, extraDiskErr: Dec 01 09:44:05 crc kubenswrapper[5004]: I1201 09:44:05.587844 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-g9cl4"] Dec 01 09:44:05 crc kubenswrapper[5004]: E1201 09:44:05.590188 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1636321f-2475-4b45-9551-396e0a65cc23" containerName="extract-content" Dec 01 09:44:05 crc kubenswrapper[5004]: I1201 09:44:05.590226 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="1636321f-2475-4b45-9551-396e0a65cc23" containerName="extract-content" Dec 01 09:44:05 crc kubenswrapper[5004]: E1201 09:44:05.590260 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1636321f-2475-4b45-9551-396e0a65cc23" containerName="registry-server" Dec 01 09:44:05 crc kubenswrapper[5004]: I1201 09:44:05.590268 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="1636321f-2475-4b45-9551-396e0a65cc23" containerName="registry-server" Dec 01 09:44:05 crc kubenswrapper[5004]: E1201 09:44:05.590287 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1636321f-2475-4b45-9551-396e0a65cc23" containerName="extract-utilities" Dec 01 09:44:05 crc kubenswrapper[5004]: I1201 09:44:05.590295 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="1636321f-2475-4b45-9551-396e0a65cc23" containerName="extract-utilities" Dec 01 09:44:05 crc kubenswrapper[5004]: I1201 09:44:05.590792 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="1636321f-2475-4b45-9551-396e0a65cc23" containerName="registry-server" Dec 01 09:44:05 crc kubenswrapper[5004]: I1201 09:44:05.596423 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g9cl4" Dec 01 09:44:05 crc kubenswrapper[5004]: I1201 09:44:05.610798 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-g9cl4"] Dec 01 09:44:05 crc kubenswrapper[5004]: I1201 09:44:05.642461 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48f3efc7-0b2c-4baf-9362-6fc57a7c3abd-utilities\") pod \"redhat-marketplace-g9cl4\" (UID: \"48f3efc7-0b2c-4baf-9362-6fc57a7c3abd\") " pod="openshift-marketplace/redhat-marketplace-g9cl4" Dec 01 09:44:05 crc kubenswrapper[5004]: I1201 09:44:05.642704 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48f3efc7-0b2c-4baf-9362-6fc57a7c3abd-catalog-content\") pod \"redhat-marketplace-g9cl4\" (UID: \"48f3efc7-0b2c-4baf-9362-6fc57a7c3abd\") " pod="openshift-marketplace/redhat-marketplace-g9cl4" Dec 01 09:44:05 crc kubenswrapper[5004]: I1201 09:44:05.642773 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6xtn\" (UniqueName: \"kubernetes.io/projected/48f3efc7-0b2c-4baf-9362-6fc57a7c3abd-kube-api-access-l6xtn\") pod \"redhat-marketplace-g9cl4\" (UID: \"48f3efc7-0b2c-4baf-9362-6fc57a7c3abd\") " pod="openshift-marketplace/redhat-marketplace-g9cl4" Dec 01 09:44:05 crc kubenswrapper[5004]: I1201 09:44:05.744884 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48f3efc7-0b2c-4baf-9362-6fc57a7c3abd-utilities\") pod \"redhat-marketplace-g9cl4\" (UID: \"48f3efc7-0b2c-4baf-9362-6fc57a7c3abd\") " pod="openshift-marketplace/redhat-marketplace-g9cl4" Dec 01 09:44:05 crc kubenswrapper[5004]: I1201 09:44:05.745037 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48f3efc7-0b2c-4baf-9362-6fc57a7c3abd-catalog-content\") pod \"redhat-marketplace-g9cl4\" (UID: \"48f3efc7-0b2c-4baf-9362-6fc57a7c3abd\") " pod="openshift-marketplace/redhat-marketplace-g9cl4" Dec 01 09:44:05 crc kubenswrapper[5004]: I1201 09:44:05.745086 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6xtn\" (UniqueName: \"kubernetes.io/projected/48f3efc7-0b2c-4baf-9362-6fc57a7c3abd-kube-api-access-l6xtn\") pod \"redhat-marketplace-g9cl4\" (UID: \"48f3efc7-0b2c-4baf-9362-6fc57a7c3abd\") " pod="openshift-marketplace/redhat-marketplace-g9cl4" Dec 01 09:44:05 crc kubenswrapper[5004]: I1201 09:44:05.746744 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48f3efc7-0b2c-4baf-9362-6fc57a7c3abd-utilities\") pod \"redhat-marketplace-g9cl4\" (UID: \"48f3efc7-0b2c-4baf-9362-6fc57a7c3abd\") " pod="openshift-marketplace/redhat-marketplace-g9cl4" Dec 01 09:44:05 crc kubenswrapper[5004]: I1201 09:44:05.746767 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48f3efc7-0b2c-4baf-9362-6fc57a7c3abd-catalog-content\") pod \"redhat-marketplace-g9cl4\" (UID: \"48f3efc7-0b2c-4baf-9362-6fc57a7c3abd\") " pod="openshift-marketplace/redhat-marketplace-g9cl4" Dec 01 09:44:05 crc kubenswrapper[5004]: I1201 09:44:05.805394 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6xtn\" (UniqueName: \"kubernetes.io/projected/48f3efc7-0b2c-4baf-9362-6fc57a7c3abd-kube-api-access-l6xtn\") pod \"redhat-marketplace-g9cl4\" (UID: \"48f3efc7-0b2c-4baf-9362-6fc57a7c3abd\") " pod="openshift-marketplace/redhat-marketplace-g9cl4" Dec 01 09:44:05 crc kubenswrapper[5004]: I1201 09:44:05.946211 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g9cl4" Dec 01 09:44:06 crc kubenswrapper[5004]: I1201 09:44:06.597116 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-g9cl4"] Dec 01 09:44:06 crc kubenswrapper[5004]: W1201 09:44:06.603308 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48f3efc7_0b2c_4baf_9362_6fc57a7c3abd.slice/crio-f2b7f81734c5bae1a7b9750f7ad2bbf36942aff045b66cf0ff0ec6e13c21a7bf WatchSource:0}: Error finding container f2b7f81734c5bae1a7b9750f7ad2bbf36942aff045b66cf0ff0ec6e13c21a7bf: Status 404 returned error can't find the container with id f2b7f81734c5bae1a7b9750f7ad2bbf36942aff045b66cf0ff0ec6e13c21a7bf Dec 01 09:44:07 crc kubenswrapper[5004]: I1201 09:44:07.219518 5004 generic.go:334] "Generic (PLEG): container finished" podID="48f3efc7-0b2c-4baf-9362-6fc57a7c3abd" containerID="9ba15009f33bc97fd78359a62f4a3a86ff83aa223d6bc197b1da645aca995084" exitCode=0 Dec 01 09:44:07 crc kubenswrapper[5004]: I1201 09:44:07.219716 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g9cl4" event={"ID":"48f3efc7-0b2c-4baf-9362-6fc57a7c3abd","Type":"ContainerDied","Data":"9ba15009f33bc97fd78359a62f4a3a86ff83aa223d6bc197b1da645aca995084"} Dec 01 09:44:07 crc kubenswrapper[5004]: I1201 09:44:07.219862 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g9cl4" event={"ID":"48f3efc7-0b2c-4baf-9362-6fc57a7c3abd","Type":"ContainerStarted","Data":"f2b7f81734c5bae1a7b9750f7ad2bbf36942aff045b66cf0ff0ec6e13c21a7bf"} Dec 01 09:44:08 crc kubenswrapper[5004]: I1201 09:44:08.729672 5004 patch_prober.go:28] interesting pod/machine-config-daemon-fvdgt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:44:08 crc kubenswrapper[5004]: I1201 09:44:08.730532 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:44:09 crc kubenswrapper[5004]: I1201 09:44:09.250070 5004 generic.go:334] "Generic (PLEG): container finished" podID="48f3efc7-0b2c-4baf-9362-6fc57a7c3abd" containerID="ca00f0a899af879f15357f224c7cd11d781e7e1199842d965b4841cd3520af38" exitCode=0 Dec 01 09:44:09 crc kubenswrapper[5004]: I1201 09:44:09.250117 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g9cl4" event={"ID":"48f3efc7-0b2c-4baf-9362-6fc57a7c3abd","Type":"ContainerDied","Data":"ca00f0a899af879f15357f224c7cd11d781e7e1199842d965b4841cd3520af38"} Dec 01 09:44:11 crc kubenswrapper[5004]: I1201 09:44:11.270848 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g9cl4" event={"ID":"48f3efc7-0b2c-4baf-9362-6fc57a7c3abd","Type":"ContainerStarted","Data":"61c1d77d8d0e57a8a1820918e02f933b7ee0a20cf656b44781ac6ed9c6c13d33"} Dec 01 09:44:11 crc kubenswrapper[5004]: I1201 09:44:11.291813 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-g9cl4" podStartSLOduration=3.400953768 podStartE2EDuration="6.29159317s" podCreationTimestamp="2025-12-01 09:44:05 +0000 UTC" firstStartedPulling="2025-12-01 09:44:07.223824593 +0000 UTC m=+5224.788816575" lastFinishedPulling="2025-12-01 09:44:10.114463995 +0000 UTC m=+5227.679455977" observedRunningTime="2025-12-01 09:44:11.288609758 +0000 UTC m=+5228.853601750" watchObservedRunningTime="2025-12-01 09:44:11.29159317 +0000 UTC m=+5228.856585163" Dec 01 09:44:15 crc kubenswrapper[5004]: I1201 09:44:15.946505 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-g9cl4" Dec 01 09:44:15 crc kubenswrapper[5004]: I1201 09:44:15.947185 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-g9cl4" Dec 01 09:44:16 crc kubenswrapper[5004]: I1201 09:44:16.012278 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-g9cl4" Dec 01 09:44:16 crc kubenswrapper[5004]: I1201 09:44:16.389065 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-g9cl4" Dec 01 09:44:16 crc kubenswrapper[5004]: I1201 09:44:16.451473 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-g9cl4"] Dec 01 09:44:18 crc kubenswrapper[5004]: I1201 09:44:18.344514 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-g9cl4" podUID="48f3efc7-0b2c-4baf-9362-6fc57a7c3abd" containerName="registry-server" containerID="cri-o://61c1d77d8d0e57a8a1820918e02f933b7ee0a20cf656b44781ac6ed9c6c13d33" gracePeriod=2 Dec 01 09:44:19 crc kubenswrapper[5004]: I1201 09:44:19.360526 5004 generic.go:334] "Generic (PLEG): container finished" podID="48f3efc7-0b2c-4baf-9362-6fc57a7c3abd" containerID="61c1d77d8d0e57a8a1820918e02f933b7ee0a20cf656b44781ac6ed9c6c13d33" exitCode=0 Dec 01 09:44:19 crc kubenswrapper[5004]: I1201 09:44:19.360633 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g9cl4" event={"ID":"48f3efc7-0b2c-4baf-9362-6fc57a7c3abd","Type":"ContainerDied","Data":"61c1d77d8d0e57a8a1820918e02f933b7ee0a20cf656b44781ac6ed9c6c13d33"} Dec 01 09:44:19 crc kubenswrapper[5004]: I1201 09:44:19.582144 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g9cl4" Dec 01 09:44:19 crc kubenswrapper[5004]: I1201 09:44:19.712569 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48f3efc7-0b2c-4baf-9362-6fc57a7c3abd-catalog-content\") pod \"48f3efc7-0b2c-4baf-9362-6fc57a7c3abd\" (UID: \"48f3efc7-0b2c-4baf-9362-6fc57a7c3abd\") " Dec 01 09:44:19 crc kubenswrapper[5004]: I1201 09:44:19.712660 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6xtn\" (UniqueName: \"kubernetes.io/projected/48f3efc7-0b2c-4baf-9362-6fc57a7c3abd-kube-api-access-l6xtn\") pod \"48f3efc7-0b2c-4baf-9362-6fc57a7c3abd\" (UID: \"48f3efc7-0b2c-4baf-9362-6fc57a7c3abd\") " Dec 01 09:44:19 crc kubenswrapper[5004]: I1201 09:44:19.712933 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48f3efc7-0b2c-4baf-9362-6fc57a7c3abd-utilities\") pod \"48f3efc7-0b2c-4baf-9362-6fc57a7c3abd\" (UID: \"48f3efc7-0b2c-4baf-9362-6fc57a7c3abd\") " Dec 01 09:44:19 crc kubenswrapper[5004]: I1201 09:44:19.713663 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48f3efc7-0b2c-4baf-9362-6fc57a7c3abd-utilities" (OuterVolumeSpecName: "utilities") pod "48f3efc7-0b2c-4baf-9362-6fc57a7c3abd" (UID: "48f3efc7-0b2c-4baf-9362-6fc57a7c3abd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:44:19 crc kubenswrapper[5004]: I1201 09:44:19.722349 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48f3efc7-0b2c-4baf-9362-6fc57a7c3abd-kube-api-access-l6xtn" (OuterVolumeSpecName: "kube-api-access-l6xtn") pod "48f3efc7-0b2c-4baf-9362-6fc57a7c3abd" (UID: "48f3efc7-0b2c-4baf-9362-6fc57a7c3abd"). InnerVolumeSpecName "kube-api-access-l6xtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:44:19 crc kubenswrapper[5004]: I1201 09:44:19.730315 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48f3efc7-0b2c-4baf-9362-6fc57a7c3abd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "48f3efc7-0b2c-4baf-9362-6fc57a7c3abd" (UID: "48f3efc7-0b2c-4baf-9362-6fc57a7c3abd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:44:19 crc kubenswrapper[5004]: I1201 09:44:19.817910 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48f3efc7-0b2c-4baf-9362-6fc57a7c3abd-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:44:19 crc kubenswrapper[5004]: I1201 09:44:19.818140 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48f3efc7-0b2c-4baf-9362-6fc57a7c3abd-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:44:19 crc kubenswrapper[5004]: I1201 09:44:19.818154 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6xtn\" (UniqueName: \"kubernetes.io/projected/48f3efc7-0b2c-4baf-9362-6fc57a7c3abd-kube-api-access-l6xtn\") on node \"crc\" DevicePath \"\"" Dec 01 09:44:20 crc kubenswrapper[5004]: I1201 09:44:20.373892 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g9cl4" event={"ID":"48f3efc7-0b2c-4baf-9362-6fc57a7c3abd","Type":"ContainerDied","Data":"f2b7f81734c5bae1a7b9750f7ad2bbf36942aff045b66cf0ff0ec6e13c21a7bf"} Dec 01 09:44:20 crc kubenswrapper[5004]: I1201 09:44:20.373965 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g9cl4" Dec 01 09:44:20 crc kubenswrapper[5004]: I1201 09:44:20.374220 5004 scope.go:117] "RemoveContainer" containerID="61c1d77d8d0e57a8a1820918e02f933b7ee0a20cf656b44781ac6ed9c6c13d33" Dec 01 09:44:20 crc kubenswrapper[5004]: I1201 09:44:20.430448 5004 scope.go:117] "RemoveContainer" containerID="ca00f0a899af879f15357f224c7cd11d781e7e1199842d965b4841cd3520af38" Dec 01 09:44:20 crc kubenswrapper[5004]: I1201 09:44:20.431125 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-g9cl4"] Dec 01 09:44:20 crc kubenswrapper[5004]: I1201 09:44:20.444897 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-g9cl4"] Dec 01 09:44:20 crc kubenswrapper[5004]: E1201 09:44:20.451812 5004 kuberuntime_gc.go:389] "Failed to remove container log dead symlink" err="remove /var/log/containers/redhat-marketplace-g9cl4_openshift-marketplace_extract-content-ca00f0a899af879f15357f224c7cd11d781e7e1199842d965b4841cd3520af38.log: no such file or directory" path="/var/log/containers/redhat-marketplace-g9cl4_openshift-marketplace_extract-content-ca00f0a899af879f15357f224c7cd11d781e7e1199842d965b4841cd3520af38.log" Dec 01 09:44:20 crc kubenswrapper[5004]: I1201 09:44:20.464889 5004 scope.go:117] "RemoveContainer" containerID="9ba15009f33bc97fd78359a62f4a3a86ff83aa223d6bc197b1da645aca995084" Dec 01 09:44:20 crc kubenswrapper[5004]: I1201 09:44:20.772834 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48f3efc7-0b2c-4baf-9362-6fc57a7c3abd" path="/var/lib/kubelet/pods/48f3efc7-0b2c-4baf-9362-6fc57a7c3abd/volumes" Dec 01 09:44:38 crc kubenswrapper[5004]: I1201 09:44:38.729693 5004 patch_prober.go:28] interesting pod/machine-config-daemon-fvdgt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:44:38 crc kubenswrapper[5004]: I1201 09:44:38.730397 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:44:38 crc kubenswrapper[5004]: I1201 09:44:38.730470 5004 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" Dec 01 09:44:38 crc kubenswrapper[5004]: I1201 09:44:38.731735 5004 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"844931be84dd9ed1d62e412f6ab6e17f8e3efdf77d93563aaf4b60bee8477332"} pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 09:44:38 crc kubenswrapper[5004]: I1201 09:44:38.731823 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" containerID="cri-o://844931be84dd9ed1d62e412f6ab6e17f8e3efdf77d93563aaf4b60bee8477332" gracePeriod=600 Dec 01 09:44:39 crc kubenswrapper[5004]: E1201 09:44:39.361449 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:44:39 crc kubenswrapper[5004]: I1201 09:44:39.594699 5004 generic.go:334] "Generic (PLEG): container finished" podID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerID="844931be84dd9ed1d62e412f6ab6e17f8e3efdf77d93563aaf4b60bee8477332" exitCode=0 Dec 01 09:44:39 crc kubenswrapper[5004]: I1201 09:44:39.594779 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" event={"ID":"f9977ebb-82de-4e96-8763-0b5a84f8d4ce","Type":"ContainerDied","Data":"844931be84dd9ed1d62e412f6ab6e17f8e3efdf77d93563aaf4b60bee8477332"} Dec 01 09:44:39 crc kubenswrapper[5004]: I1201 09:44:39.594978 5004 scope.go:117] "RemoveContainer" containerID="689fffec50469597d0fa1c55a6831d3cb9e176fbbe5744fd1cbafad2fe703d10" Dec 01 09:44:39 crc kubenswrapper[5004]: I1201 09:44:39.595788 5004 scope.go:117] "RemoveContainer" containerID="844931be84dd9ed1d62e412f6ab6e17f8e3efdf77d93563aaf4b60bee8477332" Dec 01 09:44:39 crc kubenswrapper[5004]: E1201 09:44:39.596197 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:44:41 crc kubenswrapper[5004]: I1201 09:44:41.714432 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-47kxb"] Dec 01 09:44:41 crc kubenswrapper[5004]: E1201 09:44:41.715600 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48f3efc7-0b2c-4baf-9362-6fc57a7c3abd" containerName="extract-content" Dec 01 09:44:41 crc kubenswrapper[5004]: I1201 09:44:41.715617 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="48f3efc7-0b2c-4baf-9362-6fc57a7c3abd" containerName="extract-content" Dec 01 09:44:41 crc kubenswrapper[5004]: E1201 09:44:41.715639 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48f3efc7-0b2c-4baf-9362-6fc57a7c3abd" containerName="extract-utilities" Dec 01 09:44:41 crc kubenswrapper[5004]: I1201 09:44:41.715647 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="48f3efc7-0b2c-4baf-9362-6fc57a7c3abd" containerName="extract-utilities" Dec 01 09:44:41 crc kubenswrapper[5004]: E1201 09:44:41.715690 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48f3efc7-0b2c-4baf-9362-6fc57a7c3abd" containerName="registry-server" Dec 01 09:44:41 crc kubenswrapper[5004]: I1201 09:44:41.715698 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="48f3efc7-0b2c-4baf-9362-6fc57a7c3abd" containerName="registry-server" Dec 01 09:44:41 crc kubenswrapper[5004]: I1201 09:44:41.716006 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="48f3efc7-0b2c-4baf-9362-6fc57a7c3abd" containerName="registry-server" Dec 01 09:44:41 crc kubenswrapper[5004]: I1201 09:44:41.718527 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-47kxb" Dec 01 09:44:41 crc kubenswrapper[5004]: I1201 09:44:41.732220 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-47kxb"] Dec 01 09:44:41 crc kubenswrapper[5004]: I1201 09:44:41.743285 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2g7hv\" (UniqueName: \"kubernetes.io/projected/c9534f7b-b546-4e25-89ad-9b4b40221b7a-kube-api-access-2g7hv\") pod \"certified-operators-47kxb\" (UID: \"c9534f7b-b546-4e25-89ad-9b4b40221b7a\") " pod="openshift-marketplace/certified-operators-47kxb" Dec 01 09:44:41 crc kubenswrapper[5004]: I1201 09:44:41.743444 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9534f7b-b546-4e25-89ad-9b4b40221b7a-catalog-content\") pod \"certified-operators-47kxb\" (UID: \"c9534f7b-b546-4e25-89ad-9b4b40221b7a\") " pod="openshift-marketplace/certified-operators-47kxb" Dec 01 09:44:41 crc kubenswrapper[5004]: I1201 09:44:41.743583 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9534f7b-b546-4e25-89ad-9b4b40221b7a-utilities\") pod \"certified-operators-47kxb\" (UID: \"c9534f7b-b546-4e25-89ad-9b4b40221b7a\") " pod="openshift-marketplace/certified-operators-47kxb" Dec 01 09:44:41 crc kubenswrapper[5004]: I1201 09:44:41.845638 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9534f7b-b546-4e25-89ad-9b4b40221b7a-catalog-content\") pod \"certified-operators-47kxb\" (UID: \"c9534f7b-b546-4e25-89ad-9b4b40221b7a\") " pod="openshift-marketplace/certified-operators-47kxb" Dec 01 09:44:41 crc kubenswrapper[5004]: I1201 09:44:41.845792 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9534f7b-b546-4e25-89ad-9b4b40221b7a-utilities\") pod \"certified-operators-47kxb\" (UID: \"c9534f7b-b546-4e25-89ad-9b4b40221b7a\") " pod="openshift-marketplace/certified-operators-47kxb" Dec 01 09:44:41 crc kubenswrapper[5004]: I1201 09:44:41.846067 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2g7hv\" (UniqueName: \"kubernetes.io/projected/c9534f7b-b546-4e25-89ad-9b4b40221b7a-kube-api-access-2g7hv\") pod \"certified-operators-47kxb\" (UID: \"c9534f7b-b546-4e25-89ad-9b4b40221b7a\") " pod="openshift-marketplace/certified-operators-47kxb" Dec 01 09:44:41 crc kubenswrapper[5004]: I1201 09:44:41.846747 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9534f7b-b546-4e25-89ad-9b4b40221b7a-catalog-content\") pod \"certified-operators-47kxb\" (UID: \"c9534f7b-b546-4e25-89ad-9b4b40221b7a\") " pod="openshift-marketplace/certified-operators-47kxb" Dec 01 09:44:41 crc kubenswrapper[5004]: I1201 09:44:41.847254 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9534f7b-b546-4e25-89ad-9b4b40221b7a-utilities\") pod \"certified-operators-47kxb\" (UID: \"c9534f7b-b546-4e25-89ad-9b4b40221b7a\") " pod="openshift-marketplace/certified-operators-47kxb" Dec 01 09:44:41 crc kubenswrapper[5004]: I1201 09:44:41.867283 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2g7hv\" (UniqueName: \"kubernetes.io/projected/c9534f7b-b546-4e25-89ad-9b4b40221b7a-kube-api-access-2g7hv\") pod \"certified-operators-47kxb\" (UID: \"c9534f7b-b546-4e25-89ad-9b4b40221b7a\") " pod="openshift-marketplace/certified-operators-47kxb" Dec 01 09:44:42 crc kubenswrapper[5004]: I1201 09:44:42.053170 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-47kxb" Dec 01 09:44:42 crc kubenswrapper[5004]: I1201 09:44:42.878959 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-47kxb"] Dec 01 09:44:43 crc kubenswrapper[5004]: I1201 09:44:43.643695 5004 generic.go:334] "Generic (PLEG): container finished" podID="c9534f7b-b546-4e25-89ad-9b4b40221b7a" containerID="6c09ff6ef3138f504d54e9e6f740c48dcf6838075b8f911c75a3cf998ca3eb01" exitCode=0 Dec 01 09:44:43 crc kubenswrapper[5004]: I1201 09:44:43.643785 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-47kxb" event={"ID":"c9534f7b-b546-4e25-89ad-9b4b40221b7a","Type":"ContainerDied","Data":"6c09ff6ef3138f504d54e9e6f740c48dcf6838075b8f911c75a3cf998ca3eb01"} Dec 01 09:44:43 crc kubenswrapper[5004]: I1201 09:44:43.643993 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-47kxb" event={"ID":"c9534f7b-b546-4e25-89ad-9b4b40221b7a","Type":"ContainerStarted","Data":"e8618dfd62a21847f1c6a4d5d8f78ead37d3558452c848d06b07313a2be91308"} Dec 01 09:44:44 crc kubenswrapper[5004]: I1201 09:44:44.656365 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-47kxb" event={"ID":"c9534f7b-b546-4e25-89ad-9b4b40221b7a","Type":"ContainerStarted","Data":"0f10804463fd54e18387046cbfbbe7922c9eb801491686242192295a5dc0cda0"} Dec 01 09:44:46 crc kubenswrapper[5004]: I1201 09:44:46.677661 5004 generic.go:334] "Generic (PLEG): container finished" podID="c9534f7b-b546-4e25-89ad-9b4b40221b7a" containerID="0f10804463fd54e18387046cbfbbe7922c9eb801491686242192295a5dc0cda0" exitCode=0 Dec 01 09:44:46 crc kubenswrapper[5004]: I1201 09:44:46.677736 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-47kxb" event={"ID":"c9534f7b-b546-4e25-89ad-9b4b40221b7a","Type":"ContainerDied","Data":"0f10804463fd54e18387046cbfbbe7922c9eb801491686242192295a5dc0cda0"} Dec 01 09:44:48 crc kubenswrapper[5004]: I1201 09:44:48.700143 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-47kxb" event={"ID":"c9534f7b-b546-4e25-89ad-9b4b40221b7a","Type":"ContainerStarted","Data":"372bd1f6b2f33520aeafa595db5209a696be621f08eae8cab882297c4648fd35"} Dec 01 09:44:48 crc kubenswrapper[5004]: I1201 09:44:48.736335 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-47kxb" podStartSLOduration=3.84441064 podStartE2EDuration="7.736317812s" podCreationTimestamp="2025-12-01 09:44:41 +0000 UTC" firstStartedPulling="2025-12-01 09:44:43.64601636 +0000 UTC m=+5261.211008342" lastFinishedPulling="2025-12-01 09:44:47.537923532 +0000 UTC m=+5265.102915514" observedRunningTime="2025-12-01 09:44:48.72175928 +0000 UTC m=+5266.286751262" watchObservedRunningTime="2025-12-01 09:44:48.736317812 +0000 UTC m=+5266.301309794" Dec 01 09:44:52 crc kubenswrapper[5004]: I1201 09:44:52.053962 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-47kxb" Dec 01 09:44:52 crc kubenswrapper[5004]: I1201 09:44:52.054599 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-47kxb" Dec 01 09:44:53 crc kubenswrapper[5004]: I1201 09:44:53.115207 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-47kxb" podUID="c9534f7b-b546-4e25-89ad-9b4b40221b7a" containerName="registry-server" probeResult="failure" output=< Dec 01 09:44:53 crc kubenswrapper[5004]: timeout: failed to connect service ":50051" within 1s Dec 01 09:44:53 crc kubenswrapper[5004]: > Dec 01 09:44:53 crc kubenswrapper[5004]: I1201 09:44:53.759000 5004 scope.go:117] "RemoveContainer" containerID="844931be84dd9ed1d62e412f6ab6e17f8e3efdf77d93563aaf4b60bee8477332" Dec 01 09:44:53 crc kubenswrapper[5004]: E1201 09:44:53.759279 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:45:00 crc kubenswrapper[5004]: I1201 09:45:00.208519 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409705-szzf4"] Dec 01 09:45:00 crc kubenswrapper[5004]: I1201 09:45:00.211920 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409705-szzf4" Dec 01 09:45:00 crc kubenswrapper[5004]: I1201 09:45:00.220068 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 09:45:00 crc kubenswrapper[5004]: I1201 09:45:00.220093 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 09:45:00 crc kubenswrapper[5004]: I1201 09:45:00.223419 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409705-szzf4"] Dec 01 09:45:00 crc kubenswrapper[5004]: I1201 09:45:00.281199 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dc07d34c-6231-4d1d-bd64-79d325db1298-config-volume\") pod \"collect-profiles-29409705-szzf4\" (UID: \"dc07d34c-6231-4d1d-bd64-79d325db1298\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409705-szzf4" Dec 01 09:45:00 crc kubenswrapper[5004]: I1201 09:45:00.281258 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dc07d34c-6231-4d1d-bd64-79d325db1298-secret-volume\") pod \"collect-profiles-29409705-szzf4\" (UID: \"dc07d34c-6231-4d1d-bd64-79d325db1298\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409705-szzf4" Dec 01 09:45:00 crc kubenswrapper[5004]: I1201 09:45:00.281348 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5jdf\" (UniqueName: \"kubernetes.io/projected/dc07d34c-6231-4d1d-bd64-79d325db1298-kube-api-access-w5jdf\") pod \"collect-profiles-29409705-szzf4\" (UID: \"dc07d34c-6231-4d1d-bd64-79d325db1298\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409705-szzf4" Dec 01 09:45:00 crc kubenswrapper[5004]: I1201 09:45:00.383286 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dc07d34c-6231-4d1d-bd64-79d325db1298-config-volume\") pod \"collect-profiles-29409705-szzf4\" (UID: \"dc07d34c-6231-4d1d-bd64-79d325db1298\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409705-szzf4" Dec 01 09:45:00 crc kubenswrapper[5004]: I1201 09:45:00.383349 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dc07d34c-6231-4d1d-bd64-79d325db1298-secret-volume\") pod \"collect-profiles-29409705-szzf4\" (UID: \"dc07d34c-6231-4d1d-bd64-79d325db1298\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409705-szzf4" Dec 01 09:45:00 crc kubenswrapper[5004]: I1201 09:45:00.383423 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5jdf\" (UniqueName: \"kubernetes.io/projected/dc07d34c-6231-4d1d-bd64-79d325db1298-kube-api-access-w5jdf\") pod \"collect-profiles-29409705-szzf4\" (UID: \"dc07d34c-6231-4d1d-bd64-79d325db1298\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409705-szzf4" Dec 01 09:45:00 crc kubenswrapper[5004]: I1201 09:45:00.384469 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dc07d34c-6231-4d1d-bd64-79d325db1298-config-volume\") pod \"collect-profiles-29409705-szzf4\" (UID: \"dc07d34c-6231-4d1d-bd64-79d325db1298\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409705-szzf4" Dec 01 09:45:01 crc kubenswrapper[5004]: I1201 09:45:01.005163 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dc07d34c-6231-4d1d-bd64-79d325db1298-secret-volume\") pod \"collect-profiles-29409705-szzf4\" (UID: \"dc07d34c-6231-4d1d-bd64-79d325db1298\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409705-szzf4" Dec 01 09:45:01 crc kubenswrapper[5004]: I1201 09:45:01.005176 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5jdf\" (UniqueName: \"kubernetes.io/projected/dc07d34c-6231-4d1d-bd64-79d325db1298-kube-api-access-w5jdf\") pod \"collect-profiles-29409705-szzf4\" (UID: \"dc07d34c-6231-4d1d-bd64-79d325db1298\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409705-szzf4" Dec 01 09:45:01 crc kubenswrapper[5004]: I1201 09:45:01.146187 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409705-szzf4" Dec 01 09:45:01 crc kubenswrapper[5004]: I1201 09:45:01.736936 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409705-szzf4"] Dec 01 09:45:01 crc kubenswrapper[5004]: I1201 09:45:01.853794 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409705-szzf4" event={"ID":"dc07d34c-6231-4d1d-bd64-79d325db1298","Type":"ContainerStarted","Data":"6cc6badac7dfd790d0b70ba5dbc40e65af1f0e313a392a155928c9a1b58f2984"} Dec 01 09:45:02 crc kubenswrapper[5004]: I1201 09:45:02.109625 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-47kxb" Dec 01 09:45:02 crc kubenswrapper[5004]: I1201 09:45:02.170621 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-47kxb" Dec 01 09:45:02 crc kubenswrapper[5004]: I1201 09:45:02.354196 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-47kxb"] Dec 01 09:45:02 crc kubenswrapper[5004]: I1201 09:45:02.865786 5004 generic.go:334] "Generic (PLEG): container finished" podID="dc07d34c-6231-4d1d-bd64-79d325db1298" containerID="f3f4ca555f2f07e1ba2a915ab30f82196e8131385b5bf677baa3d53df166ae0a" exitCode=0 Dec 01 09:45:02 crc kubenswrapper[5004]: I1201 09:45:02.865864 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409705-szzf4" event={"ID":"dc07d34c-6231-4d1d-bd64-79d325db1298","Type":"ContainerDied","Data":"f3f4ca555f2f07e1ba2a915ab30f82196e8131385b5bf677baa3d53df166ae0a"} Dec 01 09:45:03 crc kubenswrapper[5004]: I1201 09:45:03.879311 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-47kxb" podUID="c9534f7b-b546-4e25-89ad-9b4b40221b7a" containerName="registry-server" containerID="cri-o://372bd1f6b2f33520aeafa595db5209a696be621f08eae8cab882297c4648fd35" gracePeriod=2 Dec 01 09:45:04 crc kubenswrapper[5004]: I1201 09:45:04.626148 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409705-szzf4" Dec 01 09:45:04 crc kubenswrapper[5004]: I1201 09:45:04.786441 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-47kxb" Dec 01 09:45:04 crc kubenswrapper[5004]: I1201 09:45:04.794881 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dc07d34c-6231-4d1d-bd64-79d325db1298-config-volume\") pod \"dc07d34c-6231-4d1d-bd64-79d325db1298\" (UID: \"dc07d34c-6231-4d1d-bd64-79d325db1298\") " Dec 01 09:45:04 crc kubenswrapper[5004]: I1201 09:45:04.794979 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5jdf\" (UniqueName: \"kubernetes.io/projected/dc07d34c-6231-4d1d-bd64-79d325db1298-kube-api-access-w5jdf\") pod \"dc07d34c-6231-4d1d-bd64-79d325db1298\" (UID: \"dc07d34c-6231-4d1d-bd64-79d325db1298\") " Dec 01 09:45:04 crc kubenswrapper[5004]: I1201 09:45:04.795604 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dc07d34c-6231-4d1d-bd64-79d325db1298-secret-volume\") pod \"dc07d34c-6231-4d1d-bd64-79d325db1298\" (UID: \"dc07d34c-6231-4d1d-bd64-79d325db1298\") " Dec 01 09:45:04 crc kubenswrapper[5004]: I1201 09:45:04.796214 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc07d34c-6231-4d1d-bd64-79d325db1298-config-volume" (OuterVolumeSpecName: "config-volume") pod "dc07d34c-6231-4d1d-bd64-79d325db1298" (UID: "dc07d34c-6231-4d1d-bd64-79d325db1298"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:45:04 crc kubenswrapper[5004]: I1201 09:45:04.796690 5004 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dc07d34c-6231-4d1d-bd64-79d325db1298-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 09:45:04 crc kubenswrapper[5004]: I1201 09:45:04.802585 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc07d34c-6231-4d1d-bd64-79d325db1298-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "dc07d34c-6231-4d1d-bd64-79d325db1298" (UID: "dc07d34c-6231-4d1d-bd64-79d325db1298"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:45:04 crc kubenswrapper[5004]: I1201 09:45:04.803145 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc07d34c-6231-4d1d-bd64-79d325db1298-kube-api-access-w5jdf" (OuterVolumeSpecName: "kube-api-access-w5jdf") pod "dc07d34c-6231-4d1d-bd64-79d325db1298" (UID: "dc07d34c-6231-4d1d-bd64-79d325db1298"). InnerVolumeSpecName "kube-api-access-w5jdf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:45:04 crc kubenswrapper[5004]: I1201 09:45:04.891200 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409705-szzf4" event={"ID":"dc07d34c-6231-4d1d-bd64-79d325db1298","Type":"ContainerDied","Data":"6cc6badac7dfd790d0b70ba5dbc40e65af1f0e313a392a155928c9a1b58f2984"} Dec 01 09:45:04 crc kubenswrapper[5004]: I1201 09:45:04.891310 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409705-szzf4" Dec 01 09:45:04 crc kubenswrapper[5004]: I1201 09:45:04.891614 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6cc6badac7dfd790d0b70ba5dbc40e65af1f0e313a392a155928c9a1b58f2984" Dec 01 09:45:04 crc kubenswrapper[5004]: I1201 09:45:04.893389 5004 generic.go:334] "Generic (PLEG): container finished" podID="c9534f7b-b546-4e25-89ad-9b4b40221b7a" containerID="372bd1f6b2f33520aeafa595db5209a696be621f08eae8cab882297c4648fd35" exitCode=0 Dec 01 09:45:04 crc kubenswrapper[5004]: I1201 09:45:04.893434 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-47kxb" event={"ID":"c9534f7b-b546-4e25-89ad-9b4b40221b7a","Type":"ContainerDied","Data":"372bd1f6b2f33520aeafa595db5209a696be621f08eae8cab882297c4648fd35"} Dec 01 09:45:04 crc kubenswrapper[5004]: I1201 09:45:04.893462 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-47kxb" event={"ID":"c9534f7b-b546-4e25-89ad-9b4b40221b7a","Type":"ContainerDied","Data":"e8618dfd62a21847f1c6a4d5d8f78ead37d3558452c848d06b07313a2be91308"} Dec 01 09:45:04 crc kubenswrapper[5004]: I1201 09:45:04.893479 5004 scope.go:117] "RemoveContainer" containerID="372bd1f6b2f33520aeafa595db5209a696be621f08eae8cab882297c4648fd35" Dec 01 09:45:04 crc kubenswrapper[5004]: I1201 09:45:04.893644 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-47kxb" Dec 01 09:45:04 crc kubenswrapper[5004]: I1201 09:45:04.900954 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9534f7b-b546-4e25-89ad-9b4b40221b7a-utilities\") pod \"c9534f7b-b546-4e25-89ad-9b4b40221b7a\" (UID: \"c9534f7b-b546-4e25-89ad-9b4b40221b7a\") " Dec 01 09:45:04 crc kubenswrapper[5004]: I1201 09:45:04.901013 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9534f7b-b546-4e25-89ad-9b4b40221b7a-catalog-content\") pod \"c9534f7b-b546-4e25-89ad-9b4b40221b7a\" (UID: \"c9534f7b-b546-4e25-89ad-9b4b40221b7a\") " Dec 01 09:45:04 crc kubenswrapper[5004]: I1201 09:45:04.901052 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2g7hv\" (UniqueName: \"kubernetes.io/projected/c9534f7b-b546-4e25-89ad-9b4b40221b7a-kube-api-access-2g7hv\") pod \"c9534f7b-b546-4e25-89ad-9b4b40221b7a\" (UID: \"c9534f7b-b546-4e25-89ad-9b4b40221b7a\") " Dec 01 09:45:04 crc kubenswrapper[5004]: I1201 09:45:04.902358 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9534f7b-b546-4e25-89ad-9b4b40221b7a-utilities" (OuterVolumeSpecName: "utilities") pod "c9534f7b-b546-4e25-89ad-9b4b40221b7a" (UID: "c9534f7b-b546-4e25-89ad-9b4b40221b7a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:45:04 crc kubenswrapper[5004]: I1201 09:45:04.904098 5004 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dc07d34c-6231-4d1d-bd64-79d325db1298-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 09:45:04 crc kubenswrapper[5004]: I1201 09:45:04.904125 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9534f7b-b546-4e25-89ad-9b4b40221b7a-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:45:04 crc kubenswrapper[5004]: I1201 09:45:04.904136 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5jdf\" (UniqueName: \"kubernetes.io/projected/dc07d34c-6231-4d1d-bd64-79d325db1298-kube-api-access-w5jdf\") on node \"crc\" DevicePath \"\"" Dec 01 09:45:04 crc kubenswrapper[5004]: I1201 09:45:04.906201 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9534f7b-b546-4e25-89ad-9b4b40221b7a-kube-api-access-2g7hv" (OuterVolumeSpecName: "kube-api-access-2g7hv") pod "c9534f7b-b546-4e25-89ad-9b4b40221b7a" (UID: "c9534f7b-b546-4e25-89ad-9b4b40221b7a"). InnerVolumeSpecName "kube-api-access-2g7hv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:45:04 crc kubenswrapper[5004]: I1201 09:45:04.926515 5004 scope.go:117] "RemoveContainer" containerID="0f10804463fd54e18387046cbfbbe7922c9eb801491686242192295a5dc0cda0" Dec 01 09:45:04 crc kubenswrapper[5004]: I1201 09:45:04.949588 5004 scope.go:117] "RemoveContainer" containerID="6c09ff6ef3138f504d54e9e6f740c48dcf6838075b8f911c75a3cf998ca3eb01" Dec 01 09:45:04 crc kubenswrapper[5004]: I1201 09:45:04.966837 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9534f7b-b546-4e25-89ad-9b4b40221b7a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c9534f7b-b546-4e25-89ad-9b4b40221b7a" (UID: "c9534f7b-b546-4e25-89ad-9b4b40221b7a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:45:04 crc kubenswrapper[5004]: I1201 09:45:04.976018 5004 scope.go:117] "RemoveContainer" containerID="372bd1f6b2f33520aeafa595db5209a696be621f08eae8cab882297c4648fd35" Dec 01 09:45:04 crc kubenswrapper[5004]: E1201 09:45:04.980533 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"372bd1f6b2f33520aeafa595db5209a696be621f08eae8cab882297c4648fd35\": container with ID starting with 372bd1f6b2f33520aeafa595db5209a696be621f08eae8cab882297c4648fd35 not found: ID does not exist" containerID="372bd1f6b2f33520aeafa595db5209a696be621f08eae8cab882297c4648fd35" Dec 01 09:45:04 crc kubenswrapper[5004]: I1201 09:45:04.980620 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"372bd1f6b2f33520aeafa595db5209a696be621f08eae8cab882297c4648fd35"} err="failed to get container status \"372bd1f6b2f33520aeafa595db5209a696be621f08eae8cab882297c4648fd35\": rpc error: code = NotFound desc = could not find container \"372bd1f6b2f33520aeafa595db5209a696be621f08eae8cab882297c4648fd35\": container with ID starting with 372bd1f6b2f33520aeafa595db5209a696be621f08eae8cab882297c4648fd35 not found: ID does not exist" Dec 01 09:45:04 crc kubenswrapper[5004]: I1201 09:45:04.980678 5004 scope.go:117] "RemoveContainer" containerID="0f10804463fd54e18387046cbfbbe7922c9eb801491686242192295a5dc0cda0" Dec 01 09:45:04 crc kubenswrapper[5004]: E1201 09:45:04.981484 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f10804463fd54e18387046cbfbbe7922c9eb801491686242192295a5dc0cda0\": container with ID starting with 0f10804463fd54e18387046cbfbbe7922c9eb801491686242192295a5dc0cda0 not found: ID does not exist" containerID="0f10804463fd54e18387046cbfbbe7922c9eb801491686242192295a5dc0cda0" Dec 01 09:45:04 crc kubenswrapper[5004]: I1201 09:45:04.981511 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f10804463fd54e18387046cbfbbe7922c9eb801491686242192295a5dc0cda0"} err="failed to get container status \"0f10804463fd54e18387046cbfbbe7922c9eb801491686242192295a5dc0cda0\": rpc error: code = NotFound desc = could not find container \"0f10804463fd54e18387046cbfbbe7922c9eb801491686242192295a5dc0cda0\": container with ID starting with 0f10804463fd54e18387046cbfbbe7922c9eb801491686242192295a5dc0cda0 not found: ID does not exist" Dec 01 09:45:04 crc kubenswrapper[5004]: I1201 09:45:04.981573 5004 scope.go:117] "RemoveContainer" containerID="6c09ff6ef3138f504d54e9e6f740c48dcf6838075b8f911c75a3cf998ca3eb01" Dec 01 09:45:04 crc kubenswrapper[5004]: E1201 09:45:04.982239 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c09ff6ef3138f504d54e9e6f740c48dcf6838075b8f911c75a3cf998ca3eb01\": container with ID starting with 6c09ff6ef3138f504d54e9e6f740c48dcf6838075b8f911c75a3cf998ca3eb01 not found: ID does not exist" containerID="6c09ff6ef3138f504d54e9e6f740c48dcf6838075b8f911c75a3cf998ca3eb01" Dec 01 09:45:04 crc kubenswrapper[5004]: I1201 09:45:04.982340 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c09ff6ef3138f504d54e9e6f740c48dcf6838075b8f911c75a3cf998ca3eb01"} err="failed to get container status \"6c09ff6ef3138f504d54e9e6f740c48dcf6838075b8f911c75a3cf998ca3eb01\": rpc error: code = NotFound desc = could not find container \"6c09ff6ef3138f504d54e9e6f740c48dcf6838075b8f911c75a3cf998ca3eb01\": container with ID starting with 6c09ff6ef3138f504d54e9e6f740c48dcf6838075b8f911c75a3cf998ca3eb01 not found: ID does not exist" Dec 01 09:45:05 crc kubenswrapper[5004]: I1201 09:45:05.007076 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9534f7b-b546-4e25-89ad-9b4b40221b7a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:45:05 crc kubenswrapper[5004]: I1201 09:45:05.007113 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2g7hv\" (UniqueName: \"kubernetes.io/projected/c9534f7b-b546-4e25-89ad-9b4b40221b7a-kube-api-access-2g7hv\") on node \"crc\" DevicePath \"\"" Dec 01 09:45:05 crc kubenswrapper[5004]: I1201 09:45:05.231839 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-47kxb"] Dec 01 09:45:05 crc kubenswrapper[5004]: I1201 09:45:05.243865 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-47kxb"] Dec 01 09:45:05 crc kubenswrapper[5004]: I1201 09:45:05.719424 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409660-ggjd9"] Dec 01 09:45:05 crc kubenswrapper[5004]: I1201 09:45:05.731389 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409660-ggjd9"] Dec 01 09:45:06 crc kubenswrapper[5004]: I1201 09:45:06.779140 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a83901fd-e069-4f8c-81d7-de04d71937f5" path="/var/lib/kubelet/pods/a83901fd-e069-4f8c-81d7-de04d71937f5/volumes" Dec 01 09:45:06 crc kubenswrapper[5004]: I1201 09:45:06.782514 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9534f7b-b546-4e25-89ad-9b4b40221b7a" path="/var/lib/kubelet/pods/c9534f7b-b546-4e25-89ad-9b4b40221b7a/volumes" Dec 01 09:45:07 crc kubenswrapper[5004]: I1201 09:45:07.759142 5004 scope.go:117] "RemoveContainer" containerID="844931be84dd9ed1d62e412f6ab6e17f8e3efdf77d93563aaf4b60bee8477332" Dec 01 09:45:07 crc kubenswrapper[5004]: E1201 09:45:07.759972 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:45:20 crc kubenswrapper[5004]: I1201 09:45:20.462121 5004 scope.go:117] "RemoveContainer" containerID="362700c191262c4990d04a446016dfe8188328086e503082fd9a3d596f761576" Dec 01 09:45:21 crc kubenswrapper[5004]: I1201 09:45:21.758829 5004 scope.go:117] "RemoveContainer" containerID="844931be84dd9ed1d62e412f6ab6e17f8e3efdf77d93563aaf4b60bee8477332" Dec 01 09:45:21 crc kubenswrapper[5004]: E1201 09:45:21.759617 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:45:33 crc kubenswrapper[5004]: I1201 09:45:33.759890 5004 scope.go:117] "RemoveContainer" containerID="844931be84dd9ed1d62e412f6ab6e17f8e3efdf77d93563aaf4b60bee8477332" Dec 01 09:45:33 crc kubenswrapper[5004]: E1201 09:45:33.760841 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:45:46 crc kubenswrapper[5004]: I1201 09:45:46.758698 5004 scope.go:117] "RemoveContainer" containerID="844931be84dd9ed1d62e412f6ab6e17f8e3efdf77d93563aaf4b60bee8477332" Dec 01 09:45:46 crc kubenswrapper[5004]: E1201 09:45:46.759615 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:46:00 crc kubenswrapper[5004]: I1201 09:46:00.696494 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fntzv"] Dec 01 09:46:00 crc kubenswrapper[5004]: E1201 09:46:00.697438 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9534f7b-b546-4e25-89ad-9b4b40221b7a" containerName="registry-server" Dec 01 09:46:00 crc kubenswrapper[5004]: I1201 09:46:00.697450 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9534f7b-b546-4e25-89ad-9b4b40221b7a" containerName="registry-server" Dec 01 09:46:00 crc kubenswrapper[5004]: E1201 09:46:00.697485 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc07d34c-6231-4d1d-bd64-79d325db1298" containerName="collect-profiles" Dec 01 09:46:00 crc kubenswrapper[5004]: I1201 09:46:00.697491 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc07d34c-6231-4d1d-bd64-79d325db1298" containerName="collect-profiles" Dec 01 09:46:00 crc kubenswrapper[5004]: E1201 09:46:00.697517 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9534f7b-b546-4e25-89ad-9b4b40221b7a" containerName="extract-utilities" Dec 01 09:46:00 crc kubenswrapper[5004]: I1201 09:46:00.697524 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9534f7b-b546-4e25-89ad-9b4b40221b7a" containerName="extract-utilities" Dec 01 09:46:00 crc kubenswrapper[5004]: E1201 09:46:00.697534 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9534f7b-b546-4e25-89ad-9b4b40221b7a" containerName="extract-content" Dec 01 09:46:00 crc kubenswrapper[5004]: I1201 09:46:00.697540 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9534f7b-b546-4e25-89ad-9b4b40221b7a" containerName="extract-content" Dec 01 09:46:00 crc kubenswrapper[5004]: I1201 09:46:00.697787 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9534f7b-b546-4e25-89ad-9b4b40221b7a" containerName="registry-server" Dec 01 09:46:00 crc kubenswrapper[5004]: I1201 09:46:00.697817 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc07d34c-6231-4d1d-bd64-79d325db1298" containerName="collect-profiles" Dec 01 09:46:00 crc kubenswrapper[5004]: I1201 09:46:00.699496 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fntzv" Dec 01 09:46:00 crc kubenswrapper[5004]: I1201 09:46:00.719435 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fntzv"] Dec 01 09:46:00 crc kubenswrapper[5004]: I1201 09:46:00.759450 5004 scope.go:117] "RemoveContainer" containerID="844931be84dd9ed1d62e412f6ab6e17f8e3efdf77d93563aaf4b60bee8477332" Dec 01 09:46:00 crc kubenswrapper[5004]: E1201 09:46:00.759893 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:46:00 crc kubenswrapper[5004]: I1201 09:46:00.766966 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zkjv\" (UniqueName: \"kubernetes.io/projected/7f849ff7-2248-4c69-9ed3-4a3f9babbc5e-kube-api-access-7zkjv\") pod \"community-operators-fntzv\" (UID: \"7f849ff7-2248-4c69-9ed3-4a3f9babbc5e\") " pod="openshift-marketplace/community-operators-fntzv" Dec 01 09:46:00 crc kubenswrapper[5004]: I1201 09:46:00.768791 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f849ff7-2248-4c69-9ed3-4a3f9babbc5e-catalog-content\") pod \"community-operators-fntzv\" (UID: \"7f849ff7-2248-4c69-9ed3-4a3f9babbc5e\") " pod="openshift-marketplace/community-operators-fntzv" Dec 01 09:46:00 crc kubenswrapper[5004]: I1201 09:46:00.768878 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f849ff7-2248-4c69-9ed3-4a3f9babbc5e-utilities\") pod \"community-operators-fntzv\" (UID: \"7f849ff7-2248-4c69-9ed3-4a3f9babbc5e\") " pod="openshift-marketplace/community-operators-fntzv" Dec 01 09:46:00 crc kubenswrapper[5004]: I1201 09:46:00.871509 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zkjv\" (UniqueName: \"kubernetes.io/projected/7f849ff7-2248-4c69-9ed3-4a3f9babbc5e-kube-api-access-7zkjv\") pod \"community-operators-fntzv\" (UID: \"7f849ff7-2248-4c69-9ed3-4a3f9babbc5e\") " pod="openshift-marketplace/community-operators-fntzv" Dec 01 09:46:00 crc kubenswrapper[5004]: I1201 09:46:00.871867 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f849ff7-2248-4c69-9ed3-4a3f9babbc5e-catalog-content\") pod \"community-operators-fntzv\" (UID: \"7f849ff7-2248-4c69-9ed3-4a3f9babbc5e\") " pod="openshift-marketplace/community-operators-fntzv" Dec 01 09:46:00 crc kubenswrapper[5004]: I1201 09:46:00.871938 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f849ff7-2248-4c69-9ed3-4a3f9babbc5e-utilities\") pod \"community-operators-fntzv\" (UID: \"7f849ff7-2248-4c69-9ed3-4a3f9babbc5e\") " pod="openshift-marketplace/community-operators-fntzv" Dec 01 09:46:00 crc kubenswrapper[5004]: I1201 09:46:00.872388 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f849ff7-2248-4c69-9ed3-4a3f9babbc5e-catalog-content\") pod \"community-operators-fntzv\" (UID: \"7f849ff7-2248-4c69-9ed3-4a3f9babbc5e\") " pod="openshift-marketplace/community-operators-fntzv" Dec 01 09:46:00 crc kubenswrapper[5004]: I1201 09:46:00.872408 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f849ff7-2248-4c69-9ed3-4a3f9babbc5e-utilities\") pod \"community-operators-fntzv\" (UID: \"7f849ff7-2248-4c69-9ed3-4a3f9babbc5e\") " pod="openshift-marketplace/community-operators-fntzv" Dec 01 09:46:00 crc kubenswrapper[5004]: I1201 09:46:00.891348 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zkjv\" (UniqueName: \"kubernetes.io/projected/7f849ff7-2248-4c69-9ed3-4a3f9babbc5e-kube-api-access-7zkjv\") pod \"community-operators-fntzv\" (UID: \"7f849ff7-2248-4c69-9ed3-4a3f9babbc5e\") " pod="openshift-marketplace/community-operators-fntzv" Dec 01 09:46:01 crc kubenswrapper[5004]: I1201 09:46:01.040351 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fntzv" Dec 01 09:46:01 crc kubenswrapper[5004]: I1201 09:46:01.558284 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fntzv"] Dec 01 09:46:01 crc kubenswrapper[5004]: I1201 09:46:01.613513 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fntzv" event={"ID":"7f849ff7-2248-4c69-9ed3-4a3f9babbc5e","Type":"ContainerStarted","Data":"c37de56c859f2e75f399f5bef524a972286be7cd297447ff558340e63289b20b"} Dec 01 09:46:02 crc kubenswrapper[5004]: I1201 09:46:02.625419 5004 generic.go:334] "Generic (PLEG): container finished" podID="7f849ff7-2248-4c69-9ed3-4a3f9babbc5e" containerID="7c492032d0eecd02ca3f50b50b07dc38ba84734e8fbc2f8bbdbc16547aff5a89" exitCode=0 Dec 01 09:46:02 crc kubenswrapper[5004]: I1201 09:46:02.625516 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fntzv" event={"ID":"7f849ff7-2248-4c69-9ed3-4a3f9babbc5e","Type":"ContainerDied","Data":"7c492032d0eecd02ca3f50b50b07dc38ba84734e8fbc2f8bbdbc16547aff5a89"} Dec 01 09:46:02 crc kubenswrapper[5004]: I1201 09:46:02.628599 5004 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 09:46:04 crc kubenswrapper[5004]: I1201 09:46:04.648116 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fntzv" event={"ID":"7f849ff7-2248-4c69-9ed3-4a3f9babbc5e","Type":"ContainerStarted","Data":"3055576d4ca2504ad093c589eab6c76fceb7318578f15fd6f07e8a6370d26db9"} Dec 01 09:46:05 crc kubenswrapper[5004]: I1201 09:46:05.659092 5004 generic.go:334] "Generic (PLEG): container finished" podID="7f849ff7-2248-4c69-9ed3-4a3f9babbc5e" containerID="3055576d4ca2504ad093c589eab6c76fceb7318578f15fd6f07e8a6370d26db9" exitCode=0 Dec 01 09:46:05 crc kubenswrapper[5004]: I1201 09:46:05.659170 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fntzv" event={"ID":"7f849ff7-2248-4c69-9ed3-4a3f9babbc5e","Type":"ContainerDied","Data":"3055576d4ca2504ad093c589eab6c76fceb7318578f15fd6f07e8a6370d26db9"} Dec 01 09:46:06 crc kubenswrapper[5004]: I1201 09:46:06.673264 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fntzv" event={"ID":"7f849ff7-2248-4c69-9ed3-4a3f9babbc5e","Type":"ContainerStarted","Data":"1bce77bf792e3f1899478fb1485806d441acbbd21b6f509ce0d8c338b0e95bfe"} Dec 01 09:46:06 crc kubenswrapper[5004]: I1201 09:46:06.693934 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fntzv" podStartSLOduration=3.218940041 podStartE2EDuration="6.693914429s" podCreationTimestamp="2025-12-01 09:46:00 +0000 UTC" firstStartedPulling="2025-12-01 09:46:02.627791301 +0000 UTC m=+5340.192783283" lastFinishedPulling="2025-12-01 09:46:06.102765689 +0000 UTC m=+5343.667757671" observedRunningTime="2025-12-01 09:46:06.692885114 +0000 UTC m=+5344.257877106" watchObservedRunningTime="2025-12-01 09:46:06.693914429 +0000 UTC m=+5344.258906411" Dec 01 09:46:11 crc kubenswrapper[5004]: I1201 09:46:11.041152 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fntzv" Dec 01 09:46:11 crc kubenswrapper[5004]: I1201 09:46:11.041796 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fntzv" Dec 01 09:46:11 crc kubenswrapper[5004]: I1201 09:46:11.088985 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fntzv" Dec 01 09:46:11 crc kubenswrapper[5004]: I1201 09:46:11.779199 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fntzv" Dec 01 09:46:11 crc kubenswrapper[5004]: I1201 09:46:11.835144 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fntzv"] Dec 01 09:46:13 crc kubenswrapper[5004]: I1201 09:46:13.744640 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fntzv" podUID="7f849ff7-2248-4c69-9ed3-4a3f9babbc5e" containerName="registry-server" containerID="cri-o://1bce77bf792e3f1899478fb1485806d441acbbd21b6f509ce0d8c338b0e95bfe" gracePeriod=2 Dec 01 09:46:14 crc kubenswrapper[5004]: I1201 09:46:14.410025 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fntzv" Dec 01 09:46:14 crc kubenswrapper[5004]: I1201 09:46:14.515499 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f849ff7-2248-4c69-9ed3-4a3f9babbc5e-utilities\") pod \"7f849ff7-2248-4c69-9ed3-4a3f9babbc5e\" (UID: \"7f849ff7-2248-4c69-9ed3-4a3f9babbc5e\") " Dec 01 09:46:14 crc kubenswrapper[5004]: I1201 09:46:14.515635 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f849ff7-2248-4c69-9ed3-4a3f9babbc5e-catalog-content\") pod \"7f849ff7-2248-4c69-9ed3-4a3f9babbc5e\" (UID: \"7f849ff7-2248-4c69-9ed3-4a3f9babbc5e\") " Dec 01 09:46:14 crc kubenswrapper[5004]: I1201 09:46:14.515709 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zkjv\" (UniqueName: \"kubernetes.io/projected/7f849ff7-2248-4c69-9ed3-4a3f9babbc5e-kube-api-access-7zkjv\") pod \"7f849ff7-2248-4c69-9ed3-4a3f9babbc5e\" (UID: \"7f849ff7-2248-4c69-9ed3-4a3f9babbc5e\") " Dec 01 09:46:14 crc kubenswrapper[5004]: I1201 09:46:14.516635 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f849ff7-2248-4c69-9ed3-4a3f9babbc5e-utilities" (OuterVolumeSpecName: "utilities") pod "7f849ff7-2248-4c69-9ed3-4a3f9babbc5e" (UID: "7f849ff7-2248-4c69-9ed3-4a3f9babbc5e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:46:14 crc kubenswrapper[5004]: I1201 09:46:14.533859 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f849ff7-2248-4c69-9ed3-4a3f9babbc5e-kube-api-access-7zkjv" (OuterVolumeSpecName: "kube-api-access-7zkjv") pod "7f849ff7-2248-4c69-9ed3-4a3f9babbc5e" (UID: "7f849ff7-2248-4c69-9ed3-4a3f9babbc5e"). InnerVolumeSpecName "kube-api-access-7zkjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:46:14 crc kubenswrapper[5004]: I1201 09:46:14.593654 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f849ff7-2248-4c69-9ed3-4a3f9babbc5e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7f849ff7-2248-4c69-9ed3-4a3f9babbc5e" (UID: "7f849ff7-2248-4c69-9ed3-4a3f9babbc5e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:46:14 crc kubenswrapper[5004]: I1201 09:46:14.619453 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f849ff7-2248-4c69-9ed3-4a3f9babbc5e-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:46:14 crc kubenswrapper[5004]: I1201 09:46:14.619755 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f849ff7-2248-4c69-9ed3-4a3f9babbc5e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:46:14 crc kubenswrapper[5004]: I1201 09:46:14.619876 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7zkjv\" (UniqueName: \"kubernetes.io/projected/7f849ff7-2248-4c69-9ed3-4a3f9babbc5e-kube-api-access-7zkjv\") on node \"crc\" DevicePath \"\"" Dec 01 09:46:14 crc kubenswrapper[5004]: I1201 09:46:14.759052 5004 generic.go:334] "Generic (PLEG): container finished" podID="7f849ff7-2248-4c69-9ed3-4a3f9babbc5e" containerID="1bce77bf792e3f1899478fb1485806d441acbbd21b6f509ce0d8c338b0e95bfe" exitCode=0 Dec 01 09:46:14 crc kubenswrapper[5004]: I1201 09:46:14.759156 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fntzv" Dec 01 09:46:14 crc kubenswrapper[5004]: I1201 09:46:14.779977 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fntzv" event={"ID":"7f849ff7-2248-4c69-9ed3-4a3f9babbc5e","Type":"ContainerDied","Data":"1bce77bf792e3f1899478fb1485806d441acbbd21b6f509ce0d8c338b0e95bfe"} Dec 01 09:46:14 crc kubenswrapper[5004]: I1201 09:46:14.780035 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fntzv" event={"ID":"7f849ff7-2248-4c69-9ed3-4a3f9babbc5e","Type":"ContainerDied","Data":"c37de56c859f2e75f399f5bef524a972286be7cd297447ff558340e63289b20b"} Dec 01 09:46:14 crc kubenswrapper[5004]: I1201 09:46:14.780063 5004 scope.go:117] "RemoveContainer" containerID="1bce77bf792e3f1899478fb1485806d441acbbd21b6f509ce0d8c338b0e95bfe" Dec 01 09:46:14 crc kubenswrapper[5004]: I1201 09:46:14.805207 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fntzv"] Dec 01 09:46:14 crc kubenswrapper[5004]: I1201 09:46:14.809994 5004 scope.go:117] "RemoveContainer" containerID="3055576d4ca2504ad093c589eab6c76fceb7318578f15fd6f07e8a6370d26db9" Dec 01 09:46:14 crc kubenswrapper[5004]: I1201 09:46:14.820054 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fntzv"] Dec 01 09:46:14 crc kubenswrapper[5004]: I1201 09:46:14.833460 5004 scope.go:117] "RemoveContainer" containerID="7c492032d0eecd02ca3f50b50b07dc38ba84734e8fbc2f8bbdbc16547aff5a89" Dec 01 09:46:14 crc kubenswrapper[5004]: I1201 09:46:14.885081 5004 scope.go:117] "RemoveContainer" containerID="1bce77bf792e3f1899478fb1485806d441acbbd21b6f509ce0d8c338b0e95bfe" Dec 01 09:46:14 crc kubenswrapper[5004]: E1201 09:46:14.885701 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bce77bf792e3f1899478fb1485806d441acbbd21b6f509ce0d8c338b0e95bfe\": container with ID starting with 1bce77bf792e3f1899478fb1485806d441acbbd21b6f509ce0d8c338b0e95bfe not found: ID does not exist" containerID="1bce77bf792e3f1899478fb1485806d441acbbd21b6f509ce0d8c338b0e95bfe" Dec 01 09:46:14 crc kubenswrapper[5004]: I1201 09:46:14.885778 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bce77bf792e3f1899478fb1485806d441acbbd21b6f509ce0d8c338b0e95bfe"} err="failed to get container status \"1bce77bf792e3f1899478fb1485806d441acbbd21b6f509ce0d8c338b0e95bfe\": rpc error: code = NotFound desc = could not find container \"1bce77bf792e3f1899478fb1485806d441acbbd21b6f509ce0d8c338b0e95bfe\": container with ID starting with 1bce77bf792e3f1899478fb1485806d441acbbd21b6f509ce0d8c338b0e95bfe not found: ID does not exist" Dec 01 09:46:14 crc kubenswrapper[5004]: I1201 09:46:14.885819 5004 scope.go:117] "RemoveContainer" containerID="3055576d4ca2504ad093c589eab6c76fceb7318578f15fd6f07e8a6370d26db9" Dec 01 09:46:14 crc kubenswrapper[5004]: E1201 09:46:14.886160 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3055576d4ca2504ad093c589eab6c76fceb7318578f15fd6f07e8a6370d26db9\": container with ID starting with 3055576d4ca2504ad093c589eab6c76fceb7318578f15fd6f07e8a6370d26db9 not found: ID does not exist" containerID="3055576d4ca2504ad093c589eab6c76fceb7318578f15fd6f07e8a6370d26db9" Dec 01 09:46:14 crc kubenswrapper[5004]: I1201 09:46:14.886202 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3055576d4ca2504ad093c589eab6c76fceb7318578f15fd6f07e8a6370d26db9"} err="failed to get container status \"3055576d4ca2504ad093c589eab6c76fceb7318578f15fd6f07e8a6370d26db9\": rpc error: code = NotFound desc = could not find container \"3055576d4ca2504ad093c589eab6c76fceb7318578f15fd6f07e8a6370d26db9\": container with ID starting with 3055576d4ca2504ad093c589eab6c76fceb7318578f15fd6f07e8a6370d26db9 not found: ID does not exist" Dec 01 09:46:14 crc kubenswrapper[5004]: I1201 09:46:14.886231 5004 scope.go:117] "RemoveContainer" containerID="7c492032d0eecd02ca3f50b50b07dc38ba84734e8fbc2f8bbdbc16547aff5a89" Dec 01 09:46:14 crc kubenswrapper[5004]: E1201 09:46:14.886656 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c492032d0eecd02ca3f50b50b07dc38ba84734e8fbc2f8bbdbc16547aff5a89\": container with ID starting with 7c492032d0eecd02ca3f50b50b07dc38ba84734e8fbc2f8bbdbc16547aff5a89 not found: ID does not exist" containerID="7c492032d0eecd02ca3f50b50b07dc38ba84734e8fbc2f8bbdbc16547aff5a89" Dec 01 09:46:14 crc kubenswrapper[5004]: I1201 09:46:14.886687 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c492032d0eecd02ca3f50b50b07dc38ba84734e8fbc2f8bbdbc16547aff5a89"} err="failed to get container status \"7c492032d0eecd02ca3f50b50b07dc38ba84734e8fbc2f8bbdbc16547aff5a89\": rpc error: code = NotFound desc = could not find container \"7c492032d0eecd02ca3f50b50b07dc38ba84734e8fbc2f8bbdbc16547aff5a89\": container with ID starting with 7c492032d0eecd02ca3f50b50b07dc38ba84734e8fbc2f8bbdbc16547aff5a89 not found: ID does not exist" Dec 01 09:46:15 crc kubenswrapper[5004]: I1201 09:46:15.759542 5004 scope.go:117] "RemoveContainer" containerID="844931be84dd9ed1d62e412f6ab6e17f8e3efdf77d93563aaf4b60bee8477332" Dec 01 09:46:15 crc kubenswrapper[5004]: E1201 09:46:15.760278 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:46:16 crc kubenswrapper[5004]: I1201 09:46:16.778926 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f849ff7-2248-4c69-9ed3-4a3f9babbc5e" path="/var/lib/kubelet/pods/7f849ff7-2248-4c69-9ed3-4a3f9babbc5e/volumes" Dec 01 09:46:27 crc kubenswrapper[5004]: I1201 09:46:27.759378 5004 scope.go:117] "RemoveContainer" containerID="844931be84dd9ed1d62e412f6ab6e17f8e3efdf77d93563aaf4b60bee8477332" Dec 01 09:46:27 crc kubenswrapper[5004]: E1201 09:46:27.760098 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:46:42 crc kubenswrapper[5004]: I1201 09:46:42.770714 5004 scope.go:117] "RemoveContainer" containerID="844931be84dd9ed1d62e412f6ab6e17f8e3efdf77d93563aaf4b60bee8477332" Dec 01 09:46:42 crc kubenswrapper[5004]: E1201 09:46:42.771878 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:46:55 crc kubenswrapper[5004]: I1201 09:46:55.759252 5004 scope.go:117] "RemoveContainer" containerID="844931be84dd9ed1d62e412f6ab6e17f8e3efdf77d93563aaf4b60bee8477332" Dec 01 09:46:55 crc kubenswrapper[5004]: E1201 09:46:55.760200 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:47:10 crc kubenswrapper[5004]: I1201 09:47:10.759909 5004 scope.go:117] "RemoveContainer" containerID="844931be84dd9ed1d62e412f6ab6e17f8e3efdf77d93563aaf4b60bee8477332" Dec 01 09:47:10 crc kubenswrapper[5004]: E1201 09:47:10.761171 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:47:25 crc kubenswrapper[5004]: I1201 09:47:25.759344 5004 scope.go:117] "RemoveContainer" containerID="844931be84dd9ed1d62e412f6ab6e17f8e3efdf77d93563aaf4b60bee8477332" Dec 01 09:47:25 crc kubenswrapper[5004]: E1201 09:47:25.760230 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:47:38 crc kubenswrapper[5004]: I1201 09:47:38.759003 5004 scope.go:117] "RemoveContainer" containerID="844931be84dd9ed1d62e412f6ab6e17f8e3efdf77d93563aaf4b60bee8477332" Dec 01 09:47:38 crc kubenswrapper[5004]: E1201 09:47:38.760197 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:47:53 crc kubenswrapper[5004]: I1201 09:47:53.759141 5004 scope.go:117] "RemoveContainer" containerID="844931be84dd9ed1d62e412f6ab6e17f8e3efdf77d93563aaf4b60bee8477332" Dec 01 09:47:53 crc kubenswrapper[5004]: E1201 09:47:53.760440 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:48:04 crc kubenswrapper[5004]: I1201 09:48:04.759990 5004 scope.go:117] "RemoveContainer" containerID="844931be84dd9ed1d62e412f6ab6e17f8e3efdf77d93563aaf4b60bee8477332" Dec 01 09:48:04 crc kubenswrapper[5004]: E1201 09:48:04.761039 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:48:19 crc kubenswrapper[5004]: I1201 09:48:19.759520 5004 scope.go:117] "RemoveContainer" containerID="844931be84dd9ed1d62e412f6ab6e17f8e3efdf77d93563aaf4b60bee8477332" Dec 01 09:48:19 crc kubenswrapper[5004]: E1201 09:48:19.760466 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:48:31 crc kubenswrapper[5004]: I1201 09:48:31.758346 5004 scope.go:117] "RemoveContainer" containerID="844931be84dd9ed1d62e412f6ab6e17f8e3efdf77d93563aaf4b60bee8477332" Dec 01 09:48:31 crc kubenswrapper[5004]: E1201 09:48:31.759195 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:48:46 crc kubenswrapper[5004]: I1201 09:48:46.759019 5004 scope.go:117] "RemoveContainer" containerID="844931be84dd9ed1d62e412f6ab6e17f8e3efdf77d93563aaf4b60bee8477332" Dec 01 09:48:46 crc kubenswrapper[5004]: E1201 09:48:46.759880 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:48:59 crc kubenswrapper[5004]: I1201 09:48:59.759635 5004 scope.go:117] "RemoveContainer" containerID="844931be84dd9ed1d62e412f6ab6e17f8e3efdf77d93563aaf4b60bee8477332" Dec 01 09:48:59 crc kubenswrapper[5004]: E1201 09:48:59.760444 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:49:10 crc kubenswrapper[5004]: I1201 09:49:10.759355 5004 scope.go:117] "RemoveContainer" containerID="844931be84dd9ed1d62e412f6ab6e17f8e3efdf77d93563aaf4b60bee8477332" Dec 01 09:49:10 crc kubenswrapper[5004]: E1201 09:49:10.760059 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:49:23 crc kubenswrapper[5004]: I1201 09:49:23.759311 5004 scope.go:117] "RemoveContainer" containerID="844931be84dd9ed1d62e412f6ab6e17f8e3efdf77d93563aaf4b60bee8477332" Dec 01 09:49:23 crc kubenswrapper[5004]: E1201 09:49:23.760213 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:49:34 crc kubenswrapper[5004]: I1201 09:49:34.759464 5004 scope.go:117] "RemoveContainer" containerID="844931be84dd9ed1d62e412f6ab6e17f8e3efdf77d93563aaf4b60bee8477332" Dec 01 09:49:34 crc kubenswrapper[5004]: E1201 09:49:34.760271 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:49:47 crc kubenswrapper[5004]: I1201 09:49:47.759921 5004 scope.go:117] "RemoveContainer" containerID="844931be84dd9ed1d62e412f6ab6e17f8e3efdf77d93563aaf4b60bee8477332" Dec 01 09:49:48 crc kubenswrapper[5004]: I1201 09:49:48.156238 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" event={"ID":"f9977ebb-82de-4e96-8763-0b5a84f8d4ce","Type":"ContainerStarted","Data":"927d6dc47244ce32c9ccee70e9761b31e1106016f419ef6a20e9de07d977672d"} Dec 01 09:52:08 crc kubenswrapper[5004]: I1201 09:52:08.729818 5004 patch_prober.go:28] interesting pod/machine-config-daemon-fvdgt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:52:08 crc kubenswrapper[5004]: I1201 09:52:08.730439 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:52:38 crc kubenswrapper[5004]: I1201 09:52:38.729039 5004 patch_prober.go:28] interesting pod/machine-config-daemon-fvdgt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:52:38 crc kubenswrapper[5004]: I1201 09:52:38.729648 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:52:59 crc kubenswrapper[5004]: I1201 09:52:59.705961 5004 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="04a6dd3a-f297-40b9-b480-0239383b9460" containerName="galera" probeResult="failure" output="command timed out" Dec 01 09:53:08 crc kubenswrapper[5004]: I1201 09:53:08.729190 5004 patch_prober.go:28] interesting pod/machine-config-daemon-fvdgt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:53:08 crc kubenswrapper[5004]: I1201 09:53:08.729853 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:53:08 crc kubenswrapper[5004]: I1201 09:53:08.729912 5004 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" Dec 01 09:53:08 crc kubenswrapper[5004]: I1201 09:53:08.731020 5004 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"927d6dc47244ce32c9ccee70e9761b31e1106016f419ef6a20e9de07d977672d"} pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 09:53:08 crc kubenswrapper[5004]: I1201 09:53:08.731088 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" containerID="cri-o://927d6dc47244ce32c9ccee70e9761b31e1106016f419ef6a20e9de07d977672d" gracePeriod=600 Dec 01 09:53:09 crc kubenswrapper[5004]: I1201 09:53:09.832090 5004 generic.go:334] "Generic (PLEG): container finished" podID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerID="927d6dc47244ce32c9ccee70e9761b31e1106016f419ef6a20e9de07d977672d" exitCode=0 Dec 01 09:53:09 crc kubenswrapper[5004]: I1201 09:53:09.832152 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" event={"ID":"f9977ebb-82de-4e96-8763-0b5a84f8d4ce","Type":"ContainerDied","Data":"927d6dc47244ce32c9ccee70e9761b31e1106016f419ef6a20e9de07d977672d"} Dec 01 09:53:09 crc kubenswrapper[5004]: I1201 09:53:09.832503 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" event={"ID":"f9977ebb-82de-4e96-8763-0b5a84f8d4ce","Type":"ContainerStarted","Data":"1888096bc7eea8f4ed362bde431fc15761a89c0c03d561f41b82ce51165e9213"} Dec 01 09:53:09 crc kubenswrapper[5004]: I1201 09:53:09.832530 5004 scope.go:117] "RemoveContainer" containerID="844931be84dd9ed1d62e412f6ab6e17f8e3efdf77d93563aaf4b60bee8477332" Dec 01 09:54:00 crc kubenswrapper[5004]: I1201 09:54:00.650209 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4s98v"] Dec 01 09:54:00 crc kubenswrapper[5004]: E1201 09:54:00.651298 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f849ff7-2248-4c69-9ed3-4a3f9babbc5e" containerName="extract-utilities" Dec 01 09:54:00 crc kubenswrapper[5004]: I1201 09:54:00.651312 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f849ff7-2248-4c69-9ed3-4a3f9babbc5e" containerName="extract-utilities" Dec 01 09:54:00 crc kubenswrapper[5004]: E1201 09:54:00.651332 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f849ff7-2248-4c69-9ed3-4a3f9babbc5e" containerName="registry-server" Dec 01 09:54:00 crc kubenswrapper[5004]: I1201 09:54:00.651338 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f849ff7-2248-4c69-9ed3-4a3f9babbc5e" containerName="registry-server" Dec 01 09:54:00 crc kubenswrapper[5004]: E1201 09:54:00.651365 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f849ff7-2248-4c69-9ed3-4a3f9babbc5e" containerName="extract-content" Dec 01 09:54:00 crc kubenswrapper[5004]: I1201 09:54:00.651372 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f849ff7-2248-4c69-9ed3-4a3f9babbc5e" containerName="extract-content" Dec 01 09:54:00 crc kubenswrapper[5004]: I1201 09:54:00.651615 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f849ff7-2248-4c69-9ed3-4a3f9babbc5e" containerName="registry-server" Dec 01 09:54:00 crc kubenswrapper[5004]: I1201 09:54:00.653480 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4s98v" Dec 01 09:54:00 crc kubenswrapper[5004]: I1201 09:54:00.664663 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4s98v"] Dec 01 09:54:00 crc kubenswrapper[5004]: I1201 09:54:00.759902 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6af8218-c7bd-4b24-85c9-aa9e84b137f0-catalog-content\") pod \"redhat-operators-4s98v\" (UID: \"b6af8218-c7bd-4b24-85c9-aa9e84b137f0\") " pod="openshift-marketplace/redhat-operators-4s98v" Dec 01 09:54:00 crc kubenswrapper[5004]: I1201 09:54:00.760144 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6af8218-c7bd-4b24-85c9-aa9e84b137f0-utilities\") pod \"redhat-operators-4s98v\" (UID: \"b6af8218-c7bd-4b24-85c9-aa9e84b137f0\") " pod="openshift-marketplace/redhat-operators-4s98v" Dec 01 09:54:00 crc kubenswrapper[5004]: I1201 09:54:00.760203 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdxhk\" (UniqueName: \"kubernetes.io/projected/b6af8218-c7bd-4b24-85c9-aa9e84b137f0-kube-api-access-rdxhk\") pod \"redhat-operators-4s98v\" (UID: \"b6af8218-c7bd-4b24-85c9-aa9e84b137f0\") " pod="openshift-marketplace/redhat-operators-4s98v" Dec 01 09:54:00 crc kubenswrapper[5004]: I1201 09:54:00.862487 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6af8218-c7bd-4b24-85c9-aa9e84b137f0-utilities\") pod \"redhat-operators-4s98v\" (UID: \"b6af8218-c7bd-4b24-85c9-aa9e84b137f0\") " pod="openshift-marketplace/redhat-operators-4s98v" Dec 01 09:54:00 crc kubenswrapper[5004]: I1201 09:54:00.862594 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdxhk\" (UniqueName: \"kubernetes.io/projected/b6af8218-c7bd-4b24-85c9-aa9e84b137f0-kube-api-access-rdxhk\") pod \"redhat-operators-4s98v\" (UID: \"b6af8218-c7bd-4b24-85c9-aa9e84b137f0\") " pod="openshift-marketplace/redhat-operators-4s98v" Dec 01 09:54:00 crc kubenswrapper[5004]: I1201 09:54:00.862821 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6af8218-c7bd-4b24-85c9-aa9e84b137f0-catalog-content\") pod \"redhat-operators-4s98v\" (UID: \"b6af8218-c7bd-4b24-85c9-aa9e84b137f0\") " pod="openshift-marketplace/redhat-operators-4s98v" Dec 01 09:54:00 crc kubenswrapper[5004]: I1201 09:54:00.863623 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6af8218-c7bd-4b24-85c9-aa9e84b137f0-utilities\") pod \"redhat-operators-4s98v\" (UID: \"b6af8218-c7bd-4b24-85c9-aa9e84b137f0\") " pod="openshift-marketplace/redhat-operators-4s98v" Dec 01 09:54:00 crc kubenswrapper[5004]: I1201 09:54:00.863632 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6af8218-c7bd-4b24-85c9-aa9e84b137f0-catalog-content\") pod \"redhat-operators-4s98v\" (UID: \"b6af8218-c7bd-4b24-85c9-aa9e84b137f0\") " pod="openshift-marketplace/redhat-operators-4s98v" Dec 01 09:54:00 crc kubenswrapper[5004]: I1201 09:54:00.884880 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdxhk\" (UniqueName: \"kubernetes.io/projected/b6af8218-c7bd-4b24-85c9-aa9e84b137f0-kube-api-access-rdxhk\") pod \"redhat-operators-4s98v\" (UID: \"b6af8218-c7bd-4b24-85c9-aa9e84b137f0\") " pod="openshift-marketplace/redhat-operators-4s98v" Dec 01 09:54:00 crc kubenswrapper[5004]: I1201 09:54:00.976126 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4s98v" Dec 01 09:54:01 crc kubenswrapper[5004]: I1201 09:54:01.562381 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4s98v"] Dec 01 09:54:02 crc kubenswrapper[5004]: I1201 09:54:02.408380 5004 generic.go:334] "Generic (PLEG): container finished" podID="b6af8218-c7bd-4b24-85c9-aa9e84b137f0" containerID="005ada188d63e931d23c080852151c05d091a0d54a88b460a20b8a2734c8c7b7" exitCode=0 Dec 01 09:54:02 crc kubenswrapper[5004]: I1201 09:54:02.408488 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4s98v" event={"ID":"b6af8218-c7bd-4b24-85c9-aa9e84b137f0","Type":"ContainerDied","Data":"005ada188d63e931d23c080852151c05d091a0d54a88b460a20b8a2734c8c7b7"} Dec 01 09:54:02 crc kubenswrapper[5004]: I1201 09:54:02.408669 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4s98v" event={"ID":"b6af8218-c7bd-4b24-85c9-aa9e84b137f0","Type":"ContainerStarted","Data":"a617ad746e32a53e0e12cda49646bd236c62330e835b35283710930b9e9876fe"} Dec 01 09:54:02 crc kubenswrapper[5004]: I1201 09:54:02.410938 5004 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 09:54:04 crc kubenswrapper[5004]: I1201 09:54:04.431507 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4s98v" event={"ID":"b6af8218-c7bd-4b24-85c9-aa9e84b137f0","Type":"ContainerStarted","Data":"169e0fd5d7431d8a6d77d82daabe9d821d1a53c1b32493ce2fcd3e358af84dec"} Dec 01 09:54:08 crc kubenswrapper[5004]: I1201 09:54:08.478992 5004 generic.go:334] "Generic (PLEG): container finished" podID="b6af8218-c7bd-4b24-85c9-aa9e84b137f0" containerID="169e0fd5d7431d8a6d77d82daabe9d821d1a53c1b32493ce2fcd3e358af84dec" exitCode=0 Dec 01 09:54:08 crc kubenswrapper[5004]: I1201 09:54:08.479104 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4s98v" event={"ID":"b6af8218-c7bd-4b24-85c9-aa9e84b137f0","Type":"ContainerDied","Data":"169e0fd5d7431d8a6d77d82daabe9d821d1a53c1b32493ce2fcd3e358af84dec"} Dec 01 09:54:09 crc kubenswrapper[5004]: I1201 09:54:09.495514 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4s98v" event={"ID":"b6af8218-c7bd-4b24-85c9-aa9e84b137f0","Type":"ContainerStarted","Data":"36cf82c4197615f91897fd56ae88be9bad650af7326b16f122d09d273e6e9ea2"} Dec 01 09:54:09 crc kubenswrapper[5004]: I1201 09:54:09.523033 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4s98v" podStartSLOduration=2.85688039 podStartE2EDuration="9.523014031s" podCreationTimestamp="2025-12-01 09:54:00 +0000 UTC" firstStartedPulling="2025-12-01 09:54:02.410582804 +0000 UTC m=+5819.975574796" lastFinishedPulling="2025-12-01 09:54:09.076716455 +0000 UTC m=+5826.641708437" observedRunningTime="2025-12-01 09:54:09.520335466 +0000 UTC m=+5827.085327458" watchObservedRunningTime="2025-12-01 09:54:09.523014031 +0000 UTC m=+5827.088006013" Dec 01 09:54:10 crc kubenswrapper[5004]: I1201 09:54:10.977094 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4s98v" Dec 01 09:54:10 crc kubenswrapper[5004]: I1201 09:54:10.977459 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4s98v" Dec 01 09:54:12 crc kubenswrapper[5004]: I1201 09:54:12.023395 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4s98v" podUID="b6af8218-c7bd-4b24-85c9-aa9e84b137f0" containerName="registry-server" probeResult="failure" output=< Dec 01 09:54:12 crc kubenswrapper[5004]: timeout: failed to connect service ":50051" within 1s Dec 01 09:54:12 crc kubenswrapper[5004]: > Dec 01 09:54:22 crc kubenswrapper[5004]: I1201 09:54:22.029670 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4s98v" podUID="b6af8218-c7bd-4b24-85c9-aa9e84b137f0" containerName="registry-server" probeResult="failure" output=< Dec 01 09:54:22 crc kubenswrapper[5004]: timeout: failed to connect service ":50051" within 1s Dec 01 09:54:22 crc kubenswrapper[5004]: > Dec 01 09:54:31 crc kubenswrapper[5004]: I1201 09:54:31.026801 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4s98v" Dec 01 09:54:31 crc kubenswrapper[5004]: I1201 09:54:31.086019 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4s98v" Dec 01 09:54:31 crc kubenswrapper[5004]: I1201 09:54:31.853617 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4s98v"] Dec 01 09:54:32 crc kubenswrapper[5004]: I1201 09:54:32.088553 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4s98v" podUID="b6af8218-c7bd-4b24-85c9-aa9e84b137f0" containerName="registry-server" containerID="cri-o://36cf82c4197615f91897fd56ae88be9bad650af7326b16f122d09d273e6e9ea2" gracePeriod=2 Dec 01 09:54:32 crc kubenswrapper[5004]: I1201 09:54:32.654586 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4s98v" Dec 01 09:54:32 crc kubenswrapper[5004]: I1201 09:54:32.766161 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdxhk\" (UniqueName: \"kubernetes.io/projected/b6af8218-c7bd-4b24-85c9-aa9e84b137f0-kube-api-access-rdxhk\") pod \"b6af8218-c7bd-4b24-85c9-aa9e84b137f0\" (UID: \"b6af8218-c7bd-4b24-85c9-aa9e84b137f0\") " Dec 01 09:54:32 crc kubenswrapper[5004]: I1201 09:54:32.766369 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6af8218-c7bd-4b24-85c9-aa9e84b137f0-catalog-content\") pod \"b6af8218-c7bd-4b24-85c9-aa9e84b137f0\" (UID: \"b6af8218-c7bd-4b24-85c9-aa9e84b137f0\") " Dec 01 09:54:32 crc kubenswrapper[5004]: I1201 09:54:32.766624 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6af8218-c7bd-4b24-85c9-aa9e84b137f0-utilities\") pod \"b6af8218-c7bd-4b24-85c9-aa9e84b137f0\" (UID: \"b6af8218-c7bd-4b24-85c9-aa9e84b137f0\") " Dec 01 09:54:32 crc kubenswrapper[5004]: I1201 09:54:32.767486 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6af8218-c7bd-4b24-85c9-aa9e84b137f0-utilities" (OuterVolumeSpecName: "utilities") pod "b6af8218-c7bd-4b24-85c9-aa9e84b137f0" (UID: "b6af8218-c7bd-4b24-85c9-aa9e84b137f0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:54:32 crc kubenswrapper[5004]: I1201 09:54:32.773011 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6af8218-c7bd-4b24-85c9-aa9e84b137f0-kube-api-access-rdxhk" (OuterVolumeSpecName: "kube-api-access-rdxhk") pod "b6af8218-c7bd-4b24-85c9-aa9e84b137f0" (UID: "b6af8218-c7bd-4b24-85c9-aa9e84b137f0"). InnerVolumeSpecName "kube-api-access-rdxhk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:54:32 crc kubenswrapper[5004]: I1201 09:54:32.869547 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6af8218-c7bd-4b24-85c9-aa9e84b137f0-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:54:32 crc kubenswrapper[5004]: I1201 09:54:32.869600 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdxhk\" (UniqueName: \"kubernetes.io/projected/b6af8218-c7bd-4b24-85c9-aa9e84b137f0-kube-api-access-rdxhk\") on node \"crc\" DevicePath \"\"" Dec 01 09:54:32 crc kubenswrapper[5004]: I1201 09:54:32.876869 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6af8218-c7bd-4b24-85c9-aa9e84b137f0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b6af8218-c7bd-4b24-85c9-aa9e84b137f0" (UID: "b6af8218-c7bd-4b24-85c9-aa9e84b137f0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:54:32 crc kubenswrapper[5004]: I1201 09:54:32.971945 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6af8218-c7bd-4b24-85c9-aa9e84b137f0-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:54:33 crc kubenswrapper[5004]: I1201 09:54:33.100651 5004 generic.go:334] "Generic (PLEG): container finished" podID="b6af8218-c7bd-4b24-85c9-aa9e84b137f0" containerID="36cf82c4197615f91897fd56ae88be9bad650af7326b16f122d09d273e6e9ea2" exitCode=0 Dec 01 09:54:33 crc kubenswrapper[5004]: I1201 09:54:33.100711 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4s98v" event={"ID":"b6af8218-c7bd-4b24-85c9-aa9e84b137f0","Type":"ContainerDied","Data":"36cf82c4197615f91897fd56ae88be9bad650af7326b16f122d09d273e6e9ea2"} Dec 01 09:54:33 crc kubenswrapper[5004]: I1201 09:54:33.100722 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4s98v" Dec 01 09:54:33 crc kubenswrapper[5004]: I1201 09:54:33.100744 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4s98v" event={"ID":"b6af8218-c7bd-4b24-85c9-aa9e84b137f0","Type":"ContainerDied","Data":"a617ad746e32a53e0e12cda49646bd236c62330e835b35283710930b9e9876fe"} Dec 01 09:54:33 crc kubenswrapper[5004]: I1201 09:54:33.100762 5004 scope.go:117] "RemoveContainer" containerID="36cf82c4197615f91897fd56ae88be9bad650af7326b16f122d09d273e6e9ea2" Dec 01 09:54:33 crc kubenswrapper[5004]: I1201 09:54:33.149613 5004 scope.go:117] "RemoveContainer" containerID="169e0fd5d7431d8a6d77d82daabe9d821d1a53c1b32493ce2fcd3e358af84dec" Dec 01 09:54:33 crc kubenswrapper[5004]: I1201 09:54:33.154895 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4s98v"] Dec 01 09:54:33 crc kubenswrapper[5004]: I1201 09:54:33.172931 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4s98v"] Dec 01 09:54:33 crc kubenswrapper[5004]: I1201 09:54:33.179104 5004 scope.go:117] "RemoveContainer" containerID="005ada188d63e931d23c080852151c05d091a0d54a88b460a20b8a2734c8c7b7" Dec 01 09:54:33 crc kubenswrapper[5004]: I1201 09:54:33.235419 5004 scope.go:117] "RemoveContainer" containerID="36cf82c4197615f91897fd56ae88be9bad650af7326b16f122d09d273e6e9ea2" Dec 01 09:54:33 crc kubenswrapper[5004]: E1201 09:54:33.236025 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36cf82c4197615f91897fd56ae88be9bad650af7326b16f122d09d273e6e9ea2\": container with ID starting with 36cf82c4197615f91897fd56ae88be9bad650af7326b16f122d09d273e6e9ea2 not found: ID does not exist" containerID="36cf82c4197615f91897fd56ae88be9bad650af7326b16f122d09d273e6e9ea2" Dec 01 09:54:33 crc kubenswrapper[5004]: I1201 09:54:33.236066 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36cf82c4197615f91897fd56ae88be9bad650af7326b16f122d09d273e6e9ea2"} err="failed to get container status \"36cf82c4197615f91897fd56ae88be9bad650af7326b16f122d09d273e6e9ea2\": rpc error: code = NotFound desc = could not find container \"36cf82c4197615f91897fd56ae88be9bad650af7326b16f122d09d273e6e9ea2\": container with ID starting with 36cf82c4197615f91897fd56ae88be9bad650af7326b16f122d09d273e6e9ea2 not found: ID does not exist" Dec 01 09:54:33 crc kubenswrapper[5004]: I1201 09:54:33.236094 5004 scope.go:117] "RemoveContainer" containerID="169e0fd5d7431d8a6d77d82daabe9d821d1a53c1b32493ce2fcd3e358af84dec" Dec 01 09:54:33 crc kubenswrapper[5004]: E1201 09:54:33.236417 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"169e0fd5d7431d8a6d77d82daabe9d821d1a53c1b32493ce2fcd3e358af84dec\": container with ID starting with 169e0fd5d7431d8a6d77d82daabe9d821d1a53c1b32493ce2fcd3e358af84dec not found: ID does not exist" containerID="169e0fd5d7431d8a6d77d82daabe9d821d1a53c1b32493ce2fcd3e358af84dec" Dec 01 09:54:33 crc kubenswrapper[5004]: I1201 09:54:33.236473 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"169e0fd5d7431d8a6d77d82daabe9d821d1a53c1b32493ce2fcd3e358af84dec"} err="failed to get container status \"169e0fd5d7431d8a6d77d82daabe9d821d1a53c1b32493ce2fcd3e358af84dec\": rpc error: code = NotFound desc = could not find container \"169e0fd5d7431d8a6d77d82daabe9d821d1a53c1b32493ce2fcd3e358af84dec\": container with ID starting with 169e0fd5d7431d8a6d77d82daabe9d821d1a53c1b32493ce2fcd3e358af84dec not found: ID does not exist" Dec 01 09:54:33 crc kubenswrapper[5004]: I1201 09:54:33.236506 5004 scope.go:117] "RemoveContainer" containerID="005ada188d63e931d23c080852151c05d091a0d54a88b460a20b8a2734c8c7b7" Dec 01 09:54:33 crc kubenswrapper[5004]: E1201 09:54:33.236912 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"005ada188d63e931d23c080852151c05d091a0d54a88b460a20b8a2734c8c7b7\": container with ID starting with 005ada188d63e931d23c080852151c05d091a0d54a88b460a20b8a2734c8c7b7 not found: ID does not exist" containerID="005ada188d63e931d23c080852151c05d091a0d54a88b460a20b8a2734c8c7b7" Dec 01 09:54:33 crc kubenswrapper[5004]: I1201 09:54:33.236939 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"005ada188d63e931d23c080852151c05d091a0d54a88b460a20b8a2734c8c7b7"} err="failed to get container status \"005ada188d63e931d23c080852151c05d091a0d54a88b460a20b8a2734c8c7b7\": rpc error: code = NotFound desc = could not find container \"005ada188d63e931d23c080852151c05d091a0d54a88b460a20b8a2734c8c7b7\": container with ID starting with 005ada188d63e931d23c080852151c05d091a0d54a88b460a20b8a2734c8c7b7 not found: ID does not exist" Dec 01 09:54:34 crc kubenswrapper[5004]: I1201 09:54:34.775076 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6af8218-c7bd-4b24-85c9-aa9e84b137f0" path="/var/lib/kubelet/pods/b6af8218-c7bd-4b24-85c9-aa9e84b137f0/volumes" Dec 01 09:55:38 crc kubenswrapper[5004]: I1201 09:55:38.729470 5004 patch_prober.go:28] interesting pod/machine-config-daemon-fvdgt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:55:38 crc kubenswrapper[5004]: I1201 09:55:38.730190 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:55:54 crc kubenswrapper[5004]: I1201 09:55:54.434220 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kgblg"] Dec 01 09:55:54 crc kubenswrapper[5004]: E1201 09:55:54.435463 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6af8218-c7bd-4b24-85c9-aa9e84b137f0" containerName="extract-utilities" Dec 01 09:55:54 crc kubenswrapper[5004]: I1201 09:55:54.435478 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6af8218-c7bd-4b24-85c9-aa9e84b137f0" containerName="extract-utilities" Dec 01 09:55:54 crc kubenswrapper[5004]: E1201 09:55:54.435502 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6af8218-c7bd-4b24-85c9-aa9e84b137f0" containerName="registry-server" Dec 01 09:55:54 crc kubenswrapper[5004]: I1201 09:55:54.435508 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6af8218-c7bd-4b24-85c9-aa9e84b137f0" containerName="registry-server" Dec 01 09:55:54 crc kubenswrapper[5004]: E1201 09:55:54.435519 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6af8218-c7bd-4b24-85c9-aa9e84b137f0" containerName="extract-content" Dec 01 09:55:54 crc kubenswrapper[5004]: I1201 09:55:54.435526 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6af8218-c7bd-4b24-85c9-aa9e84b137f0" containerName="extract-content" Dec 01 09:55:54 crc kubenswrapper[5004]: I1201 09:55:54.435877 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6af8218-c7bd-4b24-85c9-aa9e84b137f0" containerName="registry-server" Dec 01 09:55:54 crc kubenswrapper[5004]: I1201 09:55:54.438752 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kgblg" Dec 01 09:55:54 crc kubenswrapper[5004]: I1201 09:55:54.467416 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kgblg"] Dec 01 09:55:54 crc kubenswrapper[5004]: I1201 09:55:54.480947 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5l6d\" (UniqueName: \"kubernetes.io/projected/559c3ffa-1643-458b-8ad7-8a8edff6a89b-kube-api-access-n5l6d\") pod \"certified-operators-kgblg\" (UID: \"559c3ffa-1643-458b-8ad7-8a8edff6a89b\") " pod="openshift-marketplace/certified-operators-kgblg" Dec 01 09:55:54 crc kubenswrapper[5004]: I1201 09:55:54.481024 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/559c3ffa-1643-458b-8ad7-8a8edff6a89b-utilities\") pod \"certified-operators-kgblg\" (UID: \"559c3ffa-1643-458b-8ad7-8a8edff6a89b\") " pod="openshift-marketplace/certified-operators-kgblg" Dec 01 09:55:54 crc kubenswrapper[5004]: I1201 09:55:54.481227 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/559c3ffa-1643-458b-8ad7-8a8edff6a89b-catalog-content\") pod \"certified-operators-kgblg\" (UID: \"559c3ffa-1643-458b-8ad7-8a8edff6a89b\") " pod="openshift-marketplace/certified-operators-kgblg" Dec 01 09:55:54 crc kubenswrapper[5004]: I1201 09:55:54.584964 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5l6d\" (UniqueName: \"kubernetes.io/projected/559c3ffa-1643-458b-8ad7-8a8edff6a89b-kube-api-access-n5l6d\") pod \"certified-operators-kgblg\" (UID: \"559c3ffa-1643-458b-8ad7-8a8edff6a89b\") " pod="openshift-marketplace/certified-operators-kgblg" Dec 01 09:55:54 crc kubenswrapper[5004]: I1201 09:55:54.585061 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/559c3ffa-1643-458b-8ad7-8a8edff6a89b-utilities\") pod \"certified-operators-kgblg\" (UID: \"559c3ffa-1643-458b-8ad7-8a8edff6a89b\") " pod="openshift-marketplace/certified-operators-kgblg" Dec 01 09:55:54 crc kubenswrapper[5004]: I1201 09:55:54.585438 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/559c3ffa-1643-458b-8ad7-8a8edff6a89b-catalog-content\") pod \"certified-operators-kgblg\" (UID: \"559c3ffa-1643-458b-8ad7-8a8edff6a89b\") " pod="openshift-marketplace/certified-operators-kgblg" Dec 01 09:55:54 crc kubenswrapper[5004]: I1201 09:55:54.586104 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/559c3ffa-1643-458b-8ad7-8a8edff6a89b-utilities\") pod \"certified-operators-kgblg\" (UID: \"559c3ffa-1643-458b-8ad7-8a8edff6a89b\") " pod="openshift-marketplace/certified-operators-kgblg" Dec 01 09:55:54 crc kubenswrapper[5004]: I1201 09:55:54.586225 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/559c3ffa-1643-458b-8ad7-8a8edff6a89b-catalog-content\") pod \"certified-operators-kgblg\" (UID: \"559c3ffa-1643-458b-8ad7-8a8edff6a89b\") " pod="openshift-marketplace/certified-operators-kgblg" Dec 01 09:55:54 crc kubenswrapper[5004]: I1201 09:55:54.606452 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5l6d\" (UniqueName: \"kubernetes.io/projected/559c3ffa-1643-458b-8ad7-8a8edff6a89b-kube-api-access-n5l6d\") pod \"certified-operators-kgblg\" (UID: \"559c3ffa-1643-458b-8ad7-8a8edff6a89b\") " pod="openshift-marketplace/certified-operators-kgblg" Dec 01 09:55:54 crc kubenswrapper[5004]: I1201 09:55:54.772595 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kgblg" Dec 01 09:55:55 crc kubenswrapper[5004]: I1201 09:55:55.306098 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kgblg"] Dec 01 09:55:56 crc kubenswrapper[5004]: I1201 09:55:56.037553 5004 generic.go:334] "Generic (PLEG): container finished" podID="559c3ffa-1643-458b-8ad7-8a8edff6a89b" containerID="031a8d8adefd6e8bd42c084e406d44855109d79bb9fbec29a3c74aee10dfd6bc" exitCode=0 Dec 01 09:55:56 crc kubenswrapper[5004]: I1201 09:55:56.037955 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kgblg" event={"ID":"559c3ffa-1643-458b-8ad7-8a8edff6a89b","Type":"ContainerDied","Data":"031a8d8adefd6e8bd42c084e406d44855109d79bb9fbec29a3c74aee10dfd6bc"} Dec 01 09:55:56 crc kubenswrapper[5004]: I1201 09:55:56.037988 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kgblg" event={"ID":"559c3ffa-1643-458b-8ad7-8a8edff6a89b","Type":"ContainerStarted","Data":"97f2509414d1214d55eff5d1fa9765ff4da42d226ecd9fed83f9ca5327f177f9"} Dec 01 09:55:57 crc kubenswrapper[5004]: I1201 09:55:57.051093 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kgblg" event={"ID":"559c3ffa-1643-458b-8ad7-8a8edff6a89b","Type":"ContainerStarted","Data":"205989fb2b17ce3c25f3fd268fc57dae7e87878503c5ea7598dfcac7fdb3b3ed"} Dec 01 09:55:59 crc kubenswrapper[5004]: I1201 09:55:59.082051 5004 generic.go:334] "Generic (PLEG): container finished" podID="559c3ffa-1643-458b-8ad7-8a8edff6a89b" containerID="205989fb2b17ce3c25f3fd268fc57dae7e87878503c5ea7598dfcac7fdb3b3ed" exitCode=0 Dec 01 09:55:59 crc kubenswrapper[5004]: I1201 09:55:59.082127 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kgblg" event={"ID":"559c3ffa-1643-458b-8ad7-8a8edff6a89b","Type":"ContainerDied","Data":"205989fb2b17ce3c25f3fd268fc57dae7e87878503c5ea7598dfcac7fdb3b3ed"} Dec 01 09:56:00 crc kubenswrapper[5004]: I1201 09:56:00.096316 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kgblg" event={"ID":"559c3ffa-1643-458b-8ad7-8a8edff6a89b","Type":"ContainerStarted","Data":"2185d65e0aa6f83ea0b5b04acf7dc11d59fc68b28e474719c183a4b508ce36e5"} Dec 01 09:56:00 crc kubenswrapper[5004]: I1201 09:56:00.126783 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kgblg" podStartSLOduration=2.63283896 podStartE2EDuration="6.126762797s" podCreationTimestamp="2025-12-01 09:55:54 +0000 UTC" firstStartedPulling="2025-12-01 09:55:56.041366701 +0000 UTC m=+5933.606358683" lastFinishedPulling="2025-12-01 09:55:59.535290548 +0000 UTC m=+5937.100282520" observedRunningTime="2025-12-01 09:56:00.117523732 +0000 UTC m=+5937.682515714" watchObservedRunningTime="2025-12-01 09:56:00.126762797 +0000 UTC m=+5937.691754779" Dec 01 09:56:04 crc kubenswrapper[5004]: I1201 09:56:04.774869 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kgblg" Dec 01 09:56:04 crc kubenswrapper[5004]: I1201 09:56:04.775479 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kgblg" Dec 01 09:56:04 crc kubenswrapper[5004]: I1201 09:56:04.820300 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kgblg" Dec 01 09:56:05 crc kubenswrapper[5004]: I1201 09:56:05.192866 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kgblg" Dec 01 09:56:05 crc kubenswrapper[5004]: I1201 09:56:05.242812 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kgblg"] Dec 01 09:56:07 crc kubenswrapper[5004]: I1201 09:56:07.166620 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-kgblg" podUID="559c3ffa-1643-458b-8ad7-8a8edff6a89b" containerName="registry-server" containerID="cri-o://2185d65e0aa6f83ea0b5b04acf7dc11d59fc68b28e474719c183a4b508ce36e5" gracePeriod=2 Dec 01 09:56:07 crc kubenswrapper[5004]: I1201 09:56:07.691385 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kgblg" Dec 01 09:56:07 crc kubenswrapper[5004]: I1201 09:56:07.839503 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/559c3ffa-1643-458b-8ad7-8a8edff6a89b-utilities\") pod \"559c3ffa-1643-458b-8ad7-8a8edff6a89b\" (UID: \"559c3ffa-1643-458b-8ad7-8a8edff6a89b\") " Dec 01 09:56:07 crc kubenswrapper[5004]: I1201 09:56:07.840137 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/559c3ffa-1643-458b-8ad7-8a8edff6a89b-catalog-content\") pod \"559c3ffa-1643-458b-8ad7-8a8edff6a89b\" (UID: \"559c3ffa-1643-458b-8ad7-8a8edff6a89b\") " Dec 01 09:56:07 crc kubenswrapper[5004]: I1201 09:56:07.840314 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5l6d\" (UniqueName: \"kubernetes.io/projected/559c3ffa-1643-458b-8ad7-8a8edff6a89b-kube-api-access-n5l6d\") pod \"559c3ffa-1643-458b-8ad7-8a8edff6a89b\" (UID: \"559c3ffa-1643-458b-8ad7-8a8edff6a89b\") " Dec 01 09:56:07 crc kubenswrapper[5004]: I1201 09:56:07.840476 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/559c3ffa-1643-458b-8ad7-8a8edff6a89b-utilities" (OuterVolumeSpecName: "utilities") pod "559c3ffa-1643-458b-8ad7-8a8edff6a89b" (UID: "559c3ffa-1643-458b-8ad7-8a8edff6a89b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:56:07 crc kubenswrapper[5004]: I1201 09:56:07.842992 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/559c3ffa-1643-458b-8ad7-8a8edff6a89b-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:56:07 crc kubenswrapper[5004]: I1201 09:56:07.846044 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/559c3ffa-1643-458b-8ad7-8a8edff6a89b-kube-api-access-n5l6d" (OuterVolumeSpecName: "kube-api-access-n5l6d") pod "559c3ffa-1643-458b-8ad7-8a8edff6a89b" (UID: "559c3ffa-1643-458b-8ad7-8a8edff6a89b"). InnerVolumeSpecName "kube-api-access-n5l6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:56:07 crc kubenswrapper[5004]: I1201 09:56:07.884628 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/559c3ffa-1643-458b-8ad7-8a8edff6a89b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "559c3ffa-1643-458b-8ad7-8a8edff6a89b" (UID: "559c3ffa-1643-458b-8ad7-8a8edff6a89b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:56:07 crc kubenswrapper[5004]: I1201 09:56:07.944675 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/559c3ffa-1643-458b-8ad7-8a8edff6a89b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:56:07 crc kubenswrapper[5004]: I1201 09:56:07.944705 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5l6d\" (UniqueName: \"kubernetes.io/projected/559c3ffa-1643-458b-8ad7-8a8edff6a89b-kube-api-access-n5l6d\") on node \"crc\" DevicePath \"\"" Dec 01 09:56:08 crc kubenswrapper[5004]: I1201 09:56:08.181412 5004 generic.go:334] "Generic (PLEG): container finished" podID="559c3ffa-1643-458b-8ad7-8a8edff6a89b" containerID="2185d65e0aa6f83ea0b5b04acf7dc11d59fc68b28e474719c183a4b508ce36e5" exitCode=0 Dec 01 09:56:08 crc kubenswrapper[5004]: I1201 09:56:08.181481 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kgblg" event={"ID":"559c3ffa-1643-458b-8ad7-8a8edff6a89b","Type":"ContainerDied","Data":"2185d65e0aa6f83ea0b5b04acf7dc11d59fc68b28e474719c183a4b508ce36e5"} Dec 01 09:56:08 crc kubenswrapper[5004]: I1201 09:56:08.181843 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kgblg" event={"ID":"559c3ffa-1643-458b-8ad7-8a8edff6a89b","Type":"ContainerDied","Data":"97f2509414d1214d55eff5d1fa9765ff4da42d226ecd9fed83f9ca5327f177f9"} Dec 01 09:56:08 crc kubenswrapper[5004]: I1201 09:56:08.181869 5004 scope.go:117] "RemoveContainer" containerID="2185d65e0aa6f83ea0b5b04acf7dc11d59fc68b28e474719c183a4b508ce36e5" Dec 01 09:56:08 crc kubenswrapper[5004]: I1201 09:56:08.181503 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kgblg" Dec 01 09:56:08 crc kubenswrapper[5004]: I1201 09:56:08.210459 5004 scope.go:117] "RemoveContainer" containerID="205989fb2b17ce3c25f3fd268fc57dae7e87878503c5ea7598dfcac7fdb3b3ed" Dec 01 09:56:08 crc kubenswrapper[5004]: I1201 09:56:08.220925 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kgblg"] Dec 01 09:56:08 crc kubenswrapper[5004]: I1201 09:56:08.231875 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-kgblg"] Dec 01 09:56:08 crc kubenswrapper[5004]: I1201 09:56:08.247751 5004 scope.go:117] "RemoveContainer" containerID="031a8d8adefd6e8bd42c084e406d44855109d79bb9fbec29a3c74aee10dfd6bc" Dec 01 09:56:08 crc kubenswrapper[5004]: I1201 09:56:08.306349 5004 scope.go:117] "RemoveContainer" containerID="2185d65e0aa6f83ea0b5b04acf7dc11d59fc68b28e474719c183a4b508ce36e5" Dec 01 09:56:08 crc kubenswrapper[5004]: E1201 09:56:08.306928 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2185d65e0aa6f83ea0b5b04acf7dc11d59fc68b28e474719c183a4b508ce36e5\": container with ID starting with 2185d65e0aa6f83ea0b5b04acf7dc11d59fc68b28e474719c183a4b508ce36e5 not found: ID does not exist" containerID="2185d65e0aa6f83ea0b5b04acf7dc11d59fc68b28e474719c183a4b508ce36e5" Dec 01 09:56:08 crc kubenswrapper[5004]: I1201 09:56:08.306982 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2185d65e0aa6f83ea0b5b04acf7dc11d59fc68b28e474719c183a4b508ce36e5"} err="failed to get container status \"2185d65e0aa6f83ea0b5b04acf7dc11d59fc68b28e474719c183a4b508ce36e5\": rpc error: code = NotFound desc = could not find container \"2185d65e0aa6f83ea0b5b04acf7dc11d59fc68b28e474719c183a4b508ce36e5\": container with ID starting with 2185d65e0aa6f83ea0b5b04acf7dc11d59fc68b28e474719c183a4b508ce36e5 not found: ID does not exist" Dec 01 09:56:08 crc kubenswrapper[5004]: I1201 09:56:08.307066 5004 scope.go:117] "RemoveContainer" containerID="205989fb2b17ce3c25f3fd268fc57dae7e87878503c5ea7598dfcac7fdb3b3ed" Dec 01 09:56:08 crc kubenswrapper[5004]: E1201 09:56:08.307417 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"205989fb2b17ce3c25f3fd268fc57dae7e87878503c5ea7598dfcac7fdb3b3ed\": container with ID starting with 205989fb2b17ce3c25f3fd268fc57dae7e87878503c5ea7598dfcac7fdb3b3ed not found: ID does not exist" containerID="205989fb2b17ce3c25f3fd268fc57dae7e87878503c5ea7598dfcac7fdb3b3ed" Dec 01 09:56:08 crc kubenswrapper[5004]: I1201 09:56:08.307452 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"205989fb2b17ce3c25f3fd268fc57dae7e87878503c5ea7598dfcac7fdb3b3ed"} err="failed to get container status \"205989fb2b17ce3c25f3fd268fc57dae7e87878503c5ea7598dfcac7fdb3b3ed\": rpc error: code = NotFound desc = could not find container \"205989fb2b17ce3c25f3fd268fc57dae7e87878503c5ea7598dfcac7fdb3b3ed\": container with ID starting with 205989fb2b17ce3c25f3fd268fc57dae7e87878503c5ea7598dfcac7fdb3b3ed not found: ID does not exist" Dec 01 09:56:08 crc kubenswrapper[5004]: I1201 09:56:08.307472 5004 scope.go:117] "RemoveContainer" containerID="031a8d8adefd6e8bd42c084e406d44855109d79bb9fbec29a3c74aee10dfd6bc" Dec 01 09:56:08 crc kubenswrapper[5004]: E1201 09:56:08.307966 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"031a8d8adefd6e8bd42c084e406d44855109d79bb9fbec29a3c74aee10dfd6bc\": container with ID starting with 031a8d8adefd6e8bd42c084e406d44855109d79bb9fbec29a3c74aee10dfd6bc not found: ID does not exist" containerID="031a8d8adefd6e8bd42c084e406d44855109d79bb9fbec29a3c74aee10dfd6bc" Dec 01 09:56:08 crc kubenswrapper[5004]: I1201 09:56:08.308014 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"031a8d8adefd6e8bd42c084e406d44855109d79bb9fbec29a3c74aee10dfd6bc"} err="failed to get container status \"031a8d8adefd6e8bd42c084e406d44855109d79bb9fbec29a3c74aee10dfd6bc\": rpc error: code = NotFound desc = could not find container \"031a8d8adefd6e8bd42c084e406d44855109d79bb9fbec29a3c74aee10dfd6bc\": container with ID starting with 031a8d8adefd6e8bd42c084e406d44855109d79bb9fbec29a3c74aee10dfd6bc not found: ID does not exist" Dec 01 09:56:08 crc kubenswrapper[5004]: I1201 09:56:08.730129 5004 patch_prober.go:28] interesting pod/machine-config-daemon-fvdgt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:56:08 crc kubenswrapper[5004]: I1201 09:56:08.730222 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:56:08 crc kubenswrapper[5004]: I1201 09:56:08.773997 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="559c3ffa-1643-458b-8ad7-8a8edff6a89b" path="/var/lib/kubelet/pods/559c3ffa-1643-458b-8ad7-8a8edff6a89b/volumes" Dec 01 09:56:38 crc kubenswrapper[5004]: I1201 09:56:38.729160 5004 patch_prober.go:28] interesting pod/machine-config-daemon-fvdgt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:56:38 crc kubenswrapper[5004]: I1201 09:56:38.729759 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:56:38 crc kubenswrapper[5004]: I1201 09:56:38.729811 5004 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" Dec 01 09:56:38 crc kubenswrapper[5004]: I1201 09:56:38.730722 5004 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1888096bc7eea8f4ed362bde431fc15761a89c0c03d561f41b82ce51165e9213"} pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 09:56:38 crc kubenswrapper[5004]: I1201 09:56:38.730777 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" containerID="cri-o://1888096bc7eea8f4ed362bde431fc15761a89c0c03d561f41b82ce51165e9213" gracePeriod=600 Dec 01 09:56:38 crc kubenswrapper[5004]: E1201 09:56:38.857393 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:56:39 crc kubenswrapper[5004]: I1201 09:56:39.554817 5004 generic.go:334] "Generic (PLEG): container finished" podID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerID="1888096bc7eea8f4ed362bde431fc15761a89c0c03d561f41b82ce51165e9213" exitCode=0 Dec 01 09:56:39 crc kubenswrapper[5004]: I1201 09:56:39.554891 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" event={"ID":"f9977ebb-82de-4e96-8763-0b5a84f8d4ce","Type":"ContainerDied","Data":"1888096bc7eea8f4ed362bde431fc15761a89c0c03d561f41b82ce51165e9213"} Dec 01 09:56:39 crc kubenswrapper[5004]: I1201 09:56:39.555174 5004 scope.go:117] "RemoveContainer" containerID="927d6dc47244ce32c9ccee70e9761b31e1106016f419ef6a20e9de07d977672d" Dec 01 09:56:39 crc kubenswrapper[5004]: I1201 09:56:39.555893 5004 scope.go:117] "RemoveContainer" containerID="1888096bc7eea8f4ed362bde431fc15761a89c0c03d561f41b82ce51165e9213" Dec 01 09:56:39 crc kubenswrapper[5004]: E1201 09:56:39.556180 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:56:53 crc kubenswrapper[5004]: I1201 09:56:53.758684 5004 scope.go:117] "RemoveContainer" containerID="1888096bc7eea8f4ed362bde431fc15761a89c0c03d561f41b82ce51165e9213" Dec 01 09:56:53 crc kubenswrapper[5004]: E1201 09:56:53.759603 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:57:04 crc kubenswrapper[5004]: I1201 09:57:04.759297 5004 scope.go:117] "RemoveContainer" containerID="1888096bc7eea8f4ed362bde431fc15761a89c0c03d561f41b82ce51165e9213" Dec 01 09:57:04 crc kubenswrapper[5004]: E1201 09:57:04.760091 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:57:07 crc kubenswrapper[5004]: I1201 09:57:07.556433 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qd4xw"] Dec 01 09:57:07 crc kubenswrapper[5004]: E1201 09:57:07.557899 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="559c3ffa-1643-458b-8ad7-8a8edff6a89b" containerName="extract-utilities" Dec 01 09:57:07 crc kubenswrapper[5004]: I1201 09:57:07.557951 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="559c3ffa-1643-458b-8ad7-8a8edff6a89b" containerName="extract-utilities" Dec 01 09:57:07 crc kubenswrapper[5004]: E1201 09:57:07.557994 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="559c3ffa-1643-458b-8ad7-8a8edff6a89b" containerName="extract-content" Dec 01 09:57:07 crc kubenswrapper[5004]: I1201 09:57:07.558032 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="559c3ffa-1643-458b-8ad7-8a8edff6a89b" containerName="extract-content" Dec 01 09:57:07 crc kubenswrapper[5004]: E1201 09:57:07.558055 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="559c3ffa-1643-458b-8ad7-8a8edff6a89b" containerName="registry-server" Dec 01 09:57:07 crc kubenswrapper[5004]: I1201 09:57:07.558063 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="559c3ffa-1643-458b-8ad7-8a8edff6a89b" containerName="registry-server" Dec 01 09:57:07 crc kubenswrapper[5004]: I1201 09:57:07.558613 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="559c3ffa-1643-458b-8ad7-8a8edff6a89b" containerName="registry-server" Dec 01 09:57:07 crc kubenswrapper[5004]: I1201 09:57:07.560863 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qd4xw" Dec 01 09:57:07 crc kubenswrapper[5004]: I1201 09:57:07.595911 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qd4xw"] Dec 01 09:57:07 crc kubenswrapper[5004]: I1201 09:57:07.664218 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59dbb649-9136-4173-9384-519af6598038-utilities\") pod \"community-operators-qd4xw\" (UID: \"59dbb649-9136-4173-9384-519af6598038\") " pod="openshift-marketplace/community-operators-qd4xw" Dec 01 09:57:07 crc kubenswrapper[5004]: I1201 09:57:07.664380 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59dbb649-9136-4173-9384-519af6598038-catalog-content\") pod \"community-operators-qd4xw\" (UID: \"59dbb649-9136-4173-9384-519af6598038\") " pod="openshift-marketplace/community-operators-qd4xw" Dec 01 09:57:07 crc kubenswrapper[5004]: I1201 09:57:07.664417 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26gpw\" (UniqueName: \"kubernetes.io/projected/59dbb649-9136-4173-9384-519af6598038-kube-api-access-26gpw\") pod \"community-operators-qd4xw\" (UID: \"59dbb649-9136-4173-9384-519af6598038\") " pod="openshift-marketplace/community-operators-qd4xw" Dec 01 09:57:07 crc kubenswrapper[5004]: I1201 09:57:07.767013 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59dbb649-9136-4173-9384-519af6598038-utilities\") pod \"community-operators-qd4xw\" (UID: \"59dbb649-9136-4173-9384-519af6598038\") " pod="openshift-marketplace/community-operators-qd4xw" Dec 01 09:57:07 crc kubenswrapper[5004]: I1201 09:57:07.767160 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59dbb649-9136-4173-9384-519af6598038-catalog-content\") pod \"community-operators-qd4xw\" (UID: \"59dbb649-9136-4173-9384-519af6598038\") " pod="openshift-marketplace/community-operators-qd4xw" Dec 01 09:57:07 crc kubenswrapper[5004]: I1201 09:57:07.767192 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26gpw\" (UniqueName: \"kubernetes.io/projected/59dbb649-9136-4173-9384-519af6598038-kube-api-access-26gpw\") pod \"community-operators-qd4xw\" (UID: \"59dbb649-9136-4173-9384-519af6598038\") " pod="openshift-marketplace/community-operators-qd4xw" Dec 01 09:57:07 crc kubenswrapper[5004]: I1201 09:57:07.767941 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59dbb649-9136-4173-9384-519af6598038-utilities\") pod \"community-operators-qd4xw\" (UID: \"59dbb649-9136-4173-9384-519af6598038\") " pod="openshift-marketplace/community-operators-qd4xw" Dec 01 09:57:07 crc kubenswrapper[5004]: I1201 09:57:07.768061 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59dbb649-9136-4173-9384-519af6598038-catalog-content\") pod \"community-operators-qd4xw\" (UID: \"59dbb649-9136-4173-9384-519af6598038\") " pod="openshift-marketplace/community-operators-qd4xw" Dec 01 09:57:07 crc kubenswrapper[5004]: I1201 09:57:07.785649 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26gpw\" (UniqueName: \"kubernetes.io/projected/59dbb649-9136-4173-9384-519af6598038-kube-api-access-26gpw\") pod \"community-operators-qd4xw\" (UID: \"59dbb649-9136-4173-9384-519af6598038\") " pod="openshift-marketplace/community-operators-qd4xw" Dec 01 09:57:07 crc kubenswrapper[5004]: I1201 09:57:07.925267 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qd4xw" Dec 01 09:57:08 crc kubenswrapper[5004]: I1201 09:57:08.444392 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qd4xw"] Dec 01 09:57:08 crc kubenswrapper[5004]: I1201 09:57:08.878862 5004 generic.go:334] "Generic (PLEG): container finished" podID="59dbb649-9136-4173-9384-519af6598038" containerID="bb1173179f3ed930b14f023e529de9e170cbbbb8dcec992da0b8a840cc147555" exitCode=0 Dec 01 09:57:08 crc kubenswrapper[5004]: I1201 09:57:08.879251 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qd4xw" event={"ID":"59dbb649-9136-4173-9384-519af6598038","Type":"ContainerDied","Data":"bb1173179f3ed930b14f023e529de9e170cbbbb8dcec992da0b8a840cc147555"} Dec 01 09:57:08 crc kubenswrapper[5004]: I1201 09:57:08.879300 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qd4xw" event={"ID":"59dbb649-9136-4173-9384-519af6598038","Type":"ContainerStarted","Data":"a3ba79940fa9b05a6d429d2a1c4d4c9ae2f66b35f6004f0e21d36a4d25526dbb"} Dec 01 09:57:10 crc kubenswrapper[5004]: I1201 09:57:10.900657 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qd4xw" event={"ID":"59dbb649-9136-4173-9384-519af6598038","Type":"ContainerStarted","Data":"2dd418081f348e3482ac36172056036d676d78a0bcb0ab3053008e2859603e01"} Dec 01 09:57:11 crc kubenswrapper[5004]: I1201 09:57:11.922044 5004 generic.go:334] "Generic (PLEG): container finished" podID="59dbb649-9136-4173-9384-519af6598038" containerID="2dd418081f348e3482ac36172056036d676d78a0bcb0ab3053008e2859603e01" exitCode=0 Dec 01 09:57:11 crc kubenswrapper[5004]: I1201 09:57:11.922365 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qd4xw" event={"ID":"59dbb649-9136-4173-9384-519af6598038","Type":"ContainerDied","Data":"2dd418081f348e3482ac36172056036d676d78a0bcb0ab3053008e2859603e01"} Dec 01 09:57:12 crc kubenswrapper[5004]: I1201 09:57:12.934248 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qd4xw" event={"ID":"59dbb649-9136-4173-9384-519af6598038","Type":"ContainerStarted","Data":"5ac1300747cbc2d623fec603c5b36b087bf8b3cd19d628af5865314b606b8063"} Dec 01 09:57:12 crc kubenswrapper[5004]: I1201 09:57:12.959873 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qd4xw" podStartSLOduration=2.472178943 podStartE2EDuration="5.959854749s" podCreationTimestamp="2025-12-01 09:57:07 +0000 UTC" firstStartedPulling="2025-12-01 09:57:08.882402796 +0000 UTC m=+6006.447394818" lastFinishedPulling="2025-12-01 09:57:12.370078642 +0000 UTC m=+6009.935070624" observedRunningTime="2025-12-01 09:57:12.949243952 +0000 UTC m=+6010.514235944" watchObservedRunningTime="2025-12-01 09:57:12.959854749 +0000 UTC m=+6010.524846731" Dec 01 09:57:16 crc kubenswrapper[5004]: I1201 09:57:16.759550 5004 scope.go:117] "RemoveContainer" containerID="1888096bc7eea8f4ed362bde431fc15761a89c0c03d561f41b82ce51165e9213" Dec 01 09:57:16 crc kubenswrapper[5004]: E1201 09:57:16.760854 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:57:17 crc kubenswrapper[5004]: I1201 09:57:17.925593 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qd4xw" Dec 01 09:57:17 crc kubenswrapper[5004]: I1201 09:57:17.925925 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qd4xw" Dec 01 09:57:17 crc kubenswrapper[5004]: I1201 09:57:17.977605 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qd4xw" Dec 01 09:57:18 crc kubenswrapper[5004]: I1201 09:57:18.042137 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qd4xw" Dec 01 09:57:18 crc kubenswrapper[5004]: I1201 09:57:18.220388 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qd4xw"] Dec 01 09:57:20 crc kubenswrapper[5004]: I1201 09:57:20.014895 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qd4xw" podUID="59dbb649-9136-4173-9384-519af6598038" containerName="registry-server" containerID="cri-o://5ac1300747cbc2d623fec603c5b36b087bf8b3cd19d628af5865314b606b8063" gracePeriod=2 Dec 01 09:57:20 crc kubenswrapper[5004]: I1201 09:57:20.555074 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qd4xw" Dec 01 09:57:20 crc kubenswrapper[5004]: I1201 09:57:20.702676 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59dbb649-9136-4173-9384-519af6598038-utilities\") pod \"59dbb649-9136-4173-9384-519af6598038\" (UID: \"59dbb649-9136-4173-9384-519af6598038\") " Dec 01 09:57:20 crc kubenswrapper[5004]: I1201 09:57:20.702816 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26gpw\" (UniqueName: \"kubernetes.io/projected/59dbb649-9136-4173-9384-519af6598038-kube-api-access-26gpw\") pod \"59dbb649-9136-4173-9384-519af6598038\" (UID: \"59dbb649-9136-4173-9384-519af6598038\") " Dec 01 09:57:20 crc kubenswrapper[5004]: I1201 09:57:20.702967 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59dbb649-9136-4173-9384-519af6598038-catalog-content\") pod \"59dbb649-9136-4173-9384-519af6598038\" (UID: \"59dbb649-9136-4173-9384-519af6598038\") " Dec 01 09:57:20 crc kubenswrapper[5004]: I1201 09:57:20.704241 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59dbb649-9136-4173-9384-519af6598038-utilities" (OuterVolumeSpecName: "utilities") pod "59dbb649-9136-4173-9384-519af6598038" (UID: "59dbb649-9136-4173-9384-519af6598038"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:57:20 crc kubenswrapper[5004]: I1201 09:57:20.723823 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59dbb649-9136-4173-9384-519af6598038-kube-api-access-26gpw" (OuterVolumeSpecName: "kube-api-access-26gpw") pod "59dbb649-9136-4173-9384-519af6598038" (UID: "59dbb649-9136-4173-9384-519af6598038"). InnerVolumeSpecName "kube-api-access-26gpw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:57:20 crc kubenswrapper[5004]: I1201 09:57:20.785527 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59dbb649-9136-4173-9384-519af6598038-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "59dbb649-9136-4173-9384-519af6598038" (UID: "59dbb649-9136-4173-9384-519af6598038"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:57:20 crc kubenswrapper[5004]: I1201 09:57:20.805985 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26gpw\" (UniqueName: \"kubernetes.io/projected/59dbb649-9136-4173-9384-519af6598038-kube-api-access-26gpw\") on node \"crc\" DevicePath \"\"" Dec 01 09:57:20 crc kubenswrapper[5004]: I1201 09:57:20.806023 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59dbb649-9136-4173-9384-519af6598038-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:57:20 crc kubenswrapper[5004]: I1201 09:57:20.806035 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59dbb649-9136-4173-9384-519af6598038-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:57:21 crc kubenswrapper[5004]: I1201 09:57:21.028576 5004 generic.go:334] "Generic (PLEG): container finished" podID="59dbb649-9136-4173-9384-519af6598038" containerID="5ac1300747cbc2d623fec603c5b36b087bf8b3cd19d628af5865314b606b8063" exitCode=0 Dec 01 09:57:21 crc kubenswrapper[5004]: I1201 09:57:21.028653 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qd4xw" Dec 01 09:57:21 crc kubenswrapper[5004]: I1201 09:57:21.028681 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qd4xw" event={"ID":"59dbb649-9136-4173-9384-519af6598038","Type":"ContainerDied","Data":"5ac1300747cbc2d623fec603c5b36b087bf8b3cd19d628af5865314b606b8063"} Dec 01 09:57:21 crc kubenswrapper[5004]: I1201 09:57:21.029638 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qd4xw" event={"ID":"59dbb649-9136-4173-9384-519af6598038","Type":"ContainerDied","Data":"a3ba79940fa9b05a6d429d2a1c4d4c9ae2f66b35f6004f0e21d36a4d25526dbb"} Dec 01 09:57:21 crc kubenswrapper[5004]: I1201 09:57:21.029661 5004 scope.go:117] "RemoveContainer" containerID="5ac1300747cbc2d623fec603c5b36b087bf8b3cd19d628af5865314b606b8063" Dec 01 09:57:21 crc kubenswrapper[5004]: I1201 09:57:21.051301 5004 scope.go:117] "RemoveContainer" containerID="2dd418081f348e3482ac36172056036d676d78a0bcb0ab3053008e2859603e01" Dec 01 09:57:21 crc kubenswrapper[5004]: I1201 09:57:21.077270 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qd4xw"] Dec 01 09:57:21 crc kubenswrapper[5004]: I1201 09:57:21.090880 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qd4xw"] Dec 01 09:57:21 crc kubenswrapper[5004]: I1201 09:57:21.095550 5004 scope.go:117] "RemoveContainer" containerID="bb1173179f3ed930b14f023e529de9e170cbbbb8dcec992da0b8a840cc147555" Dec 01 09:57:21 crc kubenswrapper[5004]: I1201 09:57:21.147427 5004 scope.go:117] "RemoveContainer" containerID="5ac1300747cbc2d623fec603c5b36b087bf8b3cd19d628af5865314b606b8063" Dec 01 09:57:21 crc kubenswrapper[5004]: E1201 09:57:21.147904 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ac1300747cbc2d623fec603c5b36b087bf8b3cd19d628af5865314b606b8063\": container with ID starting with 5ac1300747cbc2d623fec603c5b36b087bf8b3cd19d628af5865314b606b8063 not found: ID does not exist" containerID="5ac1300747cbc2d623fec603c5b36b087bf8b3cd19d628af5865314b606b8063" Dec 01 09:57:21 crc kubenswrapper[5004]: I1201 09:57:21.147938 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ac1300747cbc2d623fec603c5b36b087bf8b3cd19d628af5865314b606b8063"} err="failed to get container status \"5ac1300747cbc2d623fec603c5b36b087bf8b3cd19d628af5865314b606b8063\": rpc error: code = NotFound desc = could not find container \"5ac1300747cbc2d623fec603c5b36b087bf8b3cd19d628af5865314b606b8063\": container with ID starting with 5ac1300747cbc2d623fec603c5b36b087bf8b3cd19d628af5865314b606b8063 not found: ID does not exist" Dec 01 09:57:21 crc kubenswrapper[5004]: I1201 09:57:21.147962 5004 scope.go:117] "RemoveContainer" containerID="2dd418081f348e3482ac36172056036d676d78a0bcb0ab3053008e2859603e01" Dec 01 09:57:21 crc kubenswrapper[5004]: E1201 09:57:21.148519 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2dd418081f348e3482ac36172056036d676d78a0bcb0ab3053008e2859603e01\": container with ID starting with 2dd418081f348e3482ac36172056036d676d78a0bcb0ab3053008e2859603e01 not found: ID does not exist" containerID="2dd418081f348e3482ac36172056036d676d78a0bcb0ab3053008e2859603e01" Dec 01 09:57:21 crc kubenswrapper[5004]: I1201 09:57:21.148591 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2dd418081f348e3482ac36172056036d676d78a0bcb0ab3053008e2859603e01"} err="failed to get container status \"2dd418081f348e3482ac36172056036d676d78a0bcb0ab3053008e2859603e01\": rpc error: code = NotFound desc = could not find container \"2dd418081f348e3482ac36172056036d676d78a0bcb0ab3053008e2859603e01\": container with ID starting with 2dd418081f348e3482ac36172056036d676d78a0bcb0ab3053008e2859603e01 not found: ID does not exist" Dec 01 09:57:21 crc kubenswrapper[5004]: I1201 09:57:21.148626 5004 scope.go:117] "RemoveContainer" containerID="bb1173179f3ed930b14f023e529de9e170cbbbb8dcec992da0b8a840cc147555" Dec 01 09:57:21 crc kubenswrapper[5004]: E1201 09:57:21.149036 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb1173179f3ed930b14f023e529de9e170cbbbb8dcec992da0b8a840cc147555\": container with ID starting with bb1173179f3ed930b14f023e529de9e170cbbbb8dcec992da0b8a840cc147555 not found: ID does not exist" containerID="bb1173179f3ed930b14f023e529de9e170cbbbb8dcec992da0b8a840cc147555" Dec 01 09:57:21 crc kubenswrapper[5004]: I1201 09:57:21.149143 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb1173179f3ed930b14f023e529de9e170cbbbb8dcec992da0b8a840cc147555"} err="failed to get container status \"bb1173179f3ed930b14f023e529de9e170cbbbb8dcec992da0b8a840cc147555\": rpc error: code = NotFound desc = could not find container \"bb1173179f3ed930b14f023e529de9e170cbbbb8dcec992da0b8a840cc147555\": container with ID starting with bb1173179f3ed930b14f023e529de9e170cbbbb8dcec992da0b8a840cc147555 not found: ID does not exist" Dec 01 09:57:22 crc kubenswrapper[5004]: I1201 09:57:22.773540 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59dbb649-9136-4173-9384-519af6598038" path="/var/lib/kubelet/pods/59dbb649-9136-4173-9384-519af6598038/volumes" Dec 01 09:57:27 crc kubenswrapper[5004]: I1201 09:57:27.759153 5004 scope.go:117] "RemoveContainer" containerID="1888096bc7eea8f4ed362bde431fc15761a89c0c03d561f41b82ce51165e9213" Dec 01 09:57:27 crc kubenswrapper[5004]: E1201 09:57:27.759814 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:57:41 crc kubenswrapper[5004]: I1201 09:57:41.759500 5004 scope.go:117] "RemoveContainer" containerID="1888096bc7eea8f4ed362bde431fc15761a89c0c03d561f41b82ce51165e9213" Dec 01 09:57:41 crc kubenswrapper[5004]: E1201 09:57:41.760528 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:57:56 crc kubenswrapper[5004]: I1201 09:57:56.759352 5004 scope.go:117] "RemoveContainer" containerID="1888096bc7eea8f4ed362bde431fc15761a89c0c03d561f41b82ce51165e9213" Dec 01 09:57:56 crc kubenswrapper[5004]: E1201 09:57:56.760496 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:58:11 crc kubenswrapper[5004]: I1201 09:58:11.759026 5004 scope.go:117] "RemoveContainer" containerID="1888096bc7eea8f4ed362bde431fc15761a89c0c03d561f41b82ce51165e9213" Dec 01 09:58:11 crc kubenswrapper[5004]: E1201 09:58:11.760081 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:58:25 crc kubenswrapper[5004]: I1201 09:58:25.759023 5004 scope.go:117] "RemoveContainer" containerID="1888096bc7eea8f4ed362bde431fc15761a89c0c03d561f41b82ce51165e9213" Dec 01 09:58:25 crc kubenswrapper[5004]: E1201 09:58:25.759834 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:58:38 crc kubenswrapper[5004]: I1201 09:58:38.760317 5004 scope.go:117] "RemoveContainer" containerID="1888096bc7eea8f4ed362bde431fc15761a89c0c03d561f41b82ce51165e9213" Dec 01 09:58:38 crc kubenswrapper[5004]: E1201 09:58:38.761172 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:58:49 crc kubenswrapper[5004]: I1201 09:58:49.763812 5004 scope.go:117] "RemoveContainer" containerID="1888096bc7eea8f4ed362bde431fc15761a89c0c03d561f41b82ce51165e9213" Dec 01 09:58:49 crc kubenswrapper[5004]: E1201 09:58:49.764615 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:58:57 crc kubenswrapper[5004]: I1201 09:58:57.788790 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-26jb6"] Dec 01 09:58:57 crc kubenswrapper[5004]: E1201 09:58:57.790385 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59dbb649-9136-4173-9384-519af6598038" containerName="extract-content" Dec 01 09:58:57 crc kubenswrapper[5004]: I1201 09:58:57.790402 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="59dbb649-9136-4173-9384-519af6598038" containerName="extract-content" Dec 01 09:58:57 crc kubenswrapper[5004]: E1201 09:58:57.790441 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59dbb649-9136-4173-9384-519af6598038" containerName="extract-utilities" Dec 01 09:58:57 crc kubenswrapper[5004]: I1201 09:58:57.790448 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="59dbb649-9136-4173-9384-519af6598038" containerName="extract-utilities" Dec 01 09:58:57 crc kubenswrapper[5004]: E1201 09:58:57.790460 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59dbb649-9136-4173-9384-519af6598038" containerName="registry-server" Dec 01 09:58:57 crc kubenswrapper[5004]: I1201 09:58:57.790467 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="59dbb649-9136-4173-9384-519af6598038" containerName="registry-server" Dec 01 09:58:57 crc kubenswrapper[5004]: I1201 09:58:57.790692 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="59dbb649-9136-4173-9384-519af6598038" containerName="registry-server" Dec 01 09:58:57 crc kubenswrapper[5004]: I1201 09:58:57.792535 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-26jb6" Dec 01 09:58:57 crc kubenswrapper[5004]: I1201 09:58:57.811860 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-26jb6"] Dec 01 09:58:57 crc kubenswrapper[5004]: I1201 09:58:57.926850 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bbd9c08-0606-4fb0-a823-25389a69ef07-catalog-content\") pod \"redhat-marketplace-26jb6\" (UID: \"4bbd9c08-0606-4fb0-a823-25389a69ef07\") " pod="openshift-marketplace/redhat-marketplace-26jb6" Dec 01 09:58:57 crc kubenswrapper[5004]: I1201 09:58:57.927375 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fx84k\" (UniqueName: \"kubernetes.io/projected/4bbd9c08-0606-4fb0-a823-25389a69ef07-kube-api-access-fx84k\") pod \"redhat-marketplace-26jb6\" (UID: \"4bbd9c08-0606-4fb0-a823-25389a69ef07\") " pod="openshift-marketplace/redhat-marketplace-26jb6" Dec 01 09:58:57 crc kubenswrapper[5004]: I1201 09:58:57.927495 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bbd9c08-0606-4fb0-a823-25389a69ef07-utilities\") pod \"redhat-marketplace-26jb6\" (UID: \"4bbd9c08-0606-4fb0-a823-25389a69ef07\") " pod="openshift-marketplace/redhat-marketplace-26jb6" Dec 01 09:58:58 crc kubenswrapper[5004]: I1201 09:58:58.030005 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fx84k\" (UniqueName: \"kubernetes.io/projected/4bbd9c08-0606-4fb0-a823-25389a69ef07-kube-api-access-fx84k\") pod \"redhat-marketplace-26jb6\" (UID: \"4bbd9c08-0606-4fb0-a823-25389a69ef07\") " pod="openshift-marketplace/redhat-marketplace-26jb6" Dec 01 09:58:58 crc kubenswrapper[5004]: I1201 09:58:58.030092 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bbd9c08-0606-4fb0-a823-25389a69ef07-utilities\") pod \"redhat-marketplace-26jb6\" (UID: \"4bbd9c08-0606-4fb0-a823-25389a69ef07\") " pod="openshift-marketplace/redhat-marketplace-26jb6" Dec 01 09:58:58 crc kubenswrapper[5004]: I1201 09:58:58.030128 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bbd9c08-0606-4fb0-a823-25389a69ef07-catalog-content\") pod \"redhat-marketplace-26jb6\" (UID: \"4bbd9c08-0606-4fb0-a823-25389a69ef07\") " pod="openshift-marketplace/redhat-marketplace-26jb6" Dec 01 09:58:58 crc kubenswrapper[5004]: I1201 09:58:58.030692 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bbd9c08-0606-4fb0-a823-25389a69ef07-catalog-content\") pod \"redhat-marketplace-26jb6\" (UID: \"4bbd9c08-0606-4fb0-a823-25389a69ef07\") " pod="openshift-marketplace/redhat-marketplace-26jb6" Dec 01 09:58:58 crc kubenswrapper[5004]: I1201 09:58:58.030965 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bbd9c08-0606-4fb0-a823-25389a69ef07-utilities\") pod \"redhat-marketplace-26jb6\" (UID: \"4bbd9c08-0606-4fb0-a823-25389a69ef07\") " pod="openshift-marketplace/redhat-marketplace-26jb6" Dec 01 09:58:58 crc kubenswrapper[5004]: I1201 09:58:58.051902 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fx84k\" (UniqueName: \"kubernetes.io/projected/4bbd9c08-0606-4fb0-a823-25389a69ef07-kube-api-access-fx84k\") pod \"redhat-marketplace-26jb6\" (UID: \"4bbd9c08-0606-4fb0-a823-25389a69ef07\") " pod="openshift-marketplace/redhat-marketplace-26jb6" Dec 01 09:58:58 crc kubenswrapper[5004]: I1201 09:58:58.116532 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-26jb6" Dec 01 09:58:58 crc kubenswrapper[5004]: I1201 09:58:58.648821 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-26jb6"] Dec 01 09:58:59 crc kubenswrapper[5004]: I1201 09:58:59.170267 5004 generic.go:334] "Generic (PLEG): container finished" podID="4bbd9c08-0606-4fb0-a823-25389a69ef07" containerID="5cc91952c2fd488df200efaacde9ebacc1bd16cd7f1c7d68f79a649e5115d6eb" exitCode=0 Dec 01 09:58:59 crc kubenswrapper[5004]: I1201 09:58:59.170343 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-26jb6" event={"ID":"4bbd9c08-0606-4fb0-a823-25389a69ef07","Type":"ContainerDied","Data":"5cc91952c2fd488df200efaacde9ebacc1bd16cd7f1c7d68f79a649e5115d6eb"} Dec 01 09:58:59 crc kubenswrapper[5004]: I1201 09:58:59.170616 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-26jb6" event={"ID":"4bbd9c08-0606-4fb0-a823-25389a69ef07","Type":"ContainerStarted","Data":"e1e9bf40ebf6344085ace6848e54f7711c360ea8b39fc5d621e8d09828c6ac79"} Dec 01 09:59:00 crc kubenswrapper[5004]: I1201 09:59:00.182507 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-26jb6" event={"ID":"4bbd9c08-0606-4fb0-a823-25389a69ef07","Type":"ContainerStarted","Data":"8fcedaf28184956ced22995a96e1d99caa084a23e5157b52dec644671401c776"} Dec 01 09:59:00 crc kubenswrapper[5004]: I1201 09:59:00.761746 5004 scope.go:117] "RemoveContainer" containerID="1888096bc7eea8f4ed362bde431fc15761a89c0c03d561f41b82ce51165e9213" Dec 01 09:59:00 crc kubenswrapper[5004]: E1201 09:59:00.766169 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:59:01 crc kubenswrapper[5004]: I1201 09:59:01.195143 5004 generic.go:334] "Generic (PLEG): container finished" podID="4bbd9c08-0606-4fb0-a823-25389a69ef07" containerID="8fcedaf28184956ced22995a96e1d99caa084a23e5157b52dec644671401c776" exitCode=0 Dec 01 09:59:01 crc kubenswrapper[5004]: I1201 09:59:01.195195 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-26jb6" event={"ID":"4bbd9c08-0606-4fb0-a823-25389a69ef07","Type":"ContainerDied","Data":"8fcedaf28184956ced22995a96e1d99caa084a23e5157b52dec644671401c776"} Dec 01 09:59:02 crc kubenswrapper[5004]: I1201 09:59:02.206246 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-26jb6" event={"ID":"4bbd9c08-0606-4fb0-a823-25389a69ef07","Type":"ContainerStarted","Data":"4c1627398a8cbe7739fd147f49566bcd13fcfacd75ad8cbe9a6ec3e5bcfeef96"} Dec 01 09:59:02 crc kubenswrapper[5004]: I1201 09:59:02.233884 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-26jb6" podStartSLOduration=2.653601858 podStartE2EDuration="5.233864617s" podCreationTimestamp="2025-12-01 09:58:57 +0000 UTC" firstStartedPulling="2025-12-01 09:58:59.173260687 +0000 UTC m=+6116.738252669" lastFinishedPulling="2025-12-01 09:59:01.753523436 +0000 UTC m=+6119.318515428" observedRunningTime="2025-12-01 09:59:02.224948771 +0000 UTC m=+6119.789940753" watchObservedRunningTime="2025-12-01 09:59:02.233864617 +0000 UTC m=+6119.798856599" Dec 01 09:59:08 crc kubenswrapper[5004]: I1201 09:59:08.117306 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-26jb6" Dec 01 09:59:08 crc kubenswrapper[5004]: I1201 09:59:08.117955 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-26jb6" Dec 01 09:59:08 crc kubenswrapper[5004]: I1201 09:59:08.170639 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-26jb6" Dec 01 09:59:08 crc kubenswrapper[5004]: I1201 09:59:08.325755 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-26jb6" Dec 01 09:59:08 crc kubenswrapper[5004]: I1201 09:59:08.975537 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-26jb6"] Dec 01 09:59:10 crc kubenswrapper[5004]: I1201 09:59:10.312639 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-26jb6" podUID="4bbd9c08-0606-4fb0-a823-25389a69ef07" containerName="registry-server" containerID="cri-o://4c1627398a8cbe7739fd147f49566bcd13fcfacd75ad8cbe9a6ec3e5bcfeef96" gracePeriod=2 Dec 01 09:59:10 crc kubenswrapper[5004]: I1201 09:59:10.831025 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-26jb6" Dec 01 09:59:10 crc kubenswrapper[5004]: I1201 09:59:10.944206 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bbd9c08-0606-4fb0-a823-25389a69ef07-utilities\") pod \"4bbd9c08-0606-4fb0-a823-25389a69ef07\" (UID: \"4bbd9c08-0606-4fb0-a823-25389a69ef07\") " Dec 01 09:59:10 crc kubenswrapper[5004]: I1201 09:59:10.944650 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fx84k\" (UniqueName: \"kubernetes.io/projected/4bbd9c08-0606-4fb0-a823-25389a69ef07-kube-api-access-fx84k\") pod \"4bbd9c08-0606-4fb0-a823-25389a69ef07\" (UID: \"4bbd9c08-0606-4fb0-a823-25389a69ef07\") " Dec 01 09:59:10 crc kubenswrapper[5004]: I1201 09:59:10.944711 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bbd9c08-0606-4fb0-a823-25389a69ef07-catalog-content\") pod \"4bbd9c08-0606-4fb0-a823-25389a69ef07\" (UID: \"4bbd9c08-0606-4fb0-a823-25389a69ef07\") " Dec 01 09:59:10 crc kubenswrapper[5004]: I1201 09:59:10.945245 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bbd9c08-0606-4fb0-a823-25389a69ef07-utilities" (OuterVolumeSpecName: "utilities") pod "4bbd9c08-0606-4fb0-a823-25389a69ef07" (UID: "4bbd9c08-0606-4fb0-a823-25389a69ef07"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:59:10 crc kubenswrapper[5004]: I1201 09:59:10.945643 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bbd9c08-0606-4fb0-a823-25389a69ef07-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:10 crc kubenswrapper[5004]: I1201 09:59:10.953599 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bbd9c08-0606-4fb0-a823-25389a69ef07-kube-api-access-fx84k" (OuterVolumeSpecName: "kube-api-access-fx84k") pod "4bbd9c08-0606-4fb0-a823-25389a69ef07" (UID: "4bbd9c08-0606-4fb0-a823-25389a69ef07"). InnerVolumeSpecName "kube-api-access-fx84k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:59:10 crc kubenswrapper[5004]: I1201 09:59:10.973310 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bbd9c08-0606-4fb0-a823-25389a69ef07-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4bbd9c08-0606-4fb0-a823-25389a69ef07" (UID: "4bbd9c08-0606-4fb0-a823-25389a69ef07"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:59:11 crc kubenswrapper[5004]: I1201 09:59:11.047801 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fx84k\" (UniqueName: \"kubernetes.io/projected/4bbd9c08-0606-4fb0-a823-25389a69ef07-kube-api-access-fx84k\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:11 crc kubenswrapper[5004]: I1201 09:59:11.047853 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bbd9c08-0606-4fb0-a823-25389a69ef07-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:11 crc kubenswrapper[5004]: I1201 09:59:11.324898 5004 generic.go:334] "Generic (PLEG): container finished" podID="4bbd9c08-0606-4fb0-a823-25389a69ef07" containerID="4c1627398a8cbe7739fd147f49566bcd13fcfacd75ad8cbe9a6ec3e5bcfeef96" exitCode=0 Dec 01 09:59:11 crc kubenswrapper[5004]: I1201 09:59:11.324948 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-26jb6" Dec 01 09:59:11 crc kubenswrapper[5004]: I1201 09:59:11.324958 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-26jb6" event={"ID":"4bbd9c08-0606-4fb0-a823-25389a69ef07","Type":"ContainerDied","Data":"4c1627398a8cbe7739fd147f49566bcd13fcfacd75ad8cbe9a6ec3e5bcfeef96"} Dec 01 09:59:11 crc kubenswrapper[5004]: I1201 09:59:11.324985 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-26jb6" event={"ID":"4bbd9c08-0606-4fb0-a823-25389a69ef07","Type":"ContainerDied","Data":"e1e9bf40ebf6344085ace6848e54f7711c360ea8b39fc5d621e8d09828c6ac79"} Dec 01 09:59:11 crc kubenswrapper[5004]: I1201 09:59:11.325004 5004 scope.go:117] "RemoveContainer" containerID="4c1627398a8cbe7739fd147f49566bcd13fcfacd75ad8cbe9a6ec3e5bcfeef96" Dec 01 09:59:11 crc kubenswrapper[5004]: I1201 09:59:11.357601 5004 scope.go:117] "RemoveContainer" containerID="8fcedaf28184956ced22995a96e1d99caa084a23e5157b52dec644671401c776" Dec 01 09:59:11 crc kubenswrapper[5004]: I1201 09:59:11.360641 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-26jb6"] Dec 01 09:59:11 crc kubenswrapper[5004]: I1201 09:59:11.372092 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-26jb6"] Dec 01 09:59:11 crc kubenswrapper[5004]: I1201 09:59:11.386159 5004 scope.go:117] "RemoveContainer" containerID="5cc91952c2fd488df200efaacde9ebacc1bd16cd7f1c7d68f79a649e5115d6eb" Dec 01 09:59:11 crc kubenswrapper[5004]: I1201 09:59:11.431823 5004 scope.go:117] "RemoveContainer" containerID="4c1627398a8cbe7739fd147f49566bcd13fcfacd75ad8cbe9a6ec3e5bcfeef96" Dec 01 09:59:11 crc kubenswrapper[5004]: E1201 09:59:11.432340 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c1627398a8cbe7739fd147f49566bcd13fcfacd75ad8cbe9a6ec3e5bcfeef96\": container with ID starting with 4c1627398a8cbe7739fd147f49566bcd13fcfacd75ad8cbe9a6ec3e5bcfeef96 not found: ID does not exist" containerID="4c1627398a8cbe7739fd147f49566bcd13fcfacd75ad8cbe9a6ec3e5bcfeef96" Dec 01 09:59:11 crc kubenswrapper[5004]: I1201 09:59:11.432386 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c1627398a8cbe7739fd147f49566bcd13fcfacd75ad8cbe9a6ec3e5bcfeef96"} err="failed to get container status \"4c1627398a8cbe7739fd147f49566bcd13fcfacd75ad8cbe9a6ec3e5bcfeef96\": rpc error: code = NotFound desc = could not find container \"4c1627398a8cbe7739fd147f49566bcd13fcfacd75ad8cbe9a6ec3e5bcfeef96\": container with ID starting with 4c1627398a8cbe7739fd147f49566bcd13fcfacd75ad8cbe9a6ec3e5bcfeef96 not found: ID does not exist" Dec 01 09:59:11 crc kubenswrapper[5004]: I1201 09:59:11.432411 5004 scope.go:117] "RemoveContainer" containerID="8fcedaf28184956ced22995a96e1d99caa084a23e5157b52dec644671401c776" Dec 01 09:59:11 crc kubenswrapper[5004]: E1201 09:59:11.432824 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fcedaf28184956ced22995a96e1d99caa084a23e5157b52dec644671401c776\": container with ID starting with 8fcedaf28184956ced22995a96e1d99caa084a23e5157b52dec644671401c776 not found: ID does not exist" containerID="8fcedaf28184956ced22995a96e1d99caa084a23e5157b52dec644671401c776" Dec 01 09:59:11 crc kubenswrapper[5004]: I1201 09:59:11.432884 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fcedaf28184956ced22995a96e1d99caa084a23e5157b52dec644671401c776"} err="failed to get container status \"8fcedaf28184956ced22995a96e1d99caa084a23e5157b52dec644671401c776\": rpc error: code = NotFound desc = could not find container \"8fcedaf28184956ced22995a96e1d99caa084a23e5157b52dec644671401c776\": container with ID starting with 8fcedaf28184956ced22995a96e1d99caa084a23e5157b52dec644671401c776 not found: ID does not exist" Dec 01 09:59:11 crc kubenswrapper[5004]: I1201 09:59:11.432921 5004 scope.go:117] "RemoveContainer" containerID="5cc91952c2fd488df200efaacde9ebacc1bd16cd7f1c7d68f79a649e5115d6eb" Dec 01 09:59:11 crc kubenswrapper[5004]: E1201 09:59:11.433282 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cc91952c2fd488df200efaacde9ebacc1bd16cd7f1c7d68f79a649e5115d6eb\": container with ID starting with 5cc91952c2fd488df200efaacde9ebacc1bd16cd7f1c7d68f79a649e5115d6eb not found: ID does not exist" containerID="5cc91952c2fd488df200efaacde9ebacc1bd16cd7f1c7d68f79a649e5115d6eb" Dec 01 09:59:11 crc kubenswrapper[5004]: I1201 09:59:11.433358 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cc91952c2fd488df200efaacde9ebacc1bd16cd7f1c7d68f79a649e5115d6eb"} err="failed to get container status \"5cc91952c2fd488df200efaacde9ebacc1bd16cd7f1c7d68f79a649e5115d6eb\": rpc error: code = NotFound desc = could not find container \"5cc91952c2fd488df200efaacde9ebacc1bd16cd7f1c7d68f79a649e5115d6eb\": container with ID starting with 5cc91952c2fd488df200efaacde9ebacc1bd16cd7f1c7d68f79a649e5115d6eb not found: ID does not exist" Dec 01 09:59:12 crc kubenswrapper[5004]: I1201 09:59:12.774200 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bbd9c08-0606-4fb0-a823-25389a69ef07" path="/var/lib/kubelet/pods/4bbd9c08-0606-4fb0-a823-25389a69ef07/volumes" Dec 01 09:59:13 crc kubenswrapper[5004]: I1201 09:59:13.758760 5004 scope.go:117] "RemoveContainer" containerID="1888096bc7eea8f4ed362bde431fc15761a89c0c03d561f41b82ce51165e9213" Dec 01 09:59:13 crc kubenswrapper[5004]: E1201 09:59:13.759407 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:59:27 crc kubenswrapper[5004]: I1201 09:59:27.758893 5004 scope.go:117] "RemoveContainer" containerID="1888096bc7eea8f4ed362bde431fc15761a89c0c03d561f41b82ce51165e9213" Dec 01 09:59:27 crc kubenswrapper[5004]: E1201 09:59:27.759724 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:59:40 crc kubenswrapper[5004]: I1201 09:59:40.759437 5004 scope.go:117] "RemoveContainer" containerID="1888096bc7eea8f4ed362bde431fc15761a89c0c03d561f41b82ce51165e9213" Dec 01 09:59:40 crc kubenswrapper[5004]: E1201 09:59:40.760304 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 09:59:52 crc kubenswrapper[5004]: I1201 09:59:52.767083 5004 scope.go:117] "RemoveContainer" containerID="1888096bc7eea8f4ed362bde431fc15761a89c0c03d561f41b82ce51165e9213" Dec 01 09:59:52 crc kubenswrapper[5004]: E1201 09:59:52.767963 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 10:00:00 crc kubenswrapper[5004]: I1201 10:00:00.176488 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409720-4n5tp"] Dec 01 10:00:00 crc kubenswrapper[5004]: E1201 10:00:00.177780 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bbd9c08-0606-4fb0-a823-25389a69ef07" containerName="extract-content" Dec 01 10:00:00 crc kubenswrapper[5004]: I1201 10:00:00.177799 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bbd9c08-0606-4fb0-a823-25389a69ef07" containerName="extract-content" Dec 01 10:00:00 crc kubenswrapper[5004]: E1201 10:00:00.177817 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bbd9c08-0606-4fb0-a823-25389a69ef07" containerName="extract-utilities" Dec 01 10:00:00 crc kubenswrapper[5004]: I1201 10:00:00.177826 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bbd9c08-0606-4fb0-a823-25389a69ef07" containerName="extract-utilities" Dec 01 10:00:00 crc kubenswrapper[5004]: E1201 10:00:00.177857 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bbd9c08-0606-4fb0-a823-25389a69ef07" containerName="registry-server" Dec 01 10:00:00 crc kubenswrapper[5004]: I1201 10:00:00.177863 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bbd9c08-0606-4fb0-a823-25389a69ef07" containerName="registry-server" Dec 01 10:00:00 crc kubenswrapper[5004]: I1201 10:00:00.178126 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bbd9c08-0606-4fb0-a823-25389a69ef07" containerName="registry-server" Dec 01 10:00:00 crc kubenswrapper[5004]: I1201 10:00:00.179012 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409720-4n5tp" Dec 01 10:00:00 crc kubenswrapper[5004]: I1201 10:00:00.186629 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 10:00:00 crc kubenswrapper[5004]: I1201 10:00:00.186766 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 10:00:00 crc kubenswrapper[5004]: I1201 10:00:00.191289 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409720-4n5tp"] Dec 01 10:00:00 crc kubenswrapper[5004]: I1201 10:00:00.271347 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d7c9e05f-94e0-47da-a57e-9d439949ae6e-secret-volume\") pod \"collect-profiles-29409720-4n5tp\" (UID: \"d7c9e05f-94e0-47da-a57e-9d439949ae6e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409720-4n5tp" Dec 01 10:00:00 crc kubenswrapper[5004]: I1201 10:00:00.271878 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d7c9e05f-94e0-47da-a57e-9d439949ae6e-config-volume\") pod \"collect-profiles-29409720-4n5tp\" (UID: \"d7c9e05f-94e0-47da-a57e-9d439949ae6e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409720-4n5tp" Dec 01 10:00:00 crc kubenswrapper[5004]: I1201 10:00:00.271925 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6d26b\" (UniqueName: \"kubernetes.io/projected/d7c9e05f-94e0-47da-a57e-9d439949ae6e-kube-api-access-6d26b\") pod \"collect-profiles-29409720-4n5tp\" (UID: \"d7c9e05f-94e0-47da-a57e-9d439949ae6e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409720-4n5tp" Dec 01 10:00:00 crc kubenswrapper[5004]: I1201 10:00:00.373741 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d7c9e05f-94e0-47da-a57e-9d439949ae6e-config-volume\") pod \"collect-profiles-29409720-4n5tp\" (UID: \"d7c9e05f-94e0-47da-a57e-9d439949ae6e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409720-4n5tp" Dec 01 10:00:00 crc kubenswrapper[5004]: I1201 10:00:00.373828 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6d26b\" (UniqueName: \"kubernetes.io/projected/d7c9e05f-94e0-47da-a57e-9d439949ae6e-kube-api-access-6d26b\") pod \"collect-profiles-29409720-4n5tp\" (UID: \"d7c9e05f-94e0-47da-a57e-9d439949ae6e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409720-4n5tp" Dec 01 10:00:00 crc kubenswrapper[5004]: I1201 10:00:00.373889 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d7c9e05f-94e0-47da-a57e-9d439949ae6e-secret-volume\") pod \"collect-profiles-29409720-4n5tp\" (UID: \"d7c9e05f-94e0-47da-a57e-9d439949ae6e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409720-4n5tp" Dec 01 10:00:00 crc kubenswrapper[5004]: I1201 10:00:00.374625 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d7c9e05f-94e0-47da-a57e-9d439949ae6e-config-volume\") pod \"collect-profiles-29409720-4n5tp\" (UID: \"d7c9e05f-94e0-47da-a57e-9d439949ae6e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409720-4n5tp" Dec 01 10:00:00 crc kubenswrapper[5004]: I1201 10:00:00.379636 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d7c9e05f-94e0-47da-a57e-9d439949ae6e-secret-volume\") pod \"collect-profiles-29409720-4n5tp\" (UID: \"d7c9e05f-94e0-47da-a57e-9d439949ae6e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409720-4n5tp" Dec 01 10:00:00 crc kubenswrapper[5004]: I1201 10:00:00.391162 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6d26b\" (UniqueName: \"kubernetes.io/projected/d7c9e05f-94e0-47da-a57e-9d439949ae6e-kube-api-access-6d26b\") pod \"collect-profiles-29409720-4n5tp\" (UID: \"d7c9e05f-94e0-47da-a57e-9d439949ae6e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409720-4n5tp" Dec 01 10:00:00 crc kubenswrapper[5004]: I1201 10:00:00.513828 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409720-4n5tp" Dec 01 10:00:01 crc kubenswrapper[5004]: I1201 10:00:01.040148 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409720-4n5tp"] Dec 01 10:00:01 crc kubenswrapper[5004]: I1201 10:00:01.906753 5004 generic.go:334] "Generic (PLEG): container finished" podID="d7c9e05f-94e0-47da-a57e-9d439949ae6e" containerID="86d2ab2009b70bc0c06b8cba4419c75e6d0f4fda5f95fdf106ff8f4020723555" exitCode=0 Dec 01 10:00:01 crc kubenswrapper[5004]: I1201 10:00:01.906820 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409720-4n5tp" event={"ID":"d7c9e05f-94e0-47da-a57e-9d439949ae6e","Type":"ContainerDied","Data":"86d2ab2009b70bc0c06b8cba4419c75e6d0f4fda5f95fdf106ff8f4020723555"} Dec 01 10:00:01 crc kubenswrapper[5004]: I1201 10:00:01.907046 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409720-4n5tp" event={"ID":"d7c9e05f-94e0-47da-a57e-9d439949ae6e","Type":"ContainerStarted","Data":"e2cb30408409042fd93dd09c7a14310e48bb5b64aadb18bb30eb61cebf9504dc"} Dec 01 10:00:03 crc kubenswrapper[5004]: I1201 10:00:03.460213 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409720-4n5tp" Dec 01 10:00:03 crc kubenswrapper[5004]: I1201 10:00:03.584246 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d7c9e05f-94e0-47da-a57e-9d439949ae6e-config-volume\") pod \"d7c9e05f-94e0-47da-a57e-9d439949ae6e\" (UID: \"d7c9e05f-94e0-47da-a57e-9d439949ae6e\") " Dec 01 10:00:03 crc kubenswrapper[5004]: I1201 10:00:03.584582 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6d26b\" (UniqueName: \"kubernetes.io/projected/d7c9e05f-94e0-47da-a57e-9d439949ae6e-kube-api-access-6d26b\") pod \"d7c9e05f-94e0-47da-a57e-9d439949ae6e\" (UID: \"d7c9e05f-94e0-47da-a57e-9d439949ae6e\") " Dec 01 10:00:03 crc kubenswrapper[5004]: I1201 10:00:03.584646 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d7c9e05f-94e0-47da-a57e-9d439949ae6e-secret-volume\") pod \"d7c9e05f-94e0-47da-a57e-9d439949ae6e\" (UID: \"d7c9e05f-94e0-47da-a57e-9d439949ae6e\") " Dec 01 10:00:03 crc kubenswrapper[5004]: I1201 10:00:03.584776 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7c9e05f-94e0-47da-a57e-9d439949ae6e-config-volume" (OuterVolumeSpecName: "config-volume") pod "d7c9e05f-94e0-47da-a57e-9d439949ae6e" (UID: "d7c9e05f-94e0-47da-a57e-9d439949ae6e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:00:03 crc kubenswrapper[5004]: I1201 10:00:03.585166 5004 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d7c9e05f-94e0-47da-a57e-9d439949ae6e-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 10:00:03 crc kubenswrapper[5004]: I1201 10:00:03.591061 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7c9e05f-94e0-47da-a57e-9d439949ae6e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d7c9e05f-94e0-47da-a57e-9d439949ae6e" (UID: "d7c9e05f-94e0-47da-a57e-9d439949ae6e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:00:03 crc kubenswrapper[5004]: I1201 10:00:03.591218 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7c9e05f-94e0-47da-a57e-9d439949ae6e-kube-api-access-6d26b" (OuterVolumeSpecName: "kube-api-access-6d26b") pod "d7c9e05f-94e0-47da-a57e-9d439949ae6e" (UID: "d7c9e05f-94e0-47da-a57e-9d439949ae6e"). InnerVolumeSpecName "kube-api-access-6d26b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:00:03 crc kubenswrapper[5004]: I1201 10:00:03.686790 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6d26b\" (UniqueName: \"kubernetes.io/projected/d7c9e05f-94e0-47da-a57e-9d439949ae6e-kube-api-access-6d26b\") on node \"crc\" DevicePath \"\"" Dec 01 10:00:03 crc kubenswrapper[5004]: I1201 10:00:03.686829 5004 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d7c9e05f-94e0-47da-a57e-9d439949ae6e-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 10:00:03 crc kubenswrapper[5004]: I1201 10:00:03.934281 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409720-4n5tp" event={"ID":"d7c9e05f-94e0-47da-a57e-9d439949ae6e","Type":"ContainerDied","Data":"e2cb30408409042fd93dd09c7a14310e48bb5b64aadb18bb30eb61cebf9504dc"} Dec 01 10:00:03 crc kubenswrapper[5004]: I1201 10:00:03.934326 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409720-4n5tp" Dec 01 10:00:03 crc kubenswrapper[5004]: I1201 10:00:03.934332 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e2cb30408409042fd93dd09c7a14310e48bb5b64aadb18bb30eb61cebf9504dc" Dec 01 10:00:04 crc kubenswrapper[5004]: I1201 10:00:04.545277 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409675-tldzl"] Dec 01 10:00:04 crc kubenswrapper[5004]: I1201 10:00:04.556380 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409675-tldzl"] Dec 01 10:00:04 crc kubenswrapper[5004]: I1201 10:00:04.776401 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5aaf7b51-fbd4-40e4-a5a1-3281a372a6cd" path="/var/lib/kubelet/pods/5aaf7b51-fbd4-40e4-a5a1-3281a372a6cd/volumes" Dec 01 10:00:05 crc kubenswrapper[5004]: I1201 10:00:05.759676 5004 scope.go:117] "RemoveContainer" containerID="1888096bc7eea8f4ed362bde431fc15761a89c0c03d561f41b82ce51165e9213" Dec 01 10:00:05 crc kubenswrapper[5004]: E1201 10:00:05.760455 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 10:00:17 crc kubenswrapper[5004]: I1201 10:00:17.759206 5004 scope.go:117] "RemoveContainer" containerID="1888096bc7eea8f4ed362bde431fc15761a89c0c03d561f41b82ce51165e9213" Dec 01 10:00:17 crc kubenswrapper[5004]: E1201 10:00:17.760007 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 10:00:20 crc kubenswrapper[5004]: I1201 10:00:20.980408 5004 scope.go:117] "RemoveContainer" containerID="00f8e665d3ea8ff52f1d723ba72d8f24f89eb5d9061d051ff36178dbc208ad02" Dec 01 10:00:29 crc kubenswrapper[5004]: I1201 10:00:29.759469 5004 scope.go:117] "RemoveContainer" containerID="1888096bc7eea8f4ed362bde431fc15761a89c0c03d561f41b82ce51165e9213" Dec 01 10:00:29 crc kubenswrapper[5004]: E1201 10:00:29.760506 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 10:00:40 crc kubenswrapper[5004]: I1201 10:00:40.763412 5004 scope.go:117] "RemoveContainer" containerID="1888096bc7eea8f4ed362bde431fc15761a89c0c03d561f41b82ce51165e9213" Dec 01 10:00:40 crc kubenswrapper[5004]: E1201 10:00:40.764206 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 10:00:49 crc kubenswrapper[5004]: I1201 10:00:49.420756 5004 generic.go:334] "Generic (PLEG): container finished" podID="b624b6f4-e294-427a-94ac-358b5be6897b" containerID="429362d7683522181f61d793ac96ab2053aab7a3e7528b36f6c8459ffa813d6c" exitCode=0 Dec 01 10:00:49 crc kubenswrapper[5004]: I1201 10:00:49.420843 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"b624b6f4-e294-427a-94ac-358b5be6897b","Type":"ContainerDied","Data":"429362d7683522181f61d793ac96ab2053aab7a3e7528b36f6c8459ffa813d6c"} Dec 01 10:00:50 crc kubenswrapper[5004]: I1201 10:00:50.844674 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 01 10:00:50 crc kubenswrapper[5004]: I1201 10:00:50.989452 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b624b6f4-e294-427a-94ac-358b5be6897b-ssh-key\") pod \"b624b6f4-e294-427a-94ac-358b5be6897b\" (UID: \"b624b6f4-e294-427a-94ac-358b5be6897b\") " Dec 01 10:00:50 crc kubenswrapper[5004]: I1201 10:00:50.989609 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b624b6f4-e294-427a-94ac-358b5be6897b-config-data\") pod \"b624b6f4-e294-427a-94ac-358b5be6897b\" (UID: \"b624b6f4-e294-427a-94ac-358b5be6897b\") " Dec 01 10:00:50 crc kubenswrapper[5004]: I1201 10:00:50.989689 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b624b6f4-e294-427a-94ac-358b5be6897b-openstack-config\") pod \"b624b6f4-e294-427a-94ac-358b5be6897b\" (UID: \"b624b6f4-e294-427a-94ac-358b5be6897b\") " Dec 01 10:00:50 crc kubenswrapper[5004]: I1201 10:00:50.989721 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/b624b6f4-e294-427a-94ac-358b5be6897b-test-operator-ephemeral-temporary\") pod \"b624b6f4-e294-427a-94ac-358b5be6897b\" (UID: \"b624b6f4-e294-427a-94ac-358b5be6897b\") " Dec 01 10:00:50 crc kubenswrapper[5004]: I1201 10:00:50.989831 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/b624b6f4-e294-427a-94ac-358b5be6897b-test-operator-ephemeral-workdir\") pod \"b624b6f4-e294-427a-94ac-358b5be6897b\" (UID: \"b624b6f4-e294-427a-94ac-358b5be6897b\") " Dec 01 10:00:50 crc kubenswrapper[5004]: I1201 10:00:50.989980 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/b624b6f4-e294-427a-94ac-358b5be6897b-ca-certs\") pod \"b624b6f4-e294-427a-94ac-358b5be6897b\" (UID: \"b624b6f4-e294-427a-94ac-358b5be6897b\") " Dec 01 10:00:50 crc kubenswrapper[5004]: I1201 10:00:50.990014 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b624b6f4-e294-427a-94ac-358b5be6897b-openstack-config-secret\") pod \"b624b6f4-e294-427a-94ac-358b5be6897b\" (UID: \"b624b6f4-e294-427a-94ac-358b5be6897b\") " Dec 01 10:00:50 crc kubenswrapper[5004]: I1201 10:00:50.990035 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"b624b6f4-e294-427a-94ac-358b5be6897b\" (UID: \"b624b6f4-e294-427a-94ac-358b5be6897b\") " Dec 01 10:00:50 crc kubenswrapper[5004]: I1201 10:00:50.990061 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nw6tg\" (UniqueName: \"kubernetes.io/projected/b624b6f4-e294-427a-94ac-358b5be6897b-kube-api-access-nw6tg\") pod \"b624b6f4-e294-427a-94ac-358b5be6897b\" (UID: \"b624b6f4-e294-427a-94ac-358b5be6897b\") " Dec 01 10:00:50 crc kubenswrapper[5004]: I1201 10:00:50.990705 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b624b6f4-e294-427a-94ac-358b5be6897b-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "b624b6f4-e294-427a-94ac-358b5be6897b" (UID: "b624b6f4-e294-427a-94ac-358b5be6897b"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:00:50 crc kubenswrapper[5004]: I1201 10:00:50.990850 5004 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/b624b6f4-e294-427a-94ac-358b5be6897b-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Dec 01 10:00:50 crc kubenswrapper[5004]: I1201 10:00:50.991520 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b624b6f4-e294-427a-94ac-358b5be6897b-config-data" (OuterVolumeSpecName: "config-data") pod "b624b6f4-e294-427a-94ac-358b5be6897b" (UID: "b624b6f4-e294-427a-94ac-358b5be6897b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:00:50 crc kubenswrapper[5004]: I1201 10:00:50.996947 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b624b6f4-e294-427a-94ac-358b5be6897b-kube-api-access-nw6tg" (OuterVolumeSpecName: "kube-api-access-nw6tg") pod "b624b6f4-e294-427a-94ac-358b5be6897b" (UID: "b624b6f4-e294-427a-94ac-358b5be6897b"). InnerVolumeSpecName "kube-api-access-nw6tg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:00:50 crc kubenswrapper[5004]: I1201 10:00:50.997253 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "test-operator-logs") pod "b624b6f4-e294-427a-94ac-358b5be6897b" (UID: "b624b6f4-e294-427a-94ac-358b5be6897b"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:00:50 crc kubenswrapper[5004]: I1201 10:00:50.999706 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b624b6f4-e294-427a-94ac-358b5be6897b-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "b624b6f4-e294-427a-94ac-358b5be6897b" (UID: "b624b6f4-e294-427a-94ac-358b5be6897b"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:00:51 crc kubenswrapper[5004]: I1201 10:00:51.029299 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b624b6f4-e294-427a-94ac-358b5be6897b-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "b624b6f4-e294-427a-94ac-358b5be6897b" (UID: "b624b6f4-e294-427a-94ac-358b5be6897b"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:00:51 crc kubenswrapper[5004]: I1201 10:00:51.029431 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b624b6f4-e294-427a-94ac-358b5be6897b-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "b624b6f4-e294-427a-94ac-358b5be6897b" (UID: "b624b6f4-e294-427a-94ac-358b5be6897b"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:00:51 crc kubenswrapper[5004]: I1201 10:00:51.029925 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b624b6f4-e294-427a-94ac-358b5be6897b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b624b6f4-e294-427a-94ac-358b5be6897b" (UID: "b624b6f4-e294-427a-94ac-358b5be6897b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:00:51 crc kubenswrapper[5004]: I1201 10:00:51.069714 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b624b6f4-e294-427a-94ac-358b5be6897b-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "b624b6f4-e294-427a-94ac-358b5be6897b" (UID: "b624b6f4-e294-427a-94ac-358b5be6897b"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:00:51 crc kubenswrapper[5004]: I1201 10:00:51.093030 5004 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b624b6f4-e294-427a-94ac-358b5be6897b-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:00:51 crc kubenswrapper[5004]: I1201 10:00:51.093076 5004 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/b624b6f4-e294-427a-94ac-358b5be6897b-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Dec 01 10:00:51 crc kubenswrapper[5004]: I1201 10:00:51.093095 5004 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/b624b6f4-e294-427a-94ac-358b5be6897b-ca-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:00:51 crc kubenswrapper[5004]: I1201 10:00:51.093108 5004 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b624b6f4-e294-427a-94ac-358b5be6897b-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 01 10:00:51 crc kubenswrapper[5004]: I1201 10:00:51.093632 5004 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Dec 01 10:00:51 crc kubenswrapper[5004]: I1201 10:00:51.093657 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nw6tg\" (UniqueName: \"kubernetes.io/projected/b624b6f4-e294-427a-94ac-358b5be6897b-kube-api-access-nw6tg\") on node \"crc\" DevicePath \"\"" Dec 01 10:00:51 crc kubenswrapper[5004]: I1201 10:00:51.093670 5004 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b624b6f4-e294-427a-94ac-358b5be6897b-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 10:00:51 crc kubenswrapper[5004]: I1201 10:00:51.093682 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b624b6f4-e294-427a-94ac-358b5be6897b-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:00:51 crc kubenswrapper[5004]: I1201 10:00:51.121835 5004 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Dec 01 10:00:51 crc kubenswrapper[5004]: I1201 10:00:51.196099 5004 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:00:51 crc kubenswrapper[5004]: I1201 10:00:51.443788 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"b624b6f4-e294-427a-94ac-358b5be6897b","Type":"ContainerDied","Data":"05795bb73619fbb81f76e41f982cd7e652b7c3c06352a27d591186a800d3fc55"} Dec 01 10:00:51 crc kubenswrapper[5004]: I1201 10:00:51.444130 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05795bb73619fbb81f76e41f982cd7e652b7c3c06352a27d591186a800d3fc55" Dec 01 10:00:51 crc kubenswrapper[5004]: I1201 10:00:51.443844 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 01 10:00:51 crc kubenswrapper[5004]: I1201 10:00:51.759050 5004 scope.go:117] "RemoveContainer" containerID="1888096bc7eea8f4ed362bde431fc15761a89c0c03d561f41b82ce51165e9213" Dec 01 10:00:51 crc kubenswrapper[5004]: E1201 10:00:51.759517 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 10:00:56 crc kubenswrapper[5004]: I1201 10:00:56.701475 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 01 10:00:56 crc kubenswrapper[5004]: E1201 10:00:56.705105 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7c9e05f-94e0-47da-a57e-9d439949ae6e" containerName="collect-profiles" Dec 01 10:00:56 crc kubenswrapper[5004]: I1201 10:00:56.705148 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7c9e05f-94e0-47da-a57e-9d439949ae6e" containerName="collect-profiles" Dec 01 10:00:56 crc kubenswrapper[5004]: E1201 10:00:56.705212 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b624b6f4-e294-427a-94ac-358b5be6897b" containerName="tempest-tests-tempest-tests-runner" Dec 01 10:00:56 crc kubenswrapper[5004]: I1201 10:00:56.705223 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="b624b6f4-e294-427a-94ac-358b5be6897b" containerName="tempest-tests-tempest-tests-runner" Dec 01 10:00:56 crc kubenswrapper[5004]: I1201 10:00:56.705875 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7c9e05f-94e0-47da-a57e-9d439949ae6e" containerName="collect-profiles" Dec 01 10:00:56 crc kubenswrapper[5004]: I1201 10:00:56.705907 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="b624b6f4-e294-427a-94ac-358b5be6897b" containerName="tempest-tests-tempest-tests-runner" Dec 01 10:00:56 crc kubenswrapper[5004]: I1201 10:00:56.707145 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 01 10:00:56 crc kubenswrapper[5004]: I1201 10:00:56.709258 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-2srjp" Dec 01 10:00:56 crc kubenswrapper[5004]: I1201 10:00:56.716805 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 01 10:00:56 crc kubenswrapper[5004]: I1201 10:00:56.830863 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nd8sp\" (UniqueName: \"kubernetes.io/projected/04b163a0-721a-44ce-b030-1aad4f09cbcb-kube-api-access-nd8sp\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"04b163a0-721a-44ce-b030-1aad4f09cbcb\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 01 10:00:56 crc kubenswrapper[5004]: I1201 10:00:56.831210 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"04b163a0-721a-44ce-b030-1aad4f09cbcb\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 01 10:00:56 crc kubenswrapper[5004]: I1201 10:00:56.933553 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nd8sp\" (UniqueName: \"kubernetes.io/projected/04b163a0-721a-44ce-b030-1aad4f09cbcb-kube-api-access-nd8sp\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"04b163a0-721a-44ce-b030-1aad4f09cbcb\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 01 10:00:56 crc kubenswrapper[5004]: I1201 10:00:56.933727 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"04b163a0-721a-44ce-b030-1aad4f09cbcb\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 01 10:00:56 crc kubenswrapper[5004]: I1201 10:00:56.934598 5004 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"04b163a0-721a-44ce-b030-1aad4f09cbcb\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 01 10:00:56 crc kubenswrapper[5004]: I1201 10:00:56.954658 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nd8sp\" (UniqueName: \"kubernetes.io/projected/04b163a0-721a-44ce-b030-1aad4f09cbcb-kube-api-access-nd8sp\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"04b163a0-721a-44ce-b030-1aad4f09cbcb\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 01 10:00:56 crc kubenswrapper[5004]: I1201 10:00:56.966011 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"04b163a0-721a-44ce-b030-1aad4f09cbcb\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 01 10:00:57 crc kubenswrapper[5004]: I1201 10:00:57.036166 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 01 10:00:57 crc kubenswrapper[5004]: I1201 10:00:57.432854 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 01 10:00:57 crc kubenswrapper[5004]: I1201 10:00:57.446979 5004 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 10:00:57 crc kubenswrapper[5004]: I1201 10:00:57.532341 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"04b163a0-721a-44ce-b030-1aad4f09cbcb","Type":"ContainerStarted","Data":"5d3dcaf662e0c20ce145ad5ebd4899fd744a6339c3f3669e00650d3a8853a113"} Dec 01 10:00:59 crc kubenswrapper[5004]: I1201 10:00:59.555162 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"04b163a0-721a-44ce-b030-1aad4f09cbcb","Type":"ContainerStarted","Data":"001e1fb1bad75e639a0bef787ae8940540484db7a9ebfde727f344e9b81d6d3f"} Dec 01 10:00:59 crc kubenswrapper[5004]: I1201 10:00:59.578206 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=1.712903967 podStartE2EDuration="3.578184829s" podCreationTimestamp="2025-12-01 10:00:56 +0000 UTC" firstStartedPulling="2025-12-01 10:00:57.446755546 +0000 UTC m=+6235.011747528" lastFinishedPulling="2025-12-01 10:00:59.312036408 +0000 UTC m=+6236.877028390" observedRunningTime="2025-12-01 10:00:59.568065424 +0000 UTC m=+6237.133057406" watchObservedRunningTime="2025-12-01 10:00:59.578184829 +0000 UTC m=+6237.143176811" Dec 01 10:01:00 crc kubenswrapper[5004]: I1201 10:01:00.144062 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29409721-qqnhs"] Dec 01 10:01:00 crc kubenswrapper[5004]: I1201 10:01:00.145928 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29409721-qqnhs" Dec 01 10:01:00 crc kubenswrapper[5004]: I1201 10:01:00.155910 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29409721-qqnhs"] Dec 01 10:01:00 crc kubenswrapper[5004]: I1201 10:01:00.313982 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7765b30a-ec51-4fb4-982a-b47fe84e37a7-fernet-keys\") pod \"keystone-cron-29409721-qqnhs\" (UID: \"7765b30a-ec51-4fb4-982a-b47fe84e37a7\") " pod="openstack/keystone-cron-29409721-qqnhs" Dec 01 10:01:00 crc kubenswrapper[5004]: I1201 10:01:00.314153 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7765b30a-ec51-4fb4-982a-b47fe84e37a7-config-data\") pod \"keystone-cron-29409721-qqnhs\" (UID: \"7765b30a-ec51-4fb4-982a-b47fe84e37a7\") " pod="openstack/keystone-cron-29409721-qqnhs" Dec 01 10:01:00 crc kubenswrapper[5004]: I1201 10:01:00.314192 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qchpt\" (UniqueName: \"kubernetes.io/projected/7765b30a-ec51-4fb4-982a-b47fe84e37a7-kube-api-access-qchpt\") pod \"keystone-cron-29409721-qqnhs\" (UID: \"7765b30a-ec51-4fb4-982a-b47fe84e37a7\") " pod="openstack/keystone-cron-29409721-qqnhs" Dec 01 10:01:00 crc kubenswrapper[5004]: I1201 10:01:00.314276 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7765b30a-ec51-4fb4-982a-b47fe84e37a7-combined-ca-bundle\") pod \"keystone-cron-29409721-qqnhs\" (UID: \"7765b30a-ec51-4fb4-982a-b47fe84e37a7\") " pod="openstack/keystone-cron-29409721-qqnhs" Dec 01 10:01:00 crc kubenswrapper[5004]: I1201 10:01:00.416498 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7765b30a-ec51-4fb4-982a-b47fe84e37a7-fernet-keys\") pod \"keystone-cron-29409721-qqnhs\" (UID: \"7765b30a-ec51-4fb4-982a-b47fe84e37a7\") " pod="openstack/keystone-cron-29409721-qqnhs" Dec 01 10:01:00 crc kubenswrapper[5004]: I1201 10:01:00.416616 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7765b30a-ec51-4fb4-982a-b47fe84e37a7-config-data\") pod \"keystone-cron-29409721-qqnhs\" (UID: \"7765b30a-ec51-4fb4-982a-b47fe84e37a7\") " pod="openstack/keystone-cron-29409721-qqnhs" Dec 01 10:01:00 crc kubenswrapper[5004]: I1201 10:01:00.416664 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qchpt\" (UniqueName: \"kubernetes.io/projected/7765b30a-ec51-4fb4-982a-b47fe84e37a7-kube-api-access-qchpt\") pod \"keystone-cron-29409721-qqnhs\" (UID: \"7765b30a-ec51-4fb4-982a-b47fe84e37a7\") " pod="openstack/keystone-cron-29409721-qqnhs" Dec 01 10:01:00 crc kubenswrapper[5004]: I1201 10:01:00.416769 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7765b30a-ec51-4fb4-982a-b47fe84e37a7-combined-ca-bundle\") pod \"keystone-cron-29409721-qqnhs\" (UID: \"7765b30a-ec51-4fb4-982a-b47fe84e37a7\") " pod="openstack/keystone-cron-29409721-qqnhs" Dec 01 10:01:00 crc kubenswrapper[5004]: I1201 10:01:00.422766 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7765b30a-ec51-4fb4-982a-b47fe84e37a7-fernet-keys\") pod \"keystone-cron-29409721-qqnhs\" (UID: \"7765b30a-ec51-4fb4-982a-b47fe84e37a7\") " pod="openstack/keystone-cron-29409721-qqnhs" Dec 01 10:01:00 crc kubenswrapper[5004]: I1201 10:01:00.423315 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7765b30a-ec51-4fb4-982a-b47fe84e37a7-combined-ca-bundle\") pod \"keystone-cron-29409721-qqnhs\" (UID: \"7765b30a-ec51-4fb4-982a-b47fe84e37a7\") " pod="openstack/keystone-cron-29409721-qqnhs" Dec 01 10:01:00 crc kubenswrapper[5004]: I1201 10:01:00.424454 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7765b30a-ec51-4fb4-982a-b47fe84e37a7-config-data\") pod \"keystone-cron-29409721-qqnhs\" (UID: \"7765b30a-ec51-4fb4-982a-b47fe84e37a7\") " pod="openstack/keystone-cron-29409721-qqnhs" Dec 01 10:01:00 crc kubenswrapper[5004]: I1201 10:01:00.435985 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qchpt\" (UniqueName: \"kubernetes.io/projected/7765b30a-ec51-4fb4-982a-b47fe84e37a7-kube-api-access-qchpt\") pod \"keystone-cron-29409721-qqnhs\" (UID: \"7765b30a-ec51-4fb4-982a-b47fe84e37a7\") " pod="openstack/keystone-cron-29409721-qqnhs" Dec 01 10:01:00 crc kubenswrapper[5004]: I1201 10:01:00.463955 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29409721-qqnhs" Dec 01 10:01:00 crc kubenswrapper[5004]: W1201 10:01:00.920275 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7765b30a_ec51_4fb4_982a_b47fe84e37a7.slice/crio-1e3bac1532369c894d6b5a0f3e41907d32ad2898a8472f1f0481fa6cd6442fc7 WatchSource:0}: Error finding container 1e3bac1532369c894d6b5a0f3e41907d32ad2898a8472f1f0481fa6cd6442fc7: Status 404 returned error can't find the container with id 1e3bac1532369c894d6b5a0f3e41907d32ad2898a8472f1f0481fa6cd6442fc7 Dec 01 10:01:00 crc kubenswrapper[5004]: I1201 10:01:00.924338 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29409721-qqnhs"] Dec 01 10:01:01 crc kubenswrapper[5004]: I1201 10:01:01.580218 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29409721-qqnhs" event={"ID":"7765b30a-ec51-4fb4-982a-b47fe84e37a7","Type":"ContainerStarted","Data":"1e3bac1532369c894d6b5a0f3e41907d32ad2898a8472f1f0481fa6cd6442fc7"} Dec 01 10:01:02 crc kubenswrapper[5004]: I1201 10:01:02.637258 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29409721-qqnhs" event={"ID":"7765b30a-ec51-4fb4-982a-b47fe84e37a7","Type":"ContainerStarted","Data":"66675eaffb013682ae427371d348f2ae25ae6ad6707318a1f83dd076036646c5"} Dec 01 10:01:02 crc kubenswrapper[5004]: I1201 10:01:02.705044 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29409721-qqnhs" podStartSLOduration=2.7050170639999997 podStartE2EDuration="2.705017064s" podCreationTimestamp="2025-12-01 10:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:01:02.687745595 +0000 UTC m=+6240.252737577" watchObservedRunningTime="2025-12-01 10:01:02.705017064 +0000 UTC m=+6240.270009046" Dec 01 10:01:02 crc kubenswrapper[5004]: I1201 10:01:02.768410 5004 scope.go:117] "RemoveContainer" containerID="1888096bc7eea8f4ed362bde431fc15761a89c0c03d561f41b82ce51165e9213" Dec 01 10:01:02 crc kubenswrapper[5004]: E1201 10:01:02.768931 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 10:01:05 crc kubenswrapper[5004]: I1201 10:01:05.683087 5004 generic.go:334] "Generic (PLEG): container finished" podID="7765b30a-ec51-4fb4-982a-b47fe84e37a7" containerID="66675eaffb013682ae427371d348f2ae25ae6ad6707318a1f83dd076036646c5" exitCode=0 Dec 01 10:01:05 crc kubenswrapper[5004]: I1201 10:01:05.683146 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29409721-qqnhs" event={"ID":"7765b30a-ec51-4fb4-982a-b47fe84e37a7","Type":"ContainerDied","Data":"66675eaffb013682ae427371d348f2ae25ae6ad6707318a1f83dd076036646c5"} Dec 01 10:01:07 crc kubenswrapper[5004]: I1201 10:01:07.140005 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29409721-qqnhs" Dec 01 10:01:07 crc kubenswrapper[5004]: I1201 10:01:07.322353 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qchpt\" (UniqueName: \"kubernetes.io/projected/7765b30a-ec51-4fb4-982a-b47fe84e37a7-kube-api-access-qchpt\") pod \"7765b30a-ec51-4fb4-982a-b47fe84e37a7\" (UID: \"7765b30a-ec51-4fb4-982a-b47fe84e37a7\") " Dec 01 10:01:07 crc kubenswrapper[5004]: I1201 10:01:07.322476 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7765b30a-ec51-4fb4-982a-b47fe84e37a7-fernet-keys\") pod \"7765b30a-ec51-4fb4-982a-b47fe84e37a7\" (UID: \"7765b30a-ec51-4fb4-982a-b47fe84e37a7\") " Dec 01 10:01:07 crc kubenswrapper[5004]: I1201 10:01:07.322533 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7765b30a-ec51-4fb4-982a-b47fe84e37a7-config-data\") pod \"7765b30a-ec51-4fb4-982a-b47fe84e37a7\" (UID: \"7765b30a-ec51-4fb4-982a-b47fe84e37a7\") " Dec 01 10:01:07 crc kubenswrapper[5004]: I1201 10:01:07.322688 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7765b30a-ec51-4fb4-982a-b47fe84e37a7-combined-ca-bundle\") pod \"7765b30a-ec51-4fb4-982a-b47fe84e37a7\" (UID: \"7765b30a-ec51-4fb4-982a-b47fe84e37a7\") " Dec 01 10:01:07 crc kubenswrapper[5004]: I1201 10:01:07.340423 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7765b30a-ec51-4fb4-982a-b47fe84e37a7-kube-api-access-qchpt" (OuterVolumeSpecName: "kube-api-access-qchpt") pod "7765b30a-ec51-4fb4-982a-b47fe84e37a7" (UID: "7765b30a-ec51-4fb4-982a-b47fe84e37a7"). InnerVolumeSpecName "kube-api-access-qchpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:01:07 crc kubenswrapper[5004]: I1201 10:01:07.345746 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7765b30a-ec51-4fb4-982a-b47fe84e37a7-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "7765b30a-ec51-4fb4-982a-b47fe84e37a7" (UID: "7765b30a-ec51-4fb4-982a-b47fe84e37a7"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:01:07 crc kubenswrapper[5004]: I1201 10:01:07.426782 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7765b30a-ec51-4fb4-982a-b47fe84e37a7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7765b30a-ec51-4fb4-982a-b47fe84e37a7" (UID: "7765b30a-ec51-4fb4-982a-b47fe84e37a7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:01:07 crc kubenswrapper[5004]: I1201 10:01:07.428974 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qchpt\" (UniqueName: \"kubernetes.io/projected/7765b30a-ec51-4fb4-982a-b47fe84e37a7-kube-api-access-qchpt\") on node \"crc\" DevicePath \"\"" Dec 01 10:01:07 crc kubenswrapper[5004]: I1201 10:01:07.429013 5004 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7765b30a-ec51-4fb4-982a-b47fe84e37a7-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 01 10:01:07 crc kubenswrapper[5004]: I1201 10:01:07.429046 5004 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7765b30a-ec51-4fb4-982a-b47fe84e37a7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:01:07 crc kubenswrapper[5004]: I1201 10:01:07.506596 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7765b30a-ec51-4fb4-982a-b47fe84e37a7-config-data" (OuterVolumeSpecName: "config-data") pod "7765b30a-ec51-4fb4-982a-b47fe84e37a7" (UID: "7765b30a-ec51-4fb4-982a-b47fe84e37a7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:01:07 crc kubenswrapper[5004]: I1201 10:01:07.531520 5004 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7765b30a-ec51-4fb4-982a-b47fe84e37a7-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:01:07 crc kubenswrapper[5004]: I1201 10:01:07.712628 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29409721-qqnhs" event={"ID":"7765b30a-ec51-4fb4-982a-b47fe84e37a7","Type":"ContainerDied","Data":"1e3bac1532369c894d6b5a0f3e41907d32ad2898a8472f1f0481fa6cd6442fc7"} Dec 01 10:01:07 crc kubenswrapper[5004]: I1201 10:01:07.712679 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e3bac1532369c894d6b5a0f3e41907d32ad2898a8472f1f0481fa6cd6442fc7" Dec 01 10:01:07 crc kubenswrapper[5004]: I1201 10:01:07.712690 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29409721-qqnhs" Dec 01 10:01:17 crc kubenswrapper[5004]: I1201 10:01:17.760715 5004 scope.go:117] "RemoveContainer" containerID="1888096bc7eea8f4ed362bde431fc15761a89c0c03d561f41b82ce51165e9213" Dec 01 10:01:17 crc kubenswrapper[5004]: E1201 10:01:17.762342 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 10:01:28 crc kubenswrapper[5004]: I1201 10:01:28.759003 5004 scope.go:117] "RemoveContainer" containerID="1888096bc7eea8f4ed362bde431fc15761a89c0c03d561f41b82ce51165e9213" Dec 01 10:01:28 crc kubenswrapper[5004]: E1201 10:01:28.759829 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 10:01:35 crc kubenswrapper[5004]: I1201 10:01:35.040220 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xmsbt/must-gather-7cnt7"] Dec 01 10:01:35 crc kubenswrapper[5004]: E1201 10:01:35.041412 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7765b30a-ec51-4fb4-982a-b47fe84e37a7" containerName="keystone-cron" Dec 01 10:01:35 crc kubenswrapper[5004]: I1201 10:01:35.041432 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="7765b30a-ec51-4fb4-982a-b47fe84e37a7" containerName="keystone-cron" Dec 01 10:01:35 crc kubenswrapper[5004]: I1201 10:01:35.041825 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="7765b30a-ec51-4fb4-982a-b47fe84e37a7" containerName="keystone-cron" Dec 01 10:01:35 crc kubenswrapper[5004]: I1201 10:01:35.044442 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xmsbt/must-gather-7cnt7" Dec 01 10:01:35 crc kubenswrapper[5004]: I1201 10:01:35.046502 5004 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-xmsbt"/"default-dockercfg-msbnt" Dec 01 10:01:35 crc kubenswrapper[5004]: I1201 10:01:35.046679 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-xmsbt"/"openshift-service-ca.crt" Dec 01 10:01:35 crc kubenswrapper[5004]: I1201 10:01:35.050368 5004 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-xmsbt"/"kube-root-ca.crt" Dec 01 10:01:35 crc kubenswrapper[5004]: I1201 10:01:35.081656 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xmsbt/must-gather-7cnt7"] Dec 01 10:01:35 crc kubenswrapper[5004]: I1201 10:01:35.137193 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/481141e8-7f09-4b09-b15d-4621f4ff7bab-must-gather-output\") pod \"must-gather-7cnt7\" (UID: \"481141e8-7f09-4b09-b15d-4621f4ff7bab\") " pod="openshift-must-gather-xmsbt/must-gather-7cnt7" Dec 01 10:01:35 crc kubenswrapper[5004]: I1201 10:01:35.138445 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28ql9\" (UniqueName: \"kubernetes.io/projected/481141e8-7f09-4b09-b15d-4621f4ff7bab-kube-api-access-28ql9\") pod \"must-gather-7cnt7\" (UID: \"481141e8-7f09-4b09-b15d-4621f4ff7bab\") " pod="openshift-must-gather-xmsbt/must-gather-7cnt7" Dec 01 10:01:35 crc kubenswrapper[5004]: I1201 10:01:35.240465 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/481141e8-7f09-4b09-b15d-4621f4ff7bab-must-gather-output\") pod \"must-gather-7cnt7\" (UID: \"481141e8-7f09-4b09-b15d-4621f4ff7bab\") " pod="openshift-must-gather-xmsbt/must-gather-7cnt7" Dec 01 10:01:35 crc kubenswrapper[5004]: I1201 10:01:35.240707 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28ql9\" (UniqueName: \"kubernetes.io/projected/481141e8-7f09-4b09-b15d-4621f4ff7bab-kube-api-access-28ql9\") pod \"must-gather-7cnt7\" (UID: \"481141e8-7f09-4b09-b15d-4621f4ff7bab\") " pod="openshift-must-gather-xmsbt/must-gather-7cnt7" Dec 01 10:01:35 crc kubenswrapper[5004]: I1201 10:01:35.241086 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/481141e8-7f09-4b09-b15d-4621f4ff7bab-must-gather-output\") pod \"must-gather-7cnt7\" (UID: \"481141e8-7f09-4b09-b15d-4621f4ff7bab\") " pod="openshift-must-gather-xmsbt/must-gather-7cnt7" Dec 01 10:01:35 crc kubenswrapper[5004]: I1201 10:01:35.262880 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28ql9\" (UniqueName: \"kubernetes.io/projected/481141e8-7f09-4b09-b15d-4621f4ff7bab-kube-api-access-28ql9\") pod \"must-gather-7cnt7\" (UID: \"481141e8-7f09-4b09-b15d-4621f4ff7bab\") " pod="openshift-must-gather-xmsbt/must-gather-7cnt7" Dec 01 10:01:35 crc kubenswrapper[5004]: I1201 10:01:35.374341 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xmsbt/must-gather-7cnt7" Dec 01 10:01:36 crc kubenswrapper[5004]: I1201 10:01:36.133425 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xmsbt/must-gather-7cnt7"] Dec 01 10:01:37 crc kubenswrapper[5004]: I1201 10:01:37.100447 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xmsbt/must-gather-7cnt7" event={"ID":"481141e8-7f09-4b09-b15d-4621f4ff7bab","Type":"ContainerStarted","Data":"e784fbb7412e9c16fe0239b73228580a456c0985481a50631d82e18f816625ec"} Dec 01 10:01:40 crc kubenswrapper[5004]: I1201 10:01:40.760920 5004 scope.go:117] "RemoveContainer" containerID="1888096bc7eea8f4ed362bde431fc15761a89c0c03d561f41b82ce51165e9213" Dec 01 10:01:42 crc kubenswrapper[5004]: I1201 10:01:42.167048 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" event={"ID":"f9977ebb-82de-4e96-8763-0b5a84f8d4ce","Type":"ContainerStarted","Data":"497710a2817802c21762db587a6989ef9d9b667ad5011a6d0d79313f386386ad"} Dec 01 10:01:42 crc kubenswrapper[5004]: I1201 10:01:42.175654 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xmsbt/must-gather-7cnt7" event={"ID":"481141e8-7f09-4b09-b15d-4621f4ff7bab","Type":"ContainerStarted","Data":"62c34251faf762eef045feb5e0a6c341c1273373da979163d027c1f4821cee50"} Dec 01 10:01:42 crc kubenswrapper[5004]: I1201 10:01:42.175703 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xmsbt/must-gather-7cnt7" event={"ID":"481141e8-7f09-4b09-b15d-4621f4ff7bab","Type":"ContainerStarted","Data":"ce164add57bfc59dcddfda167ea69e4adfcbb581de265b86a2460d41e4dcc79b"} Dec 01 10:01:42 crc kubenswrapper[5004]: I1201 10:01:42.211609 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xmsbt/must-gather-7cnt7" podStartSLOduration=2.019442969 podStartE2EDuration="7.211580315s" podCreationTimestamp="2025-12-01 10:01:35 +0000 UTC" firstStartedPulling="2025-12-01 10:01:36.135520501 +0000 UTC m=+6273.700512483" lastFinishedPulling="2025-12-01 10:01:41.327657847 +0000 UTC m=+6278.892649829" observedRunningTime="2025-12-01 10:01:42.201011878 +0000 UTC m=+6279.766003880" watchObservedRunningTime="2025-12-01 10:01:42.211580315 +0000 UTC m=+6279.776572307" Dec 01 10:01:47 crc kubenswrapper[5004]: I1201 10:01:47.343812 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xmsbt/crc-debug-2m5n9"] Dec 01 10:01:47 crc kubenswrapper[5004]: I1201 10:01:47.346762 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xmsbt/crc-debug-2m5n9" Dec 01 10:01:47 crc kubenswrapper[5004]: I1201 10:01:47.497251 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d56de7fc-21b5-4ced-8bfa-6429afcd72e0-host\") pod \"crc-debug-2m5n9\" (UID: \"d56de7fc-21b5-4ced-8bfa-6429afcd72e0\") " pod="openshift-must-gather-xmsbt/crc-debug-2m5n9" Dec 01 10:01:47 crc kubenswrapper[5004]: I1201 10:01:47.497477 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6hjv\" (UniqueName: \"kubernetes.io/projected/d56de7fc-21b5-4ced-8bfa-6429afcd72e0-kube-api-access-j6hjv\") pod \"crc-debug-2m5n9\" (UID: \"d56de7fc-21b5-4ced-8bfa-6429afcd72e0\") " pod="openshift-must-gather-xmsbt/crc-debug-2m5n9" Dec 01 10:01:47 crc kubenswrapper[5004]: I1201 10:01:47.600978 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d56de7fc-21b5-4ced-8bfa-6429afcd72e0-host\") pod \"crc-debug-2m5n9\" (UID: \"d56de7fc-21b5-4ced-8bfa-6429afcd72e0\") " pod="openshift-must-gather-xmsbt/crc-debug-2m5n9" Dec 01 10:01:47 crc kubenswrapper[5004]: I1201 10:01:47.601102 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6hjv\" (UniqueName: \"kubernetes.io/projected/d56de7fc-21b5-4ced-8bfa-6429afcd72e0-kube-api-access-j6hjv\") pod \"crc-debug-2m5n9\" (UID: \"d56de7fc-21b5-4ced-8bfa-6429afcd72e0\") " pod="openshift-must-gather-xmsbt/crc-debug-2m5n9" Dec 01 10:01:47 crc kubenswrapper[5004]: I1201 10:01:47.601659 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d56de7fc-21b5-4ced-8bfa-6429afcd72e0-host\") pod \"crc-debug-2m5n9\" (UID: \"d56de7fc-21b5-4ced-8bfa-6429afcd72e0\") " pod="openshift-must-gather-xmsbt/crc-debug-2m5n9" Dec 01 10:01:47 crc kubenswrapper[5004]: I1201 10:01:47.625455 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6hjv\" (UniqueName: \"kubernetes.io/projected/d56de7fc-21b5-4ced-8bfa-6429afcd72e0-kube-api-access-j6hjv\") pod \"crc-debug-2m5n9\" (UID: \"d56de7fc-21b5-4ced-8bfa-6429afcd72e0\") " pod="openshift-must-gather-xmsbt/crc-debug-2m5n9" Dec 01 10:01:47 crc kubenswrapper[5004]: I1201 10:01:47.670036 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xmsbt/crc-debug-2m5n9" Dec 01 10:01:47 crc kubenswrapper[5004]: W1201 10:01:47.712439 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd56de7fc_21b5_4ced_8bfa_6429afcd72e0.slice/crio-f71402c202e92bf0646502cfdf6e8c1dbaf8cbc30d0d05b9e04d0c4fd63917e7 WatchSource:0}: Error finding container f71402c202e92bf0646502cfdf6e8c1dbaf8cbc30d0d05b9e04d0c4fd63917e7: Status 404 returned error can't find the container with id f71402c202e92bf0646502cfdf6e8c1dbaf8cbc30d0d05b9e04d0c4fd63917e7 Dec 01 10:01:48 crc kubenswrapper[5004]: I1201 10:01:48.259137 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xmsbt/crc-debug-2m5n9" event={"ID":"d56de7fc-21b5-4ced-8bfa-6429afcd72e0","Type":"ContainerStarted","Data":"f71402c202e92bf0646502cfdf6e8c1dbaf8cbc30d0d05b9e04d0c4fd63917e7"} Dec 01 10:02:03 crc kubenswrapper[5004]: I1201 10:02:03.240038 5004 trace.go:236] Trace[610529331]: "Calculate volume metrics of catalog-content for pod openshift-marketplace/redhat-operators-mrv6v" (01-Dec-2025 10:02:01.941) (total time: 1297ms): Dec 01 10:02:03 crc kubenswrapper[5004]: Trace[610529331]: [1.297625342s] [1.297625342s] END Dec 01 10:02:03 crc kubenswrapper[5004]: E1201 10:02:03.582741 5004 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296" Dec 01 10:02:03 crc kubenswrapper[5004]: E1201 10:02:03.596835 5004 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:container-00,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296,Command:[chroot /host bash -c echo 'TOOLBOX_NAME=toolbox-osp' > /root/.toolboxrc ; rm -rf \"/var/tmp/sos-osp\" && mkdir -p \"/var/tmp/sos-osp\" && sudo podman rm --force toolbox-osp; sudo --preserve-env podman pull --authfile /var/lib/kubelet/config.json registry.redhat.io/rhel9/support-tools && toolbox sos report --batch --all-logs --only-plugins block,cifs,crio,devicemapper,devices,firewall_tables,firewalld,iscsi,lvm2,memory,multipath,nfs,nis,nvme,podman,process,processor,selinux,scsi,udev,logs,crypto --tmp-dir=\"/var/tmp/sos-osp\" && if [[ \"$(ls /var/log/pods/*/{*.log.*,*/*.log.*} 2>/dev/null)\" != '' ]]; then tar --ignore-failed-read --warning=no-file-changed -cJf \"/var/tmp/sos-osp/podlogs.tar.xz\" --transform 's,^,podlogs/,' /var/log/pods/*/{*.log.*,*/*.log.*} || true; fi],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:TMOUT,Value:900,ValueFrom:nil,},EnvVar{Name:HOST,Value:/host,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host,ReadOnly:false,MountPath:/host,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-j6hjv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod crc-debug-2m5n9_openshift-must-gather-xmsbt(d56de7fc-21b5-4ced-8bfa-6429afcd72e0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 10:02:03 crc kubenswrapper[5004]: E1201 10:02:03.598385 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"container-00\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-must-gather-xmsbt/crc-debug-2m5n9" podUID="d56de7fc-21b5-4ced-8bfa-6429afcd72e0" Dec 01 10:02:04 crc kubenswrapper[5004]: E1201 10:02:04.451815 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"container-00\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296\\\"\"" pod="openshift-must-gather-xmsbt/crc-debug-2m5n9" podUID="d56de7fc-21b5-4ced-8bfa-6429afcd72e0" Dec 01 10:02:17 crc kubenswrapper[5004]: I1201 10:02:17.625980 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xmsbt/crc-debug-2m5n9" event={"ID":"d56de7fc-21b5-4ced-8bfa-6429afcd72e0","Type":"ContainerStarted","Data":"85b42fb039d635fe67a4574b59d1e44b7ab85b12d9f6109a13f8b77262a51031"} Dec 01 10:02:17 crc kubenswrapper[5004]: I1201 10:02:17.649372 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xmsbt/crc-debug-2m5n9" podStartSLOduration=1.7461474799999999 podStartE2EDuration="30.64934541s" podCreationTimestamp="2025-12-01 10:01:47 +0000 UTC" firstStartedPulling="2025-12-01 10:01:47.715614462 +0000 UTC m=+6285.280606444" lastFinishedPulling="2025-12-01 10:02:16.618812392 +0000 UTC m=+6314.183804374" observedRunningTime="2025-12-01 10:02:17.642174396 +0000 UTC m=+6315.207166388" watchObservedRunningTime="2025-12-01 10:02:17.64934541 +0000 UTC m=+6315.214337392" Dec 01 10:03:12 crc kubenswrapper[5004]: I1201 10:03:12.233023 5004 generic.go:334] "Generic (PLEG): container finished" podID="d56de7fc-21b5-4ced-8bfa-6429afcd72e0" containerID="85b42fb039d635fe67a4574b59d1e44b7ab85b12d9f6109a13f8b77262a51031" exitCode=0 Dec 01 10:03:12 crc kubenswrapper[5004]: I1201 10:03:12.233120 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xmsbt/crc-debug-2m5n9" event={"ID":"d56de7fc-21b5-4ced-8bfa-6429afcd72e0","Type":"ContainerDied","Data":"85b42fb039d635fe67a4574b59d1e44b7ab85b12d9f6109a13f8b77262a51031"} Dec 01 10:03:13 crc kubenswrapper[5004]: I1201 10:03:13.379040 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xmsbt/crc-debug-2m5n9" Dec 01 10:03:13 crc kubenswrapper[5004]: I1201 10:03:13.428041 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-xmsbt/crc-debug-2m5n9"] Dec 01 10:03:13 crc kubenswrapper[5004]: I1201 10:03:13.443673 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-xmsbt/crc-debug-2m5n9"] Dec 01 10:03:13 crc kubenswrapper[5004]: I1201 10:03:13.514037 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6hjv\" (UniqueName: \"kubernetes.io/projected/d56de7fc-21b5-4ced-8bfa-6429afcd72e0-kube-api-access-j6hjv\") pod \"d56de7fc-21b5-4ced-8bfa-6429afcd72e0\" (UID: \"d56de7fc-21b5-4ced-8bfa-6429afcd72e0\") " Dec 01 10:03:13 crc kubenswrapper[5004]: I1201 10:03:13.514153 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d56de7fc-21b5-4ced-8bfa-6429afcd72e0-host\") pod \"d56de7fc-21b5-4ced-8bfa-6429afcd72e0\" (UID: \"d56de7fc-21b5-4ced-8bfa-6429afcd72e0\") " Dec 01 10:03:13 crc kubenswrapper[5004]: I1201 10:03:13.514269 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d56de7fc-21b5-4ced-8bfa-6429afcd72e0-host" (OuterVolumeSpecName: "host") pod "d56de7fc-21b5-4ced-8bfa-6429afcd72e0" (UID: "d56de7fc-21b5-4ced-8bfa-6429afcd72e0"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:03:13 crc kubenswrapper[5004]: I1201 10:03:13.515330 5004 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d56de7fc-21b5-4ced-8bfa-6429afcd72e0-host\") on node \"crc\" DevicePath \"\"" Dec 01 10:03:13 crc kubenswrapper[5004]: I1201 10:03:13.521609 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d56de7fc-21b5-4ced-8bfa-6429afcd72e0-kube-api-access-j6hjv" (OuterVolumeSpecName: "kube-api-access-j6hjv") pod "d56de7fc-21b5-4ced-8bfa-6429afcd72e0" (UID: "d56de7fc-21b5-4ced-8bfa-6429afcd72e0"). InnerVolumeSpecName "kube-api-access-j6hjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:03:13 crc kubenswrapper[5004]: I1201 10:03:13.617545 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6hjv\" (UniqueName: \"kubernetes.io/projected/d56de7fc-21b5-4ced-8bfa-6429afcd72e0-kube-api-access-j6hjv\") on node \"crc\" DevicePath \"\"" Dec 01 10:03:14 crc kubenswrapper[5004]: I1201 10:03:14.258383 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f71402c202e92bf0646502cfdf6e8c1dbaf8cbc30d0d05b9e04d0c4fd63917e7" Dec 01 10:03:14 crc kubenswrapper[5004]: I1201 10:03:14.258442 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xmsbt/crc-debug-2m5n9" Dec 01 10:03:14 crc kubenswrapper[5004]: I1201 10:03:14.609616 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xmsbt/crc-debug-8vwjp"] Dec 01 10:03:14 crc kubenswrapper[5004]: E1201 10:03:14.610213 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d56de7fc-21b5-4ced-8bfa-6429afcd72e0" containerName="container-00" Dec 01 10:03:14 crc kubenswrapper[5004]: I1201 10:03:14.610230 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="d56de7fc-21b5-4ced-8bfa-6429afcd72e0" containerName="container-00" Dec 01 10:03:14 crc kubenswrapper[5004]: I1201 10:03:14.610493 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="d56de7fc-21b5-4ced-8bfa-6429afcd72e0" containerName="container-00" Dec 01 10:03:14 crc kubenswrapper[5004]: I1201 10:03:14.611378 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xmsbt/crc-debug-8vwjp" Dec 01 10:03:14 crc kubenswrapper[5004]: I1201 10:03:14.642834 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px5sv\" (UniqueName: \"kubernetes.io/projected/45be9646-6ae6-45fa-9c94-ae7768d9db18-kube-api-access-px5sv\") pod \"crc-debug-8vwjp\" (UID: \"45be9646-6ae6-45fa-9c94-ae7768d9db18\") " pod="openshift-must-gather-xmsbt/crc-debug-8vwjp" Dec 01 10:03:14 crc kubenswrapper[5004]: I1201 10:03:14.643121 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/45be9646-6ae6-45fa-9c94-ae7768d9db18-host\") pod \"crc-debug-8vwjp\" (UID: \"45be9646-6ae6-45fa-9c94-ae7768d9db18\") " pod="openshift-must-gather-xmsbt/crc-debug-8vwjp" Dec 01 10:03:14 crc kubenswrapper[5004]: I1201 10:03:14.745228 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/45be9646-6ae6-45fa-9c94-ae7768d9db18-host\") pod \"crc-debug-8vwjp\" (UID: \"45be9646-6ae6-45fa-9c94-ae7768d9db18\") " pod="openshift-must-gather-xmsbt/crc-debug-8vwjp" Dec 01 10:03:14 crc kubenswrapper[5004]: I1201 10:03:14.745400 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/45be9646-6ae6-45fa-9c94-ae7768d9db18-host\") pod \"crc-debug-8vwjp\" (UID: \"45be9646-6ae6-45fa-9c94-ae7768d9db18\") " pod="openshift-must-gather-xmsbt/crc-debug-8vwjp" Dec 01 10:03:14 crc kubenswrapper[5004]: I1201 10:03:14.745436 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-px5sv\" (UniqueName: \"kubernetes.io/projected/45be9646-6ae6-45fa-9c94-ae7768d9db18-kube-api-access-px5sv\") pod \"crc-debug-8vwjp\" (UID: \"45be9646-6ae6-45fa-9c94-ae7768d9db18\") " pod="openshift-must-gather-xmsbt/crc-debug-8vwjp" Dec 01 10:03:14 crc kubenswrapper[5004]: I1201 10:03:14.764253 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-px5sv\" (UniqueName: \"kubernetes.io/projected/45be9646-6ae6-45fa-9c94-ae7768d9db18-kube-api-access-px5sv\") pod \"crc-debug-8vwjp\" (UID: \"45be9646-6ae6-45fa-9c94-ae7768d9db18\") " pod="openshift-must-gather-xmsbt/crc-debug-8vwjp" Dec 01 10:03:14 crc kubenswrapper[5004]: I1201 10:03:14.774422 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d56de7fc-21b5-4ced-8bfa-6429afcd72e0" path="/var/lib/kubelet/pods/d56de7fc-21b5-4ced-8bfa-6429afcd72e0/volumes" Dec 01 10:03:14 crc kubenswrapper[5004]: I1201 10:03:14.935067 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xmsbt/crc-debug-8vwjp" Dec 01 10:03:15 crc kubenswrapper[5004]: I1201 10:03:15.270591 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xmsbt/crc-debug-8vwjp" event={"ID":"45be9646-6ae6-45fa-9c94-ae7768d9db18","Type":"ContainerStarted","Data":"aa186de15b2aa28115e645a43d57d491956c742c6f6f0c1523437ed5365371f5"} Dec 01 10:03:15 crc kubenswrapper[5004]: E1201 10:03:15.598525 5004 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45be9646_6ae6_45fa_9c94_ae7768d9db18.slice/crio-e4870a8d51858be9e9d5a86f36212e4ee37167d8742ae17af56649d8b8102824.scope\": RecentStats: unable to find data in memory cache]" Dec 01 10:03:16 crc kubenswrapper[5004]: I1201 10:03:16.282043 5004 generic.go:334] "Generic (PLEG): container finished" podID="45be9646-6ae6-45fa-9c94-ae7768d9db18" containerID="e4870a8d51858be9e9d5a86f36212e4ee37167d8742ae17af56649d8b8102824" exitCode=0 Dec 01 10:03:16 crc kubenswrapper[5004]: I1201 10:03:16.282152 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xmsbt/crc-debug-8vwjp" event={"ID":"45be9646-6ae6-45fa-9c94-ae7768d9db18","Type":"ContainerDied","Data":"e4870a8d51858be9e9d5a86f36212e4ee37167d8742ae17af56649d8b8102824"} Dec 01 10:03:17 crc kubenswrapper[5004]: I1201 10:03:17.432597 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xmsbt/crc-debug-8vwjp" Dec 01 10:03:17 crc kubenswrapper[5004]: I1201 10:03:17.613206 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/45be9646-6ae6-45fa-9c94-ae7768d9db18-host\") pod \"45be9646-6ae6-45fa-9c94-ae7768d9db18\" (UID: \"45be9646-6ae6-45fa-9c94-ae7768d9db18\") " Dec 01 10:03:17 crc kubenswrapper[5004]: I1201 10:03:17.613432 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-px5sv\" (UniqueName: \"kubernetes.io/projected/45be9646-6ae6-45fa-9c94-ae7768d9db18-kube-api-access-px5sv\") pod \"45be9646-6ae6-45fa-9c94-ae7768d9db18\" (UID: \"45be9646-6ae6-45fa-9c94-ae7768d9db18\") " Dec 01 10:03:17 crc kubenswrapper[5004]: I1201 10:03:17.613598 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/45be9646-6ae6-45fa-9c94-ae7768d9db18-host" (OuterVolumeSpecName: "host") pod "45be9646-6ae6-45fa-9c94-ae7768d9db18" (UID: "45be9646-6ae6-45fa-9c94-ae7768d9db18"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:03:17 crc kubenswrapper[5004]: I1201 10:03:17.614085 5004 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/45be9646-6ae6-45fa-9c94-ae7768d9db18-host\") on node \"crc\" DevicePath \"\"" Dec 01 10:03:17 crc kubenswrapper[5004]: I1201 10:03:17.621443 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45be9646-6ae6-45fa-9c94-ae7768d9db18-kube-api-access-px5sv" (OuterVolumeSpecName: "kube-api-access-px5sv") pod "45be9646-6ae6-45fa-9c94-ae7768d9db18" (UID: "45be9646-6ae6-45fa-9c94-ae7768d9db18"). InnerVolumeSpecName "kube-api-access-px5sv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:03:17 crc kubenswrapper[5004]: I1201 10:03:17.715785 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-px5sv\" (UniqueName: \"kubernetes.io/projected/45be9646-6ae6-45fa-9c94-ae7768d9db18-kube-api-access-px5sv\") on node \"crc\" DevicePath \"\"" Dec 01 10:03:18 crc kubenswrapper[5004]: I1201 10:03:18.301752 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xmsbt/crc-debug-8vwjp" event={"ID":"45be9646-6ae6-45fa-9c94-ae7768d9db18","Type":"ContainerDied","Data":"aa186de15b2aa28115e645a43d57d491956c742c6f6f0c1523437ed5365371f5"} Dec 01 10:03:18 crc kubenswrapper[5004]: I1201 10:03:18.302200 5004 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa186de15b2aa28115e645a43d57d491956c742c6f6f0c1523437ed5365371f5" Dec 01 10:03:18 crc kubenswrapper[5004]: I1201 10:03:18.302402 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xmsbt/crc-debug-8vwjp" Dec 01 10:03:18 crc kubenswrapper[5004]: I1201 10:03:18.794824 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-xmsbt/crc-debug-8vwjp"] Dec 01 10:03:18 crc kubenswrapper[5004]: I1201 10:03:18.808693 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-xmsbt/crc-debug-8vwjp"] Dec 01 10:03:19 crc kubenswrapper[5004]: I1201 10:03:19.953873 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xmsbt/crc-debug-w8d9z"] Dec 01 10:03:19 crc kubenswrapper[5004]: E1201 10:03:19.954800 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45be9646-6ae6-45fa-9c94-ae7768d9db18" containerName="container-00" Dec 01 10:03:19 crc kubenswrapper[5004]: I1201 10:03:19.954818 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="45be9646-6ae6-45fa-9c94-ae7768d9db18" containerName="container-00" Dec 01 10:03:19 crc kubenswrapper[5004]: I1201 10:03:19.955106 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="45be9646-6ae6-45fa-9c94-ae7768d9db18" containerName="container-00" Dec 01 10:03:19 crc kubenswrapper[5004]: I1201 10:03:19.956255 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xmsbt/crc-debug-w8d9z" Dec 01 10:03:20 crc kubenswrapper[5004]: I1201 10:03:20.101219 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxrg9\" (UniqueName: \"kubernetes.io/projected/2687c749-a7bc-401c-923b-14e9fafd6db1-kube-api-access-mxrg9\") pod \"crc-debug-w8d9z\" (UID: \"2687c749-a7bc-401c-923b-14e9fafd6db1\") " pod="openshift-must-gather-xmsbt/crc-debug-w8d9z" Dec 01 10:03:20 crc kubenswrapper[5004]: I1201 10:03:20.101364 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2687c749-a7bc-401c-923b-14e9fafd6db1-host\") pod \"crc-debug-w8d9z\" (UID: \"2687c749-a7bc-401c-923b-14e9fafd6db1\") " pod="openshift-must-gather-xmsbt/crc-debug-w8d9z" Dec 01 10:03:20 crc kubenswrapper[5004]: I1201 10:03:20.203470 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2687c749-a7bc-401c-923b-14e9fafd6db1-host\") pod \"crc-debug-w8d9z\" (UID: \"2687c749-a7bc-401c-923b-14e9fafd6db1\") " pod="openshift-must-gather-xmsbt/crc-debug-w8d9z" Dec 01 10:03:20 crc kubenswrapper[5004]: I1201 10:03:20.203626 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2687c749-a7bc-401c-923b-14e9fafd6db1-host\") pod \"crc-debug-w8d9z\" (UID: \"2687c749-a7bc-401c-923b-14e9fafd6db1\") " pod="openshift-must-gather-xmsbt/crc-debug-w8d9z" Dec 01 10:03:20 crc kubenswrapper[5004]: I1201 10:03:20.203822 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxrg9\" (UniqueName: \"kubernetes.io/projected/2687c749-a7bc-401c-923b-14e9fafd6db1-kube-api-access-mxrg9\") pod \"crc-debug-w8d9z\" (UID: \"2687c749-a7bc-401c-923b-14e9fafd6db1\") " pod="openshift-must-gather-xmsbt/crc-debug-w8d9z" Dec 01 10:03:20 crc kubenswrapper[5004]: I1201 10:03:20.224273 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxrg9\" (UniqueName: \"kubernetes.io/projected/2687c749-a7bc-401c-923b-14e9fafd6db1-kube-api-access-mxrg9\") pod \"crc-debug-w8d9z\" (UID: \"2687c749-a7bc-401c-923b-14e9fafd6db1\") " pod="openshift-must-gather-xmsbt/crc-debug-w8d9z" Dec 01 10:03:20 crc kubenswrapper[5004]: I1201 10:03:20.276947 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xmsbt/crc-debug-w8d9z" Dec 01 10:03:21 crc kubenswrapper[5004]: I1201 10:03:21.096730 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45be9646-6ae6-45fa-9c94-ae7768d9db18" path="/var/lib/kubelet/pods/45be9646-6ae6-45fa-9c94-ae7768d9db18/volumes" Dec 01 10:03:21 crc kubenswrapper[5004]: I1201 10:03:21.344885 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xmsbt/crc-debug-w8d9z" event={"ID":"2687c749-a7bc-401c-923b-14e9fafd6db1","Type":"ContainerStarted","Data":"e64f2f1769005949755460029d869b2dd11105f04f6ae137c64577842aa2c5ca"} Dec 01 10:03:22 crc kubenswrapper[5004]: I1201 10:03:22.356956 5004 generic.go:334] "Generic (PLEG): container finished" podID="2687c749-a7bc-401c-923b-14e9fafd6db1" containerID="d0827187c893632f0ff8b1101ee73ea461e9acd0bcd43243e4db43c2dccd5ef6" exitCode=0 Dec 01 10:03:22 crc kubenswrapper[5004]: I1201 10:03:22.357029 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xmsbt/crc-debug-w8d9z" event={"ID":"2687c749-a7bc-401c-923b-14e9fafd6db1","Type":"ContainerDied","Data":"d0827187c893632f0ff8b1101ee73ea461e9acd0bcd43243e4db43c2dccd5ef6"} Dec 01 10:03:22 crc kubenswrapper[5004]: I1201 10:03:22.400315 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-xmsbt/crc-debug-w8d9z"] Dec 01 10:03:22 crc kubenswrapper[5004]: I1201 10:03:22.410997 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-xmsbt/crc-debug-w8d9z"] Dec 01 10:03:23 crc kubenswrapper[5004]: I1201 10:03:23.509358 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xmsbt/crc-debug-w8d9z" Dec 01 10:03:23 crc kubenswrapper[5004]: I1201 10:03:23.619933 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2687c749-a7bc-401c-923b-14e9fafd6db1-host\") pod \"2687c749-a7bc-401c-923b-14e9fafd6db1\" (UID: \"2687c749-a7bc-401c-923b-14e9fafd6db1\") " Dec 01 10:03:23 crc kubenswrapper[5004]: I1201 10:03:23.620046 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2687c749-a7bc-401c-923b-14e9fafd6db1-host" (OuterVolumeSpecName: "host") pod "2687c749-a7bc-401c-923b-14e9fafd6db1" (UID: "2687c749-a7bc-401c-923b-14e9fafd6db1"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:03:23 crc kubenswrapper[5004]: I1201 10:03:23.620214 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxrg9\" (UniqueName: \"kubernetes.io/projected/2687c749-a7bc-401c-923b-14e9fafd6db1-kube-api-access-mxrg9\") pod \"2687c749-a7bc-401c-923b-14e9fafd6db1\" (UID: \"2687c749-a7bc-401c-923b-14e9fafd6db1\") " Dec 01 10:03:23 crc kubenswrapper[5004]: I1201 10:03:23.620773 5004 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2687c749-a7bc-401c-923b-14e9fafd6db1-host\") on node \"crc\" DevicePath \"\"" Dec 01 10:03:23 crc kubenswrapper[5004]: I1201 10:03:23.627918 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2687c749-a7bc-401c-923b-14e9fafd6db1-kube-api-access-mxrg9" (OuterVolumeSpecName: "kube-api-access-mxrg9") pod "2687c749-a7bc-401c-923b-14e9fafd6db1" (UID: "2687c749-a7bc-401c-923b-14e9fafd6db1"). InnerVolumeSpecName "kube-api-access-mxrg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:03:23 crc kubenswrapper[5004]: I1201 10:03:23.723322 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxrg9\" (UniqueName: \"kubernetes.io/projected/2687c749-a7bc-401c-923b-14e9fafd6db1-kube-api-access-mxrg9\") on node \"crc\" DevicePath \"\"" Dec 01 10:03:24 crc kubenswrapper[5004]: I1201 10:03:24.380351 5004 scope.go:117] "RemoveContainer" containerID="d0827187c893632f0ff8b1101ee73ea461e9acd0bcd43243e4db43c2dccd5ef6" Dec 01 10:03:24 crc kubenswrapper[5004]: I1201 10:03:24.380393 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xmsbt/crc-debug-w8d9z" Dec 01 10:03:24 crc kubenswrapper[5004]: I1201 10:03:24.773164 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2687c749-a7bc-401c-923b-14e9fafd6db1" path="/var/lib/kubelet/pods/2687c749-a7bc-401c-923b-14e9fafd6db1/volumes" Dec 01 10:03:49 crc kubenswrapper[5004]: I1201 10:03:49.186447 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_2920a19e-815a-4972-83c6-7c85f961f88a/aodh-api/0.log" Dec 01 10:03:49 crc kubenswrapper[5004]: I1201 10:03:49.479162 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_2920a19e-815a-4972-83c6-7c85f961f88a/aodh-evaluator/0.log" Dec 01 10:03:49 crc kubenswrapper[5004]: I1201 10:03:49.533574 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_2920a19e-815a-4972-83c6-7c85f961f88a/aodh-listener/0.log" Dec 01 10:03:49 crc kubenswrapper[5004]: I1201 10:03:49.546583 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_2920a19e-815a-4972-83c6-7c85f961f88a/aodh-notifier/0.log" Dec 01 10:03:49 crc kubenswrapper[5004]: I1201 10:03:49.799920 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-585d58d86-6cl4z_ce432d6a-76cf-408f-88e5-de133c8a555b/barbican-api/0.log" Dec 01 10:03:49 crc kubenswrapper[5004]: I1201 10:03:49.832688 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-585d58d86-6cl4z_ce432d6a-76cf-408f-88e5-de133c8a555b/barbican-api-log/0.log" Dec 01 10:03:49 crc kubenswrapper[5004]: I1201 10:03:49.920019 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-58bccc5494-trhd5_c77ef8f3-6312-48ca-9e64-49e0db910168/barbican-keystone-listener/0.log" Dec 01 10:03:50 crc kubenswrapper[5004]: I1201 10:03:50.174381 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-58d7b4cb4c-z6xck_b463310a-8c0e-462e-a746-94a664a21ebe/barbican-worker/0.log" Dec 01 10:03:50 crc kubenswrapper[5004]: I1201 10:03:50.206056 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-58d7b4cb4c-z6xck_b463310a-8c0e-462e-a746-94a664a21ebe/barbican-worker-log/0.log" Dec 01 10:03:50 crc kubenswrapper[5004]: I1201 10:03:50.299427 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-58bccc5494-trhd5_c77ef8f3-6312-48ca-9e64-49e0db910168/barbican-keystone-listener-log/0.log" Dec 01 10:03:50 crc kubenswrapper[5004]: I1201 10:03:50.492794 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-l7cvw_76e7add3-c357-40c8-b77c-dd408f7315d2/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:03:50 crc kubenswrapper[5004]: I1201 10:03:50.718114 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5cdcf7de-1cb2-4071-b7e9-cdf08f6ff5c7/ceilometer-central-agent/0.log" Dec 01 10:03:50 crc kubenswrapper[5004]: I1201 10:03:50.729720 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5cdcf7de-1cb2-4071-b7e9-cdf08f6ff5c7/proxy-httpd/0.log" Dec 01 10:03:50 crc kubenswrapper[5004]: I1201 10:03:50.772223 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5cdcf7de-1cb2-4071-b7e9-cdf08f6ff5c7/sg-core/0.log" Dec 01 10:03:50 crc kubenswrapper[5004]: I1201 10:03:50.827956 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5cdcf7de-1cb2-4071-b7e9-cdf08f6ff5c7/ceilometer-notification-agent/0.log" Dec 01 10:03:51 crc kubenswrapper[5004]: I1201 10:03:51.184169 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_c809558d-b402-4195-b03a-dbc3a1b96707/cinder-api-log/0.log" Dec 01 10:03:51 crc kubenswrapper[5004]: I1201 10:03:51.230539 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_c809558d-b402-4195-b03a-dbc3a1b96707/cinder-api/0.log" Dec 01 10:03:51 crc kubenswrapper[5004]: I1201 10:03:51.415793 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_2ca33de3-6acb-4792-acd0-47790a8d0ee6/cinder-scheduler/0.log" Dec 01 10:03:51 crc kubenswrapper[5004]: I1201 10:03:51.476933 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_2ca33de3-6acb-4792-acd0-47790a8d0ee6/probe/0.log" Dec 01 10:03:51 crc kubenswrapper[5004]: I1201 10:03:51.601855 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-hfxnb_54a59a5d-6353-4420-9600-3fdfbaa42595/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:03:51 crc kubenswrapper[5004]: I1201 10:03:51.727092 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-p7x8j_604b379e-d0f5-469b-abcd-6c9717007b3f/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:03:51 crc kubenswrapper[5004]: I1201 10:03:51.863736 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5596c69fcc-l9pb7_7d1249df-92dc-4f52-8db5-8088870a934c/init/0.log" Dec 01 10:03:52 crc kubenswrapper[5004]: I1201 10:03:52.043615 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5596c69fcc-l9pb7_7d1249df-92dc-4f52-8db5-8088870a934c/init/0.log" Dec 01 10:03:52 crc kubenswrapper[5004]: I1201 10:03:52.148804 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5596c69fcc-l9pb7_7d1249df-92dc-4f52-8db5-8088870a934c/dnsmasq-dns/0.log" Dec 01 10:03:52 crc kubenswrapper[5004]: I1201 10:03:52.185377 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-6frqs_1d1fef94-adb6-4276-8448-af6b16e5d9ff/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:03:52 crc kubenswrapper[5004]: I1201 10:03:52.506767 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_bbd9e9ac-7391-43f2-9f0c-a10d4dd9f81a/glance-log/0.log" Dec 01 10:03:52 crc kubenswrapper[5004]: I1201 10:03:52.538099 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_bbd9e9ac-7391-43f2-9f0c-a10d4dd9f81a/glance-httpd/0.log" Dec 01 10:03:52 crc kubenswrapper[5004]: I1201 10:03:52.621215 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_2394661d-62f2-4367-bd1e-662a248df799/glance-httpd/0.log" Dec 01 10:03:52 crc kubenswrapper[5004]: I1201 10:03:52.758957 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_2394661d-62f2-4367-bd1e-662a248df799/glance-log/0.log" Dec 01 10:03:53 crc kubenswrapper[5004]: I1201 10:03:53.266963 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-74fd44f8dd-bd4pt_1c0efc3a-6559-4f89-8d20-c11f7b4f291b/heat-engine/0.log" Dec 01 10:03:53 crc kubenswrapper[5004]: I1201 10:03:53.493799 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-snd4j_8edacfd3-e002-434d-bd0a-15400e279c58/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:03:53 crc kubenswrapper[5004]: I1201 10:03:53.535506 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-697fddcf97-nrt68_7a6dfa68-ad11-4ad2-8596-951823d970cc/heat-api/0.log" Dec 01 10:03:53 crc kubenswrapper[5004]: I1201 10:03:53.690392 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-54fdd54654-fhlqp_8a88be62-a3b8-4a6c-b226-e0e203d4c022/heat-cfnapi/0.log" Dec 01 10:03:53 crc kubenswrapper[5004]: I1201 10:03:53.778254 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-r8g28_66d71651-3d44-4bef-8259-19908c843f85/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:03:54 crc kubenswrapper[5004]: I1201 10:03:54.063162 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29409661-kbzbs_d92bc2f4-8519-45f5-bf8c-f825ce955687/keystone-cron/0.log" Dec 01 10:03:54 crc kubenswrapper[5004]: I1201 10:03:54.074102 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29409721-qqnhs_7765b30a-ec51-4fb4-982a-b47fe84e37a7/keystone-cron/0.log" Dec 01 10:03:54 crc kubenswrapper[5004]: I1201 10:03:54.257468 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-6d7ffc7f79-7hlcz_78648b2d-8d56-42a1-8152-80e6e1a1b201/keystone-api/0.log" Dec 01 10:03:54 crc kubenswrapper[5004]: I1201 10:03:54.281458 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_7e571289-bc5b-4304-bca2-994e61086e68/kube-state-metrics/0.log" Dec 01 10:03:54 crc kubenswrapper[5004]: I1201 10:03:54.405393 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-2p8kr_ebe1a3e0-aa10-4437-adea-bcb08ba1e7ce/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:03:54 crc kubenswrapper[5004]: I1201 10:03:54.703770 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_logging-edpm-deployment-openstack-edpm-ipam-zrjqn_39f7f17f-adaa-4dd0-a59a-b40406b85353/logging-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:03:54 crc kubenswrapper[5004]: I1201 10:03:54.903721 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mysqld-exporter-0_4941cf08-e742-4085-a85d-5d64305aec32/mysqld-exporter/0.log" Dec 01 10:03:55 crc kubenswrapper[5004]: I1201 10:03:55.321170 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-cb845f59f-m4f5q_8d456c05-b8d3-43ca-ae93-9b0a5b111296/neutron-httpd/0.log" Dec 01 10:03:55 crc kubenswrapper[5004]: I1201 10:03:55.345757 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-kt6tb_c7804cf5-e266-43a4-a6da-6e0686d4897c/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:03:55 crc kubenswrapper[5004]: I1201 10:03:55.367309 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-cb845f59f-m4f5q_8d456c05-b8d3-43ca-ae93-9b0a5b111296/neutron-api/0.log" Dec 01 10:03:55 crc kubenswrapper[5004]: I1201 10:03:55.982700 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_87fd3c52-cd06-4af8-9399-7ed081bc8799/nova-cell0-conductor-conductor/0.log" Dec 01 10:03:56 crc kubenswrapper[5004]: I1201 10:03:56.283639 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_6ac2f2ad-2604-4972-896b-07e23c74ac8d/nova-cell1-conductor-conductor/0.log" Dec 01 10:03:56 crc kubenswrapper[5004]: I1201 10:03:56.411856 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_5481ba31-c298-4eb4-ab74-e4fd53d46316/nova-api-log/0.log" Dec 01 10:03:56 crc kubenswrapper[5004]: I1201 10:03:56.645825 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_95dcb289-4473-451f-bd94-18d680b4f4f0/nova-cell1-novncproxy-novncproxy/0.log" Dec 01 10:03:56 crc kubenswrapper[5004]: I1201 10:03:56.715487 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-65pft_5b434b81-6f6d-48f1-b569-4a31ef7abbec/nova-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:03:56 crc kubenswrapper[5004]: I1201 10:03:56.852537 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_5481ba31-c298-4eb4-ab74-e4fd53d46316/nova-api-api/0.log" Dec 01 10:03:56 crc kubenswrapper[5004]: I1201 10:03:56.966203 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_c51ed16a-b6da-4bc9-9d47-18ec809ba124/nova-metadata-log/0.log" Dec 01 10:03:57 crc kubenswrapper[5004]: I1201 10:03:57.295465 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_d68b7ff7-f7f8-442f-870d-94f82b0842c1/nova-scheduler-scheduler/0.log" Dec 01 10:03:57 crc kubenswrapper[5004]: I1201 10:03:57.397062 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_bb30b7a7-42e1-421b-8673-7f3c8f5cfae3/mysql-bootstrap/0.log" Dec 01 10:03:57 crc kubenswrapper[5004]: I1201 10:03:57.578613 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_bb30b7a7-42e1-421b-8673-7f3c8f5cfae3/galera/0.log" Dec 01 10:03:57 crc kubenswrapper[5004]: I1201 10:03:57.579259 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_bb30b7a7-42e1-421b-8673-7f3c8f5cfae3/mysql-bootstrap/0.log" Dec 01 10:03:58 crc kubenswrapper[5004]: I1201 10:03:58.110936 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_04a6dd3a-f297-40b9-b480-0239383b9460/mysql-bootstrap/0.log" Dec 01 10:03:58 crc kubenswrapper[5004]: I1201 10:03:58.293402 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_04a6dd3a-f297-40b9-b480-0239383b9460/mysql-bootstrap/0.log" Dec 01 10:03:58 crc kubenswrapper[5004]: I1201 10:03:58.336587 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_04a6dd3a-f297-40b9-b480-0239383b9460/galera/0.log" Dec 01 10:03:58 crc kubenswrapper[5004]: I1201 10:03:58.524747 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_a4252410-0df2-4381-97da-772a062f3f8b/openstackclient/0.log" Dec 01 10:03:58 crc kubenswrapper[5004]: I1201 10:03:58.601535 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-tbwk4_cfb808bc-b59f-492b-a3aa-d817263501a5/openstack-network-exporter/0.log" Dec 01 10:03:58 crc kubenswrapper[5004]: I1201 10:03:58.832402 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-8qrgn_0b64b7d6-2fe8-43b0-9632-84e70a749fe9/ovsdb-server-init/0.log" Dec 01 10:03:59 crc kubenswrapper[5004]: I1201 10:03:59.015402 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-8qrgn_0b64b7d6-2fe8-43b0-9632-84e70a749fe9/ovsdb-server-init/0.log" Dec 01 10:03:59 crc kubenswrapper[5004]: I1201 10:03:59.085418 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-8qrgn_0b64b7d6-2fe8-43b0-9632-84e70a749fe9/ovs-vswitchd/0.log" Dec 01 10:03:59 crc kubenswrapper[5004]: I1201 10:03:59.104047 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-8qrgn_0b64b7d6-2fe8-43b0-9632-84e70a749fe9/ovsdb-server/0.log" Dec 01 10:03:59 crc kubenswrapper[5004]: I1201 10:03:59.341910 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-r8x2b_effd853b-0b95-4749-8119-88fcfaf8b0c0/ovn-controller/0.log" Dec 01 10:03:59 crc kubenswrapper[5004]: I1201 10:03:59.577335 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-fpw2q_b127ecf6-67a1-486c-b90a-147c4953f2d4/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:03:59 crc kubenswrapper[5004]: I1201 10:03:59.642043 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_c51ed16a-b6da-4bc9-9d47-18ec809ba124/nova-metadata-metadata/0.log" Dec 01 10:03:59 crc kubenswrapper[5004]: I1201 10:03:59.659690 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_e7d0d76d-60c9-4ba6-b29b-8c6bc8a4d499/openstack-network-exporter/0.log" Dec 01 10:03:59 crc kubenswrapper[5004]: I1201 10:03:59.833816 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_e7d0d76d-60c9-4ba6-b29b-8c6bc8a4d499/ovn-northd/0.log" Dec 01 10:03:59 crc kubenswrapper[5004]: I1201 10:03:59.837523 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_c5ea92c9-9b0a-473a-872f-a78f27946432/openstack-network-exporter/0.log" Dec 01 10:03:59 crc kubenswrapper[5004]: I1201 10:03:59.908071 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_c5ea92c9-9b0a-473a-872f-a78f27946432/ovsdbserver-nb/0.log" Dec 01 10:04:00 crc kubenswrapper[5004]: I1201 10:04:00.150390 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_6362259b-bac4-4df3-ad0c-d76511731aae/openstack-network-exporter/0.log" Dec 01 10:04:00 crc kubenswrapper[5004]: I1201 10:04:00.337039 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_6362259b-bac4-4df3-ad0c-d76511731aae/ovsdbserver-sb/0.log" Dec 01 10:04:00 crc kubenswrapper[5004]: I1201 10:04:00.671391 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_b0dfa0be-7482-49a2-adc0-425cedd5c597/init-config-reloader/0.log" Dec 01 10:04:00 crc kubenswrapper[5004]: I1201 10:04:00.852653 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-695797f65d-5b4rw_e3715c3c-b54f-4ff0-808a-51dd69614417/placement-api/0.log" Dec 01 10:04:00 crc kubenswrapper[5004]: I1201 10:04:00.896793 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-695797f65d-5b4rw_e3715c3c-b54f-4ff0-808a-51dd69614417/placement-log/0.log" Dec 01 10:04:00 crc kubenswrapper[5004]: I1201 10:04:00.961308 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_b0dfa0be-7482-49a2-adc0-425cedd5c597/init-config-reloader/0.log" Dec 01 10:04:00 crc kubenswrapper[5004]: I1201 10:04:00.997882 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_b0dfa0be-7482-49a2-adc0-425cedd5c597/config-reloader/0.log" Dec 01 10:04:01 crc kubenswrapper[5004]: I1201 10:04:01.336089 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_b0dfa0be-7482-49a2-adc0-425cedd5c597/prometheus/0.log" Dec 01 10:04:01 crc kubenswrapper[5004]: I1201 10:04:01.370499 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_b0dfa0be-7482-49a2-adc0-425cedd5c597/thanos-sidecar/0.log" Dec 01 10:04:01 crc kubenswrapper[5004]: I1201 10:04:01.491904 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_56831219-9428-45a8-8888-869bc645d080/setup-container/0.log" Dec 01 10:04:01 crc kubenswrapper[5004]: I1201 10:04:01.677108 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_56831219-9428-45a8-8888-869bc645d080/setup-container/0.log" Dec 01 10:04:01 crc kubenswrapper[5004]: I1201 10:04:01.684028 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_56831219-9428-45a8-8888-869bc645d080/rabbitmq/0.log" Dec 01 10:04:01 crc kubenswrapper[5004]: I1201 10:04:01.757477 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_af2896c5-aa9f-47d7-ba02-ebea4bbd29ed/setup-container/0.log" Dec 01 10:04:02 crc kubenswrapper[5004]: I1201 10:04:02.017121 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-shf6w_cf82dc00-6575-4d77-9cc5-00fddb8957e0/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:04:02 crc kubenswrapper[5004]: I1201 10:04:02.035373 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_af2896c5-aa9f-47d7-ba02-ebea4bbd29ed/setup-container/0.log" Dec 01 10:04:02 crc kubenswrapper[5004]: I1201 10:04:02.119663 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_af2896c5-aa9f-47d7-ba02-ebea4bbd29ed/rabbitmq/0.log" Dec 01 10:04:02 crc kubenswrapper[5004]: I1201 10:04:02.336420 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-n2xnj_a7cd8c79-5eb5-4883-b64c-f83dff955fd0/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:04:02 crc kubenswrapper[5004]: I1201 10:04:02.492289 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-8ttnq_5c56494f-200d-4f94-ad70-1f53e7b5d1fe/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:04:02 crc kubenswrapper[5004]: I1201 10:04:02.636014 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-g5zz7_33c18f99-b111-4d1d-bcc0-003f3a58deee/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:04:02 crc kubenswrapper[5004]: I1201 10:04:02.877878 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-jhdl8_47faa1f0-89dc-4cf2-a026-b425da197aaf/ssh-known-hosts-edpm-deployment/0.log" Dec 01 10:04:03 crc kubenswrapper[5004]: I1201 10:04:03.127369 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7977b5fddc-g9fhv_2842f079-806d-4bdd-8218-db4b2b000259/proxy-server/0.log" Dec 01 10:04:03 crc kubenswrapper[5004]: I1201 10:04:03.238605 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-clktj_e5a97177-1085-4eab-a646-c3b849dc73b5/swift-ring-rebalance/0.log" Dec 01 10:04:03 crc kubenswrapper[5004]: I1201 10:04:03.312192 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7977b5fddc-g9fhv_2842f079-806d-4bdd-8218-db4b2b000259/proxy-httpd/0.log" Dec 01 10:04:03 crc kubenswrapper[5004]: I1201 10:04:03.484650 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_64792f42-dd08-4537-bce9-a632e644cf5a/account-auditor/0.log" Dec 01 10:04:03 crc kubenswrapper[5004]: I1201 10:04:03.526096 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_64792f42-dd08-4537-bce9-a632e644cf5a/account-reaper/0.log" Dec 01 10:04:03 crc kubenswrapper[5004]: I1201 10:04:03.592926 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_64792f42-dd08-4537-bce9-a632e644cf5a/account-replicator/0.log" Dec 01 10:04:03 crc kubenswrapper[5004]: I1201 10:04:03.730868 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_64792f42-dd08-4537-bce9-a632e644cf5a/account-server/0.log" Dec 01 10:04:03 crc kubenswrapper[5004]: I1201 10:04:03.745480 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_64792f42-dd08-4537-bce9-a632e644cf5a/container-auditor/0.log" Dec 01 10:04:03 crc kubenswrapper[5004]: I1201 10:04:03.976503 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_64792f42-dd08-4537-bce9-a632e644cf5a/container-updater/0.log" Dec 01 10:04:03 crc kubenswrapper[5004]: I1201 10:04:03.992687 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_64792f42-dd08-4537-bce9-a632e644cf5a/container-server/0.log" Dec 01 10:04:04 crc kubenswrapper[5004]: I1201 10:04:04.037741 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_64792f42-dd08-4537-bce9-a632e644cf5a/container-replicator/0.log" Dec 01 10:04:04 crc kubenswrapper[5004]: I1201 10:04:04.042979 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_64792f42-dd08-4537-bce9-a632e644cf5a/object-auditor/0.log" Dec 01 10:04:04 crc kubenswrapper[5004]: I1201 10:04:04.228977 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_64792f42-dd08-4537-bce9-a632e644cf5a/object-expirer/0.log" Dec 01 10:04:04 crc kubenswrapper[5004]: I1201 10:04:04.496547 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_64792f42-dd08-4537-bce9-a632e644cf5a/object-server/0.log" Dec 01 10:04:04 crc kubenswrapper[5004]: I1201 10:04:04.533091 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_64792f42-dd08-4537-bce9-a632e644cf5a/object-replicator/0.log" Dec 01 10:04:04 crc kubenswrapper[5004]: I1201 10:04:04.626884 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_64792f42-dd08-4537-bce9-a632e644cf5a/object-updater/0.log" Dec 01 10:04:04 crc kubenswrapper[5004]: I1201 10:04:04.744594 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_64792f42-dd08-4537-bce9-a632e644cf5a/swift-recon-cron/0.log" Dec 01 10:04:04 crc kubenswrapper[5004]: I1201 10:04:04.801680 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_64792f42-dd08-4537-bce9-a632e644cf5a/rsync/0.log" Dec 01 10:04:04 crc kubenswrapper[5004]: I1201 10:04:04.908315 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-6h4fj_194e6eb7-a03b-4482-a871-368a20922a87/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:04:05 crc kubenswrapper[5004]: I1201 10:04:05.160928 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-power-monitoring-edpm-deployment-openstack-edpm-rs57q_6a233136-7248-4994-a3d6-0108bbf72fef/telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:04:05 crc kubenswrapper[5004]: I1201 10:04:05.478333 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_04b163a0-721a-44ce-b030-1aad4f09cbcb/test-operator-logs-container/0.log" Dec 01 10:04:05 crc kubenswrapper[5004]: I1201 10:04:05.688953 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-lw9tq_995d1b4f-5178-4522-989d-ba19dbc7316f/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:04:06 crc kubenswrapper[5004]: I1201 10:04:06.402181 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_b624b6f4-e294-427a-94ac-358b5be6897b/tempest-tests-tempest-tests-runner/0.log" Dec 01 10:04:08 crc kubenswrapper[5004]: I1201 10:04:08.728766 5004 patch_prober.go:28] interesting pod/machine-config-daemon-fvdgt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:04:08 crc kubenswrapper[5004]: I1201 10:04:08.729299 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:04:14 crc kubenswrapper[5004]: I1201 10:04:14.661245 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_08dfc50d-80b8-4885-826d-4a8314b46234/memcached/0.log" Dec 01 10:04:35 crc kubenswrapper[5004]: I1201 10:04:35.824386 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_232b41aac13e001dc857b9795d2b854afae448fcc1728af84c1ef8a48428nq7_911b89dd-ee3e-4349-b52d-36e0199aabba/util/0.log" Dec 01 10:04:36 crc kubenswrapper[5004]: I1201 10:04:36.140838 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_232b41aac13e001dc857b9795d2b854afae448fcc1728af84c1ef8a48428nq7_911b89dd-ee3e-4349-b52d-36e0199aabba/pull/0.log" Dec 01 10:04:36 crc kubenswrapper[5004]: I1201 10:04:36.141441 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_232b41aac13e001dc857b9795d2b854afae448fcc1728af84c1ef8a48428nq7_911b89dd-ee3e-4349-b52d-36e0199aabba/pull/0.log" Dec 01 10:04:36 crc kubenswrapper[5004]: I1201 10:04:36.174441 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_232b41aac13e001dc857b9795d2b854afae448fcc1728af84c1ef8a48428nq7_911b89dd-ee3e-4349-b52d-36e0199aabba/util/0.log" Dec 01 10:04:36 crc kubenswrapper[5004]: I1201 10:04:36.317940 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_232b41aac13e001dc857b9795d2b854afae448fcc1728af84c1ef8a48428nq7_911b89dd-ee3e-4349-b52d-36e0199aabba/pull/0.log" Dec 01 10:04:36 crc kubenswrapper[5004]: I1201 10:04:36.319157 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_232b41aac13e001dc857b9795d2b854afae448fcc1728af84c1ef8a48428nq7_911b89dd-ee3e-4349-b52d-36e0199aabba/util/0.log" Dec 01 10:04:36 crc kubenswrapper[5004]: I1201 10:04:36.336836 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_232b41aac13e001dc857b9795d2b854afae448fcc1728af84c1ef8a48428nq7_911b89dd-ee3e-4349-b52d-36e0199aabba/extract/0.log" Dec 01 10:04:36 crc kubenswrapper[5004]: I1201 10:04:36.580339 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-rp7vl_9bca89aa-3367-4bff-b070-c191fcae5f2f/manager/0.log" Dec 01 10:04:36 crc kubenswrapper[5004]: I1201 10:04:36.584142 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-rp7vl_9bca89aa-3367-4bff-b070-c191fcae5f2f/kube-rbac-proxy/0.log" Dec 01 10:04:36 crc kubenswrapper[5004]: I1201 10:04:36.620949 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-6dpk9_166912d9-e0b0-40b8-8e26-9c86183d7952/kube-rbac-proxy/0.log" Dec 01 10:04:36 crc kubenswrapper[5004]: I1201 10:04:36.826842 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-lfs45_c6e6ae59-9f58-4856-b200-d42d1e1e23ed/manager/0.log" Dec 01 10:04:36 crc kubenswrapper[5004]: I1201 10:04:36.866047 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-lfs45_c6e6ae59-9f58-4856-b200-d42d1e1e23ed/kube-rbac-proxy/0.log" Dec 01 10:04:36 crc kubenswrapper[5004]: I1201 10:04:36.871233 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-6dpk9_166912d9-e0b0-40b8-8e26-9c86183d7952/manager/0.log" Dec 01 10:04:37 crc kubenswrapper[5004]: I1201 10:04:37.025400 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-668d9c48b9-5c6kq_67dcdfb2-70ae-4444-b271-dd83dcb37756/kube-rbac-proxy/0.log" Dec 01 10:04:37 crc kubenswrapper[5004]: I1201 10:04:37.179364 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-668d9c48b9-5c6kq_67dcdfb2-70ae-4444-b271-dd83dcb37756/manager/0.log" Dec 01 10:04:37 crc kubenswrapper[5004]: I1201 10:04:37.270772 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-zgpn6_b931f322-0c4e-4019-a11e-616c80d1e5f1/kube-rbac-proxy/0.log" Dec 01 10:04:37 crc kubenswrapper[5004]: I1201 10:04:37.348902 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-zgpn6_b931f322-0c4e-4019-a11e-616c80d1e5f1/manager/0.log" Dec 01 10:04:37 crc kubenswrapper[5004]: I1201 10:04:37.438593 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-78524_aa866b8d-174b-4fab-a55d-cc2bcdef5526/kube-rbac-proxy/0.log" Dec 01 10:04:37 crc kubenswrapper[5004]: I1201 10:04:37.769510 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-78524_aa866b8d-174b-4fab-a55d-cc2bcdef5526/manager/0.log" Dec 01 10:04:37 crc kubenswrapper[5004]: I1201 10:04:37.829397 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-hzgf4_f87b36f7-2558-4823-85fc-6b6e9090b1d7/kube-rbac-proxy/0.log" Dec 01 10:04:38 crc kubenswrapper[5004]: I1201 10:04:38.062404 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-hzgf4_f87b36f7-2558-4823-85fc-6b6e9090b1d7/manager/0.log" Dec 01 10:04:38 crc kubenswrapper[5004]: I1201 10:04:38.070725 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-vv9hf_3521a18b-f34e-4107-9e34-048a9827a2fe/kube-rbac-proxy/0.log" Dec 01 10:04:38 crc kubenswrapper[5004]: I1201 10:04:38.114772 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-vv9hf_3521a18b-f34e-4107-9e34-048a9827a2fe/manager/0.log" Dec 01 10:04:38 crc kubenswrapper[5004]: I1201 10:04:38.284960 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-546d4bdf48-59dzd_cb3f7f28-c99e-44d4-b534-83889924b531/kube-rbac-proxy/0.log" Dec 01 10:04:38 crc kubenswrapper[5004]: I1201 10:04:38.360984 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-546d4bdf48-59dzd_cb3f7f28-c99e-44d4-b534-83889924b531/manager/0.log" Dec 01 10:04:38 crc kubenswrapper[5004]: I1201 10:04:38.483917 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6546668bfd-6l4k4_8d8f2a7f-1da5-45ca-8dd0-1fa87e3d46fe/kube-rbac-proxy/0.log" Dec 01 10:04:38 crc kubenswrapper[5004]: I1201 10:04:38.521122 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6546668bfd-6l4k4_8d8f2a7f-1da5-45ca-8dd0-1fa87e3d46fe/manager/0.log" Dec 01 10:04:38 crc kubenswrapper[5004]: I1201 10:04:38.585273 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-55g57_24fb1ec9-065a-464d-9797-8020c38f81e8/kube-rbac-proxy/0.log" Dec 01 10:04:38 crc kubenswrapper[5004]: I1201 10:04:38.710847 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-55g57_24fb1ec9-065a-464d-9797-8020c38f81e8/manager/0.log" Dec 01 10:04:38 crc kubenswrapper[5004]: I1201 10:04:38.728872 5004 patch_prober.go:28] interesting pod/machine-config-daemon-fvdgt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:04:38 crc kubenswrapper[5004]: I1201 10:04:38.728942 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:04:38 crc kubenswrapper[5004]: I1201 10:04:38.796295 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-pk7mn_67bfafa3-790f-4b23-8bef-8b5da60bf6dc/kube-rbac-proxy/0.log" Dec 01 10:04:38 crc kubenswrapper[5004]: I1201 10:04:38.889478 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-pk7mn_67bfafa3-790f-4b23-8bef-8b5da60bf6dc/manager/0.log" Dec 01 10:04:39 crc kubenswrapper[5004]: I1201 10:04:39.055681 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-nm7jp_616c962a-6fda-4c1b-a377-51d721a17616/kube-rbac-proxy/0.log" Dec 01 10:04:39 crc kubenswrapper[5004]: I1201 10:04:39.170690 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-nm7jp_616c962a-6fda-4c1b-a377-51d721a17616/manager/0.log" Dec 01 10:04:39 crc kubenswrapper[5004]: I1201 10:04:39.232122 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-wqpmp_d418620e-19a8-4171-94f7-2dba61ca8b6a/kube-rbac-proxy/0.log" Dec 01 10:04:39 crc kubenswrapper[5004]: I1201 10:04:39.304473 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-wqpmp_d418620e-19a8-4171-94f7-2dba61ca8b6a/manager/0.log" Dec 01 10:04:39 crc kubenswrapper[5004]: I1201 10:04:39.401895 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd42mdlz_fde3e479-59b7-4b8b-82c8-38b346fd3409/kube-rbac-proxy/0.log" Dec 01 10:04:39 crc kubenswrapper[5004]: I1201 10:04:39.457924 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd42mdlz_fde3e479-59b7-4b8b-82c8-38b346fd3409/manager/0.log" Dec 01 10:04:39 crc kubenswrapper[5004]: I1201 10:04:39.940694 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-h2x7z_893e9b84-6818-4442-aad3-528d7b7f24b2/registry-server/0.log" Dec 01 10:04:39 crc kubenswrapper[5004]: I1201 10:04:39.988728 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-688447bc77-l2kbh_58986db9-c1c0-4caf-a239-211791e82dc2/operator/0.log" Dec 01 10:04:40 crc kubenswrapper[5004]: I1201 10:04:40.201263 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-tnmn9_bff56810-ae93-4fca-a568-cc88e971c1d8/kube-rbac-proxy/0.log" Dec 01 10:04:40 crc kubenswrapper[5004]: I1201 10:04:40.278611 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-tnmn9_bff56810-ae93-4fca-a568-cc88e971c1d8/manager/0.log" Dec 01 10:04:40 crc kubenswrapper[5004]: I1201 10:04:40.530030 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-p468t_9d4afad8-bd75-403d-b4d0-d7f01e1a6e5d/kube-rbac-proxy/0.log" Dec 01 10:04:40 crc kubenswrapper[5004]: I1201 10:04:40.566547 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-p468t_9d4afad8-bd75-403d-b4d0-d7f01e1a6e5d/manager/0.log" Dec 01 10:04:40 crc kubenswrapper[5004]: I1201 10:04:40.676738 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-l4rkg_ed5bf034-cb91-4b02-97a4-c63a8506e527/operator/0.log" Dec 01 10:04:40 crc kubenswrapper[5004]: I1201 10:04:40.798106 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-c96bj_a1d2c2cd-d1c5-490e-99ee-e0ab5c18ebc4/kube-rbac-proxy/0.log" Dec 01 10:04:40 crc kubenswrapper[5004]: I1201 10:04:40.947048 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-c96bj_a1d2c2cd-d1c5-490e-99ee-e0ab5c18ebc4/manager/0.log" Dec 01 10:04:41 crc kubenswrapper[5004]: I1201 10:04:41.027824 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-bcd9b8768-5phd6_5425cd72-5745-4b0f-ab14-b697c726d75f/kube-rbac-proxy/0.log" Dec 01 10:04:41 crc kubenswrapper[5004]: I1201 10:04:41.222009 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-8n2qh_bfa6d181-b802-48de-8c57-4b8b7a8f1e07/kube-rbac-proxy/0.log" Dec 01 10:04:41 crc kubenswrapper[5004]: I1201 10:04:41.357325 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-8n2qh_bfa6d181-b802-48de-8c57-4b8b7a8f1e07/manager/0.log" Dec 01 10:04:41 crc kubenswrapper[5004]: I1201 10:04:41.501985 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6c4968b65-xg6h2_38bf4275-c95e-4b2d-88fe-aeace2e41983/manager/0.log" Dec 01 10:04:41 crc kubenswrapper[5004]: I1201 10:04:41.515840 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-bcd9b8768-5phd6_5425cd72-5745-4b0f-ab14-b697c726d75f/manager/0.log" Dec 01 10:04:41 crc kubenswrapper[5004]: I1201 10:04:41.527184 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-nfdnl_f1d1796c-7fa3-4a90-bfb2-cc257a69ba58/kube-rbac-proxy/0.log" Dec 01 10:04:41 crc kubenswrapper[5004]: I1201 10:04:41.594721 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-nfdnl_f1d1796c-7fa3-4a90-bfb2-cc257a69ba58/manager/0.log" Dec 01 10:05:00 crc kubenswrapper[5004]: I1201 10:05:00.311163 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-nsn58_7a169e83-de91-4038-95b3-aa57f9b50861/control-plane-machine-set-operator/0.log" Dec 01 10:05:00 crc kubenswrapper[5004]: I1201 10:05:00.510733 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-b9vjf_c3ebf4d5-102a-4552-b30b-cbacb3a779fa/machine-api-operator/0.log" Dec 01 10:05:00 crc kubenswrapper[5004]: I1201 10:05:00.519090 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-b9vjf_c3ebf4d5-102a-4552-b30b-cbacb3a779fa/kube-rbac-proxy/0.log" Dec 01 10:05:08 crc kubenswrapper[5004]: I1201 10:05:08.729388 5004 patch_prober.go:28] interesting pod/machine-config-daemon-fvdgt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:05:08 crc kubenswrapper[5004]: I1201 10:05:08.729951 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:05:08 crc kubenswrapper[5004]: I1201 10:05:08.730001 5004 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" Dec 01 10:05:08 crc kubenswrapper[5004]: I1201 10:05:08.730795 5004 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"497710a2817802c21762db587a6989ef9d9b667ad5011a6d0d79313f386386ad"} pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 10:05:08 crc kubenswrapper[5004]: I1201 10:05:08.730855 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" containerID="cri-o://497710a2817802c21762db587a6989ef9d9b667ad5011a6d0d79313f386386ad" gracePeriod=600 Dec 01 10:05:09 crc kubenswrapper[5004]: I1201 10:05:09.669852 5004 generic.go:334] "Generic (PLEG): container finished" podID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerID="497710a2817802c21762db587a6989ef9d9b667ad5011a6d0d79313f386386ad" exitCode=0 Dec 01 10:05:09 crc kubenswrapper[5004]: I1201 10:05:09.670489 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" event={"ID":"f9977ebb-82de-4e96-8763-0b5a84f8d4ce","Type":"ContainerDied","Data":"497710a2817802c21762db587a6989ef9d9b667ad5011a6d0d79313f386386ad"} Dec 01 10:05:09 crc kubenswrapper[5004]: I1201 10:05:09.670524 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" event={"ID":"f9977ebb-82de-4e96-8763-0b5a84f8d4ce","Type":"ContainerStarted","Data":"bb7b2f98452ad6358b81c3ab624dff561c4061a1923f5ef6c17db875889257ec"} Dec 01 10:05:09 crc kubenswrapper[5004]: I1201 10:05:09.670544 5004 scope.go:117] "RemoveContainer" containerID="1888096bc7eea8f4ed362bde431fc15761a89c0c03d561f41b82ce51165e9213" Dec 01 10:05:12 crc kubenswrapper[5004]: I1201 10:05:12.845989 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-xgcbt_efe385b8-0001-4bc8-9071-9c273ee27982/cert-manager-controller/0.log" Dec 01 10:05:13 crc kubenswrapper[5004]: I1201 10:05:13.026966 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-nb5zh_877958bb-ae47-4876-93be-dfa4393fabca/cert-manager-cainjector/0.log" Dec 01 10:05:13 crc kubenswrapper[5004]: I1201 10:05:13.120596 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-v6wk2_6d54b10a-717e-4952-b3ac-49705bee10b5/cert-manager-webhook/0.log" Dec 01 10:05:27 crc kubenswrapper[5004]: I1201 10:05:27.305909 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-624tf_21e94157-2305-46da-a56e-59dce2baa4ad/nmstate-console-plugin/0.log" Dec 01 10:05:27 crc kubenswrapper[5004]: I1201 10:05:27.532650 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-jdhw7_1027139c-9eed-42ec-8ba6-43c330579482/nmstate-handler/0.log" Dec 01 10:05:27 crc kubenswrapper[5004]: I1201 10:05:27.677389 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-phx6c_255659ba-de9a-4177-8a9f-42b2169ca1b8/kube-rbac-proxy/0.log" Dec 01 10:05:27 crc kubenswrapper[5004]: I1201 10:05:27.703614 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-phx6c_255659ba-de9a-4177-8a9f-42b2169ca1b8/nmstate-metrics/0.log" Dec 01 10:05:27 crc kubenswrapper[5004]: I1201 10:05:27.839115 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-tvfkn_1ac61d16-3eff-40e4-af81-79516560f041/nmstate-operator/0.log" Dec 01 10:05:27 crc kubenswrapper[5004]: I1201 10:05:27.940189 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-kl9rs_70434b12-7582-4851-b32d-034f4c21603a/nmstate-webhook/0.log" Dec 01 10:05:39 crc kubenswrapper[5004]: I1201 10:05:39.764651 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-56fbccf5c9-5kcrv_3ef42d0b-a102-4112-b592-aa6d481041c7/kube-rbac-proxy/0.log" Dec 01 10:05:39 crc kubenswrapper[5004]: I1201 10:05:39.820669 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-56fbccf5c9-5kcrv_3ef42d0b-a102-4112-b592-aa6d481041c7/manager/0.log" Dec 01 10:05:40 crc kubenswrapper[5004]: I1201 10:05:40.178240 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6j4k7"] Dec 01 10:05:40 crc kubenswrapper[5004]: E1201 10:05:40.178992 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2687c749-a7bc-401c-923b-14e9fafd6db1" containerName="container-00" Dec 01 10:05:40 crc kubenswrapper[5004]: I1201 10:05:40.179019 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="2687c749-a7bc-401c-923b-14e9fafd6db1" containerName="container-00" Dec 01 10:05:40 crc kubenswrapper[5004]: I1201 10:05:40.179400 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="2687c749-a7bc-401c-923b-14e9fafd6db1" containerName="container-00" Dec 01 10:05:40 crc kubenswrapper[5004]: I1201 10:05:40.181926 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6j4k7" Dec 01 10:05:40 crc kubenswrapper[5004]: I1201 10:05:40.205338 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6j4k7"] Dec 01 10:05:40 crc kubenswrapper[5004]: I1201 10:05:40.283502 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb64f32d-2761-4a60-9feb-517c4bcf8249-utilities\") pod \"redhat-operators-6j4k7\" (UID: \"eb64f32d-2761-4a60-9feb-517c4bcf8249\") " pod="openshift-marketplace/redhat-operators-6j4k7" Dec 01 10:05:40 crc kubenswrapper[5004]: I1201 10:05:40.283636 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28njm\" (UniqueName: \"kubernetes.io/projected/eb64f32d-2761-4a60-9feb-517c4bcf8249-kube-api-access-28njm\") pod \"redhat-operators-6j4k7\" (UID: \"eb64f32d-2761-4a60-9feb-517c4bcf8249\") " pod="openshift-marketplace/redhat-operators-6j4k7" Dec 01 10:05:40 crc kubenswrapper[5004]: I1201 10:05:40.283866 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb64f32d-2761-4a60-9feb-517c4bcf8249-catalog-content\") pod \"redhat-operators-6j4k7\" (UID: \"eb64f32d-2761-4a60-9feb-517c4bcf8249\") " pod="openshift-marketplace/redhat-operators-6j4k7" Dec 01 10:05:40 crc kubenswrapper[5004]: I1201 10:05:40.386571 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28njm\" (UniqueName: \"kubernetes.io/projected/eb64f32d-2761-4a60-9feb-517c4bcf8249-kube-api-access-28njm\") pod \"redhat-operators-6j4k7\" (UID: \"eb64f32d-2761-4a60-9feb-517c4bcf8249\") " pod="openshift-marketplace/redhat-operators-6j4k7" Dec 01 10:05:40 crc kubenswrapper[5004]: I1201 10:05:40.386891 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb64f32d-2761-4a60-9feb-517c4bcf8249-catalog-content\") pod \"redhat-operators-6j4k7\" (UID: \"eb64f32d-2761-4a60-9feb-517c4bcf8249\") " pod="openshift-marketplace/redhat-operators-6j4k7" Dec 01 10:05:40 crc kubenswrapper[5004]: I1201 10:05:40.386995 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb64f32d-2761-4a60-9feb-517c4bcf8249-utilities\") pod \"redhat-operators-6j4k7\" (UID: \"eb64f32d-2761-4a60-9feb-517c4bcf8249\") " pod="openshift-marketplace/redhat-operators-6j4k7" Dec 01 10:05:40 crc kubenswrapper[5004]: I1201 10:05:40.387503 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb64f32d-2761-4a60-9feb-517c4bcf8249-catalog-content\") pod \"redhat-operators-6j4k7\" (UID: \"eb64f32d-2761-4a60-9feb-517c4bcf8249\") " pod="openshift-marketplace/redhat-operators-6j4k7" Dec 01 10:05:40 crc kubenswrapper[5004]: I1201 10:05:40.387584 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb64f32d-2761-4a60-9feb-517c4bcf8249-utilities\") pod \"redhat-operators-6j4k7\" (UID: \"eb64f32d-2761-4a60-9feb-517c4bcf8249\") " pod="openshift-marketplace/redhat-operators-6j4k7" Dec 01 10:05:40 crc kubenswrapper[5004]: I1201 10:05:40.449630 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28njm\" (UniqueName: \"kubernetes.io/projected/eb64f32d-2761-4a60-9feb-517c4bcf8249-kube-api-access-28njm\") pod \"redhat-operators-6j4k7\" (UID: \"eb64f32d-2761-4a60-9feb-517c4bcf8249\") " pod="openshift-marketplace/redhat-operators-6j4k7" Dec 01 10:05:40 crc kubenswrapper[5004]: I1201 10:05:40.549312 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6j4k7" Dec 01 10:05:41 crc kubenswrapper[5004]: I1201 10:05:41.410780 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6j4k7"] Dec 01 10:05:42 crc kubenswrapper[5004]: I1201 10:05:42.226992 5004 generic.go:334] "Generic (PLEG): container finished" podID="eb64f32d-2761-4a60-9feb-517c4bcf8249" containerID="e36bdc4ce9b520ea4757567149ac849df3b35a76ab96ea29f6e775a535ffd9b9" exitCode=0 Dec 01 10:05:42 crc kubenswrapper[5004]: I1201 10:05:42.227043 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6j4k7" event={"ID":"eb64f32d-2761-4a60-9feb-517c4bcf8249","Type":"ContainerDied","Data":"e36bdc4ce9b520ea4757567149ac849df3b35a76ab96ea29f6e775a535ffd9b9"} Dec 01 10:05:42 crc kubenswrapper[5004]: I1201 10:05:42.227294 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6j4k7" event={"ID":"eb64f32d-2761-4a60-9feb-517c4bcf8249","Type":"ContainerStarted","Data":"9cb2ac3319ac27951543d3d9b6a1c61fb09afa9fe05de9128de7e051077b0ad2"} Dec 01 10:05:54 crc kubenswrapper[5004]: I1201 10:05:54.377889 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6j4k7" event={"ID":"eb64f32d-2761-4a60-9feb-517c4bcf8249","Type":"ContainerStarted","Data":"5a822ec6812f8eba0ddea95230a6fb8789890e6bbd9090f883b24aff33411e64"} Dec 01 10:05:56 crc kubenswrapper[5004]: I1201 10:05:56.137963 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_cluster-logging-operator-ff9846bd-cxlpj_8ea8ea18-7ae8-44e4-9381-10948b9b47f6/cluster-logging-operator/0.log" Dec 01 10:05:56 crc kubenswrapper[5004]: I1201 10:05:56.419242 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_collector-qdtrq_55505d98-5690-43fc-b3a9-87f4d3d8db26/collector/0.log" Dec 01 10:05:56 crc kubenswrapper[5004]: I1201 10:05:56.569768 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-compactor-0_c1f3b2ee-067a-4887-875a-c9ca05cb65b6/loki-compactor/0.log" Dec 01 10:05:56 crc kubenswrapper[5004]: I1201 10:05:56.939888 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-6c96ff8676-nnfwd_851b5f03-b03f-4a8b-9000-1fa733fb7465/gateway/0.log" Dec 01 10:05:56 crc kubenswrapper[5004]: I1201 10:05:56.942873 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-distributor-76cc67bf56-mjgbt_462fa983-5357-44cf-afb3-4803b227bcfa/loki-distributor/0.log" Dec 01 10:05:57 crc kubenswrapper[5004]: I1201 10:05:57.129927 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-6c96ff8676-nnfwd_851b5f03-b03f-4a8b-9000-1fa733fb7465/opa/0.log" Dec 01 10:05:57 crc kubenswrapper[5004]: I1201 10:05:57.303448 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-6c96ff8676-nph2r_11a613c6-725b-4e91-867b-58b8d664dd55/opa/0.log" Dec 01 10:05:57 crc kubenswrapper[5004]: I1201 10:05:57.333513 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-6c96ff8676-nph2r_11a613c6-725b-4e91-867b-58b8d664dd55/gateway/0.log" Dec 01 10:05:57 crc kubenswrapper[5004]: I1201 10:05:57.415821 5004 generic.go:334] "Generic (PLEG): container finished" podID="eb64f32d-2761-4a60-9feb-517c4bcf8249" containerID="5a822ec6812f8eba0ddea95230a6fb8789890e6bbd9090f883b24aff33411e64" exitCode=0 Dec 01 10:05:57 crc kubenswrapper[5004]: I1201 10:05:57.415866 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6j4k7" event={"ID":"eb64f32d-2761-4a60-9feb-517c4bcf8249","Type":"ContainerDied","Data":"5a822ec6812f8eba0ddea95230a6fb8789890e6bbd9090f883b24aff33411e64"} Dec 01 10:05:57 crc kubenswrapper[5004]: I1201 10:05:57.437537 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-index-gateway-0_9be864a5-1434-4402-ac67-a8cc8d09090a/loki-index-gateway/0.log" Dec 01 10:05:57 crc kubenswrapper[5004]: I1201 10:05:57.643530 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-ingester-0_346e7dcd-bc03-4c7f-b0b0-8e5206230152/loki-ingester/0.log" Dec 01 10:05:57 crc kubenswrapper[5004]: I1201 10:05:57.777587 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-querier-5895d59bb8-wdd4k_a551f476-d9e6-4e1c-9f48-60939bd6b6ff/loki-querier/0.log" Dec 01 10:05:57 crc kubenswrapper[5004]: I1201 10:05:57.943728 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-query-frontend-84558f7c9f-pkxkw_1955c798-b6bd-4194-8097-889c0e86c90b/loki-query-frontend/0.log" Dec 01 10:05:58 crc kubenswrapper[5004]: I1201 10:05:58.436364 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6j4k7" event={"ID":"eb64f32d-2761-4a60-9feb-517c4bcf8249","Type":"ContainerStarted","Data":"c6e33fede820fe6f1556982e46d2ceb54d498af73d3f198a65dc5e012ac4b15c"} Dec 01 10:05:58 crc kubenswrapper[5004]: I1201 10:05:58.461607 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6j4k7" podStartSLOduration=2.672260757 podStartE2EDuration="18.461575321s" podCreationTimestamp="2025-12-01 10:05:40 +0000 UTC" firstStartedPulling="2025-12-01 10:05:42.229052167 +0000 UTC m=+6519.794044149" lastFinishedPulling="2025-12-01 10:05:58.018366731 +0000 UTC m=+6535.583358713" observedRunningTime="2025-12-01 10:05:58.454772556 +0000 UTC m=+6536.019764538" watchObservedRunningTime="2025-12-01 10:05:58.461575321 +0000 UTC m=+6536.026567303" Dec 01 10:06:00 crc kubenswrapper[5004]: I1201 10:06:00.550865 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6j4k7" Dec 01 10:06:00 crc kubenswrapper[5004]: I1201 10:06:00.551503 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6j4k7" Dec 01 10:06:01 crc kubenswrapper[5004]: I1201 10:06:01.614701 5004 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6j4k7" podUID="eb64f32d-2761-4a60-9feb-517c4bcf8249" containerName="registry-server" probeResult="failure" output=< Dec 01 10:06:01 crc kubenswrapper[5004]: timeout: failed to connect service ":50051" within 1s Dec 01 10:06:01 crc kubenswrapper[5004]: > Dec 01 10:06:10 crc kubenswrapper[5004]: I1201 10:06:10.603240 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6j4k7" Dec 01 10:06:10 crc kubenswrapper[5004]: I1201 10:06:10.685209 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6j4k7" Dec 01 10:06:11 crc kubenswrapper[5004]: I1201 10:06:11.212841 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6j4k7"] Dec 01 10:06:11 crc kubenswrapper[5004]: I1201 10:06:11.375077 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mrv6v"] Dec 01 10:06:11 crc kubenswrapper[5004]: I1201 10:06:11.375315 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mrv6v" podUID="de6dd90c-9ef5-4754-8979-7c4efaf00386" containerName="registry-server" containerID="cri-o://c13c42ffcda8fe14761e34915935c6dd1ef7244d8e923ad5fd0132a9255e3a16" gracePeriod=2 Dec 01 10:06:11 crc kubenswrapper[5004]: I1201 10:06:11.610158 5004 generic.go:334] "Generic (PLEG): container finished" podID="de6dd90c-9ef5-4754-8979-7c4efaf00386" containerID="c13c42ffcda8fe14761e34915935c6dd1ef7244d8e923ad5fd0132a9255e3a16" exitCode=0 Dec 01 10:06:11 crc kubenswrapper[5004]: I1201 10:06:11.610269 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mrv6v" event={"ID":"de6dd90c-9ef5-4754-8979-7c4efaf00386","Type":"ContainerDied","Data":"c13c42ffcda8fe14761e34915935c6dd1ef7244d8e923ad5fd0132a9255e3a16"} Dec 01 10:06:12 crc kubenswrapper[5004]: I1201 10:06:12.004244 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mrv6v" Dec 01 10:06:12 crc kubenswrapper[5004]: I1201 10:06:12.113668 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gbr6\" (UniqueName: \"kubernetes.io/projected/de6dd90c-9ef5-4754-8979-7c4efaf00386-kube-api-access-4gbr6\") pod \"de6dd90c-9ef5-4754-8979-7c4efaf00386\" (UID: \"de6dd90c-9ef5-4754-8979-7c4efaf00386\") " Dec 01 10:06:12 crc kubenswrapper[5004]: I1201 10:06:12.113878 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de6dd90c-9ef5-4754-8979-7c4efaf00386-catalog-content\") pod \"de6dd90c-9ef5-4754-8979-7c4efaf00386\" (UID: \"de6dd90c-9ef5-4754-8979-7c4efaf00386\") " Dec 01 10:06:12 crc kubenswrapper[5004]: I1201 10:06:12.114026 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de6dd90c-9ef5-4754-8979-7c4efaf00386-utilities\") pod \"de6dd90c-9ef5-4754-8979-7c4efaf00386\" (UID: \"de6dd90c-9ef5-4754-8979-7c4efaf00386\") " Dec 01 10:06:12 crc kubenswrapper[5004]: I1201 10:06:12.114771 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de6dd90c-9ef5-4754-8979-7c4efaf00386-utilities" (OuterVolumeSpecName: "utilities") pod "de6dd90c-9ef5-4754-8979-7c4efaf00386" (UID: "de6dd90c-9ef5-4754-8979-7c4efaf00386"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:06:12 crc kubenswrapper[5004]: I1201 10:06:12.127919 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de6dd90c-9ef5-4754-8979-7c4efaf00386-kube-api-access-4gbr6" (OuterVolumeSpecName: "kube-api-access-4gbr6") pod "de6dd90c-9ef5-4754-8979-7c4efaf00386" (UID: "de6dd90c-9ef5-4754-8979-7c4efaf00386"). InnerVolumeSpecName "kube-api-access-4gbr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:06:12 crc kubenswrapper[5004]: I1201 10:06:12.216629 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de6dd90c-9ef5-4754-8979-7c4efaf00386-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:06:12 crc kubenswrapper[5004]: I1201 10:06:12.216849 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gbr6\" (UniqueName: \"kubernetes.io/projected/de6dd90c-9ef5-4754-8979-7c4efaf00386-kube-api-access-4gbr6\") on node \"crc\" DevicePath \"\"" Dec 01 10:06:12 crc kubenswrapper[5004]: I1201 10:06:12.230302 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de6dd90c-9ef5-4754-8979-7c4efaf00386-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "de6dd90c-9ef5-4754-8979-7c4efaf00386" (UID: "de6dd90c-9ef5-4754-8979-7c4efaf00386"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:06:12 crc kubenswrapper[5004]: I1201 10:06:12.319366 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de6dd90c-9ef5-4754-8979-7c4efaf00386-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:06:12 crc kubenswrapper[5004]: I1201 10:06:12.625637 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mrv6v" Dec 01 10:06:12 crc kubenswrapper[5004]: I1201 10:06:12.632698 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mrv6v" event={"ID":"de6dd90c-9ef5-4754-8979-7c4efaf00386","Type":"ContainerDied","Data":"12957fa2fe9b7bb8cf3ab23a3740cb786aea61ab5057d8a9cfa8416e82327015"} Dec 01 10:06:12 crc kubenswrapper[5004]: I1201 10:06:12.633207 5004 scope.go:117] "RemoveContainer" containerID="c13c42ffcda8fe14761e34915935c6dd1ef7244d8e923ad5fd0132a9255e3a16" Dec 01 10:06:12 crc kubenswrapper[5004]: I1201 10:06:12.723583 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mrv6v"] Dec 01 10:06:12 crc kubenswrapper[5004]: I1201 10:06:12.729045 5004 scope.go:117] "RemoveContainer" containerID="0f2c1c05059774506536e55da080ecd4f13ef4505b8b01a641d1c41a5878533d" Dec 01 10:06:12 crc kubenswrapper[5004]: I1201 10:06:12.737825 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mrv6v"] Dec 01 10:06:12 crc kubenswrapper[5004]: I1201 10:06:12.763246 5004 scope.go:117] "RemoveContainer" containerID="6f198c91cee3005c2107c1cc39e1949efa5e5d55b945caebc0783eee5f64459d" Dec 01 10:06:12 crc kubenswrapper[5004]: I1201 10:06:12.777957 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de6dd90c-9ef5-4754-8979-7c4efaf00386" path="/var/lib/kubelet/pods/de6dd90c-9ef5-4754-8979-7c4efaf00386/volumes" Dec 01 10:06:15 crc kubenswrapper[5004]: I1201 10:06:15.614214 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-5t9hg_53c42c23-7bb0-4e51-ab58-3355b224864c/kube-rbac-proxy/0.log" Dec 01 10:06:15 crc kubenswrapper[5004]: I1201 10:06:15.800400 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-5t9hg_53c42c23-7bb0-4e51-ab58-3355b224864c/controller/0.log" Dec 01 10:06:15 crc kubenswrapper[5004]: I1201 10:06:15.875574 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l6x9r_113ca366-80ad-475e-829f-fcbb4a67e642/cp-frr-files/0.log" Dec 01 10:06:16 crc kubenswrapper[5004]: I1201 10:06:16.357301 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l6x9r_113ca366-80ad-475e-829f-fcbb4a67e642/cp-metrics/0.log" Dec 01 10:06:16 crc kubenswrapper[5004]: I1201 10:06:16.381158 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l6x9r_113ca366-80ad-475e-829f-fcbb4a67e642/cp-reloader/0.log" Dec 01 10:06:16 crc kubenswrapper[5004]: I1201 10:06:16.410255 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l6x9r_113ca366-80ad-475e-829f-fcbb4a67e642/cp-frr-files/0.log" Dec 01 10:06:16 crc kubenswrapper[5004]: I1201 10:06:16.443203 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l6x9r_113ca366-80ad-475e-829f-fcbb4a67e642/cp-reloader/0.log" Dec 01 10:06:16 crc kubenswrapper[5004]: I1201 10:06:16.608277 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l6x9r_113ca366-80ad-475e-829f-fcbb4a67e642/cp-frr-files/0.log" Dec 01 10:06:16 crc kubenswrapper[5004]: I1201 10:06:16.635847 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l6x9r_113ca366-80ad-475e-829f-fcbb4a67e642/cp-metrics/0.log" Dec 01 10:06:16 crc kubenswrapper[5004]: I1201 10:06:16.686722 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l6x9r_113ca366-80ad-475e-829f-fcbb4a67e642/cp-reloader/0.log" Dec 01 10:06:16 crc kubenswrapper[5004]: I1201 10:06:16.731923 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l6x9r_113ca366-80ad-475e-829f-fcbb4a67e642/cp-metrics/0.log" Dec 01 10:06:16 crc kubenswrapper[5004]: I1201 10:06:16.880401 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l6x9r_113ca366-80ad-475e-829f-fcbb4a67e642/cp-metrics/0.log" Dec 01 10:06:16 crc kubenswrapper[5004]: I1201 10:06:16.891290 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l6x9r_113ca366-80ad-475e-829f-fcbb4a67e642/cp-reloader/0.log" Dec 01 10:06:16 crc kubenswrapper[5004]: I1201 10:06:16.912289 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l6x9r_113ca366-80ad-475e-829f-fcbb4a67e642/cp-frr-files/0.log" Dec 01 10:06:16 crc kubenswrapper[5004]: I1201 10:06:16.947745 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l6x9r_113ca366-80ad-475e-829f-fcbb4a67e642/controller/0.log" Dec 01 10:06:17 crc kubenswrapper[5004]: I1201 10:06:17.097080 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l6x9r_113ca366-80ad-475e-829f-fcbb4a67e642/frr-metrics/0.log" Dec 01 10:06:17 crc kubenswrapper[5004]: I1201 10:06:17.121551 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l6x9r_113ca366-80ad-475e-829f-fcbb4a67e642/kube-rbac-proxy/0.log" Dec 01 10:06:17 crc kubenswrapper[5004]: I1201 10:06:17.206171 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l6x9r_113ca366-80ad-475e-829f-fcbb4a67e642/kube-rbac-proxy-frr/0.log" Dec 01 10:06:17 crc kubenswrapper[5004]: I1201 10:06:17.351279 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l6x9r_113ca366-80ad-475e-829f-fcbb4a67e642/reloader/0.log" Dec 01 10:06:17 crc kubenswrapper[5004]: I1201 10:06:17.473808 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-26q9v_bebabc29-870f-4604-bda6-e77a3db6a5ed/frr-k8s-webhook-server/0.log" Dec 01 10:06:17 crc kubenswrapper[5004]: I1201 10:06:17.601045 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7f9c5fbb9c-ljsdl_81957152-e7e6-490b-a819-fa6d1a57c822/manager/0.log" Dec 01 10:06:17 crc kubenswrapper[5004]: I1201 10:06:17.764276 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-74999fff7b-9rfrt_1d901b5c-40a1-4f35-8f0e-b9de6884d503/webhook-server/0.log" Dec 01 10:06:18 crc kubenswrapper[5004]: I1201 10:06:18.030338 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-cqvjk_0a3f9662-dd78-4183-8bb4-ccc7b6b17ed3/kube-rbac-proxy/0.log" Dec 01 10:06:18 crc kubenswrapper[5004]: I1201 10:06:18.745920 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-cqvjk_0a3f9662-dd78-4183-8bb4-ccc7b6b17ed3/speaker/0.log" Dec 01 10:06:19 crc kubenswrapper[5004]: I1201 10:06:19.158343 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l6x9r_113ca366-80ad-475e-829f-fcbb4a67e642/frr/0.log" Dec 01 10:06:32 crc kubenswrapper[5004]: I1201 10:06:32.204358 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8bsmpc_8659be81-88bd-4a0a-b117-c72f2c9e9035/util/0.log" Dec 01 10:06:32 crc kubenswrapper[5004]: I1201 10:06:32.413846 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8bsmpc_8659be81-88bd-4a0a-b117-c72f2c9e9035/pull/0.log" Dec 01 10:06:32 crc kubenswrapper[5004]: I1201 10:06:32.431793 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8bsmpc_8659be81-88bd-4a0a-b117-c72f2c9e9035/pull/0.log" Dec 01 10:06:32 crc kubenswrapper[5004]: I1201 10:06:32.453003 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8bsmpc_8659be81-88bd-4a0a-b117-c72f2c9e9035/util/0.log" Dec 01 10:06:32 crc kubenswrapper[5004]: I1201 10:06:32.602209 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8bsmpc_8659be81-88bd-4a0a-b117-c72f2c9e9035/pull/0.log" Dec 01 10:06:32 crc kubenswrapper[5004]: I1201 10:06:32.637956 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8bsmpc_8659be81-88bd-4a0a-b117-c72f2c9e9035/util/0.log" Dec 01 10:06:32 crc kubenswrapper[5004]: I1201 10:06:32.666331 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8bsmpc_8659be81-88bd-4a0a-b117-c72f2c9e9035/extract/0.log" Dec 01 10:06:32 crc kubenswrapper[5004]: I1201 10:06:32.806233 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fghhzz_bbc10d54-27c3-4dcb-beb7-d1b675428a2c/util/0.log" Dec 01 10:06:33 crc kubenswrapper[5004]: I1201 10:06:33.010193 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fghhzz_bbc10d54-27c3-4dcb-beb7-d1b675428a2c/util/0.log" Dec 01 10:06:33 crc kubenswrapper[5004]: I1201 10:06:33.018133 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fghhzz_bbc10d54-27c3-4dcb-beb7-d1b675428a2c/pull/0.log" Dec 01 10:06:33 crc kubenswrapper[5004]: I1201 10:06:33.049777 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fghhzz_bbc10d54-27c3-4dcb-beb7-d1b675428a2c/pull/0.log" Dec 01 10:06:33 crc kubenswrapper[5004]: I1201 10:06:33.190638 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fghhzz_bbc10d54-27c3-4dcb-beb7-d1b675428a2c/util/0.log" Dec 01 10:06:33 crc kubenswrapper[5004]: I1201 10:06:33.191417 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fghhzz_bbc10d54-27c3-4dcb-beb7-d1b675428a2c/extract/0.log" Dec 01 10:06:33 crc kubenswrapper[5004]: I1201 10:06:33.267455 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fghhzz_bbc10d54-27c3-4dcb-beb7-d1b675428a2c/pull/0.log" Dec 01 10:06:33 crc kubenswrapper[5004]: I1201 10:06:33.396954 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210gbrgb_06544def-087a-4ce3-ae2b-af6a06799add/util/0.log" Dec 01 10:06:33 crc kubenswrapper[5004]: I1201 10:06:33.591406 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210gbrgb_06544def-087a-4ce3-ae2b-af6a06799add/pull/0.log" Dec 01 10:06:33 crc kubenswrapper[5004]: I1201 10:06:33.596526 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210gbrgb_06544def-087a-4ce3-ae2b-af6a06799add/pull/0.log" Dec 01 10:06:33 crc kubenswrapper[5004]: I1201 10:06:33.632932 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210gbrgb_06544def-087a-4ce3-ae2b-af6a06799add/util/0.log" Dec 01 10:06:33 crc kubenswrapper[5004]: I1201 10:06:33.792086 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210gbrgb_06544def-087a-4ce3-ae2b-af6a06799add/pull/0.log" Dec 01 10:06:33 crc kubenswrapper[5004]: I1201 10:06:33.821503 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210gbrgb_06544def-087a-4ce3-ae2b-af6a06799add/extract/0.log" Dec 01 10:06:33 crc kubenswrapper[5004]: I1201 10:06:33.851836 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210gbrgb_06544def-087a-4ce3-ae2b-af6a06799add/util/0.log" Dec 01 10:06:33 crc kubenswrapper[5004]: I1201 10:06:33.976830 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fn57ss_78cb161c-a9d6-4fd5-9144-6564ca31cd33/util/0.log" Dec 01 10:06:34 crc kubenswrapper[5004]: I1201 10:06:34.143129 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fn57ss_78cb161c-a9d6-4fd5-9144-6564ca31cd33/pull/0.log" Dec 01 10:06:34 crc kubenswrapper[5004]: I1201 10:06:34.167217 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fn57ss_78cb161c-a9d6-4fd5-9144-6564ca31cd33/util/0.log" Dec 01 10:06:34 crc kubenswrapper[5004]: I1201 10:06:34.175954 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fn57ss_78cb161c-a9d6-4fd5-9144-6564ca31cd33/pull/0.log" Dec 01 10:06:34 crc kubenswrapper[5004]: I1201 10:06:34.320661 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fn57ss_78cb161c-a9d6-4fd5-9144-6564ca31cd33/pull/0.log" Dec 01 10:06:34 crc kubenswrapper[5004]: I1201 10:06:34.341353 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fn57ss_78cb161c-a9d6-4fd5-9144-6564ca31cd33/util/0.log" Dec 01 10:06:34 crc kubenswrapper[5004]: I1201 10:06:34.412250 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fn57ss_78cb161c-a9d6-4fd5-9144-6564ca31cd33/extract/0.log" Dec 01 10:06:34 crc kubenswrapper[5004]: I1201 10:06:34.658145 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83kl2cw_749c2b48-2544-41c1-8dc8-716e9e459232/util/0.log" Dec 01 10:06:34 crc kubenswrapper[5004]: I1201 10:06:34.993791 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83kl2cw_749c2b48-2544-41c1-8dc8-716e9e459232/pull/0.log" Dec 01 10:06:35 crc kubenswrapper[5004]: I1201 10:06:35.001534 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83kl2cw_749c2b48-2544-41c1-8dc8-716e9e459232/util/0.log" Dec 01 10:06:35 crc kubenswrapper[5004]: I1201 10:06:35.006234 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83kl2cw_749c2b48-2544-41c1-8dc8-716e9e459232/pull/0.log" Dec 01 10:06:35 crc kubenswrapper[5004]: I1201 10:06:35.154113 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83kl2cw_749c2b48-2544-41c1-8dc8-716e9e459232/util/0.log" Dec 01 10:06:35 crc kubenswrapper[5004]: I1201 10:06:35.230170 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83kl2cw_749c2b48-2544-41c1-8dc8-716e9e459232/extract/0.log" Dec 01 10:06:35 crc kubenswrapper[5004]: I1201 10:06:35.233341 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83kl2cw_749c2b48-2544-41c1-8dc8-716e9e459232/pull/0.log" Dec 01 10:06:35 crc kubenswrapper[5004]: I1201 10:06:35.341653 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zsqrg_14df59e3-a048-40c2-9400-9accbd0badd7/extract-utilities/0.log" Dec 01 10:06:35 crc kubenswrapper[5004]: I1201 10:06:35.564582 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zsqrg_14df59e3-a048-40c2-9400-9accbd0badd7/extract-utilities/0.log" Dec 01 10:06:35 crc kubenswrapper[5004]: I1201 10:06:35.581940 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zsqrg_14df59e3-a048-40c2-9400-9accbd0badd7/extract-content/0.log" Dec 01 10:06:35 crc kubenswrapper[5004]: I1201 10:06:35.585244 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zsqrg_14df59e3-a048-40c2-9400-9accbd0badd7/extract-content/0.log" Dec 01 10:06:35 crc kubenswrapper[5004]: I1201 10:06:35.781405 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zsqrg_14df59e3-a048-40c2-9400-9accbd0badd7/extract-utilities/0.log" Dec 01 10:06:35 crc kubenswrapper[5004]: I1201 10:06:35.782080 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zsqrg_14df59e3-a048-40c2-9400-9accbd0badd7/extract-content/0.log" Dec 01 10:06:36 crc kubenswrapper[5004]: I1201 10:06:36.002958 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-4j42r_6b9a10fb-aabb-45e5-b0ce-156df39ce402/extract-utilities/0.log" Dec 01 10:06:36 crc kubenswrapper[5004]: I1201 10:06:36.268248 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-4j42r_6b9a10fb-aabb-45e5-b0ce-156df39ce402/extract-utilities/0.log" Dec 01 10:06:36 crc kubenswrapper[5004]: I1201 10:06:36.295476 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-4j42r_6b9a10fb-aabb-45e5-b0ce-156df39ce402/extract-content/0.log" Dec 01 10:06:36 crc kubenswrapper[5004]: I1201 10:06:36.301408 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-4j42r_6b9a10fb-aabb-45e5-b0ce-156df39ce402/extract-content/0.log" Dec 01 10:06:36 crc kubenswrapper[5004]: I1201 10:06:36.507684 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-4j42r_6b9a10fb-aabb-45e5-b0ce-156df39ce402/extract-content/0.log" Dec 01 10:06:36 crc kubenswrapper[5004]: I1201 10:06:36.588796 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-4j42r_6b9a10fb-aabb-45e5-b0ce-156df39ce402/extract-utilities/0.log" Dec 01 10:06:36 crc kubenswrapper[5004]: I1201 10:06:36.772850 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-5ffg4_2370cb7e-e860-40eb-a3a2-0a711f1e05b1/marketplace-operator/0.log" Dec 01 10:06:36 crc kubenswrapper[5004]: I1201 10:06:36.821921 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kwt85_408c336f-4cb7-4ebd-80c3-53bf49c6b884/extract-utilities/0.log" Dec 01 10:06:36 crc kubenswrapper[5004]: I1201 10:06:36.886788 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zsqrg_14df59e3-a048-40c2-9400-9accbd0badd7/registry-server/0.log" Dec 01 10:06:37 crc kubenswrapper[5004]: I1201 10:06:37.089552 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kwt85_408c336f-4cb7-4ebd-80c3-53bf49c6b884/extract-content/0.log" Dec 01 10:06:37 crc kubenswrapper[5004]: I1201 10:06:37.148083 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kwt85_408c336f-4cb7-4ebd-80c3-53bf49c6b884/extract-utilities/0.log" Dec 01 10:06:37 crc kubenswrapper[5004]: I1201 10:06:37.168550 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-4j42r_6b9a10fb-aabb-45e5-b0ce-156df39ce402/registry-server/0.log" Dec 01 10:06:37 crc kubenswrapper[5004]: I1201 10:06:37.173789 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kwt85_408c336f-4cb7-4ebd-80c3-53bf49c6b884/extract-content/0.log" Dec 01 10:06:37 crc kubenswrapper[5004]: I1201 10:06:37.356308 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kwt85_408c336f-4cb7-4ebd-80c3-53bf49c6b884/extract-content/0.log" Dec 01 10:06:37 crc kubenswrapper[5004]: I1201 10:06:37.361577 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kwt85_408c336f-4cb7-4ebd-80c3-53bf49c6b884/extract-utilities/0.log" Dec 01 10:06:37 crc kubenswrapper[5004]: I1201 10:06:37.389446 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6j4k7_eb64f32d-2761-4a60-9feb-517c4bcf8249/extract-utilities/0.log" Dec 01 10:06:37 crc kubenswrapper[5004]: I1201 10:06:37.688905 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kwt85_408c336f-4cb7-4ebd-80c3-53bf49c6b884/registry-server/0.log" Dec 01 10:06:37 crc kubenswrapper[5004]: I1201 10:06:37.706026 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6j4k7_eb64f32d-2761-4a60-9feb-517c4bcf8249/extract-utilities/0.log" Dec 01 10:06:37 crc kubenswrapper[5004]: I1201 10:06:37.710993 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6j4k7_eb64f32d-2761-4a60-9feb-517c4bcf8249/extract-content/0.log" Dec 01 10:06:37 crc kubenswrapper[5004]: I1201 10:06:37.731128 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6j4k7_eb64f32d-2761-4a60-9feb-517c4bcf8249/extract-content/0.log" Dec 01 10:06:37 crc kubenswrapper[5004]: I1201 10:06:37.968775 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6j4k7_eb64f32d-2761-4a60-9feb-517c4bcf8249/extract-content/0.log" Dec 01 10:06:37 crc kubenswrapper[5004]: I1201 10:06:37.980767 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6j4k7_eb64f32d-2761-4a60-9feb-517c4bcf8249/extract-utilities/0.log" Dec 01 10:06:38 crc kubenswrapper[5004]: I1201 10:06:38.052930 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6j4k7_eb64f32d-2761-4a60-9feb-517c4bcf8249/registry-server/0.log" Dec 01 10:06:50 crc kubenswrapper[5004]: I1201 10:06:50.872982 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-668cf9dfbb-gwtbl_89a7714c-cec3-46e6-8cdb-016669fcf18e/prometheus-operator/0.log" Dec 01 10:06:51 crc kubenswrapper[5004]: I1201 10:06:51.082835 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-5bf4d79569-4nf7p_3a6db099-c25a-4d19-8fa1-8269429274fc/prometheus-operator-admission-webhook/0.log" Dec 01 10:06:51 crc kubenswrapper[5004]: I1201 10:06:51.207767 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-5bf4d79569-bhs7g_386e0e23-8226-45e3-a1b5-38fc4fb44eec/prometheus-operator-admission-webhook/0.log" Dec 01 10:06:51 crc kubenswrapper[5004]: I1201 10:06:51.304450 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-d8bb48f5d-kfwr8_c2573e8e-c093-4ddb-b02d-f4f7e270b97a/operator/0.log" Dec 01 10:06:51 crc kubenswrapper[5004]: I1201 10:06:51.436015 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-ui-dashboards-7d5fb4cbfb-5htt9_1ff890d7-d00c-4b87-86d6-3eb403821ee3/observability-ui-dashboards/0.log" Dec 01 10:06:51 crc kubenswrapper[5004]: I1201 10:06:51.496528 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5446b9c989-9pms6_1579721a-7166-4152-9703-97b893433c9a/perses-operator/0.log" Dec 01 10:07:05 crc kubenswrapper[5004]: I1201 10:07:05.326475 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-56fbccf5c9-5kcrv_3ef42d0b-a102-4112-b592-aa6d481041c7/manager/0.log" Dec 01 10:07:05 crc kubenswrapper[5004]: I1201 10:07:05.330997 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-56fbccf5c9-5kcrv_3ef42d0b-a102-4112-b592-aa6d481041c7/kube-rbac-proxy/0.log" Dec 01 10:07:30 crc kubenswrapper[5004]: I1201 10:07:30.293018 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mhj88"] Dec 01 10:07:30 crc kubenswrapper[5004]: E1201 10:07:30.294056 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de6dd90c-9ef5-4754-8979-7c4efaf00386" containerName="registry-server" Dec 01 10:07:30 crc kubenswrapper[5004]: I1201 10:07:30.294070 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="de6dd90c-9ef5-4754-8979-7c4efaf00386" containerName="registry-server" Dec 01 10:07:30 crc kubenswrapper[5004]: E1201 10:07:30.294104 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de6dd90c-9ef5-4754-8979-7c4efaf00386" containerName="extract-content" Dec 01 10:07:30 crc kubenswrapper[5004]: I1201 10:07:30.294110 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="de6dd90c-9ef5-4754-8979-7c4efaf00386" containerName="extract-content" Dec 01 10:07:30 crc kubenswrapper[5004]: E1201 10:07:30.294145 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de6dd90c-9ef5-4754-8979-7c4efaf00386" containerName="extract-utilities" Dec 01 10:07:30 crc kubenswrapper[5004]: I1201 10:07:30.294152 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="de6dd90c-9ef5-4754-8979-7c4efaf00386" containerName="extract-utilities" Dec 01 10:07:30 crc kubenswrapper[5004]: I1201 10:07:30.294375 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="de6dd90c-9ef5-4754-8979-7c4efaf00386" containerName="registry-server" Dec 01 10:07:30 crc kubenswrapper[5004]: I1201 10:07:30.299339 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mhj88" Dec 01 10:07:30 crc kubenswrapper[5004]: I1201 10:07:30.314317 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mhj88"] Dec 01 10:07:30 crc kubenswrapper[5004]: I1201 10:07:30.474456 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsc44\" (UniqueName: \"kubernetes.io/projected/3c4504be-7de5-4997-85ad-14340f1ce362-kube-api-access-wsc44\") pod \"community-operators-mhj88\" (UID: \"3c4504be-7de5-4997-85ad-14340f1ce362\") " pod="openshift-marketplace/community-operators-mhj88" Dec 01 10:07:30 crc kubenswrapper[5004]: I1201 10:07:30.474681 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c4504be-7de5-4997-85ad-14340f1ce362-utilities\") pod \"community-operators-mhj88\" (UID: \"3c4504be-7de5-4997-85ad-14340f1ce362\") " pod="openshift-marketplace/community-operators-mhj88" Dec 01 10:07:30 crc kubenswrapper[5004]: I1201 10:07:30.474752 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c4504be-7de5-4997-85ad-14340f1ce362-catalog-content\") pod \"community-operators-mhj88\" (UID: \"3c4504be-7de5-4997-85ad-14340f1ce362\") " pod="openshift-marketplace/community-operators-mhj88" Dec 01 10:07:30 crc kubenswrapper[5004]: I1201 10:07:30.577057 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsc44\" (UniqueName: \"kubernetes.io/projected/3c4504be-7de5-4997-85ad-14340f1ce362-kube-api-access-wsc44\") pod \"community-operators-mhj88\" (UID: \"3c4504be-7de5-4997-85ad-14340f1ce362\") " pod="openshift-marketplace/community-operators-mhj88" Dec 01 10:07:30 crc kubenswrapper[5004]: I1201 10:07:30.577184 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c4504be-7de5-4997-85ad-14340f1ce362-utilities\") pod \"community-operators-mhj88\" (UID: \"3c4504be-7de5-4997-85ad-14340f1ce362\") " pod="openshift-marketplace/community-operators-mhj88" Dec 01 10:07:30 crc kubenswrapper[5004]: I1201 10:07:30.577234 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c4504be-7de5-4997-85ad-14340f1ce362-catalog-content\") pod \"community-operators-mhj88\" (UID: \"3c4504be-7de5-4997-85ad-14340f1ce362\") " pod="openshift-marketplace/community-operators-mhj88" Dec 01 10:07:30 crc kubenswrapper[5004]: I1201 10:07:30.578258 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c4504be-7de5-4997-85ad-14340f1ce362-catalog-content\") pod \"community-operators-mhj88\" (UID: \"3c4504be-7de5-4997-85ad-14340f1ce362\") " pod="openshift-marketplace/community-operators-mhj88" Dec 01 10:07:30 crc kubenswrapper[5004]: I1201 10:07:30.578264 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c4504be-7de5-4997-85ad-14340f1ce362-utilities\") pod \"community-operators-mhj88\" (UID: \"3c4504be-7de5-4997-85ad-14340f1ce362\") " pod="openshift-marketplace/community-operators-mhj88" Dec 01 10:07:30 crc kubenswrapper[5004]: I1201 10:07:30.606550 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsc44\" (UniqueName: \"kubernetes.io/projected/3c4504be-7de5-4997-85ad-14340f1ce362-kube-api-access-wsc44\") pod \"community-operators-mhj88\" (UID: \"3c4504be-7de5-4997-85ad-14340f1ce362\") " pod="openshift-marketplace/community-operators-mhj88" Dec 01 10:07:30 crc kubenswrapper[5004]: I1201 10:07:30.630201 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mhj88" Dec 01 10:07:31 crc kubenswrapper[5004]: I1201 10:07:31.278760 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mhj88"] Dec 01 10:07:31 crc kubenswrapper[5004]: I1201 10:07:31.476728 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mhj88" event={"ID":"3c4504be-7de5-4997-85ad-14340f1ce362","Type":"ContainerStarted","Data":"7b964483503ec0eb63b2b39a80996d81fb9321d84b8e161535ad849fbcf6a923"} Dec 01 10:07:32 crc kubenswrapper[5004]: I1201 10:07:32.488211 5004 generic.go:334] "Generic (PLEG): container finished" podID="3c4504be-7de5-4997-85ad-14340f1ce362" containerID="17b2219f9c4991c8fc44ddcdf13b50f36f1820cc553f2a06a73176c62cd55c2f" exitCode=0 Dec 01 10:07:32 crc kubenswrapper[5004]: I1201 10:07:32.488310 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mhj88" event={"ID":"3c4504be-7de5-4997-85ad-14340f1ce362","Type":"ContainerDied","Data":"17b2219f9c4991c8fc44ddcdf13b50f36f1820cc553f2a06a73176c62cd55c2f"} Dec 01 10:07:32 crc kubenswrapper[5004]: I1201 10:07:32.496904 5004 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 10:07:34 crc kubenswrapper[5004]: I1201 10:07:34.510241 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mhj88" event={"ID":"3c4504be-7de5-4997-85ad-14340f1ce362","Type":"ContainerStarted","Data":"75b68982803dc483cc07a19b467d464cc4f5977e8915d8ac46aa6d3df19ff6d2"} Dec 01 10:07:35 crc kubenswrapper[5004]: I1201 10:07:35.528537 5004 generic.go:334] "Generic (PLEG): container finished" podID="3c4504be-7de5-4997-85ad-14340f1ce362" containerID="75b68982803dc483cc07a19b467d464cc4f5977e8915d8ac46aa6d3df19ff6d2" exitCode=0 Dec 01 10:07:35 crc kubenswrapper[5004]: I1201 10:07:35.528613 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mhj88" event={"ID":"3c4504be-7de5-4997-85ad-14340f1ce362","Type":"ContainerDied","Data":"75b68982803dc483cc07a19b467d464cc4f5977e8915d8ac46aa6d3df19ff6d2"} Dec 01 10:07:36 crc kubenswrapper[5004]: I1201 10:07:36.542014 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mhj88" event={"ID":"3c4504be-7de5-4997-85ad-14340f1ce362","Type":"ContainerStarted","Data":"a82cca24011ada194261b1384919e9f0c00cb852d07065cdd5b90c1b2c476248"} Dec 01 10:07:36 crc kubenswrapper[5004]: I1201 10:07:36.565763 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mhj88" podStartSLOduration=3.007886374 podStartE2EDuration="6.565717245s" podCreationTimestamp="2025-12-01 10:07:30 +0000 UTC" firstStartedPulling="2025-12-01 10:07:32.491038727 +0000 UTC m=+6630.056030709" lastFinishedPulling="2025-12-01 10:07:36.048869598 +0000 UTC m=+6633.613861580" observedRunningTime="2025-12-01 10:07:36.558958131 +0000 UTC m=+6634.123950123" watchObservedRunningTime="2025-12-01 10:07:36.565717245 +0000 UTC m=+6634.130709237" Dec 01 10:07:38 crc kubenswrapper[5004]: I1201 10:07:38.729017 5004 patch_prober.go:28] interesting pod/machine-config-daemon-fvdgt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:07:38 crc kubenswrapper[5004]: I1201 10:07:38.729084 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:07:40 crc kubenswrapper[5004]: I1201 10:07:40.630704 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mhj88" Dec 01 10:07:40 crc kubenswrapper[5004]: I1201 10:07:40.631319 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mhj88" Dec 01 10:07:40 crc kubenswrapper[5004]: I1201 10:07:40.682070 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mhj88" Dec 01 10:07:41 crc kubenswrapper[5004]: I1201 10:07:41.657799 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mhj88" Dec 01 10:07:41 crc kubenswrapper[5004]: I1201 10:07:41.765213 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mhj88"] Dec 01 10:07:43 crc kubenswrapper[5004]: I1201 10:07:43.619772 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mhj88" podUID="3c4504be-7de5-4997-85ad-14340f1ce362" containerName="registry-server" containerID="cri-o://a82cca24011ada194261b1384919e9f0c00cb852d07065cdd5b90c1b2c476248" gracePeriod=2 Dec 01 10:07:44 crc kubenswrapper[5004]: I1201 10:07:44.336396 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mhj88" Dec 01 10:07:44 crc kubenswrapper[5004]: I1201 10:07:44.526083 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c4504be-7de5-4997-85ad-14340f1ce362-catalog-content\") pod \"3c4504be-7de5-4997-85ad-14340f1ce362\" (UID: \"3c4504be-7de5-4997-85ad-14340f1ce362\") " Dec 01 10:07:44 crc kubenswrapper[5004]: I1201 10:07:44.526303 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c4504be-7de5-4997-85ad-14340f1ce362-utilities\") pod \"3c4504be-7de5-4997-85ad-14340f1ce362\" (UID: \"3c4504be-7de5-4997-85ad-14340f1ce362\") " Dec 01 10:07:44 crc kubenswrapper[5004]: I1201 10:07:44.526335 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wsc44\" (UniqueName: \"kubernetes.io/projected/3c4504be-7de5-4997-85ad-14340f1ce362-kube-api-access-wsc44\") pod \"3c4504be-7de5-4997-85ad-14340f1ce362\" (UID: \"3c4504be-7de5-4997-85ad-14340f1ce362\") " Dec 01 10:07:44 crc kubenswrapper[5004]: I1201 10:07:44.533355 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c4504be-7de5-4997-85ad-14340f1ce362-kube-api-access-wsc44" (OuterVolumeSpecName: "kube-api-access-wsc44") pod "3c4504be-7de5-4997-85ad-14340f1ce362" (UID: "3c4504be-7de5-4997-85ad-14340f1ce362"). InnerVolumeSpecName "kube-api-access-wsc44". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:07:44 crc kubenswrapper[5004]: I1201 10:07:44.534939 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c4504be-7de5-4997-85ad-14340f1ce362-utilities" (OuterVolumeSpecName: "utilities") pod "3c4504be-7de5-4997-85ad-14340f1ce362" (UID: "3c4504be-7de5-4997-85ad-14340f1ce362"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:07:44 crc kubenswrapper[5004]: I1201 10:07:44.628896 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c4504be-7de5-4997-85ad-14340f1ce362-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:07:44 crc kubenswrapper[5004]: I1201 10:07:44.628939 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wsc44\" (UniqueName: \"kubernetes.io/projected/3c4504be-7de5-4997-85ad-14340f1ce362-kube-api-access-wsc44\") on node \"crc\" DevicePath \"\"" Dec 01 10:07:44 crc kubenswrapper[5004]: I1201 10:07:44.631782 5004 generic.go:334] "Generic (PLEG): container finished" podID="3c4504be-7de5-4997-85ad-14340f1ce362" containerID="a82cca24011ada194261b1384919e9f0c00cb852d07065cdd5b90c1b2c476248" exitCode=0 Dec 01 10:07:44 crc kubenswrapper[5004]: I1201 10:07:44.631827 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mhj88" event={"ID":"3c4504be-7de5-4997-85ad-14340f1ce362","Type":"ContainerDied","Data":"a82cca24011ada194261b1384919e9f0c00cb852d07065cdd5b90c1b2c476248"} Dec 01 10:07:44 crc kubenswrapper[5004]: I1201 10:07:44.631864 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mhj88" event={"ID":"3c4504be-7de5-4997-85ad-14340f1ce362","Type":"ContainerDied","Data":"7b964483503ec0eb63b2b39a80996d81fb9321d84b8e161535ad849fbcf6a923"} Dec 01 10:07:44 crc kubenswrapper[5004]: I1201 10:07:44.631888 5004 scope.go:117] "RemoveContainer" containerID="a82cca24011ada194261b1384919e9f0c00cb852d07065cdd5b90c1b2c476248" Dec 01 10:07:44 crc kubenswrapper[5004]: I1201 10:07:44.632034 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mhj88" Dec 01 10:07:44 crc kubenswrapper[5004]: I1201 10:07:44.657700 5004 scope.go:117] "RemoveContainer" containerID="75b68982803dc483cc07a19b467d464cc4f5977e8915d8ac46aa6d3df19ff6d2" Dec 01 10:07:44 crc kubenswrapper[5004]: I1201 10:07:44.681713 5004 scope.go:117] "RemoveContainer" containerID="17b2219f9c4991c8fc44ddcdf13b50f36f1820cc553f2a06a73176c62cd55c2f" Dec 01 10:07:44 crc kubenswrapper[5004]: I1201 10:07:44.761157 5004 scope.go:117] "RemoveContainer" containerID="a82cca24011ada194261b1384919e9f0c00cb852d07065cdd5b90c1b2c476248" Dec 01 10:07:44 crc kubenswrapper[5004]: E1201 10:07:44.776525 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a82cca24011ada194261b1384919e9f0c00cb852d07065cdd5b90c1b2c476248\": container with ID starting with a82cca24011ada194261b1384919e9f0c00cb852d07065cdd5b90c1b2c476248 not found: ID does not exist" containerID="a82cca24011ada194261b1384919e9f0c00cb852d07065cdd5b90c1b2c476248" Dec 01 10:07:44 crc kubenswrapper[5004]: I1201 10:07:44.776761 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a82cca24011ada194261b1384919e9f0c00cb852d07065cdd5b90c1b2c476248"} err="failed to get container status \"a82cca24011ada194261b1384919e9f0c00cb852d07065cdd5b90c1b2c476248\": rpc error: code = NotFound desc = could not find container \"a82cca24011ada194261b1384919e9f0c00cb852d07065cdd5b90c1b2c476248\": container with ID starting with a82cca24011ada194261b1384919e9f0c00cb852d07065cdd5b90c1b2c476248 not found: ID does not exist" Dec 01 10:07:44 crc kubenswrapper[5004]: I1201 10:07:44.776809 5004 scope.go:117] "RemoveContainer" containerID="75b68982803dc483cc07a19b467d464cc4f5977e8915d8ac46aa6d3df19ff6d2" Dec 01 10:07:44 crc kubenswrapper[5004]: E1201 10:07:44.781898 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75b68982803dc483cc07a19b467d464cc4f5977e8915d8ac46aa6d3df19ff6d2\": container with ID starting with 75b68982803dc483cc07a19b467d464cc4f5977e8915d8ac46aa6d3df19ff6d2 not found: ID does not exist" containerID="75b68982803dc483cc07a19b467d464cc4f5977e8915d8ac46aa6d3df19ff6d2" Dec 01 10:07:44 crc kubenswrapper[5004]: I1201 10:07:44.782205 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75b68982803dc483cc07a19b467d464cc4f5977e8915d8ac46aa6d3df19ff6d2"} err="failed to get container status \"75b68982803dc483cc07a19b467d464cc4f5977e8915d8ac46aa6d3df19ff6d2\": rpc error: code = NotFound desc = could not find container \"75b68982803dc483cc07a19b467d464cc4f5977e8915d8ac46aa6d3df19ff6d2\": container with ID starting with 75b68982803dc483cc07a19b467d464cc4f5977e8915d8ac46aa6d3df19ff6d2 not found: ID does not exist" Dec 01 10:07:44 crc kubenswrapper[5004]: I1201 10:07:44.782239 5004 scope.go:117] "RemoveContainer" containerID="17b2219f9c4991c8fc44ddcdf13b50f36f1820cc553f2a06a73176c62cd55c2f" Dec 01 10:07:44 crc kubenswrapper[5004]: E1201 10:07:44.783235 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17b2219f9c4991c8fc44ddcdf13b50f36f1820cc553f2a06a73176c62cd55c2f\": container with ID starting with 17b2219f9c4991c8fc44ddcdf13b50f36f1820cc553f2a06a73176c62cd55c2f not found: ID does not exist" containerID="17b2219f9c4991c8fc44ddcdf13b50f36f1820cc553f2a06a73176c62cd55c2f" Dec 01 10:07:44 crc kubenswrapper[5004]: I1201 10:07:44.783333 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17b2219f9c4991c8fc44ddcdf13b50f36f1820cc553f2a06a73176c62cd55c2f"} err="failed to get container status \"17b2219f9c4991c8fc44ddcdf13b50f36f1820cc553f2a06a73176c62cd55c2f\": rpc error: code = NotFound desc = could not find container \"17b2219f9c4991c8fc44ddcdf13b50f36f1820cc553f2a06a73176c62cd55c2f\": container with ID starting with 17b2219f9c4991c8fc44ddcdf13b50f36f1820cc553f2a06a73176c62cd55c2f not found: ID does not exist" Dec 01 10:07:44 crc kubenswrapper[5004]: I1201 10:07:44.798347 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c4504be-7de5-4997-85ad-14340f1ce362-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3c4504be-7de5-4997-85ad-14340f1ce362" (UID: "3c4504be-7de5-4997-85ad-14340f1ce362"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:07:44 crc kubenswrapper[5004]: I1201 10:07:44.837616 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c4504be-7de5-4997-85ad-14340f1ce362-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:07:44 crc kubenswrapper[5004]: I1201 10:07:44.973489 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mhj88"] Dec 01 10:07:44 crc kubenswrapper[5004]: I1201 10:07:44.986735 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mhj88"] Dec 01 10:07:46 crc kubenswrapper[5004]: I1201 10:07:46.777717 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c4504be-7de5-4997-85ad-14340f1ce362" path="/var/lib/kubelet/pods/3c4504be-7de5-4997-85ad-14340f1ce362/volumes" Dec 01 10:08:08 crc kubenswrapper[5004]: I1201 10:08:08.729467 5004 patch_prober.go:28] interesting pod/machine-config-daemon-fvdgt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:08:08 crc kubenswrapper[5004]: I1201 10:08:08.730196 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:08:22 crc kubenswrapper[5004]: I1201 10:08:22.064605 5004 scope.go:117] "RemoveContainer" containerID="85b42fb039d635fe67a4574b59d1e44b7ab85b12d9f6109a13f8b77262a51031" Dec 01 10:08:38 crc kubenswrapper[5004]: I1201 10:08:38.729101 5004 patch_prober.go:28] interesting pod/machine-config-daemon-fvdgt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:08:38 crc kubenswrapper[5004]: I1201 10:08:38.729688 5004 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:08:38 crc kubenswrapper[5004]: I1201 10:08:38.729745 5004 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" Dec 01 10:08:38 crc kubenswrapper[5004]: I1201 10:08:38.730610 5004 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bb7b2f98452ad6358b81c3ab624dff561c4061a1923f5ef6c17db875889257ec"} pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 10:08:38 crc kubenswrapper[5004]: I1201 10:08:38.730664 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerName="machine-config-daemon" containerID="cri-o://bb7b2f98452ad6358b81c3ab624dff561c4061a1923f5ef6c17db875889257ec" gracePeriod=600 Dec 01 10:08:38 crc kubenswrapper[5004]: E1201 10:08:38.869836 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 10:08:39 crc kubenswrapper[5004]: I1201 10:08:39.212928 5004 generic.go:334] "Generic (PLEG): container finished" podID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" containerID="bb7b2f98452ad6358b81c3ab624dff561c4061a1923f5ef6c17db875889257ec" exitCode=0 Dec 01 10:08:39 crc kubenswrapper[5004]: I1201 10:08:39.212973 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" event={"ID":"f9977ebb-82de-4e96-8763-0b5a84f8d4ce","Type":"ContainerDied","Data":"bb7b2f98452ad6358b81c3ab624dff561c4061a1923f5ef6c17db875889257ec"} Dec 01 10:08:39 crc kubenswrapper[5004]: I1201 10:08:39.213003 5004 scope.go:117] "RemoveContainer" containerID="497710a2817802c21762db587a6989ef9d9b667ad5011a6d0d79313f386386ad" Dec 01 10:08:39 crc kubenswrapper[5004]: I1201 10:08:39.213774 5004 scope.go:117] "RemoveContainer" containerID="bb7b2f98452ad6358b81c3ab624dff561c4061a1923f5ef6c17db875889257ec" Dec 01 10:08:39 crc kubenswrapper[5004]: E1201 10:08:39.214181 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 10:08:54 crc kubenswrapper[5004]: I1201 10:08:54.759265 5004 scope.go:117] "RemoveContainer" containerID="bb7b2f98452ad6358b81c3ab624dff561c4061a1923f5ef6c17db875889257ec" Dec 01 10:08:54 crc kubenswrapper[5004]: E1201 10:08:54.760167 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 10:09:02 crc kubenswrapper[5004]: I1201 10:09:02.476669 5004 generic.go:334] "Generic (PLEG): container finished" podID="481141e8-7f09-4b09-b15d-4621f4ff7bab" containerID="ce164add57bfc59dcddfda167ea69e4adfcbb581de265b86a2460d41e4dcc79b" exitCode=0 Dec 01 10:09:02 crc kubenswrapper[5004]: I1201 10:09:02.476767 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xmsbt/must-gather-7cnt7" event={"ID":"481141e8-7f09-4b09-b15d-4621f4ff7bab","Type":"ContainerDied","Data":"ce164add57bfc59dcddfda167ea69e4adfcbb581de265b86a2460d41e4dcc79b"} Dec 01 10:09:02 crc kubenswrapper[5004]: I1201 10:09:02.477942 5004 scope.go:117] "RemoveContainer" containerID="ce164add57bfc59dcddfda167ea69e4adfcbb581de265b86a2460d41e4dcc79b" Dec 01 10:09:02 crc kubenswrapper[5004]: I1201 10:09:02.767947 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-xmsbt_must-gather-7cnt7_481141e8-7f09-4b09-b15d-4621f4ff7bab/gather/0.log" Dec 01 10:09:06 crc kubenswrapper[5004]: I1201 10:09:06.759495 5004 scope.go:117] "RemoveContainer" containerID="bb7b2f98452ad6358b81c3ab624dff561c4061a1923f5ef6c17db875889257ec" Dec 01 10:09:06 crc kubenswrapper[5004]: E1201 10:09:06.760421 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 10:09:10 crc kubenswrapper[5004]: I1201 10:09:10.537089 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-xmsbt/must-gather-7cnt7"] Dec 01 10:09:10 crc kubenswrapper[5004]: I1201 10:09:10.539519 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-xmsbt/must-gather-7cnt7" podUID="481141e8-7f09-4b09-b15d-4621f4ff7bab" containerName="copy" containerID="cri-o://62c34251faf762eef045feb5e0a6c341c1273373da979163d027c1f4821cee50" gracePeriod=2 Dec 01 10:09:10 crc kubenswrapper[5004]: I1201 10:09:10.553746 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-xmsbt/must-gather-7cnt7"] Dec 01 10:09:11 crc kubenswrapper[5004]: I1201 10:09:11.049317 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-xmsbt_must-gather-7cnt7_481141e8-7f09-4b09-b15d-4621f4ff7bab/copy/0.log" Dec 01 10:09:11 crc kubenswrapper[5004]: I1201 10:09:11.052880 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xmsbt/must-gather-7cnt7" Dec 01 10:09:11 crc kubenswrapper[5004]: I1201 10:09:11.198889 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/481141e8-7f09-4b09-b15d-4621f4ff7bab-must-gather-output\") pod \"481141e8-7f09-4b09-b15d-4621f4ff7bab\" (UID: \"481141e8-7f09-4b09-b15d-4621f4ff7bab\") " Dec 01 10:09:11 crc kubenswrapper[5004]: I1201 10:09:11.199054 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28ql9\" (UniqueName: \"kubernetes.io/projected/481141e8-7f09-4b09-b15d-4621f4ff7bab-kube-api-access-28ql9\") pod \"481141e8-7f09-4b09-b15d-4621f4ff7bab\" (UID: \"481141e8-7f09-4b09-b15d-4621f4ff7bab\") " Dec 01 10:09:11 crc kubenswrapper[5004]: I1201 10:09:11.206454 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/481141e8-7f09-4b09-b15d-4621f4ff7bab-kube-api-access-28ql9" (OuterVolumeSpecName: "kube-api-access-28ql9") pod "481141e8-7f09-4b09-b15d-4621f4ff7bab" (UID: "481141e8-7f09-4b09-b15d-4621f4ff7bab"). InnerVolumeSpecName "kube-api-access-28ql9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:09:11 crc kubenswrapper[5004]: I1201 10:09:11.303493 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28ql9\" (UniqueName: \"kubernetes.io/projected/481141e8-7f09-4b09-b15d-4621f4ff7bab-kube-api-access-28ql9\") on node \"crc\" DevicePath \"\"" Dec 01 10:09:11 crc kubenswrapper[5004]: I1201 10:09:11.376707 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/481141e8-7f09-4b09-b15d-4621f4ff7bab-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "481141e8-7f09-4b09-b15d-4621f4ff7bab" (UID: "481141e8-7f09-4b09-b15d-4621f4ff7bab"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:09:11 crc kubenswrapper[5004]: I1201 10:09:11.408835 5004 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/481141e8-7f09-4b09-b15d-4621f4ff7bab-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 01 10:09:11 crc kubenswrapper[5004]: I1201 10:09:11.595067 5004 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-xmsbt_must-gather-7cnt7_481141e8-7f09-4b09-b15d-4621f4ff7bab/copy/0.log" Dec 01 10:09:11 crc kubenswrapper[5004]: I1201 10:09:11.595338 5004 generic.go:334] "Generic (PLEG): container finished" podID="481141e8-7f09-4b09-b15d-4621f4ff7bab" containerID="62c34251faf762eef045feb5e0a6c341c1273373da979163d027c1f4821cee50" exitCode=143 Dec 01 10:09:11 crc kubenswrapper[5004]: I1201 10:09:11.595393 5004 scope.go:117] "RemoveContainer" containerID="62c34251faf762eef045feb5e0a6c341c1273373da979163d027c1f4821cee50" Dec 01 10:09:11 crc kubenswrapper[5004]: I1201 10:09:11.595721 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xmsbt/must-gather-7cnt7" Dec 01 10:09:11 crc kubenswrapper[5004]: I1201 10:09:11.634962 5004 scope.go:117] "RemoveContainer" containerID="ce164add57bfc59dcddfda167ea69e4adfcbb581de265b86a2460d41e4dcc79b" Dec 01 10:09:11 crc kubenswrapper[5004]: I1201 10:09:11.695273 5004 scope.go:117] "RemoveContainer" containerID="62c34251faf762eef045feb5e0a6c341c1273373da979163d027c1f4821cee50" Dec 01 10:09:11 crc kubenswrapper[5004]: E1201 10:09:11.695858 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62c34251faf762eef045feb5e0a6c341c1273373da979163d027c1f4821cee50\": container with ID starting with 62c34251faf762eef045feb5e0a6c341c1273373da979163d027c1f4821cee50 not found: ID does not exist" containerID="62c34251faf762eef045feb5e0a6c341c1273373da979163d027c1f4821cee50" Dec 01 10:09:11 crc kubenswrapper[5004]: I1201 10:09:11.696020 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62c34251faf762eef045feb5e0a6c341c1273373da979163d027c1f4821cee50"} err="failed to get container status \"62c34251faf762eef045feb5e0a6c341c1273373da979163d027c1f4821cee50\": rpc error: code = NotFound desc = could not find container \"62c34251faf762eef045feb5e0a6c341c1273373da979163d027c1f4821cee50\": container with ID starting with 62c34251faf762eef045feb5e0a6c341c1273373da979163d027c1f4821cee50 not found: ID does not exist" Dec 01 10:09:11 crc kubenswrapper[5004]: I1201 10:09:11.696157 5004 scope.go:117] "RemoveContainer" containerID="ce164add57bfc59dcddfda167ea69e4adfcbb581de265b86a2460d41e4dcc79b" Dec 01 10:09:11 crc kubenswrapper[5004]: E1201 10:09:11.696739 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce164add57bfc59dcddfda167ea69e4adfcbb581de265b86a2460d41e4dcc79b\": container with ID starting with ce164add57bfc59dcddfda167ea69e4adfcbb581de265b86a2460d41e4dcc79b not found: ID does not exist" containerID="ce164add57bfc59dcddfda167ea69e4adfcbb581de265b86a2460d41e4dcc79b" Dec 01 10:09:11 crc kubenswrapper[5004]: I1201 10:09:11.696794 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce164add57bfc59dcddfda167ea69e4adfcbb581de265b86a2460d41e4dcc79b"} err="failed to get container status \"ce164add57bfc59dcddfda167ea69e4adfcbb581de265b86a2460d41e4dcc79b\": rpc error: code = NotFound desc = could not find container \"ce164add57bfc59dcddfda167ea69e4adfcbb581de265b86a2460d41e4dcc79b\": container with ID starting with ce164add57bfc59dcddfda167ea69e4adfcbb581de265b86a2460d41e4dcc79b not found: ID does not exist" Dec 01 10:09:12 crc kubenswrapper[5004]: I1201 10:09:12.774427 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="481141e8-7f09-4b09-b15d-4621f4ff7bab" path="/var/lib/kubelet/pods/481141e8-7f09-4b09-b15d-4621f4ff7bab/volumes" Dec 01 10:09:19 crc kubenswrapper[5004]: I1201 10:09:19.758981 5004 scope.go:117] "RemoveContainer" containerID="bb7b2f98452ad6358b81c3ab624dff561c4061a1923f5ef6c17db875889257ec" Dec 01 10:09:19 crc kubenswrapper[5004]: E1201 10:09:19.760194 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 10:09:22 crc kubenswrapper[5004]: I1201 10:09:22.177678 5004 scope.go:117] "RemoveContainer" containerID="e4870a8d51858be9e9d5a86f36212e4ee37167d8742ae17af56649d8b8102824" Dec 01 10:09:25 crc kubenswrapper[5004]: I1201 10:09:25.674866 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sm858"] Dec 01 10:09:25 crc kubenswrapper[5004]: E1201 10:09:25.676967 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c4504be-7de5-4997-85ad-14340f1ce362" containerName="extract-content" Dec 01 10:09:25 crc kubenswrapper[5004]: I1201 10:09:25.676997 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c4504be-7de5-4997-85ad-14340f1ce362" containerName="extract-content" Dec 01 10:09:25 crc kubenswrapper[5004]: E1201 10:09:25.677039 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="481141e8-7f09-4b09-b15d-4621f4ff7bab" containerName="gather" Dec 01 10:09:25 crc kubenswrapper[5004]: I1201 10:09:25.677051 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="481141e8-7f09-4b09-b15d-4621f4ff7bab" containerName="gather" Dec 01 10:09:25 crc kubenswrapper[5004]: E1201 10:09:25.677096 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c4504be-7de5-4997-85ad-14340f1ce362" containerName="registry-server" Dec 01 10:09:25 crc kubenswrapper[5004]: I1201 10:09:25.677105 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c4504be-7de5-4997-85ad-14340f1ce362" containerName="registry-server" Dec 01 10:09:25 crc kubenswrapper[5004]: E1201 10:09:25.677154 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c4504be-7de5-4997-85ad-14340f1ce362" containerName="extract-utilities" Dec 01 10:09:25 crc kubenswrapper[5004]: I1201 10:09:25.677163 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c4504be-7de5-4997-85ad-14340f1ce362" containerName="extract-utilities" Dec 01 10:09:25 crc kubenswrapper[5004]: E1201 10:09:25.677180 5004 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="481141e8-7f09-4b09-b15d-4621f4ff7bab" containerName="copy" Dec 01 10:09:25 crc kubenswrapper[5004]: I1201 10:09:25.677187 5004 state_mem.go:107] "Deleted CPUSet assignment" podUID="481141e8-7f09-4b09-b15d-4621f4ff7bab" containerName="copy" Dec 01 10:09:25 crc kubenswrapper[5004]: I1201 10:09:25.677522 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="481141e8-7f09-4b09-b15d-4621f4ff7bab" containerName="copy" Dec 01 10:09:25 crc kubenswrapper[5004]: I1201 10:09:25.677550 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="481141e8-7f09-4b09-b15d-4621f4ff7bab" containerName="gather" Dec 01 10:09:25 crc kubenswrapper[5004]: I1201 10:09:25.677596 5004 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c4504be-7de5-4997-85ad-14340f1ce362" containerName="registry-server" Dec 01 10:09:25 crc kubenswrapper[5004]: I1201 10:09:25.680484 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sm858" Dec 01 10:09:25 crc kubenswrapper[5004]: I1201 10:09:25.706890 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sm858"] Dec 01 10:09:25 crc kubenswrapper[5004]: I1201 10:09:25.855237 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z24qd\" (UniqueName: \"kubernetes.io/projected/bf9d1c6d-4aa4-4372-a35c-a30bc55a180f-kube-api-access-z24qd\") pod \"redhat-marketplace-sm858\" (UID: \"bf9d1c6d-4aa4-4372-a35c-a30bc55a180f\") " pod="openshift-marketplace/redhat-marketplace-sm858" Dec 01 10:09:25 crc kubenswrapper[5004]: I1201 10:09:25.855716 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf9d1c6d-4aa4-4372-a35c-a30bc55a180f-catalog-content\") pod \"redhat-marketplace-sm858\" (UID: \"bf9d1c6d-4aa4-4372-a35c-a30bc55a180f\") " pod="openshift-marketplace/redhat-marketplace-sm858" Dec 01 10:09:25 crc kubenswrapper[5004]: I1201 10:09:25.855787 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf9d1c6d-4aa4-4372-a35c-a30bc55a180f-utilities\") pod \"redhat-marketplace-sm858\" (UID: \"bf9d1c6d-4aa4-4372-a35c-a30bc55a180f\") " pod="openshift-marketplace/redhat-marketplace-sm858" Dec 01 10:09:25 crc kubenswrapper[5004]: I1201 10:09:25.958319 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf9d1c6d-4aa4-4372-a35c-a30bc55a180f-catalog-content\") pod \"redhat-marketplace-sm858\" (UID: \"bf9d1c6d-4aa4-4372-a35c-a30bc55a180f\") " pod="openshift-marketplace/redhat-marketplace-sm858" Dec 01 10:09:25 crc kubenswrapper[5004]: I1201 10:09:25.958720 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf9d1c6d-4aa4-4372-a35c-a30bc55a180f-utilities\") pod \"redhat-marketplace-sm858\" (UID: \"bf9d1c6d-4aa4-4372-a35c-a30bc55a180f\") " pod="openshift-marketplace/redhat-marketplace-sm858" Dec 01 10:09:25 crc kubenswrapper[5004]: I1201 10:09:25.958812 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z24qd\" (UniqueName: \"kubernetes.io/projected/bf9d1c6d-4aa4-4372-a35c-a30bc55a180f-kube-api-access-z24qd\") pod \"redhat-marketplace-sm858\" (UID: \"bf9d1c6d-4aa4-4372-a35c-a30bc55a180f\") " pod="openshift-marketplace/redhat-marketplace-sm858" Dec 01 10:09:25 crc kubenswrapper[5004]: I1201 10:09:25.958807 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf9d1c6d-4aa4-4372-a35c-a30bc55a180f-catalog-content\") pod \"redhat-marketplace-sm858\" (UID: \"bf9d1c6d-4aa4-4372-a35c-a30bc55a180f\") " pod="openshift-marketplace/redhat-marketplace-sm858" Dec 01 10:09:25 crc kubenswrapper[5004]: I1201 10:09:25.959120 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf9d1c6d-4aa4-4372-a35c-a30bc55a180f-utilities\") pod \"redhat-marketplace-sm858\" (UID: \"bf9d1c6d-4aa4-4372-a35c-a30bc55a180f\") " pod="openshift-marketplace/redhat-marketplace-sm858" Dec 01 10:09:25 crc kubenswrapper[5004]: I1201 10:09:25.995229 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z24qd\" (UniqueName: \"kubernetes.io/projected/bf9d1c6d-4aa4-4372-a35c-a30bc55a180f-kube-api-access-z24qd\") pod \"redhat-marketplace-sm858\" (UID: \"bf9d1c6d-4aa4-4372-a35c-a30bc55a180f\") " pod="openshift-marketplace/redhat-marketplace-sm858" Dec 01 10:09:26 crc kubenswrapper[5004]: I1201 10:09:26.015337 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sm858" Dec 01 10:09:26 crc kubenswrapper[5004]: I1201 10:09:26.531284 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sm858"] Dec 01 10:09:26 crc kubenswrapper[5004]: I1201 10:09:26.773092 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sm858" event={"ID":"bf9d1c6d-4aa4-4372-a35c-a30bc55a180f","Type":"ContainerStarted","Data":"7d15c596da24cc9a9b1fcc94f40ddbe04066db9dbc9cb61996553ce87a5edd63"} Dec 01 10:09:26 crc kubenswrapper[5004]: I1201 10:09:26.773406 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sm858" event={"ID":"bf9d1c6d-4aa4-4372-a35c-a30bc55a180f","Type":"ContainerStarted","Data":"03b5c5c05a51f5d5eb2dda4b1f38ca6223716ed441cf21139657653208243560"} Dec 01 10:09:27 crc kubenswrapper[5004]: I1201 10:09:27.810426 5004 generic.go:334] "Generic (PLEG): container finished" podID="bf9d1c6d-4aa4-4372-a35c-a30bc55a180f" containerID="7d15c596da24cc9a9b1fcc94f40ddbe04066db9dbc9cb61996553ce87a5edd63" exitCode=0 Dec 01 10:09:27 crc kubenswrapper[5004]: I1201 10:09:27.811612 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sm858" event={"ID":"bf9d1c6d-4aa4-4372-a35c-a30bc55a180f","Type":"ContainerDied","Data":"7d15c596da24cc9a9b1fcc94f40ddbe04066db9dbc9cb61996553ce87a5edd63"} Dec 01 10:09:27 crc kubenswrapper[5004]: I1201 10:09:27.811947 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sm858" event={"ID":"bf9d1c6d-4aa4-4372-a35c-a30bc55a180f","Type":"ContainerStarted","Data":"d227b6f11735ff002012a078c3330236a93930ffe85a47a2845434009341f17c"} Dec 01 10:09:28 crc kubenswrapper[5004]: I1201 10:09:28.827663 5004 generic.go:334] "Generic (PLEG): container finished" podID="bf9d1c6d-4aa4-4372-a35c-a30bc55a180f" containerID="d227b6f11735ff002012a078c3330236a93930ffe85a47a2845434009341f17c" exitCode=0 Dec 01 10:09:28 crc kubenswrapper[5004]: I1201 10:09:28.827796 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sm858" event={"ID":"bf9d1c6d-4aa4-4372-a35c-a30bc55a180f","Type":"ContainerDied","Data":"d227b6f11735ff002012a078c3330236a93930ffe85a47a2845434009341f17c"} Dec 01 10:09:29 crc kubenswrapper[5004]: I1201 10:09:29.271091 5004 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-j7crd"] Dec 01 10:09:29 crc kubenswrapper[5004]: I1201 10:09:29.274234 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j7crd" Dec 01 10:09:29 crc kubenswrapper[5004]: I1201 10:09:29.281835 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-j7crd"] Dec 01 10:09:29 crc kubenswrapper[5004]: I1201 10:09:29.360370 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/153bd0d7-316b-49f3-a602-1ec3cf9f6023-catalog-content\") pod \"certified-operators-j7crd\" (UID: \"153bd0d7-316b-49f3-a602-1ec3cf9f6023\") " pod="openshift-marketplace/certified-operators-j7crd" Dec 01 10:09:29 crc kubenswrapper[5004]: I1201 10:09:29.360637 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/153bd0d7-316b-49f3-a602-1ec3cf9f6023-utilities\") pod \"certified-operators-j7crd\" (UID: \"153bd0d7-316b-49f3-a602-1ec3cf9f6023\") " pod="openshift-marketplace/certified-operators-j7crd" Dec 01 10:09:29 crc kubenswrapper[5004]: I1201 10:09:29.360672 5004 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2frdp\" (UniqueName: \"kubernetes.io/projected/153bd0d7-316b-49f3-a602-1ec3cf9f6023-kube-api-access-2frdp\") pod \"certified-operators-j7crd\" (UID: \"153bd0d7-316b-49f3-a602-1ec3cf9f6023\") " pod="openshift-marketplace/certified-operators-j7crd" Dec 01 10:09:29 crc kubenswrapper[5004]: I1201 10:09:29.463186 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/153bd0d7-316b-49f3-a602-1ec3cf9f6023-catalog-content\") pod \"certified-operators-j7crd\" (UID: \"153bd0d7-316b-49f3-a602-1ec3cf9f6023\") " pod="openshift-marketplace/certified-operators-j7crd" Dec 01 10:09:29 crc kubenswrapper[5004]: I1201 10:09:29.463405 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/153bd0d7-316b-49f3-a602-1ec3cf9f6023-utilities\") pod \"certified-operators-j7crd\" (UID: \"153bd0d7-316b-49f3-a602-1ec3cf9f6023\") " pod="openshift-marketplace/certified-operators-j7crd" Dec 01 10:09:29 crc kubenswrapper[5004]: I1201 10:09:29.463436 5004 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2frdp\" (UniqueName: \"kubernetes.io/projected/153bd0d7-316b-49f3-a602-1ec3cf9f6023-kube-api-access-2frdp\") pod \"certified-operators-j7crd\" (UID: \"153bd0d7-316b-49f3-a602-1ec3cf9f6023\") " pod="openshift-marketplace/certified-operators-j7crd" Dec 01 10:09:29 crc kubenswrapper[5004]: I1201 10:09:29.464225 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/153bd0d7-316b-49f3-a602-1ec3cf9f6023-utilities\") pod \"certified-operators-j7crd\" (UID: \"153bd0d7-316b-49f3-a602-1ec3cf9f6023\") " pod="openshift-marketplace/certified-operators-j7crd" Dec 01 10:09:29 crc kubenswrapper[5004]: I1201 10:09:29.464328 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/153bd0d7-316b-49f3-a602-1ec3cf9f6023-catalog-content\") pod \"certified-operators-j7crd\" (UID: \"153bd0d7-316b-49f3-a602-1ec3cf9f6023\") " pod="openshift-marketplace/certified-operators-j7crd" Dec 01 10:09:29 crc kubenswrapper[5004]: I1201 10:09:29.485021 5004 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2frdp\" (UniqueName: \"kubernetes.io/projected/153bd0d7-316b-49f3-a602-1ec3cf9f6023-kube-api-access-2frdp\") pod \"certified-operators-j7crd\" (UID: \"153bd0d7-316b-49f3-a602-1ec3cf9f6023\") " pod="openshift-marketplace/certified-operators-j7crd" Dec 01 10:09:29 crc kubenswrapper[5004]: I1201 10:09:29.606719 5004 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j7crd" Dec 01 10:09:30 crc kubenswrapper[5004]: I1201 10:09:29.842006 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sm858" event={"ID":"bf9d1c6d-4aa4-4372-a35c-a30bc55a180f","Type":"ContainerStarted","Data":"323fbb2311446a44c2a1c7916ffc81033b55e8625e8a0a572f4974e21ddb9f1f"} Dec 01 10:09:30 crc kubenswrapper[5004]: I1201 10:09:29.867305 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sm858" podStartSLOduration=2.371325334 podStartE2EDuration="4.867284116s" podCreationTimestamp="2025-12-01 10:09:25 +0000 UTC" firstStartedPulling="2025-12-01 10:09:26.760720432 +0000 UTC m=+6744.325712414" lastFinishedPulling="2025-12-01 10:09:29.256679214 +0000 UTC m=+6746.821671196" observedRunningTime="2025-12-01 10:09:29.857600971 +0000 UTC m=+6747.422592963" watchObservedRunningTime="2025-12-01 10:09:29.867284116 +0000 UTC m=+6747.432276098" Dec 01 10:09:30 crc kubenswrapper[5004]: I1201 10:09:30.518438 5004 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-j7crd"] Dec 01 10:09:30 crc kubenswrapper[5004]: W1201 10:09:30.528010 5004 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod153bd0d7_316b_49f3_a602_1ec3cf9f6023.slice/crio-0b30505ef6726c496955e8c2d40dcd727147e3d59ff421d2d5eefcd72a49a84e WatchSource:0}: Error finding container 0b30505ef6726c496955e8c2d40dcd727147e3d59ff421d2d5eefcd72a49a84e: Status 404 returned error can't find the container with id 0b30505ef6726c496955e8c2d40dcd727147e3d59ff421d2d5eefcd72a49a84e Dec 01 10:09:30 crc kubenswrapper[5004]: I1201 10:09:30.854347 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j7crd" event={"ID":"153bd0d7-316b-49f3-a602-1ec3cf9f6023","Type":"ContainerStarted","Data":"b109e0cebcf8a891f253c397792cd8a663e69eb9cd887b9d40fb651c05ae6bf8"} Dec 01 10:09:30 crc kubenswrapper[5004]: I1201 10:09:30.854393 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j7crd" event={"ID":"153bd0d7-316b-49f3-a602-1ec3cf9f6023","Type":"ContainerStarted","Data":"0b30505ef6726c496955e8c2d40dcd727147e3d59ff421d2d5eefcd72a49a84e"} Dec 01 10:09:31 crc kubenswrapper[5004]: I1201 10:09:31.890257 5004 generic.go:334] "Generic (PLEG): container finished" podID="153bd0d7-316b-49f3-a602-1ec3cf9f6023" containerID="b109e0cebcf8a891f253c397792cd8a663e69eb9cd887b9d40fb651c05ae6bf8" exitCode=0 Dec 01 10:09:31 crc kubenswrapper[5004]: I1201 10:09:31.890600 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j7crd" event={"ID":"153bd0d7-316b-49f3-a602-1ec3cf9f6023","Type":"ContainerDied","Data":"b109e0cebcf8a891f253c397792cd8a663e69eb9cd887b9d40fb651c05ae6bf8"} Dec 01 10:09:32 crc kubenswrapper[5004]: I1201 10:09:32.768895 5004 scope.go:117] "RemoveContainer" containerID="bb7b2f98452ad6358b81c3ab624dff561c4061a1923f5ef6c17db875889257ec" Dec 01 10:09:32 crc kubenswrapper[5004]: E1201 10:09:32.769710 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 10:09:32 crc kubenswrapper[5004]: I1201 10:09:32.903176 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j7crd" event={"ID":"153bd0d7-316b-49f3-a602-1ec3cf9f6023","Type":"ContainerStarted","Data":"86f73834deb8edf680524467e631ca1a819e09e229f733505fe1942bfa2c0d31"} Dec 01 10:09:33 crc kubenswrapper[5004]: I1201 10:09:33.921032 5004 generic.go:334] "Generic (PLEG): container finished" podID="153bd0d7-316b-49f3-a602-1ec3cf9f6023" containerID="86f73834deb8edf680524467e631ca1a819e09e229f733505fe1942bfa2c0d31" exitCode=0 Dec 01 10:09:33 crc kubenswrapper[5004]: I1201 10:09:33.921121 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j7crd" event={"ID":"153bd0d7-316b-49f3-a602-1ec3cf9f6023","Type":"ContainerDied","Data":"86f73834deb8edf680524467e631ca1a819e09e229f733505fe1942bfa2c0d31"} Dec 01 10:09:35 crc kubenswrapper[5004]: I1201 10:09:35.949815 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j7crd" event={"ID":"153bd0d7-316b-49f3-a602-1ec3cf9f6023","Type":"ContainerStarted","Data":"f6c42e98737ad27f2a470202262d80927f0195875ad296e2fa80d0a520c337fe"} Dec 01 10:09:35 crc kubenswrapper[5004]: I1201 10:09:35.979649 5004 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-j7crd" podStartSLOduration=3.1254880050000002 podStartE2EDuration="6.97962548s" podCreationTimestamp="2025-12-01 10:09:29 +0000 UTC" firstStartedPulling="2025-12-01 10:09:30.857205237 +0000 UTC m=+6748.422197219" lastFinishedPulling="2025-12-01 10:09:34.711342712 +0000 UTC m=+6752.276334694" observedRunningTime="2025-12-01 10:09:35.965982829 +0000 UTC m=+6753.530974821" watchObservedRunningTime="2025-12-01 10:09:35.97962548 +0000 UTC m=+6753.544617462" Dec 01 10:09:36 crc kubenswrapper[5004]: I1201 10:09:36.015515 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sm858" Dec 01 10:09:36 crc kubenswrapper[5004]: I1201 10:09:36.015596 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sm858" Dec 01 10:09:36 crc kubenswrapper[5004]: I1201 10:09:36.068723 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sm858" Dec 01 10:09:37 crc kubenswrapper[5004]: I1201 10:09:37.010716 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sm858" Dec 01 10:09:38 crc kubenswrapper[5004]: I1201 10:09:38.260750 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sm858"] Dec 01 10:09:38 crc kubenswrapper[5004]: I1201 10:09:38.982077 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sm858" podUID="bf9d1c6d-4aa4-4372-a35c-a30bc55a180f" containerName="registry-server" containerID="cri-o://323fbb2311446a44c2a1c7916ffc81033b55e8625e8a0a572f4974e21ddb9f1f" gracePeriod=2 Dec 01 10:09:39 crc kubenswrapper[5004]: I1201 10:09:39.452544 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sm858" Dec 01 10:09:39 crc kubenswrapper[5004]: I1201 10:09:39.512984 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf9d1c6d-4aa4-4372-a35c-a30bc55a180f-utilities\") pod \"bf9d1c6d-4aa4-4372-a35c-a30bc55a180f\" (UID: \"bf9d1c6d-4aa4-4372-a35c-a30bc55a180f\") " Dec 01 10:09:39 crc kubenswrapper[5004]: I1201 10:09:39.513329 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf9d1c6d-4aa4-4372-a35c-a30bc55a180f-catalog-content\") pod \"bf9d1c6d-4aa4-4372-a35c-a30bc55a180f\" (UID: \"bf9d1c6d-4aa4-4372-a35c-a30bc55a180f\") " Dec 01 10:09:39 crc kubenswrapper[5004]: I1201 10:09:39.513647 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z24qd\" (UniqueName: \"kubernetes.io/projected/bf9d1c6d-4aa4-4372-a35c-a30bc55a180f-kube-api-access-z24qd\") pod \"bf9d1c6d-4aa4-4372-a35c-a30bc55a180f\" (UID: \"bf9d1c6d-4aa4-4372-a35c-a30bc55a180f\") " Dec 01 10:09:39 crc kubenswrapper[5004]: I1201 10:09:39.514082 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf9d1c6d-4aa4-4372-a35c-a30bc55a180f-utilities" (OuterVolumeSpecName: "utilities") pod "bf9d1c6d-4aa4-4372-a35c-a30bc55a180f" (UID: "bf9d1c6d-4aa4-4372-a35c-a30bc55a180f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:09:39 crc kubenswrapper[5004]: I1201 10:09:39.514902 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf9d1c6d-4aa4-4372-a35c-a30bc55a180f-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:09:39 crc kubenswrapper[5004]: I1201 10:09:39.521323 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf9d1c6d-4aa4-4372-a35c-a30bc55a180f-kube-api-access-z24qd" (OuterVolumeSpecName: "kube-api-access-z24qd") pod "bf9d1c6d-4aa4-4372-a35c-a30bc55a180f" (UID: "bf9d1c6d-4aa4-4372-a35c-a30bc55a180f"). InnerVolumeSpecName "kube-api-access-z24qd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:09:39 crc kubenswrapper[5004]: I1201 10:09:39.535817 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf9d1c6d-4aa4-4372-a35c-a30bc55a180f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bf9d1c6d-4aa4-4372-a35c-a30bc55a180f" (UID: "bf9d1c6d-4aa4-4372-a35c-a30bc55a180f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:09:39 crc kubenswrapper[5004]: I1201 10:09:39.607762 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-j7crd" Dec 01 10:09:39 crc kubenswrapper[5004]: I1201 10:09:39.607819 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-j7crd" Dec 01 10:09:39 crc kubenswrapper[5004]: I1201 10:09:39.617383 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z24qd\" (UniqueName: \"kubernetes.io/projected/bf9d1c6d-4aa4-4372-a35c-a30bc55a180f-kube-api-access-z24qd\") on node \"crc\" DevicePath \"\"" Dec 01 10:09:39 crc kubenswrapper[5004]: I1201 10:09:39.617418 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf9d1c6d-4aa4-4372-a35c-a30bc55a180f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:09:39 crc kubenswrapper[5004]: I1201 10:09:39.658413 5004 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-j7crd" Dec 01 10:09:39 crc kubenswrapper[5004]: I1201 10:09:39.998077 5004 generic.go:334] "Generic (PLEG): container finished" podID="bf9d1c6d-4aa4-4372-a35c-a30bc55a180f" containerID="323fbb2311446a44c2a1c7916ffc81033b55e8625e8a0a572f4974e21ddb9f1f" exitCode=0 Dec 01 10:09:39 crc kubenswrapper[5004]: I1201 10:09:39.998400 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sm858" Dec 01 10:09:39 crc kubenswrapper[5004]: I1201 10:09:39.998395 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sm858" event={"ID":"bf9d1c6d-4aa4-4372-a35c-a30bc55a180f","Type":"ContainerDied","Data":"323fbb2311446a44c2a1c7916ffc81033b55e8625e8a0a572f4974e21ddb9f1f"} Dec 01 10:09:39 crc kubenswrapper[5004]: I1201 10:09:39.998605 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sm858" event={"ID":"bf9d1c6d-4aa4-4372-a35c-a30bc55a180f","Type":"ContainerDied","Data":"03b5c5c05a51f5d5eb2dda4b1f38ca6223716ed441cf21139657653208243560"} Dec 01 10:09:39 crc kubenswrapper[5004]: I1201 10:09:39.998626 5004 scope.go:117] "RemoveContainer" containerID="323fbb2311446a44c2a1c7916ffc81033b55e8625e8a0a572f4974e21ddb9f1f" Dec 01 10:09:40 crc kubenswrapper[5004]: I1201 10:09:40.034754 5004 scope.go:117] "RemoveContainer" containerID="d227b6f11735ff002012a078c3330236a93930ffe85a47a2845434009341f17c" Dec 01 10:09:40 crc kubenswrapper[5004]: I1201 10:09:40.037316 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sm858"] Dec 01 10:09:40 crc kubenswrapper[5004]: I1201 10:09:40.050916 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sm858"] Dec 01 10:09:40 crc kubenswrapper[5004]: I1201 10:09:40.058785 5004 scope.go:117] "RemoveContainer" containerID="7d15c596da24cc9a9b1fcc94f40ddbe04066db9dbc9cb61996553ce87a5edd63" Dec 01 10:09:40 crc kubenswrapper[5004]: I1201 10:09:40.063589 5004 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-j7crd" Dec 01 10:09:40 crc kubenswrapper[5004]: I1201 10:09:40.122955 5004 scope.go:117] "RemoveContainer" containerID="323fbb2311446a44c2a1c7916ffc81033b55e8625e8a0a572f4974e21ddb9f1f" Dec 01 10:09:40 crc kubenswrapper[5004]: E1201 10:09:40.123356 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"323fbb2311446a44c2a1c7916ffc81033b55e8625e8a0a572f4974e21ddb9f1f\": container with ID starting with 323fbb2311446a44c2a1c7916ffc81033b55e8625e8a0a572f4974e21ddb9f1f not found: ID does not exist" containerID="323fbb2311446a44c2a1c7916ffc81033b55e8625e8a0a572f4974e21ddb9f1f" Dec 01 10:09:40 crc kubenswrapper[5004]: I1201 10:09:40.123398 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"323fbb2311446a44c2a1c7916ffc81033b55e8625e8a0a572f4974e21ddb9f1f"} err="failed to get container status \"323fbb2311446a44c2a1c7916ffc81033b55e8625e8a0a572f4974e21ddb9f1f\": rpc error: code = NotFound desc = could not find container \"323fbb2311446a44c2a1c7916ffc81033b55e8625e8a0a572f4974e21ddb9f1f\": container with ID starting with 323fbb2311446a44c2a1c7916ffc81033b55e8625e8a0a572f4974e21ddb9f1f not found: ID does not exist" Dec 01 10:09:40 crc kubenswrapper[5004]: I1201 10:09:40.123424 5004 scope.go:117] "RemoveContainer" containerID="d227b6f11735ff002012a078c3330236a93930ffe85a47a2845434009341f17c" Dec 01 10:09:40 crc kubenswrapper[5004]: E1201 10:09:40.123814 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d227b6f11735ff002012a078c3330236a93930ffe85a47a2845434009341f17c\": container with ID starting with d227b6f11735ff002012a078c3330236a93930ffe85a47a2845434009341f17c not found: ID does not exist" containerID="d227b6f11735ff002012a078c3330236a93930ffe85a47a2845434009341f17c" Dec 01 10:09:40 crc kubenswrapper[5004]: I1201 10:09:40.123848 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d227b6f11735ff002012a078c3330236a93930ffe85a47a2845434009341f17c"} err="failed to get container status \"d227b6f11735ff002012a078c3330236a93930ffe85a47a2845434009341f17c\": rpc error: code = NotFound desc = could not find container \"d227b6f11735ff002012a078c3330236a93930ffe85a47a2845434009341f17c\": container with ID starting with d227b6f11735ff002012a078c3330236a93930ffe85a47a2845434009341f17c not found: ID does not exist" Dec 01 10:09:40 crc kubenswrapper[5004]: I1201 10:09:40.123868 5004 scope.go:117] "RemoveContainer" containerID="7d15c596da24cc9a9b1fcc94f40ddbe04066db9dbc9cb61996553ce87a5edd63" Dec 01 10:09:40 crc kubenswrapper[5004]: E1201 10:09:40.124096 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d15c596da24cc9a9b1fcc94f40ddbe04066db9dbc9cb61996553ce87a5edd63\": container with ID starting with 7d15c596da24cc9a9b1fcc94f40ddbe04066db9dbc9cb61996553ce87a5edd63 not found: ID does not exist" containerID="7d15c596da24cc9a9b1fcc94f40ddbe04066db9dbc9cb61996553ce87a5edd63" Dec 01 10:09:40 crc kubenswrapper[5004]: I1201 10:09:40.124123 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d15c596da24cc9a9b1fcc94f40ddbe04066db9dbc9cb61996553ce87a5edd63"} err="failed to get container status \"7d15c596da24cc9a9b1fcc94f40ddbe04066db9dbc9cb61996553ce87a5edd63\": rpc error: code = NotFound desc = could not find container \"7d15c596da24cc9a9b1fcc94f40ddbe04066db9dbc9cb61996553ce87a5edd63\": container with ID starting with 7d15c596da24cc9a9b1fcc94f40ddbe04066db9dbc9cb61996553ce87a5edd63 not found: ID does not exist" Dec 01 10:09:40 crc kubenswrapper[5004]: I1201 10:09:40.772013 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf9d1c6d-4aa4-4372-a35c-a30bc55a180f" path="/var/lib/kubelet/pods/bf9d1c6d-4aa4-4372-a35c-a30bc55a180f/volumes" Dec 01 10:09:42 crc kubenswrapper[5004]: I1201 10:09:42.058981 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-j7crd"] Dec 01 10:09:42 crc kubenswrapper[5004]: I1201 10:09:42.059203 5004 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-j7crd" podUID="153bd0d7-316b-49f3-a602-1ec3cf9f6023" containerName="registry-server" containerID="cri-o://f6c42e98737ad27f2a470202262d80927f0195875ad296e2fa80d0a520c337fe" gracePeriod=2 Dec 01 10:09:42 crc kubenswrapper[5004]: I1201 10:09:42.656510 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j7crd" Dec 01 10:09:42 crc kubenswrapper[5004]: I1201 10:09:42.800104 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/153bd0d7-316b-49f3-a602-1ec3cf9f6023-utilities\") pod \"153bd0d7-316b-49f3-a602-1ec3cf9f6023\" (UID: \"153bd0d7-316b-49f3-a602-1ec3cf9f6023\") " Dec 01 10:09:42 crc kubenswrapper[5004]: I1201 10:09:42.800195 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2frdp\" (UniqueName: \"kubernetes.io/projected/153bd0d7-316b-49f3-a602-1ec3cf9f6023-kube-api-access-2frdp\") pod \"153bd0d7-316b-49f3-a602-1ec3cf9f6023\" (UID: \"153bd0d7-316b-49f3-a602-1ec3cf9f6023\") " Dec 01 10:09:42 crc kubenswrapper[5004]: I1201 10:09:42.800359 5004 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/153bd0d7-316b-49f3-a602-1ec3cf9f6023-catalog-content\") pod \"153bd0d7-316b-49f3-a602-1ec3cf9f6023\" (UID: \"153bd0d7-316b-49f3-a602-1ec3cf9f6023\") " Dec 01 10:09:42 crc kubenswrapper[5004]: I1201 10:09:42.801192 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/153bd0d7-316b-49f3-a602-1ec3cf9f6023-utilities" (OuterVolumeSpecName: "utilities") pod "153bd0d7-316b-49f3-a602-1ec3cf9f6023" (UID: "153bd0d7-316b-49f3-a602-1ec3cf9f6023"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:09:42 crc kubenswrapper[5004]: I1201 10:09:42.816949 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/153bd0d7-316b-49f3-a602-1ec3cf9f6023-kube-api-access-2frdp" (OuterVolumeSpecName: "kube-api-access-2frdp") pod "153bd0d7-316b-49f3-a602-1ec3cf9f6023" (UID: "153bd0d7-316b-49f3-a602-1ec3cf9f6023"). InnerVolumeSpecName "kube-api-access-2frdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:09:42 crc kubenswrapper[5004]: I1201 10:09:42.853843 5004 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/153bd0d7-316b-49f3-a602-1ec3cf9f6023-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "153bd0d7-316b-49f3-a602-1ec3cf9f6023" (UID: "153bd0d7-316b-49f3-a602-1ec3cf9f6023"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:09:42 crc kubenswrapper[5004]: I1201 10:09:42.903794 5004 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/153bd0d7-316b-49f3-a602-1ec3cf9f6023-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:09:42 crc kubenswrapper[5004]: I1201 10:09:42.904592 5004 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2frdp\" (UniqueName: \"kubernetes.io/projected/153bd0d7-316b-49f3-a602-1ec3cf9f6023-kube-api-access-2frdp\") on node \"crc\" DevicePath \"\"" Dec 01 10:09:42 crc kubenswrapper[5004]: I1201 10:09:42.904631 5004 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/153bd0d7-316b-49f3-a602-1ec3cf9f6023-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:09:43 crc kubenswrapper[5004]: I1201 10:09:43.036092 5004 generic.go:334] "Generic (PLEG): container finished" podID="153bd0d7-316b-49f3-a602-1ec3cf9f6023" containerID="f6c42e98737ad27f2a470202262d80927f0195875ad296e2fa80d0a520c337fe" exitCode=0 Dec 01 10:09:43 crc kubenswrapper[5004]: I1201 10:09:43.036141 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j7crd" event={"ID":"153bd0d7-316b-49f3-a602-1ec3cf9f6023","Type":"ContainerDied","Data":"f6c42e98737ad27f2a470202262d80927f0195875ad296e2fa80d0a520c337fe"} Dec 01 10:09:43 crc kubenswrapper[5004]: I1201 10:09:43.036477 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j7crd" event={"ID":"153bd0d7-316b-49f3-a602-1ec3cf9f6023","Type":"ContainerDied","Data":"0b30505ef6726c496955e8c2d40dcd727147e3d59ff421d2d5eefcd72a49a84e"} Dec 01 10:09:43 crc kubenswrapper[5004]: I1201 10:09:43.036175 5004 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j7crd" Dec 01 10:09:43 crc kubenswrapper[5004]: I1201 10:09:43.036545 5004 scope.go:117] "RemoveContainer" containerID="f6c42e98737ad27f2a470202262d80927f0195875ad296e2fa80d0a520c337fe" Dec 01 10:09:43 crc kubenswrapper[5004]: I1201 10:09:43.065540 5004 scope.go:117] "RemoveContainer" containerID="86f73834deb8edf680524467e631ca1a819e09e229f733505fe1942bfa2c0d31" Dec 01 10:09:43 crc kubenswrapper[5004]: I1201 10:09:43.078129 5004 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-j7crd"] Dec 01 10:09:43 crc kubenswrapper[5004]: I1201 10:09:43.121046 5004 scope.go:117] "RemoveContainer" containerID="b109e0cebcf8a891f253c397792cd8a663e69eb9cd887b9d40fb651c05ae6bf8" Dec 01 10:09:43 crc kubenswrapper[5004]: I1201 10:09:43.121275 5004 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-j7crd"] Dec 01 10:09:43 crc kubenswrapper[5004]: I1201 10:09:43.157775 5004 scope.go:117] "RemoveContainer" containerID="f6c42e98737ad27f2a470202262d80927f0195875ad296e2fa80d0a520c337fe" Dec 01 10:09:43 crc kubenswrapper[5004]: E1201 10:09:43.158436 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6c42e98737ad27f2a470202262d80927f0195875ad296e2fa80d0a520c337fe\": container with ID starting with f6c42e98737ad27f2a470202262d80927f0195875ad296e2fa80d0a520c337fe not found: ID does not exist" containerID="f6c42e98737ad27f2a470202262d80927f0195875ad296e2fa80d0a520c337fe" Dec 01 10:09:43 crc kubenswrapper[5004]: I1201 10:09:43.158497 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6c42e98737ad27f2a470202262d80927f0195875ad296e2fa80d0a520c337fe"} err="failed to get container status \"f6c42e98737ad27f2a470202262d80927f0195875ad296e2fa80d0a520c337fe\": rpc error: code = NotFound desc = could not find container \"f6c42e98737ad27f2a470202262d80927f0195875ad296e2fa80d0a520c337fe\": container with ID starting with f6c42e98737ad27f2a470202262d80927f0195875ad296e2fa80d0a520c337fe not found: ID does not exist" Dec 01 10:09:43 crc kubenswrapper[5004]: I1201 10:09:43.158534 5004 scope.go:117] "RemoveContainer" containerID="86f73834deb8edf680524467e631ca1a819e09e229f733505fe1942bfa2c0d31" Dec 01 10:09:43 crc kubenswrapper[5004]: E1201 10:09:43.158989 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86f73834deb8edf680524467e631ca1a819e09e229f733505fe1942bfa2c0d31\": container with ID starting with 86f73834deb8edf680524467e631ca1a819e09e229f733505fe1942bfa2c0d31 not found: ID does not exist" containerID="86f73834deb8edf680524467e631ca1a819e09e229f733505fe1942bfa2c0d31" Dec 01 10:09:43 crc kubenswrapper[5004]: I1201 10:09:43.159032 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86f73834deb8edf680524467e631ca1a819e09e229f733505fe1942bfa2c0d31"} err="failed to get container status \"86f73834deb8edf680524467e631ca1a819e09e229f733505fe1942bfa2c0d31\": rpc error: code = NotFound desc = could not find container \"86f73834deb8edf680524467e631ca1a819e09e229f733505fe1942bfa2c0d31\": container with ID starting with 86f73834deb8edf680524467e631ca1a819e09e229f733505fe1942bfa2c0d31 not found: ID does not exist" Dec 01 10:09:43 crc kubenswrapper[5004]: I1201 10:09:43.159053 5004 scope.go:117] "RemoveContainer" containerID="b109e0cebcf8a891f253c397792cd8a663e69eb9cd887b9d40fb651c05ae6bf8" Dec 01 10:09:43 crc kubenswrapper[5004]: E1201 10:09:43.159309 5004 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b109e0cebcf8a891f253c397792cd8a663e69eb9cd887b9d40fb651c05ae6bf8\": container with ID starting with b109e0cebcf8a891f253c397792cd8a663e69eb9cd887b9d40fb651c05ae6bf8 not found: ID does not exist" containerID="b109e0cebcf8a891f253c397792cd8a663e69eb9cd887b9d40fb651c05ae6bf8" Dec 01 10:09:43 crc kubenswrapper[5004]: I1201 10:09:43.159340 5004 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b109e0cebcf8a891f253c397792cd8a663e69eb9cd887b9d40fb651c05ae6bf8"} err="failed to get container status \"b109e0cebcf8a891f253c397792cd8a663e69eb9cd887b9d40fb651c05ae6bf8\": rpc error: code = NotFound desc = could not find container \"b109e0cebcf8a891f253c397792cd8a663e69eb9cd887b9d40fb651c05ae6bf8\": container with ID starting with b109e0cebcf8a891f253c397792cd8a663e69eb9cd887b9d40fb651c05ae6bf8 not found: ID does not exist" Dec 01 10:09:44 crc kubenswrapper[5004]: I1201 10:09:44.775074 5004 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="153bd0d7-316b-49f3-a602-1ec3cf9f6023" path="/var/lib/kubelet/pods/153bd0d7-316b-49f3-a602-1ec3cf9f6023/volumes" Dec 01 10:09:47 crc kubenswrapper[5004]: I1201 10:09:47.759388 5004 scope.go:117] "RemoveContainer" containerID="bb7b2f98452ad6358b81c3ab624dff561c4061a1923f5ef6c17db875889257ec" Dec 01 10:09:47 crc kubenswrapper[5004]: E1201 10:09:47.760359 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 10:09:59 crc kubenswrapper[5004]: I1201 10:09:59.759715 5004 scope.go:117] "RemoveContainer" containerID="bb7b2f98452ad6358b81c3ab624dff561c4061a1923f5ef6c17db875889257ec" Dec 01 10:09:59 crc kubenswrapper[5004]: E1201 10:09:59.760734 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 10:10:14 crc kubenswrapper[5004]: I1201 10:10:14.759060 5004 scope.go:117] "RemoveContainer" containerID="bb7b2f98452ad6358b81c3ab624dff561c4061a1923f5ef6c17db875889257ec" Dec 01 10:10:14 crc kubenswrapper[5004]: E1201 10:10:14.761084 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 10:10:26 crc kubenswrapper[5004]: I1201 10:10:26.759445 5004 scope.go:117] "RemoveContainer" containerID="bb7b2f98452ad6358b81c3ab624dff561c4061a1923f5ef6c17db875889257ec" Dec 01 10:10:26 crc kubenswrapper[5004]: E1201 10:10:26.760392 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 10:10:41 crc kubenswrapper[5004]: I1201 10:10:41.758917 5004 scope.go:117] "RemoveContainer" containerID="bb7b2f98452ad6358b81c3ab624dff561c4061a1923f5ef6c17db875889257ec" Dec 01 10:10:41 crc kubenswrapper[5004]: E1201 10:10:41.759773 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 10:10:55 crc kubenswrapper[5004]: I1201 10:10:55.758599 5004 scope.go:117] "RemoveContainer" containerID="bb7b2f98452ad6358b81c3ab624dff561c4061a1923f5ef6c17db875889257ec" Dec 01 10:10:55 crc kubenswrapper[5004]: E1201 10:10:55.761510 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 10:11:09 crc kubenswrapper[5004]: I1201 10:11:09.758757 5004 scope.go:117] "RemoveContainer" containerID="bb7b2f98452ad6358b81c3ab624dff561c4061a1923f5ef6c17db875889257ec" Dec 01 10:11:09 crc kubenswrapper[5004]: E1201 10:11:09.759594 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 10:11:23 crc kubenswrapper[5004]: I1201 10:11:23.759053 5004 scope.go:117] "RemoveContainer" containerID="bb7b2f98452ad6358b81c3ab624dff561c4061a1923f5ef6c17db875889257ec" Dec 01 10:11:23 crc kubenswrapper[5004]: E1201 10:11:23.759951 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 10:11:34 crc kubenswrapper[5004]: I1201 10:11:34.759680 5004 scope.go:117] "RemoveContainer" containerID="bb7b2f98452ad6358b81c3ab624dff561c4061a1923f5ef6c17db875889257ec" Dec 01 10:11:34 crc kubenswrapper[5004]: E1201 10:11:34.760451 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 10:11:46 crc kubenswrapper[5004]: I1201 10:11:46.759604 5004 scope.go:117] "RemoveContainer" containerID="bb7b2f98452ad6358b81c3ab624dff561c4061a1923f5ef6c17db875889257ec" Dec 01 10:11:46 crc kubenswrapper[5004]: E1201 10:11:46.761686 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 10:11:58 crc kubenswrapper[5004]: I1201 10:11:58.759795 5004 scope.go:117] "RemoveContainer" containerID="bb7b2f98452ad6358b81c3ab624dff561c4061a1923f5ef6c17db875889257ec" Dec 01 10:11:58 crc kubenswrapper[5004]: E1201 10:11:58.761082 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 10:12:11 crc kubenswrapper[5004]: I1201 10:12:11.759411 5004 scope.go:117] "RemoveContainer" containerID="bb7b2f98452ad6358b81c3ab624dff561c4061a1923f5ef6c17db875889257ec" Dec 01 10:12:11 crc kubenswrapper[5004]: E1201 10:12:11.760201 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 10:12:24 crc kubenswrapper[5004]: I1201 10:12:24.759631 5004 scope.go:117] "RemoveContainer" containerID="bb7b2f98452ad6358b81c3ab624dff561c4061a1923f5ef6c17db875889257ec" Dec 01 10:12:24 crc kubenswrapper[5004]: E1201 10:12:24.760453 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 10:12:37 crc kubenswrapper[5004]: I1201 10:12:37.759096 5004 scope.go:117] "RemoveContainer" containerID="bb7b2f98452ad6358b81c3ab624dff561c4061a1923f5ef6c17db875889257ec" Dec 01 10:12:37 crc kubenswrapper[5004]: E1201 10:12:37.760186 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 10:12:50 crc kubenswrapper[5004]: I1201 10:12:50.759380 5004 scope.go:117] "RemoveContainer" containerID="bb7b2f98452ad6358b81c3ab624dff561c4061a1923f5ef6c17db875889257ec" Dec 01 10:12:50 crc kubenswrapper[5004]: E1201 10:12:50.760234 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 10:13:04 crc kubenswrapper[5004]: I1201 10:13:04.758712 5004 scope.go:117] "RemoveContainer" containerID="bb7b2f98452ad6358b81c3ab624dff561c4061a1923f5ef6c17db875889257ec" Dec 01 10:13:04 crc kubenswrapper[5004]: E1201 10:13:04.759599 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 10:13:18 crc kubenswrapper[5004]: I1201 10:13:18.760737 5004 scope.go:117] "RemoveContainer" containerID="bb7b2f98452ad6358b81c3ab624dff561c4061a1923f5ef6c17db875889257ec" Dec 01 10:13:18 crc kubenswrapper[5004]: E1201 10:13:18.762026 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 10:13:32 crc kubenswrapper[5004]: I1201 10:13:32.766785 5004 scope.go:117] "RemoveContainer" containerID="bb7b2f98452ad6358b81c3ab624dff561c4061a1923f5ef6c17db875889257ec" Dec 01 10:13:32 crc kubenswrapper[5004]: E1201 10:13:32.767646 5004 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fvdgt_openshift-machine-config-operator(f9977ebb-82de-4e96-8763-0b5a84f8d4ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" podUID="f9977ebb-82de-4e96-8763-0b5a84f8d4ce" Dec 01 10:13:45 crc kubenswrapper[5004]: I1201 10:13:45.759691 5004 scope.go:117] "RemoveContainer" containerID="bb7b2f98452ad6358b81c3ab624dff561c4061a1923f5ef6c17db875889257ec" Dec 01 10:13:46 crc kubenswrapper[5004]: I1201 10:13:46.669425 5004 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fvdgt" event={"ID":"f9977ebb-82de-4e96-8763-0b5a84f8d4ce","Type":"ContainerStarted","Data":"0c4148c57ed8e5ffabe0c5d6eb083651f094d83bb39932f4a14cdbc4b2044062"}